× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 6/3/2015 2:41 PM,
MichaelQuigley@xxxxxxxxxx wrote:

The whole process is that we export data from the IBM i using CPYTOIMPF to
a delimited file--i.e., using DTAFMT(*DLM) and RMVBLANK(*TRAILING). Some
of the character fields are all blanks. In the resulting comma separated
file, these all blank fields are rendered as " "--i.e,. Quote,Space,Quote.
The PC process which later uses the data doesn't interpret these as blank
fields. So I wrote a little utility to take out the intervening space
between the two quotes.

At this point you probably aren't interested in a change, but sed can do
that in one step. Posted for the benefit of the archives.

sed < midrange.csv

"Now", "is", "1345", "in the afternoon", " ", "the previous had only
a space", "."

sed -e "s/, \" \"/, \"\"/" < midrange.csv

"Now", "is", "1345", "in the afternoon", "", "the previous had only a
space", "."

Redirection would put it into another file:
sed -e "the regex" < fromfile > tofile

'Some people, when confronted with a problem, think "I know, I'll use
regular expressions." Now they have two problems.' -- Jamie Zawinski

http://blog.codinghorror.com/regular-expressions-now-you-have-two-problems/


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.