You're pretty much describing it as-is. We FTP the CSV file(s) - there may be 1 to n. They then get copied to a flat 2048 byte record before the preliminary transform. That then gets copied to a real PF.
The twist on this file is I cannot do a SQL replace() against commas as they are the field delimiter. For another partner, I can SQL replace() the commas as they send tab-delimited.
For this oddball, I just threw together a quickie RPG that will spin through the 2048 byte flat record and update it in place - removing the $ and comma.
Roger Harman
COMMON Certified Application Developer - ILE RPG on IBM i on Power
________________________________
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxx> on behalf of Justin Dearing <zippy1981@xxxxxxxxx>
Sent: Tuesday, September 6, 2016 12:11 PM
To: Midrange Systems Technical Discussion
Subject: Re: Removing thousands separator from CSV quoted string
Roger,
You need to load the file first in its original format in a loading table.
Then transform it. Then put it in the final table. Whatever UDF you have
that turns money into decimals should be a scalar UDF that takes a string
as input and outputs a decimal.
If you load the raw CSV or tab delimited file into a loading table (can be
a temp table) and then transform it and load it into the real Physical
file, you won't have to worry about the file being comma delimted. You will
end up with small reusable components for other file transformations. You
will also easily be able to add logging and reporting at different steps.
You can start recording line by line error messages, and sending them back
to your trading partnet via whatever FTP site they send the files to.
Justin
On Tue, Sep 6, 2016 at 2:52 PM Roger Harman <roger.harman@xxxxxxxxxxx>
wrote:
I have a CSV file coming from a trading partner that uses formatted dollar
amounts - "$2,123.45". I need to remove the thousands separator comma
before I import the file into a real PF.
I receive a number of other formats from trading partners and have SQL
routines to remove the $ sign, trailer records, etc. from the data stream.
I remove commas in another file but it is tab-delimited.
Looking for suggestions to remove just the thousands separator if it
exists without killing the comma field separators. Preferably SQL as a lot
of this upfront preprocess happens in a generic CL which has the RUNSQL
routines I need.
I've only come up with brute force - search for $1,.. $2,.. etc.
Thanks.
Roger Harman
COMMON Certified Application Developer - ILE RPG on IBM i on Power
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
[
http://lists.midrange.com/images/small-family.gif]<
http://lists.midrange.com/mailman/listinfo/midrange-l>
MIDRANGE-L Info Page<
http://lists.midrange.com/mailman/listinfo/midrange-l>
lists.midrange.com
To unsubscribe from MIDRANGE-L, get a password reminder, or change your subscription options enter your subscription email address:
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxx for any subscription related
questions.
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit:
http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at
http://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxx for any subscription related questions.
As an Amazon Associate we earn from qualifying purchases.