It is possible in IBM i OS to define a FDF (Field Description File) like you use with IBM i Access file transfer. Then the CPYFRMIMPF can use the FDF to map the CSV data into a matching DB2 table. Honestly, this is still REALLY UGLY, and I would probably parse the CSV data directly from the stream file...
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Michael Schutte
Sent: Monday, February 04, 2013 3:27 PM
To: Midrange Systems Technical Discussion
Subject: Looking for Ideas For Processing CSV
I have a request to process a CSV file and import the data into a file.
The first record will have column header with the name of the field that
it's suppose to update.
For example if in file ITEM_MASTER there are fields ITEM and DESCRIPTION
The CSV would look like this.
"1234","ITEM NUMBER 1"
The user would like to first see if there is an item that currently exists
and then update when found or write a new record.
So I got to thinking(bad idea), maybe I could somehow take the entire csv
record and put them into an array, easily... For example with a SQL
procedure or function you could
SET fields = ARRAY["ITEM", "DESCRIPTION"];
SET data = ARRAY["1234","ITEM NUMBER 1"];
Then I could loop through the array, do validation... when passed then
update item master file.
What do you think? Is there a better method when the number of columns in
the import file could differ?
If you like this idea, I'm having trouble finding examples of using SQL
Arrrays in RPGLE.
P.S. There will a ton a validation and put into a holding file prior to
updating the ITEM master.