Kelly,

You can use an interim file, do the CPYTOIMPF to that file and than either:
a) CPYF thatfile FROMRCD(2) *ADD
b) process it in an RPG prgramme sequentially and do SETLL on second record 
before starting reding the file.

Just some thoughts.

Regards,
Carel Teijgeler.

*********** REPLY SEPARATOR  ***********

On 19-3-04 at 10:49 Kelly Cookson wrote:

>I'm wanting to use a IFS stream file (CSV) to serve as an interface between MS 
>Excel and iSeries DB2 file. I
>have a CL program that uses CPYTOIMPF and CPYFRMIMPF to transfer data between 
>the stream file and the CSV
>file. However, the copy commands do not deal with headers for the stream file.
>
>I know I can put headers in the first record of the DB2 file, but this puts 
>limitations on how fields are
>defined in the DB2 file. For example, I want to use DB2 files that have 
>uniquely keyed numeric fields.
>
>I also have a way to put headings into the CSV file using the DSPFFD command 
>and the IFS APIs in a HLL
>program. (DSPFFD to an outfile, call HLL to read outfile, create heading line 
>for CSV file, use IFS
>APIs to clear CSV file and write heading line, then perform CPYTOIMPF with the 
>*ADD option for data
>following the heading line).
>
>My problem is dealing with the headings when transferring data from the CSV to 
>a DB2 not designed
>to handle a record for headings. My only idea at this point is to use the IFS 
>APIs in an HLL program to read
>all CSV records except the first one into a temporary CSV file, then perform 
>CPYFRMIMPF on the temporary CSV
>file.  
>
>Anyone have any better ideas for deleting the first line (headings) in a 
>stream file in the IFS?




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2022 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.