× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Frank,

Either way will, of course, work. The only time that I ever had to do this I used Scott Klement's tutorial (www.scottklement.com) and just read directly from the IFS; saves the step of copying to a PF.


The hard part is going to be parsing the data elements, though unlike mine, yours appears to be fixed (positional). The delimiters ( | ) in mine floated!

Thanks.

        * Jerry C. Adams
*IBM System i5/iSeries Programmer/Analyst
B&W Wholesale Distributors, Inc.* *
voice
        615.995.7024
fax
        615.995.1201
email
        jerry@xxxxxxxxxxxxxxx <mailto:jerry@xxxxxxxxxxxxxxx>



fkany@xxxxxxxxxxxxxxxxxx wrote:
Good afternoon,

I have a text file in the IFS that has 2 records(displays as 1 line when
opened in notepad):
TGHU7051481-860580|TGHU7051481|40HC|||290.00000000000|EAC|||
TTNU9726230-860578|TTNU9726230|40HC|||290.00000000000|EAC|||

Which method of the 3 below would be the simpliest in setting up to process
the data?

1) Copy records from the text file to a physical file, then use RPG to READ
the file?

2) READ text file records directly from RPG?

3) Your suggestion...

If it will help, the system I'm working on is at V5R4.

Thank you,

Frank



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.