× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Buzz,

This might help you, if I understand your question correct. 
An answer to another post long time ago:

>A quick and 'dirty' way with FTP:

>Make a CLLE pgm with the following:
>CHGCURLIB QGPL

>Create a script with:
>FTP 127.0.0.1  (loopback)
>UserID / PW
>CD /TMP (or which library you prefer on the IFS)
>LS (DISK
>QUIT

>In library QGPL you will now find a file with the name: LSOUTPUT.
>Inside this file, you will find all libraries (directories) and files that
>exist i /TMP.

>Now you can read this file within CLLE or RPGLE and do whatever
>you want to do.

Best regards,
Leif

----- Original Message ----- 
From: "Buzz Fenner" <bfenner@xxxxxxxxxxxxxxxx>
To: "Midrange Discussion" <Midrange-L@xxxxxxxxxxxx>
Sent: 8. marts 2005 22:20
Subject: Enumerate Files


> Is anyone familiar with a "green screen" method to enumerate file names in a
> directory on the IFS?  A WRKLNK > directory_files.txt if you will?
> 
> I have a situation where I'll be getting multiple files from an FTP site
> (using Scott Klement's cool procs), then bringing the data up to a DB2 file
> via CPYFRMSTMF's (appending the data as I go).  So, I'm looking for a
> mechanism to identify the names of the stream files to use with the copy
> command.  I did a quick scan of the API's and nothing jumped out at me.
> Anyone done this sort of thing?
> 
> Buzz Fenner
> Systems Analyst/Network Administrator
> City Water & Light
> 870.930.3374
> mailto:bfenner@xxxxxxxxxxxxxxxx




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.