|
Hi Scott,
The file name will be the same in each subdirectory.
If the file exists, I want to copy it to a DB2 file.
ex: /MS_SQL/01
External_File.txt
/MS_SQL/021
External_File.txt
.
.
.
.
/MS_SQL/84
External_File.txt
I need to traverse each of the subdirectories in the MS_SQL directory and
if the file External_File.txt is present, copy it to a DB2 file then delete
it.
Thanks,
On Mon, Jan 23, 2012 at 6:12 PM, Scott Klement
<midrange-l@xxxxxxxxxxxxxxxx>wrote:
Hi Jeff,
On 1/23/2012 4:23 PM, Jeff Young wrote:
This data will be coming from an external server.
Each subdirectly will represent data from a different country.
The client has decided that they want a separate folder for each country.
Okay. So what will be in these "folders" (directories) for each country?
Is everything in the subdirectories a file to be processed? Or is
there a search criteria that needs to be used to determine if the file
is one to be processed?
I can create a loop to traverse each directory and use CPYTOSTMF to copyfor
the data and remove the file with DEL after the copy, but I was hoping
a simple one line solution for the copy and for the delete.
You mean you want to run the READMYMIND command? (Or, the Unix
equivalent, ./readmymind)
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.