× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On Wed 05-May-2011 09:12 , fbocch2595@xxxxxxx wrote:

I want to copy all joblogs for a certain job to a PF in my library.
There are hundreds of joblogs for the job. I've been manually using

CPYSPLF FILE(QPJOBLOG) TOFILE(ZPGMR97/FILEA) JOB(855542/FB/QPRTJOB)
SPLNBR(487) MBROPT(*ADD)

I've still got hundreds to go.

Do you know of a fast way to do this or do you have a pgm that might
help me out?

As always, any help is appreciated.


If that was something I had to do often, I would code something specific, however for a one-off or occasional requirement:

<code>

/* Add a member to provide a "program" to perform the work: */
ADDPFM QREXSRC CPYJLS TEXT('Copy\add joblogs from a job') SRCTYPE(REXX)

/* Insert following data as rows srcmbr added above */
parse arg Spl_A Spl_Z .
NxtSpl=Spl_A
do while NxtSpl<=Spl_Z
/* UnComment next line to test CL requests generated */
/* say , */
"CPYSPLF FILE(QPJOBLOG) TOFILE(ZPGMR97/FILEA) ",
" JOB(855542/FB/QPRTJOB) SPLNBR("NxtSpl") MBROPT(*ADD)";
if rc<>0 then nop /* ignore; Or, could react somehow */
NxtSpl=NxtSpl+1
end

/* Given the last spooled joblog is numbered 789: */
STRREXPRC CPYJLS SRCFILE(*LIBL/QREXSRC) PARM('488 789')

</code>

Regards, Chuck

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.