× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



hi Dan,

In recent releases QShell has a 'catsplf' utility that can be used to read a spooled file (which in turn could be piped to the grep command.) Unfortunately, I don't think this existed at V4R5. (Though, it might be worth going into QShell and typing the catsplf command to see if it exists.) It definitely exists at 7.1, however....

$ catsplf -j 123456/MYUSER/MYJOB SPLFNAME SPLFNO


You would need to get a list of the spooled files, though, you'll need the jobid, spooled file name and spooled file number to use catsplf (or CPYSPLF for that matter.) A simple way to do that in a one-off program might be to use WRKOUTQ command. If you run this via QShell, it'll automatically take the *PRINT option and redirect the spooled file to the screen, so from QShell, you'd do

$ system 'WRKOUTQ OUTQ(The-OutQ)'

You'd then need to write a loop to spin through the output of that command, which would be pretty easy to do in a shell script.

-SK


On 7/10/2013 2:06 PM, Dan Kimmel wrote:

V4R5. I need to parse the data in 36,000+ spooled files. One time
deal. Any suggestions how I could quickly make them all parseable?

One thought is to set up a remote outq to an outq on a V7R1 box and
print them all there using the TOFILE attribute somehow and then GREP
them in QSH.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.