× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



We are running v7r2. The job logs are spooling for the jobs I am interested
in.

During a 'release', after all files and programs are installed on
production system, a bunch of jobs get submitted, the names of which are
based on the project numbers being installed. One project number 'could'
have more than one job, and all those would be run with the same job name.

The requirement is to determine if all of the jobs ended with a End Code of
'0'. So we are copying the job log spool files and parsing them to
determine the end code.

I know it sounds crazy, as one would think that if any job went in MSGW,
someone would know, but seems they want this automated and not rely on
whoever answers the MSGW to report it.

Also, I searched as per Chuck's suggestion and was able to code the loop to
take care of duplicate jobs.

I would still be interested in knowing if the end result can be achieved
some other way.

Vinay

On Tue, Apr 26, 2016 at 7:09 AM, Rob Berendt <rob@xxxxxxxxx> wrote:

The big question is WHY do you want to do this?
If you could also tell us what version of IBM i you are running that
helps.
And, are your job logs spooling up? That's still the default for many
people but we changed the system value QLOGOUTPUT to *PND years ago.

Why do I ask these questions?
See these for starters:

http://www.ibm.com/support/knowledgecenter/ssw_ibm_i_73/rzajq/rzajqudfjobloginfo.htm?lang=en

http://www.ibm.com/support/knowledgecenter/ssw_ibm_i_73/rzajq/rzajqservicesworkmgmt.htm?lang=en


Rob Berendt
--
IBM Certified System Administrator - IBM i 6.1
Group Dekko
Dept 1600
Mail to: 2505 Dekko Drive
Garrett, IN 46738
Ship to: Dock 108
6928N 400E
Kendallville, IN 46755
http://www.dekko.com





From: Vinay Gavankar <vinaygav@xxxxxxxxx>
To: Midrange Systems Technical Discussion <midrange-l@xxxxxxxxxxxx>
Date: 04/25/2016 06:20 PM
Subject: CPYSPLF with Duplicate jobs
Sent by: "MIDRANGE-L" <midrange-l-bounces@xxxxxxxxxxxx>



I want to copy the QPJOBLOG spool file of a given job name (coming in as a
parameter) to a physical file in a CL program.

When there are duplicate jobs with the name, I want to copy for all of
those jobs.

I think it can be done by using WRKJOB with DUPJOBOPT(*MSG) and then using
RCVMSG to get all of the job numbers in a loop and doing the CPYSPLF.

I am not sure how exactly to do it, or whether there is another way to do
this.

Any help will be greatly appreciated.

Vinay
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.

Please contact support@xxxxxxxxxxxx for any subscription related
questions.


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.

Please contact support@xxxxxxxxxxxx for any subscription related
questions.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.