× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.


  • Subject: Re: Multi-processor machines.
  • From: Bob Larkin <blarkin@xxxxxx>
  • Date: Wed, 13 Aug 1997 20:08:43 -0700
  • Organization: Larkin Computer Consulting

James W. Kilgore wrote:
> 
> Viv Bassani wrote:
> >
> > We currently have a single processor 50S #2122 running V3R7
> > There is a requirement to run a high CPU/IO intensive batch
> > job which has a totally dedicated system. We have run the job
> > on the 50S and we are achieving around 100 records per hour. As
> > this machine is not powerfull enough for the task we now have a
> > new option.
> >
> A couple of years ago, in an IBM AS/400 Newsletter, was a description on
> using data queues to sysnc a disk prefetch for long running batch jobs.
> There were some dramatic reductions in run time.  (based upon the actual
> application your mileage may vary).
> 
> In a nutshell here is how it works:
> 
> Your main program has some sort of primary file it is processing and
> doing CHAIN/READ along the way.
> 
> In a separate storage pool, you start a prefetch job which also
> processes the same file, skips the first record and does the chain(s)
> for the second record.  It basicly stays one record ahead of your main
> program.  It waits on the data queue to know that your main program is
> starting record two so the prefetch starts record three.
> 
> The main program benefits by overlapping it's computations with the
> prefetch programs disk I/O so that when the main program does a disk
> request the record is already in cache.
> 
> You can clone the main program, gutting it's calculations, add the data
> queue wait and you have the prefetch program.  It might be worth
> spending a day giving it a try.
> 
> James W. Kilgore
> QAPPDSN@ibm.net

As a user of DtaQs and author of an article on such, the solution
mentioned would work. Problem is that the secondary files that are
almost always related to the program will still cause I/O bottlenecks,
unless you can have prefetch jobs running for these files.

The big question is ... Why does this job take so long?!?!?
Improper design can cause a problem, but often very small changes can
make a BIG difference. Just this week, I wrote a program wiith less then
ten lines of executable code (COBOL). This program was added to the job
stream of a job that ran 11 HOURS (660 minutes). The run time was
reduced to 17 MINUTES. Rather dramatic :)
Problem was that a RPG program was calling a COBOL program repeatedly.
Due to the way the system handles program initiation/termination , a
COBOL program encapsulating the process eliminated a BUNCH of overhead.
One client regularly process 80 Million transaction batch jobs with
heavy computation. Performance is greatly enhanced with the use of the
Caching available through the system. 
-- 
Bob Larkin
Larkin Computer Consulting
blarkin@wt.net
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
* This is the Midrange System Mailing List!  To submit a new message,   *
* send your mail to "MIDRANGE-L@midrange.com".  To unsubscribe from     *
* this list send email to MAJORDOMO@midrange.com and specify            *
* 'unsubscribe MIDRANGE-L' in the body of your message.  Questions      *
* should be directed to the list owner / operator: david@midrange.com   *
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.