|
This to me is an example of; "What would you suggest I do if I am in a bad section of town, It's dark, I have no weapon, I'm in a dark alley. There are twenty people ready to do me bodly harm...."" I personally would really start with what totals have to be updated, why, Why is program A reading a file to initiate this process. I have a strong suspicion that the data base design, business rules, process definitions, are "Not quite Right" summing up, the majority of Programming gymnastics that I see having to be done the many many times is a direct result of DB design. Don't submit millions, ask yourself why is this whole thing happening. $.02 John Hi folks, > > Program A reads a file and for each record submits about 20 jobs to batch to > update totals further up the hierarchy of records in the file. The batch > jobs are going to a dedicated jobq/subsystem which can have up to 25 active > jobs. At the moment no more than 3 or 4 jobs are active at a time because > Program A is not submitting them fast enough. We have tried submitting the > jobs directly from the RPG program using QCMDEXC and by calling a CLP to > submit them (contrary to (my) expectations, calling the CL was faster). At > the speeds we are seeing at the moment it would take 83 days of continuous > processing to complete the job (we are talking about millions of records). > Is there a way to speed up submission of the batch jobs from program A? > > TIA > > Martin > > >
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.