|
Venu, OK, so not "threads" in the Unix sense (of OS/400 V4R2 !), but a subsystem with a job queue that can handle more than one active job. Easy enough. If you do DSPSBSD QBATCH and check option 6 for job queues you will see each job queue has it's own 'Max Active' setting. So submit all your jobs to a specific job queue, and on that job queue do a CHGJOBQE queuename MAXACT(10). Now, while still on the DSPSBSD screen, check option 1 to see what the total concurrent jobs allowed in the subsystem (from all attached job queues) are. This would obviously need to be at least 10 so it didn't override the setting of the job queue. Make allowance for other job queues attached to the subsystem too. This is changed with CHGSBSD sbssysname MAXJOBS(value). Now don't go overboard and initiate too many jobs concurrently. Depending on the processing power of your system model, and maybe disk accesses required by the job vs. number of disk arms you have, you could quickly reach the point of diminishing returns where the system runs out of resources and starts spending an increasing amount of time thrashing (swapping active jobs in & out). (There are other settings that come into play there also, like the Purge setting on the class used for the job, whether you have *FIXED or *CALC paging option for the storage pool, etc.). Try testing with a different number of concurrent jobs and see if the total run time for single threaded jobs is actually quicker or slower than 10 concurrent jobs. If the 10 concurrent jobs are quicker, you should still test with say 8, 6, 4 concurrent jobs - you may find the total time actually increases if you run less jobs concurrently. And don't forget to consider the impact these 10 concurrent jobs may have on any other jobs running at the same time. Neil Palmer AS/400~~~~~ NxTrend Technology - Canada ____________ ___ ~ Thornhill, Ontario, Canada |OOOOOOOOOO| ________ o|__||= Phone: (905) 731-9000 x238 |__________|_|______|_|______) Cell.: (416) 565-1682 x238 oo oo oo oo OOOo=o\ Fax: (905) 731-9202 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ mailto:NPalmer@NxTrend.com AS/400 The Ultimate Business Server http://www.NxTrend.com > -----Original Message----- > From: VENU YAMAJALA [SMTP:venu_yamajala@goodyear.com] > Sent: Thursday, July 23, 1998 4:21 PM > To: MIDRANGE-L@midrange.com > Subject: Multi-threading > > Hi All, > > I want to know how to implement multi-threaded jobs on AS400. My > requirement > is as follows : > > I have a CL program(CL1). It reads records from a physical file PF1. > For each > record of PF1, I submit a batch job with different parameters. Now the > driver > CL (CL1) pgm itself runs in batch. So after submitting CL1 in batch, > if I look > at the JOBQ, it is forking into several batch jobs (number of records > in PF1). > But each of this batch jobs are running sequentially i.e, one after > the other. > I am loosing lot of time here. I want to have a mechanism wherein, the > CL1 pgm > when it is forked to several batch jobs, a specified number of the > batch jobs > run simultaneously (say 10 at a time). This way I can reduce total > run-time. If > there are 100 jobs, I can reduce the total time by more than 75%. I > know that > it is possible but missing lot of points. Can anybody guide me in > this? Thanx > in advance for any help/suggestion. > > Regards, > Venu > +--- | This is the Midrange System Mailing List! | To submit a new message, send your mail to MIDRANGE-L@midrange.com. | To subscribe to this list send email to MIDRANGE-L-SUB@midrange.com. | To unsubscribe from this list send email to MIDRANGE-L-UNSUB@midrange.com. | Questions should be directed to the list owner/operator: david@midrange.com +---
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.