× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Mike,

It's probably I/O bound waiting on disk access.

How many years of data on the file? How many non retail items are on the file? Do you have room to make a copy of the data? Maybe copying out the data for just those items and the past twelve months then running the process over the copied data might help.

It might also help to peg the copied or original files into a main storage pool with SETOBJACC.

Paul

Principal Programmer Analyst
IS Supply Chain/Replenishment


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Schutte, Michael D
Sent: Tuesday, December 21, 2010 12:50 PM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

Yes I have thought about it. Unfortunately that requires time to redo the program. I even suggested to my boss, but he's leaning toward moving to a different job queue. I'm just not understanding the 1448 seconds of CPU time. What isn't being accounted for in this measurement? I know that it says nine seconds. But really doesn't. If you run interactively, it's like 12ms. Anyway, thanks for the input. When I have time, I'll redesign by summarizing the table first and querying the summary table.

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Morgan, Paul
Sent: Tuesday, December 21, 2010 11:50 AM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

Mike,

240,000 runs times the estimated query run time of 9 seconds is 2,160,000 seconds = 600 hours. Not bad that it finished in 11 hours.

Have you thought about modifying the query so it's only executed once to a work table for all retail items (maybe linking to another table with the retail items) and summarizing for each month of the past twelve months? Then you run through the work table to make your calls to updHTMLvar/WrtSection.

Paul Morgan

Principal Programmer Analyst
IS Supply Chain/Replenishment


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Schutte, Michael D
Sent: Tuesday, December 21, 2010 11:15 AM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

No the access plan isn't getting updated on each call. It was last saved at 10:28:05 That's when the program started, it finished at 21:34. I suppose I cannot count. It only took 11 hours not 12. Haha. I believe yesterday this sql statement was executed for every active retail item (20,000 items) and for the past 12 months. So 20,000 * 12 = 240,000 times the statement ran. After every call to sql statement, I'm executing updHTMLvar and WrtSection. I know that this would slow the process overall.

Thinking back, this was probably poor design, but it wasn't really intended to run for all items and all vendors, even though the prompt program allows the user to do so. But still, I can live with CPU time of 1448 seconds, cannot live with 11 hours of actual run time. This is what I don't understand.


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Morgan, Paul
Sent: Tuesday, December 21, 2010 10:31 AM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

Michael,

How many times does this get run in the process? You're talking about repeated calls to CGIDEV2 so does this get run once for each call? Is the access plan getting updated on each call?

Paul

Principal Programmer Analyst
IS Supply Chain/Replenishment


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Schutte, Michael D
Sent: Tuesday, December 21, 2010 10:10 AM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

This is from the PRTSLQINF command.

SELECT SUM ( CASE : H WHEN 'DR' THEN ECDEAM ELSE ECDQTY END )
INTO : H
FROM IMSPRECH
JOIN IMSPRECD
ON ECDCONO = ECHCONO
AND ECDSTR = ECHSTR
AND ECDRCT = ECHRCT
WHERE ECHCONO = : H
AND ECHDTE BETWEEN : H AND : H
AND ECDCONO = : H
AND ECOVND = : H
AND ECDEXT = : H

SQL4021 Access plan last saved on 12/20/10 at 10:28:05.
SQL4020 Estimated query run time is 9 seconds.
SQL4017 Host variables implemented as reusable ODP.
SQL4007 Query implementation for join position 1 table 2.
SQL4006 All indexes considered for table 2.
SQL4008 Index IMSLRECD9 used for table 2.
SQL4011 Index scan-key row positioning used on table 2.
SQL4007 Query implementation for join position 2 table 1.
SQL4006 All indexes considered for table 1.
SQL4008 Index IMSLRECH1 used for table 1.
SQL4014 1 join column pair(s) are used for this join position.
SQL4015 From-column 2.ECDCONO, to-column 1.ECHCONO, join operator MF, join predicate 1.
SQL4015 From-column 2.ECDSTR, to-column 1.ECHSTR, join operator MF, join predicate 2.
SQL4015 From-column 2.ECDRCT, to-column 1.ECHRCT, join operator MF, join predicate 3.
SQL4011 Index scan-key row positioning used on table 1.

If it was doing a table scan, the print out above would show it and wouldn't the CPU time be a lot greater. FYI, we didn't have any other jobs running on the system with higher priority. CPU% was around 70%. I don't know. I'm starting to believe that it had nothing to do with the statement itself but rather the repeated calls to WrtSection of the Dev2 procedures.

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Morgan, Paul
Sent: Tuesday, December 21, 2010 9:37 AM
To: Midrange Systems Technical Discussion
Subject: RE: Processing Unit Time

Michael,

Try running the process under debug (STRDBG/ENDDBG). That will get the SQL access plan printed in the job log. You can then review the access plan to see if it's actually using the logicals. Sometimes SQL will reject a logical for a sequential access if the previous run was very fast. It thinks a sequential read through the file will be faster than including the overhead of an indexed read.

Paul Morgan

Principal Programmer Analyst
IS Supply Chain/Replenishment

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Schutte, Michael D
Sent: Tuesday, December 21, 2010 8:37 AM
To: Midrange Systems Technical Discussion
Subject: Processing Unit Time

I'm having trouble finding an explanation of this but I was wondering if someone could explain this here. A user submitted a job the other day that took 6 hours to complete. By looking at the SQL statement in the program, I saw where an improvement could be made. I created a new logical by the selection keys in the where clause, so now the statement take less than a second to run for a month of data and the entire company. The user resubmitted the job yesterday and this time it took 12 hours to complete while using the new logical. If you look at the job log it says that it only used 1448 seconds processing unit time. That's only 24 minutes. So why did it take 12 hours. Is it because it created a 23,000 page job log?

There are two messages that are repeated over and over again that I can fix. Dataarea CGIDEBUG not found and null indicator require. The reason for the CGIDEBUG message is because I'm using the CGIDEV2 procedures to build an HTML page and then use write to stream file function so that I can email it to the user. Then for the null indicator, the sql statement is summing a field but I'm not checking any statuses nor using a null indicator. I've always just set the INTO field to zero prior to executing the statement. I should change my habit to always use a null indicator so that this message isn't created in the job log. But I don't see why this would cause the program to run for 12 hours.

Any thoughts?


Bob Evans makes the holidays easier. Our Farmhouse Feast is a complete holiday meal to go, ready to heat at home, serve and enjoy. Serves 6-8 for just $74.99. Pre-order through your local Bob Evans or online at www.bobevans.com/onlineordering<http://www.bobevans.com/onlineordering>




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.