Minimally he should run the statement while the job is in debug mode - run STRDBG with no program specified. There will be messages in the job log to tell what indexes were used, or which were built, etc.

Even more info can be had from Visual Explain in Navigator's Database component - which at more recent versions has an "advised indexes" feature. Not all indexes actually get used, some help decide what to do in the access plan.

-------------- Original message --------------
From: "Jim Franz" <franz400@xxxxxxxxxxxx>

I have a customer with a 10 million record database, and a rpglesql
pgm for users to make selections, on a tiny system. Some selections include
which can force a table scan, but I made sure thru edits that
there had to be some selection (like customer) so that the %like
only gets executed when a single customer selected, a logical views exist
for sql to not have to build a key. You might want to (if you have not
already done this) analyze the file(s) and logicals so the system doesn't
need to build large temporary structures.
jim franz
----- Original Message -----
From: "Tim Gornall"
Sent: Friday, May 30, 2008 10:46 AM
Subject: I almost dropped the system due to SQL using large amounts

We have a SQLRPGLE program that dynamically builds the SQL statement based
on user imput. Sometimes the statement can fairly large and when it is it
uses very large amounts of both processing and storage resources. I have
seen it gobble up processer, but was surprised yesterday when I noticed
the system ASP when through the roof. The ASP was up to 93 percent and
climbing when I got word. As soon as I ended the job the ASP came back
almost immediately. I backed out the process completely until I can be
assured this won't happen again.

The job is run interactivly within an order entry process. We implemented
it last week and instantly noticed slow system response when large queries
are run. As a quick fix, I change the job priority within the program
before the sql runs and then change it back afterwards. This makes it run
bit slower for the user but the system does not take the hit.

But now with the ASP issue, I need a real fix. Is there a way to to limit
the amount of resourses the job can alocate? Both processer and ASP.
Basically put some type of govenor on SQL. Idealy it would be a global
setting for all SQL run on the machine.

Thanks, Tim

This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives

This thread ...

Return to Archive home page | Return to MIDRANGE.COM home page