|
Hi Charles, and thanks for your answer.--
" Having said that, 20min still seems like way too much time."
If I test one single customer in ACS the cost is between 0,9 to 1,3
seconds. So * 1.260 is not so far away to 20 minutes. The I/O is
really high (I guess a full scan for each execution).
Sadly I have no time to ask IBM.
Thanks!
--
Marco Facchinetti
Mr S.r.l.
Tel. 035 962885
Cel. 393 9620498
Skype: facchinettimarco
Il giorno mer 21 ago 2024 alle ore 16:01 Charles Wilt <
charles.wilt@xxxxxxxxx> ha scritto:
A few thoughts...run
First off, to find the answer you're looking for, you're going to
need to capture a DB trace and probably hand it over to IBM.
I'd agree that what you're seeing doesn't make sense. I'd expect
the cursor to be slower.
I think Daniel hit the nail on the head talking about a difference
in optimization.
My understanding is that while static statements are optimized at
compile time, there is some level of re-optimization / planning that
happens at
time so that the DB can ensure the best performance in case thingsthat
have changed; for instance an index has been added.
This run-time optimization happens once for the cursor variant,
since the DECLARE is compile time and only optimized during the first open.
But 1260 times for the static SELECT INTO.
Having said that, 20min still seems like way too much time.
I suspect you'd see even better performance with a dynamic VALUES
INTO
is PREPAREd once and EXECUTEd 1260 times.to
Of course, I agree with everybody else that suggested removing the
need
run the statement 1260 times.same
You say that "customer's list (ABCDUTENTE) is too complex with
SQL.". I suspect it's not as difficult as you may think :)
Basically, what you need is a result set containing the 1260 values
used for ABCDUTENTE.
This could be an SQL UTDF or if really necessary an RPG UDTF.
Then at worst, you have
AND LcCDUTENTE in ( select abcdutente from table(myutdf(parms))
But I'd probably look at joining the UDTF results.
HTH,
Charles
On Wed, Aug 21, 2024 at 2:43 AM Marco Facchinetti <
marco.facchinetti@xxxxxxxxx> wrote:
Hi Daniel, thanks for you time.way
I agree about data sets and, when possible, I design SQL acces in
that
but in this specific program obtaining the customer's listis
(ABCDUTENTE)
too complex with SQL.takes
The program with the cursor works and is completely satisfactory
but I would like to know why with the Select the times are so
high. Is it a problem in the links between the tables? In the
conditions or in the sequence in which they are specified?
What puzzles me is why a cursor is so fast (2/3 SECONDS) and a
Select
20 MINUTES. The code of the Select and of the Cursor is exactly
the
andand the general logic too: will be executed 1.260 times.
TIA
--
Marco Facchinetti
Mr S.r.l.
Tel. 035 962885
Cel. 393 9620498
Skype: facchinettimarco
Il giorno mar 20 ago 2024 alle ore 17:17 Daniel Gross
<daniel@xxxxxxxx
ha
scritto:one
Hi Marco,
Am 20.08.2024 um 16:50 schrieb Marco Facchinetti <marco.facchinetti@xxxxxxxxx>:
Hi all, I'm talking about embedded SQL.
execute the following code 1.260 times:
<snip>
Takes 2/3 seconds.
Executing 1.260 times any variation of these 4 statements (the
last
SQLis the cursor's code):something
<snip>
You can try the statements in iACS and look, what Visual Explain
says about them. Sometimes the problem is a missing index -
sometimes
that makes it impossible to cache the access plan or result.
But first, you should think about the application logic -
repeating a
statement (or a whole SQL cursor loop) for over 1.000 times is ano-no.
big
To utilize the full power of SQL you should think about "data sets"
relatedtablehow the a linked together - maybe you can join the data to
another
-select-into
so that you don't have thousands of open/close operations or
relatedstatements.very
So rethinking the application design might be a good idea - from
this
restricted point of view.
HTH
Daniel
--
This is the RPG programming on IBM i (RPG400-L) mailing list To
post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
questions.--
This is the RPG programming on IBM i (RPG400-L) mailing list To
post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
--questions.--
This is the RPG programming on IBM i (RPG400-L) mailing list To post
a message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
This is the RPG programming on IBM i (RPG400-L) mailing list To post a
message email: RPG400-L@xxxxxxxxxxxxxxxxxx To subscribe, unsubscribe,
or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
https://archive.midrange.com/rpg400-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.