× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



@Marco
I tried as @Charles suggested, but it doesn't work: it returns the number
of fetched rows only instead of the total number of rows.

So far, I have successfully tested @Kyle suggestion to use SQL row_count()
function in reverse order.

Il giorno mar 28 lug 2020 alle ore 18:23 Marco Facchinetti <
marco.facchinetti@xxxxxxxxx> ha scritto:

Hi Maria, just answering the count() part of your question:

EXEC SQL
PREPARE S1 FROM :W_String;
EXEC SQL
DECLARE Curs1 INSENSITIVE SCROLL CURSOR FOR S1;
EXEC SQL
OPEN Curs1;
EXEC SQL
GET DIAGNOSTICS :Risposta.Recordtotali = DB2_NUMBER_ROWS;

HTH
--
Marco Facchinetti

Mr S.r.l.

Tel. 035 962885
Cel. 393 9620498

Skype: facchinettimarco


Il giorno mar 28 lug 2020 alle ore 11:17 Maria Lucia Stoppa <
mlstoppa@xxxxxxxxx> ha scritto:

Hi all,
Hope you are well.

A REST API I am working on returns data split into pages: many calls are
necessary to get the complete set of data, but in this way, the user
doesn't wait too much for the first page. Data are retrieved within an
RPG
ILE service program by a static SQL statement which applies some dynamic
filters that come within the request.

Everything works fine, except the same static SQL statement is run at
least
twice to know the total number of rows (a simple count(*)) and the rows
themselves page by page.
Now, on the same data retrieved by this SQL statement, others select
statements must be run to get some totals according to different group by
clauses in order to present the data set distribution to the final user.

I hardly accepted the idea of running the same SQL statement (which, by
the way, is pretty complex) at least twice to get both the total and the
row, but I can't stand the idea of even more runs to get data
distribution.

There might be errors in my design of how the procedure should work,
nonetheless, I wonder if a single common table expression might be used
many times to serve many different select statements, as I would avoid
the
use of other solutions like global temporary tables.

Any suggestion is really appreciated.

Many thanks

--

Maria Lucia Stoppa
mlstoppa@xxxxxxxxx
--
This is the RPG programming on IBM i (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.

Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com

--
This is the RPG programming on IBM i (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/rpg400-l
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/rpg400-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.

Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com




As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.