× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



If you use ROW_Number() or an Order By in a Subquery and then select only 20
rows, the complete query must be executed first (i.e. all rows must be
sorted and numbered) before the desired (first 20) rows can be selected. For
huge tables it may take a lot of time.

What if you execute first the sub-query to return the 20 rows you need, an
number the result in a second sub-query?
Something like this:
exec sql
select a.seqnbr, a.rownum
into :vSeqnbr, :vRownum
from (Select SeqNbr, Row_Number() Over(Order By SeqNbr) rowNum
From (select a.seqnbr, row_number( ) over(order by
a.seqnbr ) rownum
from dsh3526p a
where a.seqnbr > :inTopKey.seqnbr
order by a.seqnbr
fetch first 20 rows only ) a) b

where rownum >= :inSflPagSx
order by rownum
fetch first row only ;

or (IMHO this is more readable)
Exec SQL
With x as (Select SeqNbr
From DSH3526P
Where .SeqNbr > :inTopKey.SeqNbr
Order By SeqNbr
Fetch First 20 Rows Only)
y as (Select Row_Number() Over(Order By SeqNbr), SeqNbr
From x)
Select SeqNbr, RowNum into :vSeqnbr, :vRownum
From y
Where SeqNbr > :inSflPagx
Fetch first row only;

Make also sure that there is an index (or logical file) with the SeqNbr key
field.

Mit freundlichen Grüßen / Best regards

Birgitta Hauser

"Shoot for the moon, even if you miss, you'll land among the stars." (Les
Brown)
"If you think education is expensive, try ignorance." (Derek Bok)
"What is worse than training your staff and losing them? Not training them
and keeping them!"

-----Ursprüngliche Nachricht-----
Von: MIDRANGE-L [mailto:midrange-l-bounces@xxxxxxxxxxxx] Im Auftrag von
Steve Richter
Gesendet: Monday, 23.3 2015 19:21
An: Midrange Systems Technical Discussion
Betreff: row_number performing badly in table with a lot of rows

experimenting with using row_number to implement a SKIP type function.
Where an inner query use row_number to assign a sequential number to the
selected rows. And then the outer query says WHERE rownum >= 20 to skip over
the first 20 of the selected rows. The objective being to page thru the
rows of a table, either forward or backward.

Here is the code for paging forward:
exec sql
select a.seqnbr, a.rownum
into :vSeqnbr, :vRownum
from (

select a.seqnbr, row_number( )
over( order by a.seqnbr ) rownum
from dsh3526p a
where a.seqnbr > :inTopKey.seqnbr
order by a.seqnbr
fetch first 20 rows only ) a

where a.rownum >= :inSflPagSx
order by a.rownum
fetch first row only ;

The code works. And it performs well enough when there are up to 50,000 rows
in the table. But the more rows in the table the slower the execution.

How can I use row_number in this manner efficiently? I tried the code where
I have a 3rd nested query which simply runs "fetch first 20 rows only",
returning those rows to the query which assigns the row_number which returns
its result rows to the outer query. Same poor performance.

thanks,
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe,
or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a
moment to review the archives at http://archive.midrange.com/midrange-l.



As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.