× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 2 February 2016 at 21:50, Booth Martin <booth@xxxxxxxxxxxx> wrote:
I am doing that, as step 1. It works great. Its step 2 thats is troubling
me.

Step 1: gather all records for ItemA1.
Repeat for ItemA2, A3, B5, B8. r8, etc.

In the end I will have gathered 30 to 40 records for each of 10 to 60
different items.

I expect to have a separate array and keep track of positioning. Then add
each gathering to the end of the array. SQL has surprised so often with
flexibility that I did not expect that it seemed reasonable to ask.

Apologies to the list for this inline reply. I'm too old to be able
to think upside down, and just can't manage to top post like a normal
person. So the salient background is below this :-(

Where are you getting item A1, A2, A3, B5, etc? The point is that if
they are coming in as a group - say, a subfile - this is a set, and
SQL loves sets. So instead of doing a SELECT... WHERE PARTNO = :ITEM
maybe you could try WHERE PARTNO IN (:ITEMA, :ITEMB...) or some
similar construct. The point being that if you intend to manipulate
the resultant set of A1 through R8, grab them all in one go in the
first place, rather than piecemeal and then append them into a final
set.

--buck

On 2/1/2016 3:53 PM, Charles Wilt wrote:

Why not just fetch all 1000 at once?

c/exec sql
c+ fetch myCursor for 1000 rows
c+ into :myOccursDS
c/end-exec

c eval wNumRowsRtn = SQLERRD(3)

Charles

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.