|
Bob, I have to admit stupidity. I have never seen this done before. It is exactly what I need. Thank you. I had visions of CL programs with OPNQRYF and headaches and performance problems, or some such. Now that I see what you've done and how simple it is I'm feeling really dumb. Still, I suppose if I hadn't asked I'd be the only one that didn't know. _______________________ Booth Martin boothm@earth.goddard.edu http://www.spy.net/~booth _______________________ Bob Larkin <blarkin@wt.net> Sent by: owner-rpg400-l@midrange.com 12/09/1999 12:46 AM Please respond to RPG400-L To: RPG400-L@midrange.com cc: Subject: Re: Alphabetizing detail records to a header file If I understand the question, create a logical or view withthe DETAIL file as the primary file joining to the Header file. The join would be on the "arrival sequence Unique ID number field". Select the Alpha Name from the secondary file (Header) as the key. The max reads would be the total number of records in the detail file. It sounds like you have used the JDFTVAL keyword in your join, in which case the join is backwards, resulting in a view with the total number of records equal to the number of records in the Header file (unless you use an Inner Join which would result from no JDFTVAL keyword.) The logical below should work. Let us know if it does the trick. |...+....1....+....2....+....3....+....4....+....5....+....6....+....7....+....8 A R JOINREC JFILE(DETAIL HEADER) A J JOIN(DETAIL HEADER A JFLD(DetailUniqueID HeaderUniqueID) A UniqueID JREF(DetailUniqueID) A NAME A ..... fields ....... A K NAME Bob "by way of David Gibbs " wrote: > I have an idea I may have missed something obvious. > > Here's the situation: I have a Header File of names and addresses. The > Header File has a unique identifying number for each name. The numbers > are assigned as each new name is added to the file so the numbers really > amount to arrival sequence, not to Alphabetic Sequence. I do have a > logical over the Header File based upon the name, so that any reports can > be generated in alphabetic order. All this seems simple and straight > forward. > > Now comes the part where I think I am missing something: I also have an > application with a small detail file for a specialized application. The > small group is probably going to be about 300 records but certainly never > over 1,000 records. I've linked it logically with the Header File by the > unique Header File I.D. number. To present an alphabetic subfile on the > screen of that small grouping I have to read through the entire alpha > logical on the Header File. As you can imagine performance is horrid. > > In the past I've loaded an *INZSR array, and I've denormalized the data. > Neither method has felt right. Is there some simple thing I've missed? > > Thanks. > _______________________ > Booth Martin > boothm@earth.goddard.edu > http://www.spy.net/~booth > _______________________ > > +--- > | This is the RPG/400 Mailing List! > | To submit a new message, send your mail to RPG400-L@midrange.com. > | To subscribe to this list send email to RPG400-L-SUB@midrange.com. > | To unsubscribe from this list send email to RPG400-L-UNSUB@midrange.com. > | Questions should be directed to the list owner/operator: david@midrange.com > +--- +--- | This is the RPG/400 Mailing List! | To submit a new message, send your mail to RPG400-L@midrange.com. | To subscribe to this list send email to RPG400-L-SUB@midrange.com. | To unsubscribe from this list send email to RPG400-L-UNSUB@midrange.com. | Questions should be directed to the list owner/operator: david@midrange.com +--- +--- | This is the RPG/400 Mailing List! | To submit a new message, send your mail to RPG400-L@midrange.com. | To subscribe to this list send email to RPG400-L-SUB@midrange.com. | To unsubscribe from this list send email to RPG400-L-UNSUB@midrange.com. | Questions should be directed to the list owner/operator: david@midrange.com +---
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.