× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 3/19/06, rick baird <rick.baird@xxxxxxxxx> wrote:
> I've got a file with 2 or 3 years worth of history in it.  the key
> fields are not unique, so duplicates abound.
>
> I want to extract the last instance (by rrn) of each key value.
>
> will this work?
>
> copy the file (by rrn) to a sequencial file (this ?might? not be necessary?)
>
> create an empty duplicate file with unique keyword specified.
>
> cpyf to the empty file using *UPDADD
>
> sound kosher?

looks good. I never knew of mbropt(*updadd). When was that option
added to the CPYF command?  Does CPYF only look to the keys of the
tofile to determine that a record is a duplicate? ( as opposed to
examining all logicals of the tofile )

To maintain compatibility with the p5 I would do this as an sql procedure:
  - insert all the columns of FROMFILE + a new column ( RRNCOL ) that
is assigned the RRN of each row into a temporary table T1.
  - insert a grouping of T1 on the key columns with a max(RRNCOL) into
a temporary table T2.  T2 is now uniquely keyed on the key columns.
  - Join from T2 back to T1 on the key columns and the RRNCOL column.
This gives you all the columns of T1.  Insert all the T1 columns
except for RRNCOL into a resulting table that is created like the
FROMFILE.

-Steve


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.