|
I'd pick some logical file, any logical file, for that physical, so long as it included all fields. I'd define a data structure externally. I'd define a field New data Like(The Data Structure) and eval it as equal to the new record.. Then I would use the key to do setll/read loop while the key is unchanged. In the loop if the datastructure equals the new data field, then I'd process my duplicate record subroutine. --------------------------------------------------------- Booth Martin http://www.MartinVT.com Booth@xxxxxxxxxxxx --------------------------------------------------------- -------Original Message------- From: Midrange Systems Technical Discussion Date: Friday, May 09, 2003 12:52:40 To: midrange-l@xxxxxxxxxxxx Subject: Best Way to find duplicate records in same file I need to add duplicate record checking in a file we received from a switch. (I'll just add it to the CL that already FIXES a bunch of stuff in the file anyways) and was wondering what the best (i.e easy/fast) to do it would be. I've come up with: 1. Run a query against the file, sorted by entire record, level break on entire record, count. Output Query results to file and then do cpyf. 2. Build a file keyed on all 210 bytes, read original chain to new file, if not found write it 3. SQL ??? not done much with it but maybe it could do? Mark Allen IS Manager Wilkes Telephone & Electric 11 W. Court Street Washington, GA 30673 Phone: (706) 678-9565 Fax: (706) 678-1000 _______________________________________________ This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe, unsubscribe, or change list options, visit: http://lists.midrange.com/mailman/listinfo.cgi/midrange-l or email: MIDRANGE-L-request@xxxxxxxxxxxx Before posting, please take a moment to review the archives at http://archive.midrange.com/midrange-l. .
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.