× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Do you need to check the entire record, or a "key field"?

If it's just a key field, something like this SQL

SELECT DISTINCT(keyfield), COUNT(keyfield)
FROM somefile

May work. If you need to do the entire record (is it one field or multiple
fields?), you can do the same but it looks like

SELECT DISTINCT(field1 || field2 || char(field3) || ...), count(*) /* I
think */
FROM somefile

HTH,
Loyd

--  
Loyd Goodbar
Programmer/analyst
BorgWarner Incorporated
ETS/Water Valley
662-473-5713
lgoodbar@xxxxxxxxxxxxxx


-----Original Message-----
From: Mark Allen [mailto:mallen@xxxxxxxx] 
Sent: Friday, May 09, 2003 12:16 PM
To: midrange-l@xxxxxxxxxxxx
Subject: Best Way to find duplicate records in same file


I need to add duplicate record checking in a file we received from a switch.
(I'll just add it to the CL that already FIXES a bunch of stuff in the file
anyways) and was wondering what the best (i.e easy/fast) to do it would be.
I've come up with:
 
1.  Run a query against the file, sorted by entire record, level break on
entire record, count.  Output Query results to file and then do cpyf. 2.
Build a file keyed on all 210 bytes, read original chain to new file, if not
found write it 3.  SQL ???  not done much with it but maybe it could do?

As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.