× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Probably going to go with following as I need to get the deletes in the
journal for the file

Creat file FILEW with Key on all fields

RPG PGM:

Read File
Chain to FILEW
If not in FILEW
Write FILEw
Else
Delete FILE     
Endif


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx
[mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Mlpolutta@xxxxxxx
Sent: Friday, May 09, 2003 2:18 PM
To: Midrange Systems Technical Discussion
Subject: Re: Best Way to find duplicate records in same file


You can also use OPNQRYF/CPYFRMQRYF to do this (one-time approach):

OPNQRYF FILE(YOURFILE) KEYFLD((KEY 1) (KEY 2)) UNIQUEKEY(*ALL)
CPYFRMQRYF OPNID(YOURFILE) TOFILE(UNIQUEFILE) CRTFILE(*YES)
MBROPT(*REPLACE)

Now UNIQUEFILE has the unique records based on the keys given.

HTH,
Michael

> Actually what I need it to do is delete the dups:
> 
> Record 1: ABCDEFGHIJKL
> Record 2: ABCDEFGHIJKL
> Record 3: 123456789012
> 
> I want to delete record 2
_______________________________________________
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo.cgi/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.