|
Murali, If you really want to leave your existing duplicate records in the file and ensure uniqueness going forward (I would clean this up personally), there is a way, assuming you have a current transaction date field or a sequentially generated key field somewhere in the file. Create a new logical file that has the same keys as your existing file, make it UNIQUE, and add select/omit criteria to your new LF to include only records from this day forward. Use either "greater than date X" if you have a current date field, or use "greater than seq.# X" if you have a sequentially generated key. Thus this new LF will not include any of the dups written previously. Should any of your programs attempt to add a duplicate key, this new LF will object. Bill -- "NOTICE: The information contained in this electronic mail transmission is intended by Convergys Corporation for the use of the named individual or entity to which it is directed and may contain information that is privileged or otherwise confidential. If you have received this electronic mail transmission in error, please delete it from your system without copying or forwarding it, and notify the sender of the error by reply email or by telephone (collect), so that the sender's address records can be corrected."
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.