× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Al, 

This can very easily be done.    You need to create the query by joining the
file to itself.   Join on all the fields that you want to consider for
duplicate record checking.  For the type of join, specify "3=Unmatched
Records with primary file."   This should give you a listing of all the
non-duplicated records.

Bud Roble 

-----Original Message-----
From: avron gershen [mailto:aldg3@xxxxxxxxxxx] 
Sent: Tuesday, July 15, 2003 3:46 PM
To: MapicsML
Subject: Query/400: Selecting Records based on Record No?

Hi group:

I don't believe this can be done but I'll run it by you 
for verification:

If I have a DB file with ten records in it and five of the 
records are duplicated, using Query/400 can I output five 
of the records to a report?  I don't want the duplicated 
records in the report.

I don't see this in the IBM Query/400 Use (Version 3) 
manual.  Did I miss something?

I look forward to your comments.

Regards,
Al Gershen
ECS Composites
Grants Pass, OR
aldg3@xxxxxxxxxxx
_______________________________________________
This is the MAPICS ERP System Discussion (MAPICS-L) mailing list
To post a message email: MAPICS-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/mapics-l
or email: MAPICS-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/mapics-l.

As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.