× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



Darryl

Man, this is mysterious - and probably has a simple answer!

Anyhow, here's another stab at it.

You say that the other data is not important - is that correct? So that all you really want is one of each item number and anything in the other fields?

If so, try this brute-force method - use CPYF MBROPT(*ADD) to put EVERYTHING into a work file.

Then use MIN() functions against all fields except item number.

SELECT ITEM, MIN(FIELD1), MIN(FIELD2, ...... FROM WORKFILE GROUP BY ITEM

I don't know, but this should be guaranteed to give you exactly 1 record for each item number, with a weird set of data in the other fields, perhaps.

HTH
Vern

On 10/24/2014 11:56 AM, Darryl Freinkel wrote:
Please see my comments inline below:

1) COMMON_FILE is empty when you start -----YES
2) ITEM values are unique in FILE_A -----YES

Then I think all the records in FILE_A should be inserted into
COMMON_FILE. Does this happen? -----YES it does

How are you proving "The exists is not working and as a result the insert
is inserting duplicates"? Are you sure you are doing this correctly?
--------Before doing the next insert, I list all duplicates. There are
none. After running the INSERT, I run the duplicates check and there now
are duplicates.


You can check for duplicate ITEMS simply with something like this:
SELECT ITEM,COUNT(*)
FROM COMMON_FILE
GROUP BY ITEM
HAVING COUNT(*) > 1

---- This is already being done.



On Thu, Oct 23, 2014 at 9:57 PM, Sam_L <lennon_s_j@xxxxxxxxxxx> wrote:

So, given:

1) COMMON_FILE is empty when you start
2) ITEM values are unique in FILE_A

Then I think all the records in FILE_A should be inserted into
COMMON_FILE. Does this happen?

How are you proving "The exists is not working and as a result the insert
is inserting duplicates"? Are you sure you are doing this correctly?

You can check for duplicate ITEMS simply with something like this:
SELECT ITEM,COUNT(*)
FROM COMMON_FILE
GROUP BY ITEM
HAVING COUNT(*) > 1

Sam

On 10/23/2014 10:56 AM, Darryl Freinkel wrote:

I need some help on this one.

I am merging 5 company records into 1 file and need to drop any
duplicates.
This is the statement:
INSERT INTO COMMON_FILE
(SELECT * FROM FILE_A A1
WHERE NOT EXISTS (SELECT 1 FROM COMMON_FILE A2 WHERE A1.ITEM = A2.ITEM))

Problem:
The exists is not working and as a result the insert is inserting
duplicates.

What alternative ways are there to achieve this merge or what mistake am I
not seeing?

I have 5 similar SQL statements to run.

TIA


---
This email is free from viruses and malware because avast! Antivirus
protection is active.
http://www.avast.com


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.





As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.