|
Vern's on the right track....but I believe it's more than a single column difference. select * from ppay408a a inner join (select tmlssn,tmljob,tmlped,tmldat from ppay408a group by tmlssn,tmljob,tmlped,tmldat having count(*) > 1) as dups the select * causes your output to consist of every column of ppay408 + every (4) column of dups. if you're inserting into a duplicate of ppay408a, then you may want to use: select A.* to just get all columns from ppay408a. But, I'm not sure what good that will do, the new table will contain the same data as the old. How about the following: insert into ptmp408a ( with dups as (select tmlssn,tmljob,tmlped,tmldat from ppay408a group by tmlssn,tmljob,tmlped,tmldat having count(*) > 1 ), first as (select tmlssn, tmljob, tmlped, tmldat, min(tmllin) as line from ppay408a A, dups where a.tmlssn = dups.tmlssn and a.tmljob = dups.tmljob and a.tmlped = dups.tmlped and a.tmldat = dups.tmldat group by tmlssn,tmljob,tmlped,tmldat ) select a.* from ppay408a, first where a.tmlssn = first.tmlssn and a.tmljob = first.tmljob and a.tmlped = first.tmlped and a.tmldat = first.tmldat and a.tmllin = first.line ) I'm sure there's an easier way to do this. But for quick and dirty, should do what you want. HTH, Charles > -----Original Message----- > From: Vern Hamberg [mailto:vhamberg@xxxxxxxxxxxxxxxxxxxxxxxxx] > Sent: Saturday, May 22, 2004 9:08 AM > To: Midrange Systems Technical Discussion > Subject: Re: Hairy SQL issue with duplicate (almost) records > > > Hi, Pete > > The second-level text on the message gives you the possible > reasons for > this error. It's worth taking a look (F1 is your best > friend). In this > case, your subselect has 4 columns, and your table has 5. > > At least 4 options: > > 1. Don't do the CRTDUPOBJ. Rather, from interactive SQL, > press F13 and take > option 1 to change the output type to file. The first time, set it to > create the file. Then turn your subselect into a select (run > it by itself) > - next time, set option to replace member, or whatever works. > > 2. Don't do the CRTDUPOBJ. Rather, put the subselect, by > itself, into a > *QMQRY. You can use STRQM in SQL rather than PROMPT mode. Or put the > statement into a RCDLEN(91) source file and use CRTQMQRY. Then use > STRQMQRY, which has an OUTPUT(*OUTFILE) option and has OUTMBR(*FIRST > *REPLACE) as well, just like CPYF. > > 3. Don't use CRTDUPOBJ. (Is this getting boring? Nothing > wrong with it, as > far as it goes, but is not necessary) Get Martin Rowe's very > excellent > EXCSQL command at <www.dbg400.net/excsql.html>. This lets you > run an SQL > statement from a command line. It has OUTPUT options, also, > since it's > based on QMQRYs. There are others-Midrange Computing had one, > I wrote my > own once, Buck Calabro has written one. > > 4. Use CRTDUPOBJ. In interactive SQL, run the alter table > statement to DROP > the offending column. Then use your INSERT INTO > > I strongly recommend QM queries. In the long term, they are > more flexible > than RUNSQLSTM or Query/400 queries. There are QM forms that > give you the > layout for reports. There are several better options in QM > forms than in > QRYDFNs. And, perhaps best of all, you can use substitution > variables. > QRYDFNs have this, too, but are not callable from CL with the > values. The > secret of Martin's command, et al., is to use a QMQRY with only > substitution variables. > > HTH > Vern > > At 08:51 PM 5/21/2004, you wrote: > >This is the classic duplicate record issue where my customer > posted data > >twice and now has some duplicate data in a file. The file > is keyed such > >that a "duplicate" really doesn't occur because there is a > line number in > >the record that can be incremented by the RPG program when a > duplicate is > >entered (there are reasons for this..which has back back to > haunt me). > > > >So I have a table (ppay408a) that looks like: > > > >tmlssn,tmljob,tmlped,tmldat, tmllin, etc > > > >A "duplicate" is when the first four columns have identical > values and the > >line number (tmllin) is different. So I wrote an sql > statement that shows > >me the duplicate records: > > > >select * from ppay408a a inner join > >(select tmlssn,tmljob,tmlped,tmldat from ppay408a > >group by tmlssn,tmljob,tmlped,tmldat > >having count(*) > 1) as dups > >on a.tmlssn = dups.tmlssn > >and a.tmljob = dups.tmljob > >and a.tmlped = dups.tmlped > >and a.tmldat = dups.tmldat > > > >Sweet! All my duplicated records are there. So I want to > write them to a > >file: > > > >so I change the statement to be: > > > >insert into ptmp408a > >select * from ppay408a a inner join > >(select tmlssn,tmljob,tmlped,tmldat from ppay408a > >group by tmlssn,tmljob,tmlped,tmldat > >having count(*) > 1) as dups > >on a.tmlssn = dups.tmlssn > >and a.tmljob = dups.tmljob > >and a.tmlped = dups.tmlped > >and a.tmldat = dups.tmldat > > > >And I get an SQL error (SQL0117) saying "Statement contains > wrong number of > >values." > > > >Is that because of the subselect? Something else? ptmp408a > was created > >using the crtdupobj command so the tables are identical, aren't they? > > > >Second question? Is there a better way to go about this > (without writing a > >program to do it...which is my alternative...) > > > >Thanks, > > > >Pete Helgren > >Value Added Software,Inc. > >801.581.1154 x202 > > > _______________________________________________ > This is the Midrange Systems Technical Discussion > (MIDRANGE-L) mailing list > To post a message email: MIDRANGE-L@xxxxxxxxxxxx > To subscribe, unsubscribe, or change list options, > visit: http://lists.midrange.com/mailman/listinfo/midrange-l > or email: MIDRANGE-L-request@xxxxxxxxxxxx > Before posting, please take a moment to review the archives > at http://archive.midrange.com/midrange-l. >
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.