You're spot on - this all depends on what the meaning of IS is - or rather, of duplicate. The QM solution assumes absolutely identical across the entire row - timestamps would clearly make that difficult, at least.


Luis Rodriguez wrote:

Nice idea, hadn't thought of using QM. The reason I suggested the view
was that I thought that there could be some fields, like timestamp
fields, that would not work with a SELECT DISTINCT (here, we timestamp
everything, so there are not two exact records (I hope :-) )).

He could also (*maybe*) output the SELECT DISTINCT statement to a temp
file, and then copy that data into his original file.

We really need a little more info about his files...

Luis Rodriguez
IBM Certified Systems Expert — eServer i5 iSeries

On Tue, Sep 1, 2009 at 8:43 AM, Vern Hamberg <vhamberg@xxxxxxxxxxx> wrote:

A very cool technique, albeit a little scary, is to use a query
management query. You can run a SELECT statement with a DISTINCT in the
field list, then have it put the result right back into the PF. Like this -


The statement would be


To do this,
- create a source file called QQMQRYSRC with record length = 91 (yes, 91)
- add a member to it, call it what you will
- put the above SELECT statement into it on one record - obviously use
the name of the PF you want to clean up
- save the member
- run CRTQMQRY using that source member as the source - you'll see where
if you prompt CRTQMQRY
- run STRQMQRY as shown above, again, putting your library and object
names in it

Since you probably have STRSQL, you could use STRQM instead for the
first 5 steps - no need to have a source file. Just be sure, in STRQM
option 1, to change it from PROMPT SQL - F19 will do that.

You can even run the query from STRQM option 1 - option 9 there is the
ticket - just specify output number 3, I think, for a file.

There is no safety net in this process - it does not warn you that you
are replacing everything, so backup backup backup


jmmckee wrote:
I made a mistake today. Accidentally ran a posting program three additional times. I normally try to automate things so dumb things like thisdon't happen. But, this has not been a project developed in the proper manner - meaning GET IT DONE!

The file is a join logical. Common ields are 1) hospital number, 2) account, 3) date, 4) time , and 5) id of poster. The first four fields are numeric and the last is alpha.

The two files are a main file with 30 characters of text and the second file contains multiple records for additional text.

Something like 5000 records posted three extra times..

The problem will disappear in about 6 months, if nothing is done. Just looks bad and a lot of people will see my blunder and have to deal with it.

Can SQL be used to identify when there are multiple records for the same hospital and account number that have identical date, time, and poster id?

Can SQL be used to remove three of the four multiples?

Can SQL be used to do the previous two operations, but for one specific account - for testing?

Would it be easier if the original posting file was available? My assumption is yes. and that file can be available.

This doesn't effect any totals. Just looks really bad.

Seems this would be pretty simple in RPG, especially with the original file available. Just don't know if I can get the time to devote to tis.

Is this something a) best left alone, b) best fixed with SQL, or c) best fixed with RPG?

Easiest approach would be to back out all the records and ten post once, I am guessing.

Wish I hadn't made the mess, in any case.

John McKee

This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].