On 12 January 2018 at 09:01, Jay Vaughn <jeffersonvaughn@xxxxxxxxx> wrote:
So the only way i know to detect bad data using sql is when doing the
actual insert of the data into another file.
I have walked many, many miles in these shoes.
The only way to solve the issue permanently is to edit the data before
it goes into the file. That, and find the last remnants of program
described files, make them externally described and stop putting
blanks in packed fields :-) This is a lot of work for very little
perceived benefit, but if you can't trust the data, what's the point?
Now that you've found and rectified the programs that are corrupting
the data, you will need to go through the database and identify the
corrupted rows.
Is there "select" way of doing it instead?
Yes. ish.
select keycolumn, hex(prefix)
from ldataicl.srvfm50
where hex(prefix) = '4040404040'
order by keycolumn;
The trick here, if you will, is to keep the database manager from
trying to interpret PREFIX as a number.
The problem is that PREFIX might not be blanks. It might be something
else like 12.34, with the decimal point actually in the columns. For
those situations, I've 'scanned' for '4B'. The situation rapidly gets
uglier as more patterns are discovered.
In my opinion, the best generic way to find these devils in all
columns of a file is to write a program that writes a program. Query
all the numeric columns out of SYSCOLUMNS where table = 'SRVFM50'.
Use RPG record level access (Input Primary) to read the file. For
each of the numeric columns, call your sub-procedure that tests for
numericalness. SYSCOLUMNS will tell you which ones are numeric.
That sounds like a lot more work than you were hoping to do, and I'm
sorry to be the messenger.
also when a select is done in STRSQL
over the file, the bad data shows up as "+++++", so just need to know how
STRSQL detects it and shows "+++++" instead of the data.
As far as I know, we can't turn that off.
--buck
As an Amazon Associate we earn from qualifying purchases.