× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



After a more thorough review for my comments in my most recent\prior reply in this thread, and also searching and reviewing some of my SQL source, I recall that the complete solution is dependent upon the precision of the Packed Decimal column. I had alluded to that in some prior replies but I had failed to account for that in my solution. So unfortunately the solution given in my prior message is completely valid *only* for odd-precision Packed decimal, because as coded, there is nothing that ensures the first nibble is always zero for an even-precision :-(

I had sometime in the past not offered a UDF that I was preparing as a solution [and probably this is why I could not find one in a web search]. My progress was stopped because the SQL does not provide any scalar functions for PRECISION or SCALE; so suggested my comments on an excursion into that territory very long ago. I was apparently going to write a PROCEDURE that would create a VIEW in QTEMP, by generating the appropriate expressions with the assistance of the SYSCOLUMNS data to obtain the precision, but I never continued with that. That approach is actually nicer, because that can more easily handle all BCD fields; i.e. generate the validation for every BCD field as part of a VIEW that could be queried. A UDF would need to have the precision passed as a parameter along with the HEX() of the column, because there is no way to know if for example, a five-byte string of hex notation is for a 8P##\Dec(8,##) or a 9P##\Dec(9,##).

I did a head-slap when I since realized that DIGITS could give me the answer for PRECISION. However I was harshly reminded that although the precision is available via the expression "LENGTH(DIGITS(dec_column))", that expression is not helpful in a bad decimal data scenario. That is because just like with every other SQL [or UDF] scalar other than HEX, one must expect that the database SQL will /choke/ on the bad decimal data :-(

However with the creative use of another query of the same column being interrogated for bad decimal data, the precision can still be had within the same query doing the interrogation. Thus I will provide a correction that checks validity of values for all supported precisions, both odd-precision and even-precision [although I did not specifically review the efficacy for the odd case of single precision; I assume it works].

But first, a reminder of the first query which finds the RRNs of the bad decimal data in column OrderQty [but only when that column is an odd-precision], along with the row data; e.g. to be used in a report:

≥ select rrn(a) , a.*
≥ from Datafile as a
≥ where right(hex(a.OrderQty), 1) not in ('A','B','C','D','E','F')
≥ or translate( left(hex(a.OrderQty), length(hex(a.OrderQty))-1)
≥ , ' ' /* decimal digits to blanks; blank pad */
≥ , '0987654321' ) <> ''
≥ -- invalid sign or any digits are not 0-9 [i.e. any digit is A-F]

And how the inverse selection logic can get just the rows of the file with valid decimal values [and thus no longer selecting the RRNs] for the column OrderQty being interrogated [but only when that column is an odd-precision]; e.g. used to copy only the good rows, as alluded in the OP and other replies:

≥ select a.*
≥ from Datafile as a
≥ where right(hex(a.OrderQty), 1) in ('A','B','C','D','E','F')
≥ and translate( left(hex(a.OrderQty), length(hex(a.OrderQty))-1)
≥ , ' ' /* decimal digits to blanks; blank pad */
≥ , '0987654321' ) = ''
≥ -- valid sign nibble and all digits are decimal 0-9

Now for the complete validation irrespective of even-precision vs odd-precision, used to select all rows with any type of bad decimal data [regardless if presentation or anything else might effectively overlook\ignore the decimal data error] in the column OrderQty being validated\interrogated:

≥ select rrn(a) , a.*
≥ from Datafile as a
≥ , ( select
≥ mod(length(digits(ifnull(m.OrderQty, dec(0, 1)))), 2) odd_prec
≥ from ( select max(OrderQty) OrderQty
≥ from Datafile where 1<>1) m
≥ ) b
≥ where right(hex(a.OrderQty), 1) not in ('A','B','C','D','E','F')
≥ or translate( left(hex(a.OrderQty), length(hex(a.OrderQty))-1)
≥ , ' ' /* decimal digits to blanks; blank pad */
≥ , '0987654321' ) <> ''
≥ or ( b.odd_prec = 0 /* even-precision */
≥ and left(hex(a.OrderQty), 1) <> '0' )
≥ -- invalid sign or any digits are not 0-9 [i.e. any digit is A-F]
≥ -- or first digit is not 0 for an even-precision

Similarly, the complete validation irrespective of even-precision vs odd-precision, used to select only the rows with no bad decimal data in the column OrderQty being validated\interrogated:

≥ select a.*
≥ from Datafile as a
≥ , ( select
≥ mod(length(digits(ifnull(m.OrderQty, dec(0, 1)))), 2) odd_prec
≥ from ( select max(OrderQty) OrderQty
≥ from Datafile where 1<>1) m
≥ ) b
≥ where right(hex(a.OrderQty), 1) in ('A','B','C','D','E','F')
≥ and translate( left(hex(a.OrderQty), length(hex(a.OrderQty))-1)
≥ , ' ' /* decimal digits to blanks; blank pad */
≥ , '0987654321' ) = ''
≥ and ( b.odd_prec = 1 /* odd-precision; validation done */
≥ or ( b.odd_prec = 0 /* even-precision */
≥ and left(hex(a.OrderQty), 1) = '0' )
≥ )
≥ -- valid sign nibble and all digits are decimal 0-9
≥ -- and either odd-precision [so first digit already validated]
≥ -- or the first digit is a 0 for an even-precision


kwds: sql validate packed decimal data udf create function


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.