× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



On 08-Jul-2016 09:14 -0500, CRPence wrote:
On 08-Jul-2016 00:38 -0500, Birgitta Hauser wrote:
On Friday, 08.7 2016 00:15 CRPence wrote:

<<SNIP and combine doc link with contiguous quoted snippet:>>

[http://www.ibm.com/support/knowledgecenter/api/content/ssw_ibm_i_71/db2/rbafzscatsformat.htm]

"A substring of the string-expression representing a component of
a ≥date or≤ timestamp (such as year, month, day, hour, minutes,
seconds) can include less than the maximum number of digits for
that component of the ≥date or≤ timestamp. Any missing digits
default to zero. For example, with a format-string of
'YYYY-MM-DD HH24:MI:SS', an input value of '999-3-9 5:7:2' would
produce the same result as '0999-03-09 05:07:02'.

Leading zeroes can be specified for any component of the ≥date
or≤ timestamp value (for example, month, day, hour, minutes,
seconds) that does not have the maximum number of significant
digits for the corresponding format element in the
format-string.
..."

<<SNIP>>

I wonder if anyone can get a valid TIMESTAMP [rather than the
data mapping errors for the following query TO_DATE expression
<<SNIP>>] when running the following query on any of [a somewhat
up-to-date] v7r1, v7r2, or v7r3?:

with dt2v (c8) as (
values('01/02/03'), ( '8/6/14'), ( '5/10/15')
,('09/7/16' ), ( '3/4/7' ), ('12/18/30')
,('12/30/42') )
select substr(c8 , 1, 08) as c8
, to_date(substr(c8, 1, 08),'MM/DD/YY') as c8_to_date
, date( insert( c8
, locate('/', c8, 4) + 1
, case when length(rtrim(c8))>5
then 0 end
, case when right( rtrim(c8), 2 )
between '00' and '39'
then '20'
else '19'
end
)
) as usa_date_fmt
from dt2v
; -- STRSQL report I get, running with ISO date formatting:
....+....1....+....2....+....3....+....4....+....5
C8 C8_TO_DATE USA_DATE_FMT
01/02/03 2003-01-02-00.00.00.000000 2003-01-02
8/6/14 ++++++++++++++++++++++++++ 2014-08-06
5/10/15 ++++++++++++++++++++++++++ 2015-05-10
09/7/16 ++++++++++++++++++++++++++ 2016-09-07
3/4/7 ++++++++++++++++++++++++++ -
12/18/30 2030-12-18-00.00.00.000000 2030-12-18
12/30/42 2042-12-30-00.00.00.000000 1942-12-30
******** End of data ********


Of course given the prior v7r1 doc reference snippet I included
<<SNIP>>, I am expecting instead, from what I infer, from what
that doc suggests, is the following report, without any mapping
errors:

....+....1....+....2....+....3....+....4....+....5
C8 C8_TO_DATE USA_DATE_FMT
01/02/03 2003-01-02-00.00.00.000000 2003-01-02
8/6/14 2014-08-06-00.00.00.000000 2014-08-06
5/10/15 2015-05-10-00.00.00.000000 2015-05-10
09/7/16 2016-09-07-00.00.00.000000 2016-09-07
3/4/7 2007-03-04-00.00.00.000000 -
12/18/30 2030-12-18-00.00.00.000000 2030-12-18
12/30/42 2042-12-30-00.00.00.000000 1942-12-30
******** End of data ********

<ed: reworded; to emphasize example is from the doc ref above>
Revising the above CTE and TO_DATE usage to match the expression
and example data from the docs, for what is their allusion of
support for date\timestamp components lacking leading zeroes, I
get exactly the result the docs suggest [even whilst being a bit
more aggressive on the YYYY component than with their example
data]; happily, without data mapping errors!:

with dt2v (cv) as (
values('9-3-9 5:7:2' )
,('99-3-9 5:7:2' )
,('999-3-9 5:7:2' )
,('0999-03-09 05:07:02' ) )
select substr(cv , 1, 19) as cv
, to_date( substr(cv , 1, 19)
, 'YYYY-MM-DD HH24:MI:SS')
as cv_to_ts
from dt2v
; -- the report I get follows:

....+....1....+....2....+....3....+....4....+..
CV CV_TO_TS
9-3-9 5:7:2 0009-03-09-05.07.02.000000
99-3-9 5:7:2 0099-03-09-05.07.02.000000
999-3-9 5:7:2 0999-03-09-05.07.02.000000
0999-03-09 05:07:02 0999-03-09-05.07.02.000000
******** End of data ********

So <ed: that the above example is functional> leaves me
wondering "Why the failures using my earlier examples?"

p.s. Despite the claim by those docs that "Separator characters
can also be specified at the start or end of format-string.
These separator characters can be used in any combination in the
format string, for example 'YYYY/MM-DD HH:MM.SS'. Separator
characters specified in a string-expression are used to separate
components and are not required to match the separator characters
specified in the format-string." simple changes to use '/' vs '-'
often causes the pseudo-builtin to fail as the [system-supplied]
UDF that it is, with msg CPF426B rc13 F/QQQSVRTN FM/QQINVUDF
FP/SIGNALUDFERROR with an implication that I should "Ensure that
the format string correctly describes the timestamp string
specified." -- ironic, as that seems almost to suggest that I
need to match the separators, when clearly the doc implies *not*!
The UDF seems IMO, to be way too fickle to be trusted generally;
I would probably tend to avoid using the feature in production,
esp. if the input data might not be very exacting and well tested
with that UDF.

If you are working with a character representation of a date with
a date format with a 2 digit year, leading zeros are required for
the day and month specifications (this is valid for all date
conversion methods in SQL and even RPG).


I concede that "Leading zeros can be omitted from the month and day
portions [only] when using the IBM® SQL standard formats", of which
there are only 4-digit-year SQL standard formats, holds true for the
standard\typical casting scalars from a Date String. But that is not
germane. Re Date Strings, see:

IBM i->IBM i 7.1->Database->Reference->SQL reference->Language
elements->Data types->Datetime values->String representations of
datetime values->Date strings
[http://www.ibm.com/support/knowledgecenter/api/content/ssw_ibm_i_71/db2/rbafzdatestrings.htm]


Note however, the documentation for TIMESTAMP_FORMAT, for which "The
TIMESTAMP_FORMAT function returns a timestamp that is based on the
interpretation of the input string using the specified format." does
*not* require any of those formal\standard Date String variations,
because that pseudo-builtin scalar function allows for the
specification of a /format-string/ to be specified, with various
/components/ that are [best I can infer] *conspicuously documented*
to support the lack of a leading zero when /interpreted/. In fact
there are examples where clearly this function handles such data,
even for the year component; e.g. little different than my prior
failing example:

with dt2v (cv) as (
values('1.2.3'), ('5.6.11')
,('99.3.9 ' ) )
select substr(cv , 1, 07) as cv
, to_date(substr(cv , 1, 10), 'YY.MM.DD' )
as cv_to_ts
from dt2v
; -- report from the above query:
....+....1....+....2....+....3....+
CV CV_TO_TS
1.2.3 2001-02-03-00.00.00.000000
5.6.11 2005-06-11-00.00.00.000000
99.3.9 2099-03-09-00.00.00.000000
******** End of data ********

And simply by changing the /separator/ character in any one value,
which is documented [again quite conspicuously] to be effectively
irrelevant to the capability\functionality of the pseudo-builtin
scalar UDF, suddenly sees the function fail to convert a value:

with dt2v (cv) as (
values('1.2.3'), ('5.6.11')
,('99-3-9 ' ) )
select substr(cv , 1, 07) as cv
, to_date(substr(cv , 1, 10), 'YY.MM.DD' )
as cv_to_ts
from dt2v
; -- report from the above query:
*none, instead: msg SQ20448 "Expression not valid using format
string specified for TIMESTAMP_FORMAT." per msg CPF426B rc13 followed
by msg CPF503E ec24 rc24 per error invoking the associated external
program or service program QQQSVUSR with program entry point or
external name QQQTimestamp_Format


One question is, why does this feature so unpredictably both support
and fail to support, various values as input, while making such bold
claims of support\capability? And if unsupported, then please just
fail consistently, instead of giving the impression that the
apparently deceptively worded docs are honest about what should be
supported -- and drop all the then-presumably false-claims from the
docs. I am still wondering if anyone on up-to-date v7r1 or more
recent releases can suggest that by their tests, the feature can
perform without these errors, thus apparently eventually providing
consistent support for date\time components lacking the leading
zero.


FWiW: Apparently nobody else was curious enough to [read any of my post or then additionally] try those queries and report feedback.

So now that I have access to a v7r2 system, I can now report that all of the above queries I showed above as failing, function as expected on the pub400.com; at the present RT\cumulative\group maintenance of v7r2 [C6127720; SF99702 "DB2 FOR IBM I" group:13]. But run on the v7r1 [C5317710; SF99701 "DB2 FOR IBM I" group:38] system I use the still .puke. as described in the quoted message, and on the same queries\values.

So maybe the function has indeed matured [enough], by then, to a point where I might be more trusting to use TO_DATE [or its "also known as" variants] in /production/ code rather than writing my own code.

Or maybe not; because while the following query functions well\as-expected on v7r1, the identical query .pukes. on that v7r2 the same way those other queries fail in v7r1:

with
d (cwd) as (values( cast( '121226145503+0530' as varchar(20))))
select to_date(cwd, 'YYMMDDHH24MISS' )
from d -- omitting "+0530" somehow mollifies the finicky function

So perhaps just suffer through such regression issues and\or remaining surprise failures, and then await the necessary corrective fixes, or perhaps trust only thyself and settle on RYO because clearly _they_ can not be trusted; still ;-)


As an Amazon Associate we earn from qualifying purchases.

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.