|
On 15-Dec-2015 08:47 -0600, Wilson, Jonathan wrote:
This question is based in part on an ISBN structure, but is not
strictly following the rules.
Both short and long answers are appreciated, even ones that say "look
at the ... statement in the ... manual" or "this link does something
similar, you'll have to adapt it to DB2/i" as I'm working with very
very little experience of SQL (basic selects/inserts/updates is my
current level of experience).
I would suggest that, at least in part, seems to be an incorrect
approach, as directly mimicking RLA in SQL is not generally helpful.
The first two questions regarding review of the data by SQL, appear to
imply wanting to mimic what RLA would effect, but with little
conspicuous value-add.? That is to suggest, that the goal seems more
likely to be about ensuring the integrity of the data [the latter two
questions] rather than checking the integrity of existing data.?
Further comments inline.
An ISBN is made up of 3 parts (Excluding the new prefix, and
checksum):
• Country (Also language, as "English" is denoted by 0 and 1)
• Imprint (publisher)
• Book
All the parts are variable in length, so if a country produces few
books its country code is longer than a country producing a lot of
titles.
The same holds true of the Imprint, someone like OUP has a short
imprint number but a long title number split. JoeBlogsPublishing has
a long imprint and a short title number split as he will publish
hardly any books.
A couple of examples (not valid re:checksum or actual codes)
0-117-00001X (X denotes checksum)
81-01154-00X
9222-4101-2X
9229-41-011X
Now each section has to be "partial" unique, and the concatenation
of sections also have to be unique. (actually I'm not totally sure if
the imprint has to be "partial" unique on its own, or only as a
concatenation with the country) Assume the following already exist
in the file:
Country/language
0
1
80
81
82
9222
9229
So with the given, the setup for an actual test case:
create table isbn_file
( Country varchar(6)
, Imprint varchar(6)
, Book varchar(6)
)
;
create unique index isbn_x_ucc /* per later comment */
on isbn_file (country)
;
insert into isbn_file (country) values
('0' )
,('1' )
,('80' )
,('81' )
,('82' )
,('9222')
,('9229')
;
Not only that, but also no country code to be added, can have as a
part of it a previous country code:
0
01 << not allowed to be added
0177 << not allowed to be added
1
80
81
82
821 << not allowed to be added
9922
9925 << no problem - added
9929
The above example seems flawed, but only per 92## seems to have
changed to 99##; i.e. the described effects, marked with "<<", are
understood to be accurate.
Now in RPG it's easy when adding records to test for a previous
record being "similar" with a something along the lines of:
SetLL NewCountryCode
ReadP
If CountryCodeRead=%subst(NewCountryCode:1:%len(CountryCodeRead))
// partial match hit - error.
endif
Like wise, adding a new country code that is a partial key match of
a longer already existing record can also be tested by:
0
1
8 << not allowed to be added
80
81
82
92 << not allowed to be added
922 << not allowed to be added
9222
9925
9926 << no problem - added
9229
The example seems flawed; the 92## values return, though seems the
previously oddly appearing 9929 was lost to the returning 9229. Again,
no matter to the described effects, but consistency is preferable.
SetLL NewCountryCode
Read
if %subst(CountryCodeRead:1:%len(NewCountryCode))=NewCountryCode
// partial match hit - error
endif
Now my questions are: (V5R3 only as that's the highest level I can
test on.)
1) how do I test both conditions in SQL, using two SQL operations?
(Actually return the partial match hits with two selects
- select from file when partial match previous.
select max(country) as partial_prev
from isbn_file
where length(country)<length( :addCode )
and country = left( :addCode ,length(country))
-- return the NULL value when no matching rows
-- min vs max immaterial for as-describe scenario
And
- select from file when partial match next.)
select min(country) as partial_next
from isbn_file
where length(country)>length( :addcode )
and left(country, length( :addcode )) = :addcode
-- return the NULL value when no matching rows
2) Is is possible to perform both tests with only one SQL statement
and if so how? (a single select)
select from file
when partial match previous
or partial match next.
The same predicates as shown prior [in response to "1)"], combined
with an OR; or, using the variation on the predicates in what Charles
provided, similarly those were combined with an OR:
select *
from isbn_file
where :addCode like (country concat '%') /* any partial_prev */
or country like (:addCode concat '%') /* any partial_next */
But notice that the above query returns all rows with
partial-matches, not just [the one previous or] the _one_ next; and
returns zero rows when the value of :addCode could be added to the
[static] set without breaking any of the rules. However, simply by
adding the FIRST FIRST 1 ROW ONLY clause the set can be limited to no
more than one row. Optionally, also add an indication of *prv vs *nxt
to the one row [if any row is] returned, revealing whether the value
blocking the insert is a prior key value or a next key value; e.g. using
a CASE:
select /* :addCode as code_to_add, */
case when length(country)<length(:addCode) then 'PREV'
else 'NEXT'
end as Prv_or_Nxt /* which existing country blocks an insert */
, p.*
from isbn_file as p
where :addCode like (country concat '%') /* any partial_prev */
or country like (:addCode concat '%') /* any partial_next */
fetch first 1 row only
The prior two separate queries [above, after "1)"], because they
return just one row, and always one row, could be combined in a "single
SELECT" but as two separate scalar subselects [and is coded to allow
running on v5r3]:
select
( select max(country) from isbn_file
where length(country)<length( :addcode )
and country = left( :addcode ,length(country))
) as partial_prev
, cast(:addcode as varchar(6)) as code_to_add
, ( select min(country) from isbn_file
where length(country)>length( :addcode )
and left(country, length( :addcode ))= :addcode
) as partial_next
from t
3) How would I express?:
insert into file
where partial previous <ed: does not exist>
and partial next does not exist.
insert into isbn_file (country)
select :addcode from qsqptabl
where not exists /* neither rule preventing insert exists */
(select '1'
from isbn_file
where
( length(country)>length( :addcode )
and left(country, length( :addcode ))= :addcode
) /* new country value matches an existing country pfx */
or
( length(country)<length( :addcode )
and country = left( :addcode ,length(country))
) /* no existing country with pfx of new country value */
)
Or mimicking the use of the LIKE predicates:
insert into isbn_file (country)
select :addcode from qsqptabl
where not exists /* neither rule preventing insert exists */
(select '1'
from isbn_file
where :addCode like (country concat '%') /* any partial_prev */
or country like (:addCode concat '%') /* any partial_next */
)
4) How would I express this as a RI constraint so the DB handles the
prevention of incorrect partial key match adds. (I already prevent
duplicates via unique key) Assuming such a thing is possible? eg: on
insert or update reject if partial previous or partial next found.
Best I know, the restrictions to implement the constraints would
disallow any constraint definitions that might assist in that regard;
e.g. no User Defined Functions (UDF) that could be used, nor could be
included references to other row data, parent to child data
relationships are purely equivalence, ¿etc.?.
Please note, these will be entered SQL at the green screen.
Unsure what "these" are; nor even what 5250 application if "these" is
the dynamic SQL statements. So I might presume that "these" refers to
the /statements/, most notably, the INSERT; a statement that by itself
should be able to ensure the validity, but of course recoding a new
literal on an INSERT using SELECT instead of INSERT using VALUES would
be a pain. But for that, the QM Query feature allows replacement
variables if the SQL will not be coded into a HLL; the Set Variables
(SETVAR) parameter of the Start Query Management Query (STRQMQRY). Or I
might presume the "these" refers to the literal values, and the input
could be via a custom Display File (DSPF), and the implication being
that some HLL with embedded or dynamic SQL will encapsulate the
statements. Anyhow, that statement leaves me feeling more confused that
informed; I would hope the SQL would be performed as part of a program
that is using the INSERT statement to enter only the valid values and
return an error to the user when a value did not pass the test that
would allow the write.
The SELECT queries that return which value would prohibit the insert
might be worthwhile in debugging a problem or testcase\scenario, but
they are best not used either proactively or reactively, except on
static data, because concurrent changes could debase the
merely-ephemeral answer to the more recent SELECT inquiry. The answer
from the predicates used in the INSERT statement itself, are what should
be used, to ensure best results. Only the selection under isolation
that prevents other updates would ensure feedback was correct about
which row prevents the insert, if access is concurrent vs static.
A potentially nice option for use in a dynamic environment like Start
Interactive SQL (STRSQL) would be to direct the INSERT activity to a
VIEW that was created using WITH CHECK OPTION. That is because the
effective selection preventing the insert is coded in the VIEW rather
than needing to be coded on the INSERT. But sadly. the restrictions for
that clause include prohibiting use of a subselect to validate the data;
much like restrictions prohibit use of constraints. For a simpler data
check, the effect of an INSERT whereby the row does not meet the
criteria, the outcome is sqlcode -161 or msg SQL0161 stating that the
"INSERT or UPDATE not allowed because a resulting row does not satisfy
view definition..."
Many thanks in advance.
However all of what is expressed as desired for data validation, is
available from an INSERT TRIGGER; and similarly for UPDATE, if updates
were allowed in that capacity, or an UPDATE could even be used to
prohibit such changes if desirable. The trigger can even feedback which
value blocked the insert; effectively proactively, but within the
trigger, making the decision almost as good as within the INSERT,
because the effect is essentially the same.
The following slightly different triggers [transcribed, not copied
directly; apologies for any errors], both in what the predicates used
and a SELECT ... INTO vs a VALUES ... INTO and how each blocks the
insert of the rows, should function as written, even though the
messaging\return-code is coded poorly, properly reveals the restriction
that blocked the insert via the message_text,
Using SELECT ... INTO and using the LIKE predicates:
drop trigger isbn_f_bi
;
create trigger isbn_f_bi
before insert
on isbn_file
referencing new as n
for each row mode db2row
set option dbgview=*SOURCE
begin
/* local variables */
declare err0100 char(1) default '0' ;
declare errmsg varchar(1000) not null default 'Bad thing!' ;
declare StmState char(5) default '00000';
/* sql conditions */
declare NoRows condition for sqlstate '02000' ;
declare NoNull condition for sqlstate '23502' ;
/* sql return-codes; sqlcode & sqlstate */
declare sqlstate char(5) not null default '00000';
declare sqlcode int default 0 ;
/* sql cursors */
/* sql handlers */
declare exit handler for SQLEXCEPTION
SIGNAL SQLSTATE 'ISBNX'
SET MESSAGE_TEXT = errmsg ;
declare continue handler for NoRows
NR: begin
set err0100 = '1' ; /* just exit quietly; let insert happen */
end NR;
declare exit handler for NoNull
NN: begin
set StmState='23502';
signal sqlstate value '02N44' set message_text=errmsg ;
end NN;
select
case when length(c.country)<length(n.country)
then 'Partial_PREV key value "'
else 'Partial_NEXT key value "'
end
concat p.country
concat '" blocked INSERT of value "'
concat n.country concat '"'
into errmsg
from isbn_file as p
where n.country like (p.country concat '%') /* any partial_prev */
or p.country like (n.country concat '%') /* any partial_next */
fetch first 1 row only
;
if err0100='0' then /* no NoRows condition; block insert */
set errmsg = cast(null as varchar(1000) ) ;/* frc NoNull exit */
end if ;
end
Using VALUES ... INTO and not using LIKE predicates:
drop trigger isbn_f_bi
;
create trigger isbn_f_bi
before insert
on isbn_file
referencing new as n
for each row mode db2row
set option dbgview=*SOURCE
begin
/* local variables */
declare err0100 char(1) default '0' ;
declare errmsg varchar(1000) not null default 'Bad thing!' ;
declare errind smallint not null default 0 ; /* no idea how */
declare StmState char(5) default '00000';
/* sql conditions */
declare NoIndi condition for sqlstate '22002' ;
declare NoInd2 condition for sqlstate '22004' ;
declare NoNull condition for sqlstate '23502' ;
/* sql return-codes; sqlcode & sqlstate */
declare sqlstate char(5) not null default '00000';
declare sqlcode int default 0 ;
/* sql cursors */
/* sql handlers */
declare exit handler for SQLEXCEPTION
SIGNAL SQLSTATE 'ISBNX'
SET MESSAGE_TEXT = errmsg ;
declare exit handler for NoNull, NoIndi, NoInd2
NN: begin
set StmState='23502';
set err0100 = '1' ; /* just exit quietly; let insert happen */
end NN;
values
(
coalesce(
(select 'Partial_PREV key value "' concat max(c.country)
from isbn_file as c
where length(c.country)<length( n.country )
and c.country = left( n.country ,length(c.country)))
,
(select 'Partial_NEXT key value "' concat min(c.country)
from isbn_file as c
where length(c.country)>length( n.country )
and left(c.country, length( n.country ))= n.country )
)
concat '" blocked INSERT of value "'
concat n.country concat '"'
) into errmsg /* how to specify indicator variable? */
;
if err0100='0' then /* no NoNull condition; block insert */
signal sqlstate value '02N44' set message_text=errmsg ;
end if ;
end
--
Regards, Chuck
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.