"RPG400-L" <rpg400-l-bounces@xxxxxxxxxxxx> wrote on 11/28/2016 01:00:01
PM:
----- Message from "gio.cot" <gio.cot@xxxxxxxxxxx> on Mon, 28 Nov
2016 15:42:24 +0100 -----

To:

<rpg400-l@xxxxxxxxxxxx>

Subject:

Generic trigger - Dinamically field position find

I would need write a generic trigger program that update date and time
change fields ;

This is my scenario:

Trigger program receive the two standard parameter
(
https://publib.boulder.ibm.com/html/as400/v4r5/ic2979/info/db2/rbafomst288
.
htm#HDRRZAHFTRC )

As the date and time change fields , are always the same, (example
DATA_LOG - TIME_LOG), I would need find the initial field position in
input
record update the values in the input record string with substring and
then
return back so that in record I will find my date and time updated.

Can some help me with some example or how to find the initial fields
position ??



Keep in mind I have about 25 files that have the same date and time
fiels
and I would like to write only one trigger program to update this two
field

Thanks in advance


You can also simply add an SQL trigger:

runsql sql('create or replace trigger YourFileTrigger
before insert or update on YourFile
referencing new NewRecord
for each row
mode db2row
set NewRecord.Date_log = current date ,
NewRecord.Time_log = current time' )

Of course you would have to create the above trigger on each file. But
you'll need to do as much to associate your generic trigger program with
each file.

Raul Jager's method is even simpler, but you would have to be willing to
add a column (field) to each table/file.

Michael Quigley
Computer Services
The Way International
www.TheWay.org

This thread ...


Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2019 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].