× The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.



I found your "Trigger Mediator Documentation" from a google search.

This looks complex. Is this common to use in iseries shops?

What are the big advantages?


-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Alan Campin
Sent: Tuesday, June 12, 2012 10:32 AM
To: Midrange Systems Technical Discussion
Subject: Re: multiple trigger programs for many data files - is there a better methodology?

Think about a data queue server. Just have one trigger program that
receives the buffer and writes a record to a data queue. The data
queue picks it up and does the transformation and sends it over to be
updated.

You could also use the RCVJRNE running in a batch job to receive the
journal entries, do the transformation and send over. There are some
tricks to this. I have code that I have used in the past.

Again my warning. Do not put any substantive logic in a trigger
program without the use of a trigger mediator. Using triggers without
a mediator is scraree to me no matter what.



On Tue, Jun 12, 2012 at 8:53 AM, Chris Bipes
<chris.bipes@xxxxxxxxxxxxxxx> wrote:
I would take a closer look at journaling the files.  If the remote system is also an iSeries, you can set up remote journaling.

Daily you can detach the current journal receive and attach a new one.  You can then have one program process the journal receiver you just detached and perform the updates.  What is the external system?  Another application on the same server or a different server / platform?  Answers here will drive the recommendations in the correct direction.

--
Chris Bipes
Director of Information Services
CrossCheck, Inc.

-----Original Message-----
From: midrange-l-bounces@xxxxxxxxxxxx [mailto:midrange-l-bounces@xxxxxxxxxxxx] On Behalf Of Stone, Joel
Sent: Tuesday, June 12, 2012 7:45 AM
To: 'Midrange Systems Technical Discussion'
Subject: multiple trigger programs for many data files - is there a better methodology?

I have been tasked with copying records from several files to an external system when data in records change.

So a series of trigger programs seems like a good solution.

I will have a pile of20 files, but lets say there are 3 files for simplification.

I need a different trigger pgm for each file:

File                     Program
-------        -------------------
Cust                       pgmA

Product                pgmB

Order                    pgmC




Would it be possible and worthwhile to build only one trigger pgm?  I was thinking that maybe there is a way to capture ONLY the file-name and the primary key values into a new file.

Then later when the data transfer to the external system occurs, an SQL command would take the file name & keys and get the actual full data record to transfer.

Would this be worth the extra work to make one flexible pgm to handle all data files?

I could possibly make use of the system files QADBKFLD and QADBIFLD to construct keys and files to identify changed records.

Can this be done??  Or not worth it??

Has anyone done this?  Can you share some code??

--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/midrange-l.


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.