|
Remote Journaling is one possibility. Data queue is another, especially if you are selecting records via trigger. Have your trigger send the necessary data to a data queue. Then on the remote system, you have a job that creates a DDM data queue in QTEMP to the source system. Read the data queue and process locally. If there is a DDM error, have the job re-submit itself and end. Whole new DDM data queue in QTEMP and IBM handles all the communications. Real easy and simple and very close to real time. Chris Bipes -----Original Message----- I am looking for some ideas. Let me explain the problem. We have an iSeries located in a building, lets call this iSeries A. We have another iSeries located in the main office. Lets call this iSeries B. iSeries A is used to process information about the sales of cars. iSeries B has web services running on the machine and processes a subset of a file from iSeries A. We have a trigger for the file on iSeries A that will process only the records that are needed for this process. The trigger program saves the information we need to a hold file. What we need to do is transfer the information in the hold file to iSeries B either in real time or near real time. The file must be transferred over the internet, so DDM is out of the question (there are too many issues with error recovery when the communications go down in DDM). We were kicking around the idea to FTP the file, but this must be all automated. Any ideas on a solution?
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.