|
I have a requirement to maintain a second set of files in different
libraries (in more-or-less real time) based on activity on the first set of
files. The file formats are not the same and I have to work with individual
column names (up to 300 columns). Although I have some concerns with
record-level SQL IO performance (many of the files will be active), I
decided use embedded SQL and ILE RPG instead of logical files with qualified
PFILE()'s. I don't want to use any asynchronous techniques, like data
queues.
My INSERT statement looks like this:
INSERT INTO LIB2/FILE2
(account2,
name2,
address2,
city2,
{others})
VALUES
(:account1,
:name1,
:address1,
:city1,
{others})
"{others}" represents 200 or more columns. The full version doesn't get
through the SQL preprocessor because of SQL0101.
The model works fine with a lesser number of columns and now I'm trying to
figure out if there's an elegant way to handle the problem (a data structure
of some sort or a special coding technique) or if I should use LF's. I'd
appreciate any suggestions.
Thanks,
rf
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.