"I know that Jon P. doesn't grok XMLTABLE, and I stumble over complex stuff with XML-INTO, and all the code I have to use for XML-SAX, well, the SQL solution is just cleaner all around for me."
In cases such as this simple "record layout" type XML Vern I have no issues with it. It is when dealing with nested structures that I find it easier to deal with XML-INTO. Maybe because it has been a long time since I had an XML doc that was not pretty easily mapped into a DS.
On Feb 21, 2019, at 4:58 PM, Vernon Hamberg <vhamberg@xxxxxxxxxxxxxxx> wrote:
I'll take a stab at this - I'm going to say that XMLTABLE is probably the fastest - at least no slower, and, for me, pretty easy to set up - again, for me, easier than XML-INTO.
Here's why -we receive from our field associates several XML files, one of which is for the various components of the service they do.
It was being processed with XML-SAX, and at 7.1 we had trouble with conversion of "interesting" characters into CCSID(37).
I learned about XMLTABLE, and we use that now - it's a dream, and it did character substitution better than RPG did - and I have around half a dozen evocations of XMLTABLE, despite which, the time to process things is no worse than it was with XML-SAX - the latter would have a single pass through the XML, multiple uses of XMLTABLE would have a pass for each use - yet, the overall time was not longer with those multiple passes.
I know that Jon P. doesn't grok XMLTABLE, and I stumble over complex stuff with XML-INTO, and all the code I have to use for XML-SAX, well, the SQL solution is just cleaner all around for me.
So that's as much as I know and have seen here - HTH!
On 2/21/2019 1:37 PM, Therrien, Paul via RPG400-L wrote:
(Cross Posting from Midrange-L)--
We are on IBM I V7R3.
The immediate question: Is XMLTABLE the best way to process a large XML file?
We have XML files that contain over 40,000 individual data records with file sizes over 7 megs and one even over 70 megs in one instance.
I am parsing the data from XML into an SQL table using the following statement:
insert into pjtlib/zmpsnxfl
passing info as "doc"
mailingNumber varchar(5) default ' ' path '../../MailingNumber'
, imbNumber varchar(21) default ' ' path '../IMB_Number'
, psnValue varchar(100) default ' ' path 'PsnAlpha'
, psnCode varchar(3) default ' ' path 'PsnCde'
, MFentity varchar(3) default ' ' path 'MFentity'
, MFelement varchar(3) default ' ' path 'MFelement'
This works great for a file with a few hundred transactions in it. But when we have 40,000 transactions it takes a lot of time. (I have a file that is 35meg in size, with over 50,000 records and it took 20 minutes to complete.)
Is there a way to speed this up using XMLTABLE() or is there a more efficient way other than XMLTABLE?
+++++ This email and related attachments may contain confidential information intended exclusively for the addressee. Unauthorized use, disclosure or distribution of this material is prohibited. If you received this message in error, please advise the sender and delete all copies of it. Content is provided by the individual sender and does not necessarily reflect the views of the Company. Though sender believes this transmission to be virus-free, it is the recipient's responsibility to ensure that it is.
This is the RPG programming on the IBM i (AS/400 and iSeries) (RPG400-L) mailing list
To post a message email: RPG400-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
or email: RPG400-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
Please contact support@xxxxxxxxxxxx for any subscription related questions.
Help support midrange.com by shopping at amazon.com with our affiliate link: https://amazon.midrange.com