The immediate question: Is XMLTABLE the best way to process a large XML file?
We have XML files that contain over 40,000 individual data records with file sizes over 7 megs and one even over 70 megs in one instance.
I am parsing the data from XML into an SQL table using the following statement:
This works great for a file with a few hundred transactions in it. But when we have 40,000 transactions it takes a lot of time. (I have a file that is 35meg in size, with over 50,000 records and it took 20 minutes to complete.)
Is there a way to speed this up using XMLTABLE() or is there a more efficient way other than XMLTABLE?
+++++ This email and related attachments may contain confidential information intended exclusively for the addressee. Unauthorized use, disclosure or distribution of this material is prohibited. If you received this message in error, please advise the sender and delete all copies of it. Content is provided by the individual sender and does not necessarily reflect the views of the Company. Though sender believes this transmission to be virus-free, it is the recipient's responsibility to ensure that it is.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2021 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.