×
The internal search function is temporarily non-functional. The current search engine is no longer viable and we are researching alternatives.
As a stop gap measure, we are using Google's custom search engine service.
If you know of an easy to use, open source, search engine ... please contact support@midrange.com.
I have a very deeply nested XML with each level having a variable number of
entries.
It is my understanding that in order to use XML-INTO, I create a data
structure for each level and have it dim-ed within the previous level data
structure. So, assuming only 4 levels of variable entries (and there are
more in some areas), if I assume a possible 50 entries per level and 200
bytes for the bottom level, that means I have 6.25M occurrences of the
bottom level data structure (50 * 50 * 50 * 50) for a total of 1.25G of
memory. OUCH!!!
If I use SAX, I think I can alloc the memory for the data structure as I
need it thus reducing it from 1.25G but then I have to keep track of where I
am in the XML. This seems to be a lot more difficult.
Am I missing something in these two options or is there another way to
process my file that I haven't found yet?
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact
[javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.