|
On 11/11/05, Bob Cozzi <cozzi@xxxxxxxxx> wrote: > > Storing an entire database in XML or translating and then transmitting a > multi-million record file is NOT what XML was intended to be used for. It > looks to me like it was intended to allow you to ready about a "page" of > data for transmission or to be displayed (via a browser). It does that > wonderfully. > I can't agree more. One of my current projects is to replicate iSeries data to a Pocket PC application. At first, we were using the .NET DataSet WriteXML() to create an XML file, then we would physically move that file to the device, then a device program would use ReadXML() to read the data into its DataSet object. What we quickly learned was that even for small numbers of records, say only a few hundred, the XML files were inordinately large, and the time it took the program to read them into a dataset was terrible. Since the program (at the time) was using the DataSet to read the records and INSERT them into an SQLCE2.0 database, I replaced the XML file with a text file containing just the INSERT statements themselves. The resulting files were much much smaller and performance was at least ten times better. Of course, that project has moved well beyond that stage, and now we have .NET programs reading/writing directly to the handhelkd database (which is the bee's knees), but it definitely proved to me that XML is no good for real-time processing of chunks of records. -- Joel Cochran
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.