|
Hi Jon, <snip> >> Does this XML implementation support DTDs or XML schemas? No - not yet. It makes the assumption that the XML is well-formed and valid. Although if there are missing/extra elements/attributes you can deal with it. </snip>That's a big assumption. How do we check for well-formedness prior to processing this document? This is a BIG issue with xml processing. If you use the simple SAX parsing model and you are not using transaction control such as commitment control then you have a real headache if your doc goes bang halfway through processing it and you have already written data to tables. DOM is different, as it builds the object model in memory and will find errors before you get a handle to it. But this is expensive for simple read-only processing.
<snip> >> And imho any parser should work on those DTDs/XML schemas. Not sure why you say that. I have never noticed any that do - but then I wasn't looking. </snip>The XML Toolkit provided by IBM to run on the iSeries contains the service program QXML4PR531 in library QXMLTOOLS. This service program is designed to allow procedural languages like RPG to use the DOM and SAX parsers provided within the toolkit. I know for certain that the SAX parser in there supports full schema validation because we use it in our RPG programs.
I would say, though, that we only have full schema validation turned on when developing/testing/debugging. The beauty of validating schemas is very simple - it defines the interface. When we define an interface to a new system we also define a validating schema. This definition should be all that matters to both parties with regard to the architectural side of the interface xml. This means I can test my xml generation and as long as the data I will pass validates against the schema I know I'm ready to go. This is invaluable when you are testing in a sandbox prior to hooking up with the other party. It would be a nightmare if I couldn't get going with my development without constant reference to what is happening with the developers at the other side of the interface.
Validating schemas provide a reference point - if something goes bang on the other side, we can validate the xml to see if it is "valid". If not, we fix the xml generation program, if it does we get them to fix their xml decomposition program. In the rare occasions where the doc is "valid" but incorrectly defined we get together and re-engineer the schema. This then becomes the new "point of reference" for the interface guys.
Life is MUCH easier with validating schemas. :-) Cheers Larry Ducie
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.