10-02-2002, 07:27 PM
If I am putting together documents that will need to be searched heavily and often should I use XML? If I did I would need to know exactly what element each search result was found in ect.. and how can I search or query in XML files using XMLDOM?
10-12-2002, 08:34 PM
Well, since no one else has responded yet, I figured I'd kick this puppy off ... :D
First off, I'm only going to mention the MSXML parser, and not SAXX, since I have had no exposure to that technology. As the MSXML parser exists today, version 4.0, has vastly improved in speed (xPath), is still no where nearly as quick as SQL Database technology. With versions prior to 4.0, it wasn't until version 3.0 of the parser that MS claimed full XML compliance with XPath and XSLT specs. Version 4.0 introduced a ton of speed improvements, and fixes, but requires you to (1) dist the version 4.0 XML DLLs and you must handle your XLST through script, calling the XML 4.0 DOM directly.
I deal mainly with Medical systems, and the few we've used XML on ( XML datasets containing a couple thousand records ) are slow going through the XMLDOM. Searches, Inserts, Deletions ... slow. So slow, in fact, we run our processes in a seperate thread to keep from bogging the application down while the XMLDOM parses out the data.
As I said, version 4.0 of the parser is much better, but if you're going to be doing heavy searches against a large amount of data ... steer away from MSXML right now.
In terms of the SAXX parser, it may be better, but then you might find yourself fighting portability issues; I honestly don't know.
Hope that helps a tad bit. Perhaps others will respond now as well.
10-12-2002, 08:36 PM
thank you for the input. It helps alot