On Monday, December 17, 2001, at 06:45 AM, Ross Gardler wrote:
We are currently nearing the end of a development using teXtML. Unfortunatley we have hit a problem with this database that makes it completely unworkable in our system. Ixiasoft (the company behind textML)
are working on a solution but in the meantime we are looking at alternatives.
Our requirements are for a medium sized XML database containing around 2 Million documents, giving a total of around 50 million nodes. The largest documents will have around 50 nodes and be around 1Mb in size. We expect the size of the database to grow by around 20% year on year for at least the next five years then growth will slow.
We currently have 50 users making changes to the database simultaneously,
this is likely to grow to around 100 over the coming year. There are also a further 100 or so users reading the database at any one time.
Since this is a commercial project (due for roll out in around two months) my concern is simple. Can dbXML cope with this workload?
The largest number of documents I've ever put into the server is a little over 1 million in a single collection. I stopped because I was tired of it slowing down my workstation. This was not a real app though, and on many file systems it would have hit the file size limit. While our goal is certainly for it to be able to handle this type of thing, the reality is that at this point in time it has not been proven to handle this in a real application and I would expect some level of problems. My recommendation would be that if you're looking to do this you should run your own set of tests. A lot will depend on the platform that you are using too. Getting some more feedback here would be very helpful.
Ross Gardler
Kimbro Staken XML Database Software, Consulting and Writing http://www.xmldatabases.org/