XIndice is definitly not suited for large xml files, this
means if your file size can be measured in megabytes Xindice is not a good
idea, see XIndice FAQ No. 10. It works well for tiny to small documents and can
handle a lot of these, so If you use lets say 10 collections each
containing 500 files each 20kb large = 100MB of data and this is sufficient for
your needs and if you can split your data up that way then it may work
(unless the 500 user must write data as I mentioned earlier). Nevertheless
storing records of a RDBMS in xml makes only sense if the data has a
structure so that xml suites well. It does not make any sense to put pure
relational data into any kind of XML-Database for what reason ever. The RDBMS
will always be a lot faster in that case.
Bjoern Eickvonder Von: Sasikanth Tenneti [mailto:[EMAIL PROTECTED] Gesendet: Donnerstag, 28. Juli 2005 08:31 An: xindice-users@xml.apache.org Betreff: RE: Performance Wichtigkeit: Hoch Thank you very much
Eickvonder, I am sorry for the confusion on the number
I gave, in my current RDBMS DB I have 20 million records, and I am trying to
push most of them (around 60%) into xindice XML ( may
be in 10 different XML files). Each XML file will be definitely very bulky –
This we are doing because based on our client data, we are not finding any
standard normalization techniques in our current RDBMS Datamodel, its changing too
frequently. I thought of accommodating all un-normalized data into XML and other
% of data into RDBMS DB. Regards --SasiKanth From:
Eickvonder Bjoern [mailto:[EMAIL PROTECTED] Thats really a huge
number. You mean you have 20 million xml-files to store? What size does each
file have? What kind of request do you expect (give me document this or that or
do you have to seach for specific data within all these 20 million records)? I
personally think that XIndice will not be suited for your needs. I already had
trouble with a few dozens of collections because Xindice opens file descriptors
for each of these .tbl and .idx files for concurrent access but never closes
them (til shutdown), so I got a too many open files exception on a linux system.
Moreover I had problems if multiple users are adding/writing (different)
files to a collection concurrently (some data was later on corrupt or missing),
so I had to synchronize the write access to each collection. So if all
these 500 users will have to write data this will be a problem for you
too. Bjoern
Eickvonder Von: Sasikanth
Tenneti [mailto:[EMAIL PROTECTED] Dear All
I am trying to move one of our product database content into XML xindice
DB, the volume I am looking is around 20 million records, is xindice can scale
to those many number of records
efficiently with about 500 concurrent users
requests? Please help me to get some focus on
this. Thank you in advance --SasiKanth
____________ Virus checked by G DATA AntiVirusKit Version: AVK 15.0.6273 from 27.07.2005 |
- AW: Performance Eickvonder Bjoern
- RE: Performance Sasikanth Tenneti
- Re: AW: Performance Vadim Gritsenko
- AW: Performance Eickvonder Bjoern
- AW: Performance Eickvonder Bjoern
- Re: AW: Performance Vadim Gritsenko
- AW: AW: Performance Björn Eickvonder