Hello Vijay, pls continue this thread on the user-list because it is clearly not a dev-list question.
Also I hope you do realize your question is kind of vague...are you talking about how large one single binary data blob can be, or are you referring to how many nodes Jackrabbit can handle? Obviously, it depends on your hardware, clustering, data modelling of the repository, queries you do, which persistence manager, response times you expect, number of queries per second, etc etc. Perhaps you can provide us (on the user-list) with info about what kind of data and model you would like to have, and we might be able to hand you some pointers. At the moment all I can say, is that if you model your content correctly [1], and know what kind of queries might be havy [2], you should be able to have 'a lot' of data being handled efficiently....And I know 'a lot' is pretty vague, but it is the most sensible answer I can possibly give at this moment regarding the info you give, Regards Ard [1] http://wiki.apache.org/jackrabbit/DavidsModel [2] http://mail-archives.apache.org/mod_mbox/jackrabbit-users/200801.mbox/%3 [EMAIL PROTECTED] > > Hi, > > What is maximum amount of data that can be handled > efficiently by Apache Jackrabbit? > > Regards, > Vijay Makhija > > > > ______________________________________________________________ > ______________________ > Be a better friend, newshound, and > know-it-all with Yahoo! Mobile. Try it now. > http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ > >
