Hi all,

back in April 2013 I asked the community about the DSpace scalability, see:

http://dspace.2283337.n4.nabble.com/DSpace-scalability-tens-of-hundreds-TBs-tt4662988.html#a4663047

Now, at 2019, it is time to ask the same question :-).

How much data / how many items can DSpace handle? The DSpace system at 
Cambridge University (https://www.repository.cam.ac.uk/) was reported as 
the largest then. I can see it stores about 245 thousands of items nowadays.

Does anyone else have bigger one? Are there new information on scalability 
since 2013? 

Regards,

Vlastik Krejčíř

--
----------------------------------------------------------------------------
Vlastimil Krejčíř
Library and Information Centre, Institute of Computer Science
Masaryk University, Brno, Czech Republic
Email: krejcir (at) ics (dot) muni (dot) cz
Phone: +420 549 49 3872
OpenPGP key: https://kic-internal.ics.muni.cz/~krejvl/pgp/ 
Fingerprint: 7800 64B2 6E20 645B 56AF  C303 34CB 1495 C641 11B9
----------------------------------------------------------------------------

-- 
All messages to this mailing list should adhere to the DuraSpace Code of 
Conduct: https://duraspace.org/about/policies/code-of-conduct/
--- 
You received this message because you are subscribed to the Google Groups 
"DSpace Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dspace-community/a37b7af1-59eb-4a7e-b302-196cadbed7a0%40googlegroups.com.

Reply via email to