Hello all,
I have already posted this to dspace-general, however, without a
great effect. I hope it will be better here :-).
I have been recently asked the question on DSpace scalability - assume the
project:
16 millions of items (bistreams size about 230 TB) increasing by 3
millions items (86 TB) per year
Is DSpace able to handle this? My answer was "I don't know". Is anyone
working with such big loads of data? What is your opinion?
Regards,
Vlastik
----------------------------------------------------------------------------
Vlastimil Krejčíř
Library and Information Centre, Institute of Computer Science
Masaryk University, Brno, Czech Republic
Email: krejcir (at) ics (dot) muni (dot) cz
Phone: +420 549 49 3872
ICQ: 163963217
Jabber: [email protected]
----------------------------------------------------------------------------
------------------------------------------------------------------------------
Minimize network downtime and maximize team effectiveness.
Reduce network management and security costs.Learn how to hire
the most talented Cisco Certified professionals. Visit the
Employer Resources Portal
http://www.cisco.com/web/learning/employer_resources/index.html
_______________________________________________
DSpace-tech mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dspace-tech
List Etiquette: https://wiki.duraspace.org/display/DSPACE/Mailing+List+Etiquette