> Given that the test in the paper uses neither postgres nor the DSpace
> import tool, that seems unlikely.

30,000 items shouldn't pose a big problem for any mature DBMS (e.g. Postgres
/ MySQL etc). If there are problems at that scale, they are more likely to
be in other parts of the system.

We've recently finished testing DSpace ingest to 333,000 items using
Postgres. Again, this wasn't using the batch importer, but instead using
SWORD. Deposits into an empty repository took about 1.5 seconds each, and at
a third of a million items they took about 7 seconds. So the problems
probably aren't with Postgres. For details see:

http://blog.stuartlewis.com/2009/01/19/dspace-at-a-third-of-a-million-items/

Cheers,


Stuart
_________________________________________________________________

Gwasanaethau Gwybodaeth                      Information Services
Prifysgol Aberystwyth                      Aberystwyth University

            E-bost / E-mail: [email protected]
                 Ffon / Tel: (01970) 622860
_________________________________________________________________


------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
DSpace-tech mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dspace-tech

Reply via email to