Hi Ilias,

>  I am using the dspace import tool for batch ingesting in a Postgres
> database and I am facing extremely slow feedback in each record commitment.
> Initially, the speed was normal but when the items tend to be around 30
> thousand, the speed of each commitment is unacceptable.
> Is there any known problem with the maximum size of dspace database
> using postgres or in the import tool?
> 
> Any comments will be helpful.

The following paper talks about this, and how DSpace performs when ingesting
1 million items:

Testing the Scalability of a DSpace-based Archive, Dharitri Misra, James
Seamans, George R. Thoma, National Library of Medicine, Bethesda, Maryland,
USA

http://www.dspace.org/images/stories/ist2008_paper_submitted1.pdf

Is this one big import of 30,000 items, or do you break them up into smaller
chunks?

Thanks,


Stuart
_________________________________________________________________

Gwasanaethau Gwybodaeth                      Information Services
Prifysgol Aberystwyth                      Aberystwyth University

            E-bost / E-mail: stuart.le...@aber.ac.uk
                 Ffon / Tel: (01970) 622860
_________________________________________________________________


------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
DSpace-tech mailing list
DSpace-tech@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dspace-tech

Reply via email to