On 27 Jan 2009, at 10:27, Stuart Lewis wrote:

> Hi Ilias,
>
>> I am using the dspace import tool for batch ingesting in a Postgres
>> database and I am facing extremely slow feedback in each record  
>> commitment.
>> Initially, the speed was normal but when the items tend to be  
>> around 30
>> thousand, the speed of each commitment is unacceptable.
>> Is there any known problem with the maximum size of dspace database
>> using postgres or in the import tool?
>>
>> Any comments will be helpful.
>
> The following paper talks about this, and how DSpace performs when  
> ingesting
> 1 million items:
>
> Testing the Scalability of a DSpace-based Archive, Dharitri Misra,  
> James
> Seamans, George R. Thoma, National Library of Medicine, Bethesda,  
> Maryland,
> USA
>
> http://www.dspace.org/images/stories/ist2008_paper_submitted1.pdf
>
> Is this one big import of 30,000 items, or do you break them up into  
> smaller
> chunks?


Can I enquire as to where within the standard DSpace toolset the  
SIPIngestManager, as used in the tests in this paper, may be found? I  
haven't been able to locate it.

--
Simon Brown <[email protected]> - Cambridge University Computing Service
+44 1223 3 34714 - New Museums Site, Pembroke Street, Cambridge CB2 3QH



------------------------------------------------------------------------------
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to tell your story.
http://p.sf.net/sfu/sf-spreadtheword
_______________________________________________
DSpace-tech mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dspace-tech

Reply via email to