Hi Marcelo,

> Every day, I import hundreds pdfs to my DSpace. I developed the small program 
> to help this task. After this, I need to update my index.
> 
> I run filter-media.sh whith parameter -s to disconsider the previous top 
> communities. But the index-update is very slow, spend +- 2 hours. How can I 
> optimize this task?

It's hard to know without knowing a bit more detail. For example:

 - How big are the PDFs
 - What is your server setup like? (1 server, or DSpace on a different server 
to your database)
 - How fast are your disks

Really you'll need to do a bit of work to see where the bottlenecks are. It 
maybe that your disks are slow, or you're running out of RAM (databases perform 
better with lots of RAM), or the PDFs are very big so will take a while, your 
processors are at capacity etc.

If you can answer some of these questions, we may be able to help you tune your 
setup, of suggest ways of improving the process.

Thanks, and good luck,


Stuart Lewis
IT Innovations Analyst and Developer
Te Tumu Herenga The University of Auckland Library
Auckland Mail Centre, Private Bag 92019, Auckland 1142, New Zealand
Ph: +64 (0)9 373 7599 x81928


------------------------------------------------------------------------------
This SF.net email is sponsored by 

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
_______________________________________________
DSpace-tech mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dspace-tech

Reply via email to