Hi Havercamp,
Thanks a lot for your reply. But more VM will take more resources and be
more expensive in CPU, rite? I just have 1 server for all (DNS, mail, web
hosting, ...) 
Best, 
Cachiusa

Mr Havercamp wrote:
> 
> Maybe splitting the load using virtual machines may help. Instead of 
> running 24 dspace instances on one machine, use something like 2 virtual 
> machines with 12 instances on each or 4 and 6, 3 and 8, etc.
> 
> cachiusa wrote:
>> Dear all,
>> I am looking for solution for 24 Dspace for 24 communities run on 1
>> server.
>> Which solution I should choose:
>>
>> 1) 24 instances of Tomcat, one for each Dspace instance, or
>> 2) One Tomcat for 24 Dspace instances. 
>>
>> I choose second solution by following the link 
>> http://www.nabble.com/Changing-DB-Name-%C2%BFhow--td19053520.html 
>> and already finished installation. But my server is broken if I  try to
>> access 24 DSpace at the same time (it is ok to open 12 Dspace). 
>>
>> Some sites show error:
>> java.lang.OutOfMemoryError: PermGen space
>>
>> The others return errors:
>>  java.io.FileNotFoundException:
>> /usr/local/gmseenet/partners/gmseenet/dspace/webapps/jspui/WEB-INF/web.xml
>> (Too many open files)
>>  
>> or: 
>> org.apache.jasper.JasperException: Unable to compile class for JSP
>>
>> I already set heap memory for Tomcat to 2048 by
>> JAVA_OPTS="-Xmx2048M -Xms64M -Dfile.encoding=UTF-8", but I still face the
>> same problem.
>>
>> I have to run 24 DSpace in 1 machine, so what is solution for me?
>> Please help me. Thanks in advance.
>> Best,
>> Cachiusa
>>
>>
>> Robin Taylor-2 wrote:
>>   
>>> Hi Mika,
>>>
>>> For what it is worth, we currently run 5 Dspace instances in the same
>>> Tomcat
>>> without any problems. Using Tomcat manager allows us to
>>> stop/start/deploy
>>> one instance without affecting the others. We are running on a
>>> Sun/Solaris
>>> v240. We run an  Apache Webserver in front of Tomcat to avoid the user
>>> having to append 8080 to the URL. We allocate 2G of memory to Tomcat, my
>>> understanding is that Tomcat doesn't like having more although that may
>>> be
>>> just a vicious rumour. We did investigate the possibility of separating
>>> the
>>> Dspace's into different virtual machines or into different instances of
>>> Tomcat on the same machine, but came to the conclusion that we would be
>>> introducing a layer of complexity and increased maintenance for no real
>>> gain.  
>>>
>>> Cheers, Robin.
>>>
>>>
>>>
>>>
>>>  
>>>
>>>
>>> -- 
>>> The University of Edinburgh is a charitable body, registered in
>>> Scotland, with registration number SC005336.
>>>
>>>
>>> -----Original Message-----
>>> From: [EMAIL PROTECTED]
>>> [mailto:[EMAIL PROTECTED] On Behalf Of Mika
>>> Stenberg
>>> Sent: 08 April 2008 12:38
>>> To: '[email protected]'
>>> Subject: [Dspace-tech] Hardware used in DSpace repositories
>>>
>>> We are planning to centralize all DSpace instances used in our
>>> University
>>> into one server. Eventually a total of 5-6 individual DSpace instances
>>> would
>>> end up running on the same platform, either on virtualized OS's or
>>> simply
>>> distributed on different ports.
>>>
>>> What I'd be interested in, is to learn the your experiences on the issue
>>> and
>>> acquire the following info from other members of the community:
>>>
>>> 1)  How many instances of DSpace does your institution run and
>>>     what kind of hardware (CPU, memory, disk space & OS) are they
>>>     running on?
>>>
>>> 2)  How many items are stored in your repository?
>>>
>>> 3)  Have you experienced any performance issues with your
>>>     repositories using the hardware described above?
>>>
>>>
>>> You can reply directly to me by e-mail at mika.stenberg(at)helsinki.fi. 
>>> If other parties are interested on summary, I will be happy to provide
>>> it.
>>>
>>> Thanks for the input,
>>>
>>> Mika Stenberg
>>> University of Helsinki
>>> Finland
>>>
>>> -------------------------------------------------------------------------
>>> This SF.net email is sponsored by the 2008 JavaOne(SM) Conference
>>> Register
>>> now and save $200. Hurry, offer ends at 11:59 p.m., Monday, April 7! Use
>>> priority code J8TLD2. 
>>> http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javao
>>> ne
>>> _______________________________________________
>>> DSpace-tech mailing list
>>> [email protected]
>>> https://lists.sourceforge.net/lists/listinfo/dspace-tech
>>>
>>>
>>>
>>> -------------------------------------------------------------------------
>>> This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
>>> Don't miss this year's exciting event. There's still time to save $100. 
>>> Use priority code J8TL2D2. 
>>> http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
>>> _______________________________________________
>>> DSpace-tech mailing list
>>> [email protected]
>>> https://lists.sourceforge.net/lists/listinfo/dspace-tech
>>>
>>>
>>>     
>>
>>   
> 
> 
> -------------------------------------------------------------------------
> This SF.Net email is sponsored by the Moblin Your Move Developer's
> challenge
> Build the coolest Linux based applications with Moblin SDK & win great
> prizes
> Grand prize is a trip for two to an Open Source event anywhere in the
> world
> http://moblin-contest.org/redirect.php?banner_id=100&url=/
> _______________________________________________
> DSpace-tech mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/dspace-tech
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Hardware-used-in-DSpace-repositories-tp16561681p20475082.html
Sent from the DSpace - Tech mailing list archive at Nabble.com.


-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
DSpace-tech mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/dspace-tech

Reply via email to