Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
On 03/17/2011 12:10 AM, Legeinfo wrote: Hallo! Sergiu you are right. The database is 230mb (on my virtual mini Windows xp 1,5gb RAM). I tried your suggestion. But without luck. Is all the data in the database important? You could try several things: - clean up the database (while still using the large Xmx) by removing unneeded attachments, cleaning the recycle bins, cleaning the attachment history table, resetting the history for all documents, removing the previous activity events; hopefully this will reduce the database size enough to fit in memory, but you'll continue to expose the problem to subsequent users that will create new content - continue to turn other tables into CACHED tables until it works - switch to a different database that doesn't load everything in memory but still works embedded (Apache Derby is a good candidate, and is supported by XWiki) - if you're not in hurry, you can wait for the 3.0 release and configure the filesystem attachment storage which will not load attachments in memory at all, but will still be a problem if the size of the database is given by documents and not by their attachments -- Sergiu Dumitriu http://purl.org/net/sergiu/ ___ users mailing list users@xwiki.org http://lists.xwiki.org/mailman/listinfo/users
Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
Hallo! - Using HSQLDB from latest xwiki-enterprise-installer-generic-3.0-milestone-3-standard.jar - in Windows XP - java Standard Edition Runtime Environment Version 6 I have increased -Xmx in start_xwiki.bat too much from 300mb to 900m. After reducing back to 300m i get the HTTP ERROR 500. What can i do to go back to Xmx=300m without export/import? Volker HTTP ERROR 500 Problem accessing /xwiki/bin/view/Main/. Reason: Error number 3 in 0: Could not initialize main XWiki context Wrapped Exception: Error number 3001 in 3: Cannot load class com.xpn.xwiki.store.migration.hibernate.XWikiHibernateMigrationManager from param xwiki.store.migration.manager.class Wrapped Exception: Error number 0 in 3: Exception while hibernate execute Wrapped Exception: Could not create a DBCP pool. There is an error in the hibernate configuration file, please review it. Caused by: com.xpn.xwiki.XWikiException: Error number 3 in 0: Could not initialize main XWiki context Wrapped Exception: Error number 3001 in 3: Cannot load class com.xpn.xwiki.store.migration.hibernate.XWikiHibernateMigrationManager from param xwiki.store.migration.manager.class Wrapped Exception: Error number 0 in 3: Exception while hibernate execute Wrapped Exception: Could not create a DBCP pool. There is an error in the hibernate configuration file, please review it. at com.xpn.xwiki.XWiki.getMainXWiki(XWiki.java:416) at com.xpn.xwiki.XWiki.getXWiki(XWiki.java:485) at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:137) at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:117) at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431) at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196) at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:414) at javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1166) at com.xpn.xwiki.web.ActionFilter.doFilter(ActionFilter.java:129) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at com.xpn.xwiki.wysiwyg.server.filter.ConversionFilter.doFilter(ConversionFilter.java:152) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at com.xpn.xwiki.plugin.webdav.XWikiDavFilter.doFilter(XWikiDavFilter.java:68) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.xwiki.container.servlet.filters.internal.SavedRequestRestorerFilter.doFilter(SavedRequestRestorerFilter.java:218) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.xwiki.container.servlet.filters.internal.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:112) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418) at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114) at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152) at org.mortbay.jetty.Server.handle(Server.java:326) at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:536) at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:915) at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:539) at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212) at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:405) at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:409) at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582) Wrapped Exception: com.xpn.xwiki.XWikiException: Error number 3001 in 3: Cannot load class com.xpn.xwiki.store.migration.hibernate.XWikiHibernateMigrationManager from param xwiki.store.migration.manager.class Wrapped Exception: Error number 0 in 3: Exception while hibernate execute Wrapped Exception: Could not create a DBCP pool. There is an
Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
On 03/16/2011 07:25 PM, volker wrote: Hallo! - Using HSQLDB I have increased -Xmx too much 900m Why do you say it is too much? There really shouldn't be any problem with that size, unless the machine on which XWiki is running has at most 1G of RAM. After reducing to 300m i get the HTTP ERROR 500. What can i do to go back to Xmx=300m without export/import? Unless you have a syntax error, it should work fine with -Xmx300m Note that there's no = in that parameter, so it's -Xmx300m and not Xmx=300m Volker -- Sergiu Dumitriu http://purl.org/net/sergiu/ ___ users mailing list users@xwiki.org http://lists.xwiki.org/mailman/listinfo/users
Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
Thanks for replay Sergiu Dumitriu: from my start_xwiki.bat: set XWIKI_OPTS=-Xmx300m I am developing a USB-flash memory windows solution that have to work on a minimal windows XP platform. Working on every windows XP. Is there a way back from xxx-ḿb to 300m? Volker -- View this message in context: http://xwiki.475771.n2.nabble.com/reduce-Xmx-after-trying-too-large-Xmx-tp6178332p6178860.html Sent from the XWiki- Users mailing list archive at Nabble.com. ___ users mailing list users@xwiki.org http://lists.xwiki.org/mailman/listinfo/users
Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
On 03/16/2011 09:58 PM, Legeinfo wrote: Thanks for replay Sergiu Dumitriu: from my start_xwiki.bat: set XWIKI_OPTS=-Xmx300m I am developing a USB-flash memory windows solution that have to work on a minimal windows XP platform. Working on every windows XP. Is there a way back from xxx-ḿb to 300m? There really shouldn't be any technical reason not to be able to do that. Perhaps you have too much data in the database? Note that HSQLDB is, by default, a memory database, meaning that it will hold all its data in memory at runtime, so depending on the size of the database, you need to increase the heap size accordingly. You could try to manually change the database file so that large tables are of type CACHED instead of MEMORY. This will reduce the memory requirement, but will also decrease performance. To do this, edit database\xwiki_db.script and change the lines at the start from: CREATE MEMORY TABLE ... to: CREATE CACHED TABLE ... You can start by changing only the XWIKIATTACHMENT_ARCHIVE and XWIKIATTACHMENT_CONTENT tables, which are the ones holding attachments, and which require the most memory. This will reduce the memory requirement the most, while having only little impact on performance. -- Sergiu Dumitriu http://purl.org/net/sergiu/ ___ users mailing list users@xwiki.org http://lists.xwiki.org/mailman/listinfo/users
Re: [xwiki-users] reduce -Xmx after trying too large -Xmx?
Hallo! Sergiu you are right. The database is 230mb (on my virtual mini Windows xp 1,5gb RAM). I tried your suggestion. But without luck. Volker -- View this message in context: http://xwiki.475771.n2.nabble.com/reduce-Xmx-after-trying-too-large-Xmx-tp6178332p6179212.html Sent from the XWiki- Users mailing list archive at Nabble.com. ___ users mailing list users@xwiki.org http://lists.xwiki.org/mailman/listinfo/users