Hi, I have a spring based solution using activemq. I recently i upgraded activemq from 4.1 to 5.0.
What I am observing is, ActiveMQ5.0 gives OutOfMemoryError at stress test condition described below. ActiveMQ4.1 whereas, did not fail at the same stress level. The configuration used by me is as following. Jar used: activemq-all-5.0.0.jar Configuration File Used: <!-- START SNIPPET: example --> <beans xmlns="http://activemq.org/config/1.0"> <!-- Allows us to use system properties as variables in this configuration file --> <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"/> <broker useJmx="true"> <!-- Use the following to set the broker memory limit (in bytes) <memoryManager> <usageManager id="memory-manager" limit="1048576"/> </memoryManager> --> <!-- memoryUsage> <limit>20 mb</limit> </memoryUsage --> <!-- Use the following to configure how ActiveMQ is exposed in JMX <managementContext> <managementContext connectorPort="1099" jmxDomainName="org.apache.activemq"/> </managementContext> --> <!-- In ActiveMQ 4, you can setup destination policies --> <destinationPolicy> <policyMap><policyEntries> <policyEntry topic="FOO.>"> <dispatchPolicy> <strictOrderDispatchPolicy /> </dispatchPolicy> <subscriptionRecoveryPolicy> <lastImageSubscriptionRecoveryPolicy /> </subscriptionRecoveryPolicy> </policyEntry> </policyEntries></policyMap> </destinationPolicy> <persistenceAdapter> <journaledJDBC journalLogFiles="5" dataDirectory="${activemq.home}/activemq-data"/> <!-- To use a different datasource, use th following syntax : --> <!-- <journaledJDBC journalLogFiles="5" dataDirectory="../activemq-data" dataSource="#postgres-ds"/> --> </persistenceAdapter> <transportConnectors> <transportConnector name="default" uri="tcp://localhost:61616" discoveryUri="multicast://default"/> <transportConnector name="stomp" uri="stomp://localhost:61613"/> </transportConnectors> <systemUsage> <systemUsage> <memoryUsage> <memoryUsage limit="5 mb" percentUsageMinDelta="20"/> </memoryUsage> <tempUsage> <tempUsage limit="10 mb"/> </tempUsage> <storeUsage> <storeUsage limit="1 gb" name="foo"/> </storeUsage> </systemUsage> </systemUsage> <networkConnectors> <!-- by default just auto discover the other brokers --> <networkConnector name="default" uri="multicast://default"/> <!-- <networkConnector name="host1 and host2" uri="static://(tcp://host1:61616,tcp://host2:61616)" failover="true"/> --> </networkConnectors> </broker> <!-- This xbean configuration file supports all the standard spring xml configuration options --> <!-- Postgres DataSource Sample Setup --> <!-- <bean id="postgres-ds" class="org.postgresql.ds.PGPoolingDataSource"> <property name="serverName" value="localhost"/> <property name="databaseName" value="activemq"/> <property name="portNumber" value="0"/> <property name="user" value="activemq"/> <property name="password" value="activemq"/> <property name="dataSourceName" value="postgres"/> <property name="initialConnections" value="1"/> <property name="maxConnections" value="10"/> </bean> --> <!-- MySql DataSource Sample Setup --> <!-- <bean id="mysql-ds" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close"> <property name="driverClassName" value="com.mysql.jdbc.Driver"/> <property name="url" value="jdbc:mysql://localhost/activemq?relaxAutoCommit=true"/> <property name="username" value="activemq"/> <property name="password" value="activemq"/> <property name="poolPreparedStatements" value="true"/> </bean> --> <!-- Embedded Derby DataSource Sample Setup --> <!-- <bean id="derby-ds" class="org.apache.derby.jdbc.EmbeddedDataSource"> <property name="databaseName" value="derbydb"/> <property name="createDatabase" value="create"/> </bean> --> </beans> <!-- END SNIPPET: example --> Load Generated: I have created 100 topics and there is 1 subscriber per topic. The topic is created at runtime and use all default configurations. There is module which publishes messages in loop to generate heavy message load. I am publishing 1000 messages/topic. The brokers mode is AUTO_ACKNOWLEDGE. Results: In case of ActiveMQ5.0, I get OutOfMemoryError when 27000 messages are published and consumed. The consumer is a do nothing consumer. I have tried configuring broker for system usage but still there is no change. Is there anything which I am missing or doing incorrectly. ~Tushar -- View this message in context: http://www.nabble.com/OutOfMemoryError-in-ActiveMQ5.0-tp15329147s2354p15329147.html Sent from the ActiveMQ - User mailing list archive at Nabble.com.