This is true unless you are acking the message after you are attempting to
send another message to the same queue.  In this situation you get a
deadlock and messages are not removed from the queue.  I suggest you send to
a different queue.


On 6/27/06, bdiesenhaus <[EMAIL PROTECTED]> wrote:


We are running a test with 1 producer and 20 consumers with the memory
limit
on the queue set to 1gb. When the memory limit is hit everything is
wedged.
What I would suspect to happen is that once a message is consumed the
producer can send another message.

We are using amq 4.0.1.

Broker config is:

<beans>

  <broker persistent="false" useJmx="true"
xmlns="http://activemq.org/config/1.0";>

    <memoryManager>
        <usageManager id="memory-manager" limit="2147483648"/>
    </memoryManager>

    <destinationPolicy>
      <policyMap><policyEntries>
          <policyEntry topic=">" memoryLimit="104857600"/>
          <policyEntry queue=">" memoryLimit="104857600"/>
      </policyEntries></policyMap>
    </destinationPolicy>


    <transportConnectors>
       <transportConnector name="default" uri="tcp://localhost:61616"
discoveryUri="multicast://bedrock"/>
    </transportConnectors>

  </broker>


--
View this message in context:
http://www.nabble.com/Queue-Memory-Limits-tf1857084.html#a5071490
Sent from the ActiveMQ - User forum at Nabble.com.




--
Regards,
Hiram

Blog: http://hiramchirino.com

Reply via email to