[ 
https://issues.apache.org/jira/browse/AMQ-5942?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14746444#comment-14746444
 ] 

Christopher L. Shannon commented on AMQ-5942:
---------------------------------------------

[~boday], Seems like maybe the best thing to do here is to just switch the 
default implementation to SimpleMessageGroupMapFactory instead of an LRU cache 
(as you mentioned in the original description) to prevent confusion and then 
document the ability to use an LRU cache through configuration if desired.

[~tabish121], any reason you can think of not to switch out the default LRU 
implementation?  My main concern would be breaking existing users who are 
expecting an LRUCache to be used by default but I'm not sure how much of an 
issue that really is. 

> CachedMessageGroupMapFactory fails with large key sets
> ------------------------------------------------------
>
>                 Key: AMQ-5942
>                 URL: https://issues.apache.org/jira/browse/AMQ-5942
>             Project: ActiveMQ
>          Issue Type: Bug
>    Affects Versions: 5.11.1
>            Reporter: Ben O'Day
>         Attachments: MessageGroupFactoryRouteTest.java
>
>
> the current default factory is the CachedMessageGroupMapFactory which uses an 
> LRUMap with a maxSize of 1024 keys.  If you use this with more than 1024 keys 
> and fail to explicitly increase the maxSize, then the message groups fails to 
> ensure ordering by group, same thread processing by group and overlapping 
> execution.  
> I have reproduced this behavior in the attached unit test (using Camel routes 
> as consumers)...if you switch to the SimpleMessageGroupMapFactory or increase 
> the max size of the cache above the number of keys...the issues go away
> two suggestions
> -throw an error when the maxSize is exceeded if using the 
> CachedMessageGroupMapFactory
> -make the SimpleMessageGroupMapFactory the default (unlimited)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to