By the way, I have routes with jms endpoints, and afair, it works
fine. So I have to check.

Regards
JB

On Fri, Feb 18, 2022 at 10:36 AM Jean-Baptiste Onofré <j...@nanthrax.net> wrote:
>
> Thanks for the update.
>
> Let me try to reproduce it with a simple route.
>
> Regards
> JB
>
> On Fri, Feb 18, 2022 at 10:20 AM Zinner, Frank Uwe Alfred
> <frank.uwe.alfred.zin...@externe.dfs.de> wrote:
> >
> > Hi JB,
> >
> > this doesn't work either.
> >
> > This are the relevant log parts from karaf.log and activemq.log for further 
> > investigation:
> >
> > #### KARAF LOG
> >
> > 2022-02-17T17:06:55,593 | INFO  | CM Configuration Updater (Update: 
> > pid=org.ops4j.pax.logging) | EventAdminConfigurationNotifier  | 6 - 
> > org.ops4j.pax.logging.pax-logging-api - 1.11.13 | Sending Event Admin 
> > notification (configuration successful) to 
> > org/ops4j/pax/logging/Configuration
> > 2022-02-17T17:08:52,202 | INFO  | pipe-bundle:restart 337 | 
> > BlueprintExtender                | 22 - org.apache.aries.blueprint.core - 
> > 1.10.3 | Destroying container for blueprint bundle 
> > de.dfs.services.xxx.xxx.xxx.CatoAftnFilterRouter/3.0.0.SNAPSHOT
> > 2022-02-17T17:08:52,205 | INFO  | pipe-bundle:restart 337 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 | Apache Camel 3.11.5 (CATO-FILTER-Context) shutting down 
> > (timeout:5s)
> > 2022-02-17T17:08:53,180 | INFO  | pipe-bundle:restart 337 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 | Routes shutdown summary (total:1 stopped:1)
> > 2022-02-17T17:08:53,180 | INFO  | pipe-bundle:restart 337 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 |     Stopped CATO-FILTER-Router 
> > (opsActivemq://topic:aftn.inbound.records)
> > 2022-02-17T17:08:53,196 | INFO  | pipe-bundle:restart 337 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 | Apache Camel 3.11.5 (CATO-FILTER-Context) shutdown in 991ms 
> > (uptime:3m4s)
> > 2022-02-17T17:08:53,201 | INFO  | pipe-bundle:restart 337 | 
> > CommandExtension                 | 188 - org.apache.karaf.shell.core - 
> > 4.2.15 | Unregistering commands for bundle 
> > de.dfs.services.xxx.xxx.xxx.CatoAftnFilterRouter/3.0.0.SNAPSHOT
> > 2022-02-17T17:08:53,218 | INFO  | pipe-bundle:restart 337 | 
> > AbstractCamelContextFactoryBean  | 91 - org.apache.camel.camel-core-xml - 
> > 3.11.5 | Using custom ShutdownStrategy: 
> > org.apache.camel.impl.engine.DefaultShutdownStrategy@abb6303
> > 2022-02-17T17:08:53,222 | INFO  | pipe-bundle:restart 337 | 
> > BlueprintCamelStateService       | 139 - 
> > org.apache.camel.karaf.camel-blueprint - 3.11.5 | Karaf BundleStateService 
> > not accessible. Bundle state won't reflect Camel context state
> > 2022-02-17T17:08:53,223 | INFO  | pipe-bundle:restart 337 | 
> > BlueprintContainerImpl           | 22 - org.apache.aries.blueprint.core - 
> > 1.10.3 | Blueprint bundle 
> > de.dfs.services.xxx.xxx.xxx.CatoAftnFilterRouter/3.0.0.SNAPSHOT has been 
> > started
> > 2022-02-17T17:08:53,223 | INFO  | Blueprint Event Dispatcher: 1 | 
> > BlueprintCamelContext            | 139 - 
> > org.apache.camel.karaf.camel-blueprint - 3.11.5 | Attempting to start 
> > CamelContext: CATO-FILTER-Context
> > 2022-02-17T17:08:53,226 | INFO  | Blueprint Event Dispatcher: 1 | 
> > JmxManagementStrategy            | 111 - org.apache.camel.camel-management 
> > - 3.11.5 | JMX is enabled2022-02-17T17:08:53,257 | WARN  | Blueprint Event 
> > Dispatcher: 1 | CoreTypeConverterRegistry        | 77 - 
> > org.apache.camel.camel-base - 3.11.5 | Overriding type converter from: 
> > StaticMethodTypeConverter: public static 
> > org.apache.activemq.command.ActiveMQDestination 
> > org.apache.activemq.camel.converter.ActiveMQConverter.toDestination(java.lang.String)
> >  to: org.apache.camel.support.SimpleTypeConverter@6b36dbea
> > 2022-02-17T17:08:53,257 | WARN  | Blueprint Event Dispatcher: 1 | 
> > CoreTypeConverterRegistry        | 77 - org.apache.camel.camel-base - 
> > 3.11.5 | Overriding type converter from: InstanceMethodTypeConverter: 
> > public org.apache.activemq.command.ActiveMQMessage 
> > org.apache.activemq.camel.converter.ActiveMQMessageConverter.toMessage(org.apache.camel.Exchange)
> >  throws javax.jms.JMSException to: 
> > org.apache.camel.support.SimpleTypeConverter@1bb26af7
> > 2022-02-17T17:08:53,257 | WARN  | Blueprint Event Dispatcher: 1 | 
> > CoreTypeConverterRegistry        | 77 - org.apache.camel.camel-base - 
> > 3.11.5 | Overriding type converter from: InstanceMethodTypeConverter: 
> > public org.apache.camel.Processor 
> > org.apache.activemq.camel.converter.ActiveMQMessageConverter.toProcessor(javax.jms.MessageListener)
> >  to: org.apache.camel.support.SimpleTypeConverter@44944087
> > 2022-02-17T17:08:53,270 | INFO  | Blueprint Event Dispatcher: 1 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 | Routes startup summary (total:1 started:1)
> > 2022-02-17T17:08:53,270 | INFO  | Blueprint Event Dispatcher: 1 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 |     Started CATO-FILTER-Router 
> > (opsActivemq://topic:aftn.inbound.records)
> > 2022-02-17T17:08:53,270 | INFO  | Blueprint Event Dispatcher: 1 | 
> > AbstractCamelContext             | 78 - org.apache.camel.camel-base-engine 
> > - 3.11.5 | Apache Camel 3.11.5 (CATO-FILTER-Context) started in 48ms 
> > (build:1ms init:9ms start:38ms)
> > 2022-02-17T17:08:53,274 | INFO  | pipe-bundle:restart 337 | 
> > CommandExtension                 | 188 - org.apache.karaf.shell.core - 
> > 4.2.15 | Registering commands for bundle 
> > de.dfs.services.xxx.xxx.xxx.CatoAftnFilterRouter/3.0.0.SNAPSHOT
> > 2022-02-17T17:09:28,590 | INFO  | pipe-logout      | LogoutAction           
> >           | 186 - org.apache.karaf.shell.commands - 4.2.15 | Disconnecting 
> > from current session...
> > 2022-02-17T17:09:32,829 | INFO  | Karaf Shutdown Socket Thread | 
> > ShutdownSocketThread             | 6 - 
> > org.ops4j.pax.logging.pax-logging-api - 1.11.13 | Karaf shutdown socket: 
> > received shutdown command. Stopping framework...
> >
> >
> > #### ActiveMQ / CAMEL Log
> >
> > 2022-02-17 17:08:52,149 [e[dise] Task-11] - DEBUG Queue                     
> >      -                                 aftn.outbound        - 
> > queue://aftn.outbound, subscriptions=0, memory=5%, size=341, pending=0 
> > toPageIn: 11, force:false, Inflight: 0, pagedInMessages.size 330, 
> > pagedInPendingDispatch.size 330, enqueueCount: 341, dequeueCount: 100, 
> > memUsage:598016, maxPageSize:200
> > 2022-02-17 17:08:52,171 [ vm://dise#17-1] - DEBUG Queue                     
> >      -            vm://dise                                 - dise Message 
> > ID:clienthost-40191-1645113948293-4:5:1:1:86 sent to queue://aftn.outbound
> > 2022-02-17 17:08:52,171 [e[dise] Task-11] - DEBUG Queue                     
> >      -                                 aftn.outbound        - 
> > queue://aftn.outbound, subscriptions=0, memory=5%, size=342, pending=0 
> > toPageIn: 12, force:false, Inflight: 0, pagedInMessages.size 330, 
> > pagedInPendingDispatch.size 330, enqueueCount: 342, dequeueCount: 100, 
> > memUsage:599596, maxPageSize:200
> > 2022-02-17 17:08:53,169 [nbound.records]] - DEBUG ActiveMQMessageConsumer   
> >      -                                                      - remove: 
> > ID:clienthost-40191-1645113948293-8:1:1:1, lastDeliveredSequenceId: 
> > 1154285264
> > 2022-02-17 17:08:53,190 [dle:restart 337] - DEBUG ThreadPoolUtils           
> >      -                                                      - Shutdown of 
> > ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@66fc82ca[Terminated, pool size = 0, 
> > active threads = 0, queued tasks = 0, completed tasks = 342] is shutdown: 
> > true and terminated: false took: 0.000 seconds.
> > 2022-02-17 17:08:53,191 [dle:restart 337] - DEBUG ThreadPoolUtils           
> >      -                                                      - Shutdown of 
> > ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@4dee60a1[Terminated, pool size = 0, 
> > active threads = 0, queued tasks = 0, completed tasks = 0] is shutdown: 
> > true and terminated: true took: 0.000 seconds.
> > 2022-02-17 17:08:53,191 [0.7:61616@54932] - DEBUG ThreadPoolUtils           
> >      -                                                      - Shutdown of 
> > ExecutorService: java.util.concurrent.ThreadPoolExecutor@889e3b[Terminated, 
> > pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0] 
> > is shutdown: true and terminated: true took: 0.000 seconds.
> > 2022-02-17 17:08:53,191 [dle:restart 337] - DEBUG FailoverTransport         
> >      -                                                      - Stopped 
> > tcp://hostname:61616
> > 2022-02-17 17:08:53,191 [dle:restart 337] - DEBUG ThreadPoolUtils           
> >      -                                                      - Forcing 
> > shutdown of ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@2864d088[Running, pool size = 0, 
> > active threads = 0, queued tasks = 0, completed tasks = 2]
> > 2022-02-17 17:08:53,192 [dle:restart 337] - DEBUG TcpTransport              
> >      -                                                      - Stopping 
> > transport tcp://hostname/xxx.xxx.xxx.xxx:61616@54932
> > 2022-02-17 17:08:53,192 [dle:restart 337] - DEBUG TaskRunnerFactory         
> >      -                                                      - Initialized 
> > TaskRunnerFactory[ActiveMQ Task] using ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@780bfee[Running, pool size = 0, 
> > active threads = 0, queued tasks = 0, completed tasks = 0]
> > 2022-02-17 17:08:53,193 [ActiveMQ Task-1] - DEBUG TcpTransport              
> >      -                                                      - Closed socket 
> > Socket[addr=hostname/xxx.xxx.xxx.xxx,port=61616,localport=54932]
> > 2022-02-17 17:08:53,193 [dle:restart 337] - DEBUG ThreadPoolUtils           
> >      -                                                      - Forcing 
> > shutdown of ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@780bfee[Running, pool size = 1, 
> > active threads = 0, queued tasks = 0, completed tasks = 1]
> > 2022-02-17 17:08:53,266 [e[dise] Task-11] - DEBUG TaskRunnerFactory         
> >      -                                                      - Initialized 
> > TaskRunnerFactory[ActiveMQ VMTransport: vm://dise#25] using 
> > ExecutorService: java.util.concurrent.ThreadPoolExecutor@5ace9ef1[Running, 
> > pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
> > 2022-02-17 17:08:53,266 [t Dispatcher: 1] - DEBUG TaskRunnerFactory         
> >      -                                                      - Initialized 
> > TaskRunnerFactory[ActiveMQ VMTransport: vm://dise#24] using 
> > ExecutorService: java.util.concurrent.ThreadPoolExecutor@c271484[Running, 
> > pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 0]
> > 2022-02-17 17:08:53,268 [ vm://dise#25-1] - DEBUG TransportConnection       
> >      -            vm://dise                                 - Setting up 
> > new connection id: ID:clienthost-40191-1645113948293-16:1, address: 
> > vm://dise#24, info: ConnectionInfo {commandId = 1, responseRequired = true, 
> > connectionId = ID:clienthost-40191-1645113948293-16:1, clientId = 
> > ID:clienthost-40191-1645113948293-15:1, clientIp = null, userName = 
> > yyyyyyyyy, password = *****, brokerPath = null, brokerMasterConnector = 
> > false, manageable = true, clientMaster = true, faultTolerant = false, 
> > failoverReconnect = false}
> > 2022-02-17 17:08:53,268 [ vm://dise#25-1] - DEBUG TransportConnector        
> >      -            vm://dise                                 - Publishing: 
> > vm://dise for broker transport URI: vm://dise
> > 2022-02-17 17:08:53,268 [ vm://dise#25-1] - DEBUG TransportConnector        
> >      -            vm://dise                                 - Publishing: 
> > vm://dise for broker transport URI: vm://dise
> > 2022-02-17 17:08:53,268 [ vm://dise#25-1] - DEBUG AbstractRegion            
> >      -            vm://dise                                 - dise adding 
> > consumer: ID:clienthost-40191-1645113948293-16:1:-1:1 for destination: 
> > ActiveMQ.Advisory.TempQueue,ActiveMQ.Advisory.TempTopic
> > 2022-02-17 17:08:53,268 [ vm://dise#25-1] - DEBUG PolicyEntry               
> >      -            vm://dise                                 - Setting the 
> > maximumPendingMessages size to: 1000 for consumer: 
> > ID:clienthost-40191-1645113948293-16:1:-1:1
> > 2022-02-17 17:08:53,270 [ vm://dise#25-1] - DEBUG AbstractRegion            
> >      -            vm://dise                                 - dise adding 
> > consumer: ID:clienthost-40191-1645113948293-16:1:1:1 for destination: 
> > topic://aftn.inbound.records
> > 2022-02-17 17:08:53,270 [ vm://dise#25-1] - DEBUG PolicyEntry               
> >      -            vm://dise                                 - Setting the 
> > maximumPendingMessages size to: 1000 for consumer: 
> > ID:clienthost-40191-1645113948293-16:1:1:1
> > 2022-02-17 17:08:53,271 [ vm://dise#25-1] - DEBUG TransportConnector        
> >      -            vm://dise                                 - Publishing: 
> > vm://dise for broker transport URI: vm://dise
> > 2022-02-17 17:08:53,271 [ vm://dise#25-1] - DEBUG TransportConnector        
> >      -            vm://dise                                 - Publishing: 
> > vm://dise for broker transport URI: vm://dise
> > 2022-02-17 17:08:53,271 [ vm://dise#25-1] - DEBUG AbstractRegion            
> >      -            vm://dise                                 - dise adding 
> > destination: topic://ActiveMQ.Advisory.Consumer.Topic.aftn.inbound.records
> > 2022-02-17 17:08:53,272 [nbound.records]] - DEBUG TaskRunnerFactory         
> >      -                                                      - Initialized 
> > TaskRunnerFactory[ActiveMQ Session Task] using ExecutorService: 
> > java.util.concurrent.ThreadPoolExecutor@4a07d63e[Running, pool size = 0, 
> > active threads = 0, queued tasks = 0, completed tasks = 0]
> > 2022-02-17 17:08:53,595 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > started.
> > 2022-02-17 17:08:53,601 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > done.
> > 2022-02-17 17:08:58,607 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > started.
> > 2022-02-17 17:08:58,607 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > done.
> > 2022-02-17 17:09:03,614 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > started.
> > 2022-02-17 17:09:03,614 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > done.
> > 2022-02-17 17:09:08,622 [eckpoint Worker] - DEBUG MessageDatabase           
> >      - dise                                                 - Checkpoint 
> > started.
> >
> > Regards
> > Frank
> >
> > -----Ursprüngliche Nachricht-----
> > Von: Jean-Baptiste Onofré <j...@nanthrax.net>
> > Gesendet: Donnerstag, 17. Februar 2022 17:53
> > An: users@camel.apache.org
> > Betreff: EXT:Re: Lost connection to ActiveMQ for Camel routes after 
> > configuration changes
> >
> > Hi Frank,
> >
> > does restarting the blueprint container help (not the bundle, I mean 
> > restarting the route using camel:* commands for instance) ?
> >
> > Regards
> > JB
> >
> > On Thu, Feb 17, 2022 at 3:37 PM Zinner, Frank Uwe Alfred 
> > <frank.uwe.alfred.zin...@externe.dfs.de> wrote:
> > >
> > >
> > > Hi,
> > > I have a running Kafka 4.2.15 with Camel 3.11.5 and ActiveMQ 5.16.2 with 
> > > a Camel route to an external ActiveMQ broker.
> > > When I start my local Karaf, the Camel routes connect both ActiveMQ 
> > > servers local and external and all works as expected.
> > > Then when I change the local configuration, Karaf restarts the route but 
> > > the connection is lost.
> > > I can see on the external broker side, that the client connection is not 
> > > present after configuration changed or if I restart the bundle.
> > >
> > > The configuration properties got promoted to the camel component e.g. 
> > > when I change one of the values but I wasn't able to track/debug this 
> > > down further why the connection is lost.
> > > Only a restart of the hole Karaf, brings the connection up again.
> > >
> > > Any ideas what I can do here or has someone experienced the same as me?
> > >
> > > The external ActiveMQ broker has version 5.13.3 the local broker has 
> > > version 5.16.2.
> > >
> > > I connect to an external topic and I use a RouteBuilder to create this 
> > > route:
> > >
> > > from(source).id("Router").description("Filter messages")
> > >                 .choice()
> > >                     .when(predicate) // predicate is a Camel Predicate
> > >                         .to(destination) // this is a topic on a local 
> > > ActiveMQ broker
> > >                     .otherwise()
> > >                         .to(destinationFiltered); // also a topic on a
> > > local ActiveMQ broker
> > >
> > > BrokerURL is tcp://hostname:61616 where hostname is the external 5.13.3 
> > > ActiveMQ broker.
> > >
> > > Nothing special there. The route connects to the remote broker topic, 
> > > fetches the incoming messages, filters them and sends them to the local 
> > > broker.
> > > Again, when I change a parameter in the configuration.cfg file on my 
> > > local running Karaf or even restart the bundle this causes camel to lose 
> > > the connection.
> > >
> > >
> > > Frank Uwe Alfred Zinner (extern)
> > >
> > > DFS Deutsche Flugsicherung GmbH
> > > SH/AM
> > > Am DFS-Campus 7
> > > 63225 Langen
> > >
> > > Mail:
> > > frank.uwe.alfred.zin...@externe.dfs.de<mailto:frank.uwe.alfred.zinner@
> > > externe.dfs.de>
> > >
> > >
> > >
> > > DFS Deutsche Flugsicherung GmbH * Am DFS-Campus * 63225 Langen * Tel.:
> > > +49 6103 707-0 * Sitz der Gesellschaft: Langen/Hessen * Zust?ndiges
> > > Registergericht: AG Offenbach am Main, HRB 34977 * Vorsitzende des
> > > Aufsichtsrats: Antje Geese * Gesch?ftsf?hrer: Arndt Schoenemann
> > > (Vors.), Dr. Kerstin B?cker, Dirk Mahns, Friedrich-Wilhelm Menge *
> > > www.dfs.de<http://www.dfs.de>
> > >
> > > Sollten Sie nicht der richtige Empf?nger dieser E-Mail sein, l?schen Sie 
> > > diese bitte.
> > DFS Deutsche Flugsicherung GmbH • Am DFS-Campus • 63225 Langen • Tel.: +49 
> > 6103 707-0 • Sitz der Gesellschaft: Langen/Hessen • Zuständiges 
> > Registergericht: AG Offenbach am Main, HRB 34977 • Vorsitzende des 
> > Aufsichtsrats: Antje Geese • Geschäftsführer: Arndt Schoenemann (Vors.), 
> > Dr. Kerstin Böcker, Dirk Mahns, Friedrich-Wilhelm Menge • 
> > www.dfs.de<http://www.dfs.de>
> >
> > Sollten Sie nicht der richtige Empfänger dieser E-Mail sein, löschen Sie 
> > diese bitte.

Reply via email to