Hi, I have an aggregate AE with a remote primitive (OntoAnnotator). Both have their queues at the same broker. Clients send requests to the aggregate using the sendCAS() method.
This was running fine for about 5-6 hours, but then the aggregate logged an error: 11/10/04 02:00:12 INFO cpe.DynamicFlowController$DynamicFlow: Next Executing Annotator :: OntoAnnotator 11/10/04 02:00:12 INFO activemq.JmsOutputChannel: Controller AnalysisAggregator Invalidating JMS Connection To Broker tcp://broker_ip:61616 and Closing Sessions To Delegates It had received 4-5 timeouts from the remote delegate over time, but at least a couple of hours before the above log. Both the broker and the remote delegate were still running and had not crashed. The aggregate continued processing requests after that -- the CASes are processed by all collocated primitives but not the remote one. Each CAS process request gets an exception: 11/10/04 02:00:12 WARN activemq.JmsEndpointConnection_impl: org.apache.uima.aae.error.DelegateConnectionLostException: Controller:AnalysisAggregator Lost Connection to Delegate:OntoAnnotator at org.apache.uima.adapter.jms.activemq.JmsEndpointConnection_impl.send(JmsEndpointConnection_impl.java:547) at org.apache.uima.adapter.jms.activemq.JmsEndpointConnection_impl.send(JmsEndpointConnection_impl.java:509) at org.apache.uima.adapter.jms.activemq.JmsOutputChannel.dispatch(JmsOutputChannel.java:1366) at org.apache.uima.adapter.jms.activemq.JmsOutputChannel.sendCasToRemoteEndpoint(JmsOutputChannel.java:1527) at org.apache.uima.adapter.jms.activemq.JmsOutputChannel.serializeCasAndSend(JmsOutputChannel.java:658) at org.apache.uima.adapter.jms.activemq.JmsOutputChannel.sendRequest(JmsOutputChannel.java:610) at org.apache.uima.aae.controller.AggregateAnalysisEngineController_impl.dispatch(AggregateAnalysisEngineController_impl.java:2395) at org.apache.uima.aae.controller.AggregateAnalysisEngineController_impl.dispatchProcessRequest(AggregateAnalysisEngineController_impl.java:2435) at org.apache.uima.aae.controller.AggregateAnalysisEngineController_impl.simpleStep(AggregateAnalysisEngineController_impl.java:1295) at org.apache.uima.aae.controller.AggregateAnalysisEngineController_impl.executeFlowStep(AggregateAnalysisEngineController_impl.java:2316) at org.apache.uima.aae.controller.AggregateAnalysisEngineController_impl.process(AggregateAnalysisEngineController_impl.java:1230) at org.apache.uima.aae.handler.HandlerBase.invokeProcess(HandlerBase.java:118) at org.apache.uima.aae.handler.input.ProcessResponseHandler.cancelTimerAndProcess(ProcessResponseHandler.java:108) When I tried stopping the aggregate, the logs said the following though there was no CAS request in process: 11/10/04 10:18:18 WARN service.UIMA_Service: Uima AS Service AnalysisAggregator Caught Kill Signal - Initiating Quiesce and Stop 11/10/04 10:18:18 INFO controller.BaseAnalysisEngineController: Stopping Controller: AnalysisAggregator 11/10/04 10:18:18 INFO activemq.JmsInputChannel: Stopping Service JMS Transport. Service: q_async_ae 11/10/04 10:18:18 INFO activemq.JmsInputChannel: Controller: AnalysisAggregator Stopped Listener on Endpoint: queue://q_async_ae Selector: Selector:Command=2000 OR Command=2002. 11/10/04 10:18:18 INFO activemq.JmsInputChannel: Stopping Service JMS Transport. Service: q_async_ae 11/10/04 10:18:18 INFO activemq.JmsInputChannel: Controller: AnalysisAggregator Stopped Listener on Endpoint: queue://q_async_ae Selector: Selector:Command=2001. 11/10/04 10:18:18 INFO controller.BaseAnalysisEngineController: Controller: AnalysisAggregator Registering onEmpty Callback With InProcessCache. 11/10/04 10:18:18 INFO controller.BaseAnalysisEngineController: Controller: AnalysisAggregator Awaiting onEmpty Callback From InProcessCache After restarting just the aggregate, it connected to the remote AE just fine. So i'm wondering why the aggregate decided to stop communicating with it earlier? I've seen a previous thread with a similar error ( http://thread.gmane.org/gmane.comp.apache.uima.general/3351/focus=3388), but there the broker was wilfully taken down, whereas I did no such thing. Thanks, and sorry for the barrage of info. Meghana
