There are hadoop IPC calls are failing possibly because of its reliance on
kryo for serializing the payload and there is some incompatibility with the
new version. I will dig in more to see what is going on.

On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <aa...@punchcyber.com> wrote:

> Pramod,
>
> Thanks for taking the time to help!
>
> Here is the output (just failed parts) when running full install (clean
> install -X) on the Master branch:
>
> Running com.datatorrent.stram.StramRecoveryTest
> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
> testRpcFailover - Mock server listening at macbook-pro-6.lan/
> 192.168.87.125:62154
> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 507 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
> output error
> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 994 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 386 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
> output error
> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 506 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
> output error
> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 995 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 391 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call
> reportError(containerId, null, timeout, null), rpc version=2, client
> version=201208081755, methodsFingerPrint=-1300451462 from
> 192.168.87.125:62160 Call#142 Retry#0: output error
> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 32 ms
> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 28 ms
> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
> the basepath for checkpointing.
> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 22 ms
> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 29 ms
> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
> the basepath for checkpointing.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
> the basepath for checkpointing.
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.111 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
>
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> Running com.datatorrent.stram.CustomControlTupleTest
> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
> the basepath for checkpointing.
> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62181
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458489500 to 1529458490463
> 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@7d88b4a4identifier
> =tcp://localhost:62181/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
> {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@4dca4fb0[identifier=2.output.1]
> 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@49665770[identifier=1.out.1]
> 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
> the basepath for checkpointing.
> 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62186
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>
> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458491500 to 1529458492630
> 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
> the basepath for checkpointing.
> 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62187
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
> - Catching up from 1529458493500 to 1529458494852
> 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@5dbf681cidentifier
> =tcp://localhost:62187/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815
> {identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@46bbe39d[identifier=3.output.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2e985326identifier
> =tcp://localhost:62187/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
> {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@3de15cc7[identifier=2.output.1]
> 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
> the basepath for checkpointing.
> 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62195
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>
> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
> 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458495500 to 1529458497040
> 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
> Container thread container-0 is still alive
> 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 4 expected 4
> 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
> the basepath for checkpointing.
> 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62196
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458499500 to 1529458500273
> 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@d80a435identifier
> =tcp://localhost:62196/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212
> {identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@1684ecee[identifier=3.output.1]
> 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@75d245a1identifier
> =tcp://localhost:62196/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
> {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@379bd431[identifier=2.output.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 0 expected 1
> 2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
> the basepath for checkpointing.
> 2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62204
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
> - Catching up from 1529458501500 to 1529458502467
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@67d38a09[identifier=1.out.1]
> 2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@1badfe12identifier
> =tcp://localhost:62204/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7afd481[identifier=2.output.1]
> 2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
> the basepath for checkpointing.
> 2018-06-19 21:35

Reply via email to