Branch 3.7.0 failing install related to Kryo version...perhaps

classic Classic list List threaded Threaded
18 messages Options
Reply | Threaded
Open this post in threaded view
|

Branch 3.7.0 failing install related to Kryo version...perhaps

Aaron Bossert
I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
serialization issue related to support for Java Instant and a couple of
other classes that are supported in newer Kryo versions.  My test build and
install (vanilla, no changes of any kind, just download apex-core and
"clean install") works fine, however, when updating the Kryo dependency to
4.0.2, getting this non-obvious (to me) error (running "clean install -X).
I also identified a bug or perhaps a feature?  When building on my macOS
laptop, I have an  Idea project folder in iCloud which is locally stored in
a directory that contains a space in the name, which needs to be escaped.
When I initially built, I kept running into errors related to that...not
sure if that is something that should be fixed (it is not as
straightforward as I had hoped) or simply require that directory names not
include any spaces.  I have no control of the iCloud local folder
name...otherwise, would have just fixed that.

2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
Giving up RPC connection recovery after 504 ms
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
processResponse - IPC Server handler 0 on 65136, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141 Retry#0:
output error
2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 998 ms)
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 390 ms)
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
processResponse - IPC Server handler 0 on 65136, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142 Retry#0:
output error
2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
Giving up RPC connection recovery after 501 ms
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN  ipc.Server
processResponse - IPC Server handler 0 on 65136, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146 Retry#0:
output error
2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 995 ms)
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 387 ms)
java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2 to
MacBook-Pro-6.local:65136 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
remote=MacBook-Pro-6.local/10.37.129.2:65136]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN  ipc.Server
processResponse - IPC Server handler 0 on 65136, call
reportError(containerId, null, timeout, null), rpc version=2, client
version=201208081755, methodsFingerPrint=-1300451462 from
10.37.129.2:65142 Call#147
Retry#0: output error
2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
2018-06-18 12:43:28,492 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 32 ms
2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
2018-06-18 12:43:28,673 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 35 ms
2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939 as
the basepath for checkpointing.
2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
2018-06-18 12:43:29,494 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 29 ms
2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
2018-06-18 12:43:29,651 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 32 ms
2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194 as
the basepath for checkpointing.
2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644 as
the basepath for checkpointing.
Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329 sec
<<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
0.097 sec  <<< FAILURE!
java.lang.AssertionError: flush count expected:<1> but was:<2>
at
com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)


--

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Aaron Bossert
please disregard the first iteration...this ended up being related to a
hung build running in the background causing timeouts, I think.  I am still
having failures, but there are two and are still mysterious to me as to
their root cause.  Here are the actual failures:

I don't immediately see how these are related to Kryo at all...but then
again, I am still familiarizing myself with the code base.  I am hoping
that someone out there has a lightbulb turn on and has some notion of how
they are related...

-------------------------------------------------------------------------------
Test set: com.datatorrent.stram.StramRecoveryTest
-------------------------------------------------------------------------------
Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.119 sec
<<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
0.105 sec  <<< FAILURE!
java.lang.AssertionError: flush count expected:<1> but was:<2>
at
com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)

-------------------------------------------------------------------------------
Test set: com.datatorrent.stram.engine.StatsTest
-------------------------------------------------------------------------------
Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.051 sec
<<< FAILURE! - in com.datatorrent.stram.engine.StatsTest
testQueueSizeForContainerLocalOperators(com.datatorrent.stram.engine.StatsTest)
 Time elapsed: 3.277 sec  <<< FAILURE!
java.lang.AssertionError: Validate input port queue size -1
at
com.datatorrent.stram.engine.StatsTest.baseTestForQueueSize(StatsTest.java:270)
at
com.datatorrent.stram.engine.StatsTest.testQueueSizeForContainerLocalOperators(StatsTest.java:285)

On Mon, Jun 18, 2018 at 1:20 PM Aaron Bossert <[hidden email]> wrote:

> I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
> serialization issue related to support for Java Instant and a couple of
> other classes that are supported in newer Kryo versions.  My test build and
> install (vanilla, no changes of any kind, just download apex-core and
> "clean install") works fine, however, when updating the Kryo dependency to
> 4.0.2, getting this non-obvious (to me) error (running "clean install -X).
> I also identified a bug or perhaps a feature?  When building on my macOS
> laptop, I have an  Idea project folder in iCloud which is locally stored in
> a directory that contains a space in the name, which needs to be escaped.
> When I initially built, I kept running into errors related to that...not
> sure if that is something that should be fixed (it is not as
> straightforward as I had hoped) or simply require that directory names not
> include any spaces.  I have no control of the iCloud local folder
> name...otherwise, would have just fixed that.
>
> 2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 504 ms
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 65136, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141 Retry#0:
> output error
> 2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC failure, will retry after 100 ms (remaining 998 ms)
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC failure, will retry after 100 ms (remaining 390 ms)
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 65136, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142 Retry#0:
> output error
> 2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 501 ms
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 65136, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146 Retry#0:
> output error
> 2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC failure, will retry after 100 ms (remaining 995 ms)
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC failure, will retry after 100 ms (remaining 387 ms)
> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/10.37.129.2
>  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> remote=MacBook-Pro-6.local/10.37.129.2:65136]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 65136, call
> reportError(containerId, null, timeout, null), rpc version=2, client
> version=201208081755, methodsFingerPrint=-1300451462 from
> 10.37.129.2:65142 Call#147 Retry#0: output error
> 2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-18 12:43:28,492 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 32 ms
> 2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-18 12:43:28,673 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 35 ms
> 2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939 as
> the basepath for checkpointing.
> 2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-18 12:43:29,494 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 29 ms
> 2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-18 12:43:29,651 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 32 ms
> 2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194 as
> the basepath for checkpointing.
> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644 as
> the basepath for checkpointing.
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.097 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> --
>
> M. Aaron Bossert
> (571) 242-4021
> Punch Cyber Analytics Group
>
>
>

--

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Pramod Immaneni-3
Do you see the same errors when you run the individual tests in question in
isolation, such as using mvn test -Dtest=<test-class>. If you do can you
paste the full logs of what you see when the individual tests fail.

Thanks

On Mon, Jun 18, 2018 at 11:41 AM Aaron Bossert <[hidden email]> wrote:

> please disregard the first iteration...this ended up being related to a
> hung build running in the background causing timeouts, I think.  I am still
> having failures, but there are two and are still mysterious to me as to
> their root cause.  Here are the actual failures:
>
> I don't immediately see how these are related to Kryo at all...but then
> again, I am still familiarizing myself with the code base.  I am hoping
> that someone out there has a lightbulb turn on and has some notion of how
> they are related...
>
>
> -------------------------------------------------------------------------------
> Test set: com.datatorrent.stram.StramRecoveryTest
>
> -------------------------------------------------------------------------------
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.119 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.105 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
>
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> -------------------------------------------------------------------------------
> Test set: com.datatorrent.stram.engine.StatsTest
>
> -------------------------------------------------------------------------------
> Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.051 sec
> <<< FAILURE! - in com.datatorrent.stram.engine.StatsTest
>
> testQueueSizeForContainerLocalOperators(com.datatorrent.stram.engine.StatsTest)
>  Time elapsed: 3.277 sec  <<< FAILURE!
> java.lang.AssertionError: Validate input port queue size -1
> at
>
> com.datatorrent.stram.engine.StatsTest.baseTestForQueueSize(StatsTest.java:270)
> at
>
> com.datatorrent.stram.engine.StatsTest.testQueueSizeForContainerLocalOperators(StatsTest.java:285)
>
> On Mon, Jun 18, 2018 at 1:20 PM Aaron Bossert <[hidden email]>
> wrote:
>
> > I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
> > serialization issue related to support for Java Instant and a couple of
> > other classes that are supported in newer Kryo versions.  My test build
> and
> > install (vanilla, no changes of any kind, just download apex-core and
> > "clean install") works fine, however, when updating the Kryo dependency
> to
> > 4.0.2, getting this non-obvious (to me) error (running "clean install
> -X).
> > I also identified a bug or perhaps a feature?  When building on my macOS
> > laptop, I have an  Idea project folder in iCloud which is locally stored
> in
> > a directory that contains a space in the name, which needs to be escaped.
> > When I initially built, I kept running into errors related to that...not
> > sure if that is something that should be fixed (it is not as
> > straightforward as I had hoped) or simply require that directory names
> not
> > include any spaces.  I have no control of the iCloud local folder
> > name...otherwise, would have just fixed that.
> >
> > 2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 504 ms
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141 Retry#0:
> > output error
> > 2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 998 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 390 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142 Retry#0:
> > output error
> > 2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 501 ms
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146 Retry#0:
> > output error
> > 2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 995 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 387 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call
> > reportError(containerId, null, timeout, null), rpc version=2, client
> > version=201208081755, methodsFingerPrint=-1300451462 from
> > 10.37.129.2:65142 Call#147 Retry#0: output error
> > 2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-18 12:43:28,492 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 32 ms
> > 2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-18 12:43:28,673 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 35 ms
> > 2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939
> as
> > the basepath for checkpointing.
> > 2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-18 12:43:29,494 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 29 ms
> > 2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-18 12:43:29,651 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 32 ms
> > 2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194
> as
> > the basepath for checkpointing.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644
> as
> > the basepath for checkpointing.
> > Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329 sec
> > <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> > testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> > 0.097 sec  <<< FAILURE!
> > java.lang.AssertionError: flush count expected:<1> but was:<2>
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> >
> >
> > --
> >
> > M. Aaron Bossert
> > (571) 242-4021
> > Punch Cyber Analytics Group
> >
> >
> >
>
> --
>
> M. Aaron Bossert
> (571) 242-4021
> Punch Cyber Analytics Group
>
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Aaron Bossert
Pramod,

Thanks for taking the time to help!

Here is the output (just failed parts) when running full install (clean
install -X) on the Master branch:

Running com.datatorrent.stram.StramRecoveryTest
2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
testRpcFailover - Mock server listening at macbook-pro-6.lan/
192.168.87.125:62154
2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
Giving up RPC connection recovery after 507 ms
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
processResponse - IPC Server handler 0 on 62154, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
output error
2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 994 ms)
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 386 ms)
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
processResponse - IPC Server handler 0 on 62154, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
output error
2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
Giving up RPC connection recovery after 506 ms
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.log(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
processResponse - IPC Server handler 0 on 62154, call log(containerId,
timeout), rpc version=2, client version=201208081755,
methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
output error
2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 995 ms)
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
failure, will retry after 100 ms (remaining 391 ms)
java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
to macbook-pro-6.lan:62154 failed on socket timeout exception:
java.net.SocketTimeoutException: 500 millis timeout while waiting for
channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
at org.apache.hadoop.ipc.Client.call(Client.java:1472)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
at com.sun.proxy.$Proxy138.reportError(Unknown Source)
at
com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runners.Suite.runChild(Suite.java:127)
at org.junit.runners.Suite.runChild(Suite.java:26)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
at
org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
at
org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
Caused by: java.net.SocketTimeoutException: 500 millis timeout while
waiting for channel to be ready for read. ch :
java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
remote=macbook-pro-6.lan/192.168.87.125:62154]
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at
org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at
org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
processResponse - IPC Server handler 0 on 62154, call
reportError(containerId, null, timeout, null), rpc version=2, client
version=201208081755, methodsFingerPrint=-1300451462 from
192.168.87.125:62160 Call#142 Retry#0: output error
2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 32 ms
2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 28 ms
2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
the basepath for checkpointing.
2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 22 ms
2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
Copying initial state took 29 ms
2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
Creating
target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
the basepath for checkpointing.
2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
locality contraint due to insufficient resources.
2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
the basepath for checkpointing.
Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
<<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
0.111 sec  <<< FAILURE!
java.lang.AssertionError: flush count expected:<1> but was:<2>
at
com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)


Running com.datatorrent.stram.CustomControlTupleTest
2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
the basepath for checkpointing.
2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62181
2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62181
2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458489500 to 1529458490463
2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
Removing ln LogicalNode@7d88b4a4identifier=tcp://localhost:62181/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
{da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af3900000001, ending_window=5b29af3900000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@4dca4fb0[identifier=2.output.1]
2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
Removing ln LogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
{da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af3900000001, ending_window=5b29af3900000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@49665770[identifier=1.out.1]
2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62181
2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
the basepath for checkpointing.
2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62186
2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62186
2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458491500 to 1529458492630
2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62186
2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
the basepath for checkpointing.
2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62187
2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62187
2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
Started container container-3
2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
container-3 msg: [container-3] Entering heartbeat loop..
2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
- Catching up from 1529458493500 to 1529458494852
2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=3.output.1, windowId=ffffffffffffffff}
2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
partitions=[1], bufferSize=1024}
2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
partitions=[0], bufferSize=1024}
2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/5.<merge#output>(3.output),
upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/5.<merge#output>(2.output),
upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
container-3 msg: [container-3] Exiting heartbeat loop..
2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
Container container-3 terminating.
2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
Removing ln LogicalNode@5dbf681cidentifier=tcp://localhost:62187/3.output.1,
upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
{da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815{identifier=3.output.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@46bbe39d[identifier=3.output.1]
2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
Removing ln LogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input,
partitions=[BitVector{mask=1, bits=0}],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
{da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@7a566f6b[identifier=1.out.1]
2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
Removing ln LogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
upstream=1.out.1, group=genToProcessor/3.input,
partitions=[BitVector{mask=1, bits=1}],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
{da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@7a566f6b[identifier=1.out.1]
2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
Removing ln LogicalNode@2e985326identifier=tcp://localhost:62187/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
{da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@3de15cc7[identifier=2.output.1]
2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62187
2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
the basepath for checkpointing.
2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62195
2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62195
2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458495500 to 1529458497040
2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
Container thread container-0 is still alive
2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62195
2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 4 expected 4
2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
the basepath for checkpointing.
2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62196
2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62196
2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
Started container container-3
2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
container-3 msg: [container-3] Entering heartbeat loop..
2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458499500 to 1529458500273
2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=3.output.1, windowId=ffffffffffffffff}
2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
partitions=[0], bufferSize=1024}
2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
partitions=[1], bufferSize=1024}
2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/5.<merge#output>(3.output),
upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/5.<merge#output>(2.output),
upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
container-3 msg: [container-3] Exiting heartbeat loop..
2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
Container container-3 terminating.
2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
Removing ln LogicalNode@d80a435identifier=tcp://localhost:62196/3.output.1,
upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
{da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212{identifier=3.output.1,
data=1048576, readingOffset=0, writingOffset=36,
starting_window=5b29af4300000001, ending_window=5b29af4300000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@1684ecee[identifier=3.output.1]
2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
Removing ln LogicalNode@75d245a1identifier=tcp://localhost:62196/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
{da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=36,
starting_window=5b29af4300000001, ending_window=5b29af4300000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@379bd431[identifier=2.output.1]
2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
Removing ln LogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input,
partitions=[BitVector{mask=1, bits=0}],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
{da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=36,
starting_window=5b29af4300000001, ending_window=5b29af4300000005,
refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@47bc3c23[identifier=1.out.1]
2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
Removing ln LogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
upstream=1.out.1, group=genToProcessor/3.input,
partitions=[BitVector{mask=1, bits=1}],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
{da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=36,
starting_window=5b29af4300000001, ending_window=5b29af4300000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@47bc3c23[identifier=1.out.1]
2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62196
2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 0 expected 1
2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
the basepath for checkpointing.
2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62204
2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62204
2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
- Catching up from 1529458501500 to 1529458502467
2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
Removing ln LogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
{da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af4500000001, ending_window=5b29af4500000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@67d38a09[identifier=1.out.1]
2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
Removing ln LogicalNode@1badfe12identifier=tcp://localhost:62204/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
{da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af4500000001, ending_window=5b29af4500000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@7afd481[identifier=2.output.1]
2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62204
2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
the basepath for checkpointing.
2018-06-19 21:35:03,633 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:35:03,633 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62209
2018-06-19 21:35:03,633 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62209
2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:35:04,641 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:35:04,643 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458503500 to 1529458504643
2018-06-19 21:35:05,640 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:05,641 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:05,641 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:05,653 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:05,655 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62209
2018-06-19 21:35:05,655 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:05,655 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:35:05,672 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp9044425874557598001 as
the basepath for checkpointing.
2018-06-19 21:35:05,819 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:35:05,819 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62211
2018-06-19 21:35:05,819 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62211
2018-06-19 21:35:05,819 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:35:05,819 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:35:05,820 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:35:05,820 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:35:06,826 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:35:06,826 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:35:06,826 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:35:06,830 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458505500 to 1529458506830
2018-06-19 21:35:06,831 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62211/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62211/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:35:07,828 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:07,829 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:07,829 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:07,829 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:07,829 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:35:07,829 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:07,829 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:35:07,834 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:35:07,839 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:35:07,839 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:07,839 [ServerHelper-107-1] INFO  server.Server run -
Removing ln LogicalNode@16a2cf78identifier=tcp://localhost:62211/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@665bb8b6
{da=com.datatorrent.bufferserver.internal.DataList$Block@6682720a{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af4900000001, ending_window=5b29af4900000006,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@1eac5147[identifier=2.output.1]
2018-06-19 21:35:07,839 [ServerHelper-107-1] INFO  server.Server run -
Removing ln LogicalNode@579fc543identifier=tcp://localhost:62211/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@37b03a0c
{da=com.datatorrent.bufferserver.internal.DataList$Block@c15ba44{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=487,
starting_window=5b29af4900000001, ending_window=5b29af4900000006,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@79f29bee[identifier=1.out.1]
2018-06-19 21:35:07,840 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62211
2018-06-19 21:35:07,840 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:07,840 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
2018-06-19 21:35:07,857 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp628666253336272009 as
the basepath for checkpointing.
2018-06-19 21:35:08,003 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:35:08,004 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62216
2018-06-19 21:35:08,004 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62216
2018-06-19 21:35:08,004 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:35:08,005 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:35:09,009 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
2018-06-19 21:35:09,011 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458507500 to 1529458509011
2018-06-19 21:35:10,011 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:10,012 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:10,012 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:10,015 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:10,017 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62216
2018-06-19 21:35:10,017 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:10,017 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 4 expected 4
2018-06-19 21:35:10,034 [main] INFO  util.AsyncFSStorageAgent save - using
/Users/mbossert/testIdea/apex-core/engine/target/chkp1306712174461973573 as
the basepath for checkpointing.
2018-06-19 21:35:10,194 [main] INFO  storage.DiskStorage <init> - using
/Users/mbossert/testIdea/apex-core/engine/target as the basepath for
spooling.
2018-06-19 21:35:10,194 [ProcessWideEventLoop] INFO  server.Server
registered - Server started listening at /0:0:0:0:0:0:0:0:62217
2018-06-19 21:35:10,194 [main] INFO  stram.StramLocalCluster run - Buffer
server started: localhost:62217
2018-06-19 21:35:10,194 [container-0] INFO  stram.StramLocalCluster run -
Started container container-0
2018-06-19 21:35:10,195 [container-1] INFO  stram.StramLocalCluster run -
Started container container-1
2018-06-19 21:35:10,195 [container-2] INFO  stram.StramLocalCluster run -
Started container container-2
2018-06-19 21:35:10,195 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Entering heartbeat loop..
2018-06-19 21:35:10,195 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Entering heartbeat loop..
2018-06-19 21:35:10,195 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Entering heartbeat loop..
2018-06-19 21:35:11,201 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
2018-06-19 21:35:11,201 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
0,
0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
2018-06-19 21:35:11,201 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Deploy request:
[OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
0,
0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
2018-06-19 21:35:11,205 [container-0] INFO  engine.WindowGenerator activate
- Catching up from 1529458510500 to 1529458511205
2018-06-19 21:35:11,206 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62217/2.output.1, windowId=ffffffffffffffff,
type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:11,207 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=1.out.1, windowId=ffffffffffffffff}
2018-06-19 21:35:11,208 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received publisher request: PublishRequestTuple{version=1.0,
identifier=2.output.1, windowId=ffffffffffffffff}
2018-06-19 21:35:11,208 [ProcessWideEventLoop] INFO  server.Server
onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
identifier=tcp://localhost:62217/1.out.1, windowId=ffffffffffffffff,
type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
partitions=null, bufferSize=1024}
2018-06-19 21:35:12,202 [main] INFO  stram.StramLocalCluster run - Stopping
on exit condition
2018-06-19 21:35:12,203 [container-2] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:12,203 [container-1] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:12,203 [container-1] INFO  stram.StramLocalCluster log -
container-1 msg: [container-1] Exiting heartbeat loop..
2018-06-19 21:35:12,203 [container-2] INFO  stram.StramLocalCluster log -
container-2 msg: [container-2] Exiting heartbeat loop..
2018-06-19 21:35:12,203 [container-0] INFO  engine.StreamingContainer
processHeartbeatResponse - Received shutdown request type ABORT
2018-06-19 21:35:12,204 [container-1] INFO  stram.StramLocalCluster run -
Container container-1 terminating.
2018-06-19 21:35:12,204 [container-0] INFO  stram.StramLocalCluster log -
container-0 msg: [container-0] Exiting heartbeat loop..
2018-06-19 21:35:12,208 [container-2] INFO  stram.StramLocalCluster run -
Container container-2 terminating.
2018-06-19 21:35:12,209 [ServerHelper-113-1] INFO  server.Server run -
Removing ln LogicalNode@59cb59eidentifier=tcp://localhost:62217/2.output.1,
upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@157f4ac6
{da=com.datatorrent.bufferserver.internal.DataList$Block@75af3db2{identifier=2.output.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af4e00000001, ending_window=5b29af4e00000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@6d9b966b[identifier=2.output.1]
2018-06-19 21:35:12,216 [container-0] INFO  stram.StramLocalCluster run -
Container container-0 terminating.
2018-06-19 21:35:12,217 [ServerHelper-113-1] INFO  server.Server run -
Removing ln LogicalNode@44a1bfa5identifier=tcp://localhost:62217/1.out.1,
upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@248e40ea
{da=com.datatorrent.bufferserver.internal.DataList$Block@4b4807c7{identifier=1.out.1,
data=1048576, readingOffset=0, writingOffset=481,
starting_window=5b29af4e00000001, ending_window=5b29af4e00000005,
refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
DataList@4786e1b1[identifier=1.out.1]
2018-06-19 21:35:12,228 [ProcessWideEventLoop] INFO  server.Server run -
Server stopped listening at /0:0:0:0:0:0:0:0:62217
2018-06-19 21:35:12,229 [main] INFO  stram.StramLocalCluster run -
Application finished.
2018-06-19 21:35:12,229 [main] INFO  stram.CustomControlTupleTest testApp -
Control Tuples received 3 expected 3
Tests run: 10, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.951 sec
<<< FAILURE! - in com.datatorrent.stram.CustomControlTupleTest
testDuplicateControlTuples(com.datatorrent.stram.CustomControlTupleTest)
 Time elapsed: 2.241 sec  <<< FAILURE!
java.lang.AssertionError: Incorrect Control Tuples
at
com.datatorrent.stram.CustomControlTupleTest.testApp(CustomControlTupleTest.java:259)
at
com.datatorrent.stram.CustomControlTupleTest.testDuplicateControlTuples(CustomControlTupleTest.java:283)






and here is what I get running these two tests individually:

/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java
-ea "-Dmaven.home=/Applications/IntelliJ IDEA
CE.app/Contents/plugins/maven/lib/maven3"
"-Dmaven.multiModuleProjectDirectory=/Applications/IntelliJ IDEA
CE.app/Contents/plugins/maven/lib/maven3" -Dapex.version=3.7.1-SNAPSHOT
-Djava.io.tmpdir=/Users/mbossert/testIdea/apex-core/engine/target -Xmx2048m
-XX:MaxPermSize=128m -Didea.test.cyclic.buffer.size=1048576
"-javaagent:/Applications/IntelliJ IDEA
CE.app/Contents/lib/idea_rt.jar=62369:/Applications/IntelliJ IDEA
CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath
"/Applications/IntelliJ IDEA
CE.app/Contents/lib/idea_rt.jar:/Applications/IntelliJ IDEA
CE.app/Contents/plugins/junit/lib/junit-rt.jar:/Applications/IntelliJ IDEA
CE.app/Contents/plugins/junit/lib/junit5-rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/tools.jar:/Users/mbossert/testIdea/apex-core/engine/target/test-classes:/Users/mbossert/testIdea/apex-core/engine/target/classes:/Users/mbossert/.m2/repository/org/apache/bval/bval-jsr303/0.5/bval-jsr303-0.5.jar:/Users/mbossert/.m2/repository/org/apache/bval/bval-core/0.5/bval-core-0.5.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-lang3/3.1/commons-lang3-3.1.jar:/Users/mbossert/testIdea/apex-core/bufferserver/target/classes:/Users/mbossert/testIdea/apex-core/common/target/classes:/Users/mbossert/testIdea/apex-core/api/target/classes:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar:/Users/mbossert/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/mbossert/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/mbossert/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/mbossert/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/mbossert/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/mbossert/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/mbossert/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/mbossert/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/mbossert/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/mbossert/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/mbossert/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-auth/2.6.0/hadoop-auth-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/Users/mbossert/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/Users/mbossert/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/Users/mbossert/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-framework/2.6.0/curator-framework-2.6.0.jar:/Users/mbossert/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-client/2.6.0/curator-client-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-recipes/2.6.0/curator-recipes-2.6.0.jar:/Users/mbossert/.m2/repository/org/htrace/htrace-core/3.0.4/htrace-core-3.0.4.jar:/Users/mbossert/.m2/repository/com/datatorrent/netlet/1.3.2/netlet-1.3.2.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/kryo/4.0.2/kryo-4.0.2.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/reflectasm/1.11.3/reflectasm-1.11.3.jar:/Users/mbossert/.m2/repository/org/ow2/asm/asm/5.0.4/asm-5.0.4.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/Users/mbossert/.m2/repository/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/mbossert/.m2/repository/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar:/Users/mbossert/.m2/repository/org/apache/httpcomponents/httpcore/4.3.3/httpcore-4.3.3.jar:/Users/mbossert/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/mbossert/.m2/repository/com/sun/jersey/contribs/jersey-apache-client4/1.9/jersey-apache-client4-1.9.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.6.0/hadoop-yarn-client-2.6.0.jar:/Users/mbossert/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/mbossert/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/mbossert/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/mbossert/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/mbossert/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-annotations/2.6.0/hadoop-annotations-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.6.0/hadoop-yarn-api-2.6.0.jar:/Users/mbossert/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.6.0/hadoop-yarn-common-2.6.0.jar:/Users/mbossert/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/mbossert/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/mbossert/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/mbossert/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/mbossert/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/mbossert/.m2/repository/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar:/Users/mbossert/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/mbossert/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/mbossert/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/mbossert/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/mbossert/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/mbossert/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-tests/2.6.0/hadoop-yarn-server-tests-2.6.0-tests.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.6.0/hadoop-yarn-server-common-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/Users/mbossert/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar:/Users/mbossert/.m2/repository/io/netty/netty/3.7.0.Final/netty-3.7.0.Final.jar:/Users/mbossert/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-nodemanager/2.6.0/hadoop-yarn-server-nodemanager-2.6.0.jar:/Users/mbossert/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-resourcemanager/2.6.0/hadoop-yarn-server-resourcemanager-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-applicationhistoryservice/2.6.0/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/Users/mbossert/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-web-proxy/2.6.0/hadoop-yarn-server-web-proxy-2.6.0.jar:/Users/mbossert/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/mbossert/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/mbossert/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/mbossert/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/mbossert/.m2/repository/jline/jline/2.11/jline-2.11.jar:/Users/mbossert/.m2/repository/org/apache/ant/ant/1.9.2/ant-1.9.2.jar:/Users/mbossert/.m2/repository/org/apache/ant/ant-launcher/1.9.2/ant-launcher-1.9.2.jar:/Users/mbossert/.m2/repository/net/engio/mbassador/1.1.9/mbassador-1.1.9.jar:/Users/mbossert/.m2/repository/org/mockito/mockito-core/1.10.19/mockito-core-1.10.19.jar:/Users/mbossert/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/mbossert/.m2/repository/net/lingala/zip4j/zip4j/1.3.2/zip4j-1.3.2.jar:/Users/mbossert/.m2/repository/commons-beanutils/commons-beanutils/1.9.2/commons-beanutils-1.9.2.jar:/Users/mbossert/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-servlet/8.1.10.v20130312/jetty-servlet-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-security/8.1.10.v20130312/jetty-security-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-server/8.1.10.v20130312/jetty-server-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-continuation/8.1.10.v20130312/jetty-continuation-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-websocket/8.1.10.v20130312/jetty-websocket-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-util/8.1.10.v20130312/jetty-util-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-io/8.1.10.v20130312/jetty-io-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-http/8.1.10.v20130312/jetty-http-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/apache/xbean/xbean-asm5-shaded/4.3/xbean-asm5-shaded-4.3.jar:/Users/mbossert/.m2/repository/org/jctools/jctools-core/1.1/jctools-core-1.1.jar:/Users/mbossert/.m2/repository/org/apache/apex/apex-shaded-ning19/1.0.0/apex-shaded-ning19-1.0.0.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-embedder/3.3.9/maven-embedder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-settings/3.3.9/maven-settings-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-core/3.3.9/maven-core-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-model/3.3.9/maven-model-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-settings-builder/3.3.9/maven-settings-builder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-repository-metadata/3.3.9/maven-repository-metadata-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-artifact/3.3.9/maven-artifact-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-aether-provider/3.3.9/maven-aether-provider-3.3.9.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-impl/1.0.2.v20150114/aether-impl-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/com/google/inject/guice/4.0/guice-4.0-no_aop.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.21/plexus-interpolation-1.21.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-plugin-api/3.3.9/maven-plugin-api-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-model-builder/3.3.9/maven-model-builder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-builder-support/3.3.9/maven-builder-support-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-compat/3.3.9/maven-compat-3.3.9.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-utils/3.0.22/plexus-utils-3.0.22.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-classworlds/2.5.2/plexus-classworlds-2.5.2.jar:/Users/mbossert/.m2/repository/org/eclipse/sisu/org.eclipse.sisu.plexus/0.3.2/org.eclipse.sisu.plexus-0.3.2.jar:/Users/mbossert/.m2/repository/javax/enterprise/cdi-api/1.0/cdi-api-1.0.jar:/Users/mbossert/.m2/repository/javax/annotation/jsr250-api/1.0/jsr250-api-1.0.jar:/Users/mbossert/.m2/repository/org/eclipse/sisu/org.eclipse.sisu.inject/0.3.2/org.eclipse.sisu.inject-0.3.2.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.6/plexus-component-annotations-1.6.jar:/Users/mbossert/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar:/Users/mbossert/.m2/repository/org/sonatype/plexus/plexus-cipher/1.7/plexus-cipher-1.7.jar:/Users/mbossert/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-connector-basic/1.0.2.v20150114/aether-connector-basic-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-api/1.0.2.v20150114/aether-api-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-spi/1.0.2.v20150114/aether-spi-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-util/1.0.2.v20150114/aether-util-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-transport-wagon/1.0.2.v20150114/aether-transport-wagon-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-http/2.10/wagon-http-2.10.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-http-shared/2.10/wagon-http-shared-2.10.jar:/Users/mbossert/.m2/repository/org/jsoup/jsoup/1.7.2/jsoup-1.7.2.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-provider-api/2.10/wagon-provider-api-2.10.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-mockito/1.6.5/powermock-api-mockito-1.6.5.jar:/Users/mbossert/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-mockito-common/1.6.5/powermock-api-mockito-common-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-support/1.6.5/powermock-api-support-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-module-junit4-rule/1.6.5/powermock-module-junit4-rule-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-classloading-base/1.6.5/powermock-classloading-base-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-reflect/1.6.5/powermock-reflect-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-core/1.6.5/powermock-core-1.6.5.jar:/Users/mbossert/.m2/repository/org/javassist/javassist/3.20.0-GA/javassist-3.20.0-GA.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-module-junit4-common/1.6.5/powermock-module-junit4-common-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-classloading-xstream/1.6.5/powermock-classloading-xstream-1.6.5.jar:/Users/mbossert/.m2/repository/com/thoughtworks/xstream/xstream/1.4.9/xstream-1.4.9.jar:/Users/mbossert/.m2/repository/xmlpull/xmlpull/
1.1.3.1/xmlpull-1.1.3.1.jar:/Users/mbossert/.m2/repository/xpp3/xpp3_min/1.1.4c/xpp3_min-1.1.4c.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.9/jersey-test-framework-grizzly2-1.9.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-core/1.9/jersey-test-framework-core-1.9.jar:/Users/mbossert/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-grizzly2/1.9/jersey-grizzly2-1.9.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http/2.1.2/grizzly-http-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.2/grizzly-framework-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/gmbal/gmbal-api-only/3.0.0-b023/gmbal-api-only-3.0.0-b023.jar:/Users/mbossert/.m2/repository/org/glassfish/external/management-api/3.0.0-b012/management-api-3.0.0-b012.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http-server/2.1.2/grizzly-http-server-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.2/grizzly-rcm-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http-servlet/2.1.2/grizzly-http-servlet-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/javax.servlet/3.1/javax.servlet-3.1.jar:/Users/mbossert/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/mbossert/.m2/repository/junit/junit/4.11/junit-4.11.jar:/Users/mbossert/.m2/repository/pl/pragmatists/JUnitParams/1.0.4/JUnitParams-1.0.4.jar"
com.intellij.rt.execution.junit.JUnitStarter -ideVersion5 -junit4
com.datatorrent.stram.StramRecoveryTest,testWriteAheadLog
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
MaxPermSize=128m; support was removed in 8.0
2018-06-19 21:47:09,707 [main] WARN  util.NativeCodeLoader <clinit> -
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable

java.lang.AssertionError: flush count
Expected :1
Actual   :2
 <Click to see difference>


at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:555)
at
com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at
com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)


Process finished with exit code 255

For the second test, it fails intermittently when doing the whole
install...but I cannot seem to duplicate the failure when run in isolation.

On Tue, Jun 19, 2018 at 5:08 PM Pramod Immaneni <[hidden email]>
wrote:

> Do you see the same errors when you run the individual tests in question in
> isolation, such as using mvn test -Dtest=<test-class>. If you do can you
> paste the full logs of what you see when the individual tests fail.
>
> Thanks
>
> On Mon, Jun 18, 2018 at 11:41 AM Aaron Bossert <[hidden email]>
> wrote:
>
> > please disregard the first iteration...this ended up being related to a
> > hung build running in the background causing timeouts, I think.  I am
> still
> > having failures, but there are two and are still mysterious to me as to
> > their root cause.  Here are the actual failures:
> >
> > I don't immediately see how these are related to Kryo at all...but then
> > again, I am still familiarizing myself with the code base.  I am hoping
> > that someone out there has a lightbulb turn on and has some notion of how
> > they are related...
> >
> >
> >
> -------------------------------------------------------------------------------
> > Test set: com.datatorrent.stram.StramRecoveryTest
> >
> >
> -------------------------------------------------------------------------------
> > Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.119 sec
> > <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> > testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> > 0.105 sec  <<< FAILURE!
> > java.lang.AssertionError: flush count expected:<1> but was:<2>
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> >
> >
> >
> -------------------------------------------------------------------------------
> > Test set: com.datatorrent.stram.engine.StatsTest
> >
> >
> -------------------------------------------------------------------------------
> > Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.051
> sec
> > <<< FAILURE! - in com.datatorrent.stram.engine.StatsTest
> >
> >
> testQueueSizeForContainerLocalOperators(com.datatorrent.stram.engine.StatsTest)
> >  Time elapsed: 3.277 sec  <<< FAILURE!
> > java.lang.AssertionError: Validate input port queue size -1
> > at
> >
> >
> com.datatorrent.stram.engine.StatsTest.baseTestForQueueSize(StatsTest.java:270)
> > at
> >
> >
> com.datatorrent.stram.engine.StatsTest.testQueueSizeForContainerLocalOperators(StatsTest.java:285)
> >
> > On Mon, Jun 18, 2018 at 1:20 PM Aaron Bossert <[hidden email]>
> > wrote:
> >
> > > I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
> > > serialization issue related to support for Java Instant and a couple of
> > > other classes that are supported in newer Kryo versions.  My test build
> > and
> > > install (vanilla, no changes of any kind, just download apex-core and
> > > "clean install") works fine, however, when updating the Kryo dependency
> > to
> > > 4.0.2, getting this non-obvious (to me) error (running "clean install
> > -X).
> > > I also identified a bug or perhaps a feature?  When building on my
> macOS
> > > laptop, I have an  Idea project folder in iCloud which is locally
> stored
> > in
> > > a directory that contains a space in the name, which needs to be
> escaped.
> > > When I initially built, I kept running into errors related to
> that...not
> > > sure if that is something that should be fixed (it is not as
> > > straightforward as I had hoped) or simply require that directory names
> > not
> > > include any spaces.  I have no control of the iCloud local folder
> > > name...otherwise, would have just fixed that.
> > >
> > > 2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
> > > Giving up RPC connection recovery after 504 ms
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN
> ipc.Server
> > > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > > timeout), rpc version=2, client version=201208081755,
> > > methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141
> Retry#0:
> > > output error
> > > 2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke -
> > > RPC failure, will retry after 100 ms (remaining 998 ms)
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke -
> > > RPC failure, will retry after 100 ms (remaining 390 ms)
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN
> ipc.Server
> > > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > > timeout), rpc version=2, client version=201208081755,
> > > methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142
> Retry#0:
> > > output error
> > > 2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
> > > Giving up RPC connection recovery after 501 ms
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN
> ipc.Server
> > > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > > timeout), rpc version=2, client version=201208081755,
> > > methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146
> Retry#0:
> > > output error
> > > 2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke -
> > > RPC failure, will retry after 100 ms (remaining 995 ms)
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke -
> > > RPC failure, will retry after 100 ms (remaining 387 ms)
> > > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> > 10.37.129.2
> > >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > > channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > > http://wiki.apache.org/hadoop/SocketTimeout
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > > at
> > >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > at java.lang.reflect.Method.invoke(Method.java:498)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > > at
> > >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > > at
> > >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > > at
> > >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > > at
> > >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > > at
> > >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.junit.runners.Suite.runChild(Suite.java:127)
> > > at org.junit.runners.Suite.runChild(Suite.java:26)
> > > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > > at
> > >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > > at
> > >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > > at
> > >
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > > waiting for channel to be ready for read. ch :
> > > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > > at
> > > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > > at
> > >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > > 2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN
> ipc.Server
> > > processResponse - IPC Server handler 0 on 65136, call
> > > reportError(containerId, null, timeout, null), rpc version=2, client
> > > version=201208081755, methodsFingerPrint=-1300451462 from
> > > 10.37.129.2:65142 Call#147 Retry#0: output error
> > > 2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > > 2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > > 2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > > 2018-06-18 12:43:28,492 [main] INFO  stram.StramClient
> copyInitialState -
> > > Copying initial state took 32 ms
> > > 2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > > 2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > > 2018-06-18 12:43:28,673 [main] INFO  stram.StramClient
> copyInitialState -
> > > Copying initial state took 35 ms
> > > 2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save -
> > using
> > >
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939
> > as
> > > the basepath for checkpointing.
> > > 2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > > 2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > > 2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > > 2018-06-18 12:43:29,494 [main] INFO  stram.StramClient
> copyInitialState -
> > > Copying initial state took 29 ms
> > > 2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > > 2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > > 2018-06-18 12:43:29,651 [main] INFO  stram.StramClient
> copyInitialState -
> > > Copying initial state took 32 ms
> > > 2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog
> -
> > > Creating
> > >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > > 2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save -
> > using
> > >
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194
> > as
> > > the basepath for checkpointing.
> > > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
> without
> > > locality contraint due to insufficient resources.
> > > 2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save -
> > using
> > >
> /Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644
> > as
> > > the basepath for checkpointing.
> > > Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329
> sec
> > > <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> > > testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time
> elapsed:
> > > 0.097 sec  <<< FAILURE!
> > > java.lang.AssertionError: flush count expected:<1> but was:<2>
> > > at
> > >
> >
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> > >
> > >
> > > --
> > >
> > > M. Aaron Bossert
> > > (571) 242-4021
> > > Punch Cyber Analytics Group
> > >
> > >
> > >
> >
> > --
> >
> > M. Aaron Bossert
> > (571) 242-4021
> > Punch Cyber Analytics Group
> >
>


--

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Vlad Rozov-2
At minimum one problem is caused by a change in Kryo behavior. Please
take a look at Journal.setOutputStream(). There is a workaround for Kryo
flush() that is not needed anymore:

if (out !=null) {
   output =new Output(4096, -1)
   {
     @Override public void flush()throws KryoException
     {
       super.flush(); // Kryo does not flush internal output stream during flush. We need to
flush it explicitly. try {
         getOutputStream().flush(); }catch (IOException e) {
         throw new KryoException(e); }
     }
   };

Thank you,

Vlad
On 6/19/18 18:53, Aaron Bossert wrote:

> Pramod,
>
> Thanks for taking the time to help!
>
> Here is the output (just failed parts) when running full install (clean
> install -X) on the Master branch:
>
> Running com.datatorrent.stram.StramRecoveryTest
> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
> testRpcFailover - Mock server listening at macbook-pro-6.lan/
> 192.168.87.125:62154
> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 507 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
> output error
> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 994 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 386 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
> output error
> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 506 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
> output error
> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 995 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 391 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call
> reportError(containerId, null, timeout, null), rpc version=2, client
> version=201208081755, methodsFingerPrint=-1300451462 from
> 192.168.87.125:62160 Call#142 Retry#0: output error
> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 32 ms
> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 28 ms
> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
> the basepath for checkpointing.
> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 22 ms
> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 29 ms
> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
> the basepath for checkpointing.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
> the basepath for checkpointing.
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.111 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> Running com.datatorrent.stram.CustomControlTupleTest
> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
> the basepath for checkpointing.
> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62181
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458489500 to 1529458490463
> 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@7d88b4a4identifier=tcp://localhost:62181/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
> {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@4dca4fb0[identifier=2.output.1]
> 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@49665770[identifier=1.out.1]
> 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
> the basepath for checkpointing.
> 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62186
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458491500 to 1529458492630
> 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
> the basepath for checkpointing.
> 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62187
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
> - Catching up from 1529458493500 to 1529458494852
> 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@5dbf681cidentifier=tcp://localhost:62187/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815{identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@46bbe39d[identifier=3.output.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2e985326identifier=tcp://localhost:62187/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
> {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@3de15cc7[identifier=2.output.1]
> 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
> the basepath for checkpointing.
> 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62195
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
> 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458495500 to 1529458497040
> 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
> Container thread container-0 is still alive
> 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 4 expected 4
> 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
> the basepath for checkpointing.
> 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62196
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458499500 to 1529458500273
> 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@d80a435identifier=tcp://localhost:62196/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212{identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@1684ecee[identifier=3.output.1]
> 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@75d245a1identifier=tcp://localhost:62196/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
> {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@379bd431[identifier=2.output.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 0 expected 1
> 2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
> the basepath for checkpointing.
> 2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62204
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
> - Catching up from 1529458501500 to 1529458502467
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@67d38a09[identifier=1.out.1]
> 2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@1badfe12identifier=tcp://localhost:62204/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7afd481[identifier=2.output.1]
> 2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
> the basepath for checkpointing.
> 2018-06-19 21:35:03,633 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:03,633 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62209
> 2018-06-19 21:35:03,633 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62209
> 2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:04,641 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:04,643 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458503500 to 1529458504643
> 2018-06-19 21:35:05,640 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:05,641 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:05,641 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:05,653 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:05,655 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62209
> 2018-06-19 21:35:05,655 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:05,655 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:35:05,672 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp9044425874557598001 as
> the basepath for checkpointing.
> 2018-06-19 21:35:05,819 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:05,819 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62211
> 2018-06-19 21:35:05,819 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62211
> 2018-06-19 21:35:05,819 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:05,819 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:35:05,820 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:05,820 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:35:06,826 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:06,826 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:06,826 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:06,830 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458505500 to 1529458506830
> 2018-06-19 21:35:06,831 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62211/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62211/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:06,832 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:07,828 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:07,829 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:07,829 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:07,829 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:07,829 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:07,829 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:07,829 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:07,834 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:07,839 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:07,839 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:07,839 [ServerHelper-107-1] INFO  server.Server run -
> Removing ln LogicalNode@16a2cf78identifier=tcp://localhost:62211/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@665bb8b6
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6682720a{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af4900000001, ending_window=5b29af4900000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@1eac5147[identifier=2.output.1]
> 2018-06-19 21:35:07,839 [ServerHelper-107-1] INFO  server.Server run -
> Removing ln LogicalNode@579fc543identifier=tcp://localhost:62211/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@37b03a0c
> {da=com.datatorrent.bufferserver.internal.DataList$Block@c15ba44{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af4900000001, ending_window=5b29af4900000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@79f29bee[identifier=1.out.1]
> 2018-06-19 21:35:07,840 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62211
> 2018-06-19 21:35:07,840 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:07,840 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:35:07,857 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp628666253336272009 as
> the basepath for checkpointing.
> 2018-06-19 21:35:08,003 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:08,004 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62216
> 2018-06-19 21:35:08,004 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62216
> 2018-06-19 21:35:08,004 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:08,005 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:09,009 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:35:09,011 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458507500 to 1529458509011
> 2018-06-19 21:35:10,011 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:10,012 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:10,012 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:10,015 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:10,017 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62216
> 2018-06-19 21:35:10,017 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:10,017 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 4 expected 4
> 2018-06-19 21:35:10,034 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1306712174461973573 as
> the basepath for checkpointing.
> 2018-06-19 21:35:10,194 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:10,194 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62217
> 2018-06-19 21:35:10,194 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62217
> 2018-06-19 21:35:10,194 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:10,195 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:35:10,195 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:35:10,195 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:35:10,195 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:35:10,195 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:11,201 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:11,201 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:11,201 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:11,205 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458510500 to 1529458511205
> 2018-06-19 21:35:11,206 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62217/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:11,207 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:11,208 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:11,208 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62217/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:12,202 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:12,203 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:12,203 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:12,203 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:12,203 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:12,203 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:12,204 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:12,204 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:12,208 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:12,209 [ServerHelper-113-1] INFO  server.Server run -
> Removing ln LogicalNode@59cb59eidentifier=tcp://localhost:62217/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@157f4ac6
> {da=com.datatorrent.bufferserver.internal.DataList$Block@75af3db2{identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4e00000001, ending_window=5b29af4e00000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@6d9b966b[identifier=2.output.1]
> 2018-06-19 21:35:12,216 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:12,217 [ServerHelper-113-1] INFO  server.Server run -
> Removing ln LogicalNode@44a1bfa5identifier=tcp://localhost:62217/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@248e40ea
> {da=com.datatorrent.bufferserver.internal.DataList$Block@4b4807c7{identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4e00000001, ending_window=5b29af4e00000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@4786e1b1[identifier=1.out.1]
> 2018-06-19 21:35:12,228 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62217
> 2018-06-19 21:35:12,229 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:12,229 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> Tests run: 10, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.951 sec
> <<< FAILURE! - in com.datatorrent.stram.CustomControlTupleTest
> testDuplicateControlTuples(com.datatorrent.stram.CustomControlTupleTest)
>   Time elapsed: 2.241 sec  <<< FAILURE!
> java.lang.AssertionError: Incorrect Control Tuples
> at
> com.datatorrent.stram.CustomControlTupleTest.testApp(CustomControlTupleTest.java:259)
> at
> com.datatorrent.stram.CustomControlTupleTest.testDuplicateControlTuples(CustomControlTupleTest.java:283)
>
>
>
>
>
>
> and here is what I get running these two tests individually:
>
> /Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java
> -ea "-Dmaven.home=/Applications/IntelliJ IDEA
> CE.app/Contents/plugins/maven/lib/maven3"
> "-Dmaven.multiModuleProjectDirectory=/Applications/IntelliJ IDEA
> CE.app/Contents/plugins/maven/lib/maven3" -Dapex.version=3.7.1-SNAPSHOT
> -Djava.io.tmpdir=/Users/mbossert/testIdea/apex-core/engine/target -Xmx2048m
> -XX:MaxPermSize=128m -Didea.test.cyclic.buffer.size=1048576
> "-javaagent:/Applications/IntelliJ IDEA
> CE.app/Contents/lib/idea_rt.jar=62369:/Applications/IntelliJ IDEA
> CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath
> "/Applications/IntelliJ IDEA
> CE.app/Contents/lib/idea_rt.jar:/Applications/IntelliJ IDEA
> CE.app/Contents/plugins/junit/lib/junit-rt.jar:/Applications/IntelliJ IDEA
> CE.app/Contents/plugins/junit/lib/junit5-rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/tools.jar:/Users/mbossert/testIdea/apex-core/engine/target/test-classes:/Users/mbossert/testIdea/apex-core/engine/target/classes:/Users/mbossert/.m2/repository/org/apache/bval/bval-jsr303/0.5/bval-jsr303-0.5.jar:/Users/mbossert/.m2/repository/org/apache/bval/bval-core/0.5/bval-core-0.5.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-lang3/3.1/commons-lang3-3.1.jar:/Users/mbossert/testIdea/apex-core/bufferserver/target/classes:/Users/mbossert/testIdea/apex-core/common/target/classes:/Users/mbossert/testIdea/apex-core/api/target/classes:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-common/2.6.0/hadoop-common-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar:/Users/mbossert/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/mbossert/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/mbossert/.m2/repository/tomcat/jasper-compiler/5.5.23/jasper-compiler-5.5.23.jar:/Users/mbossert/.m2/repository/tomcat/jasper-runtime/5.5.23/jasper-runtime-5.5.23.jar:/Users/mbossert/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/mbossert/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/mbossert/.m2/repository/net/java/dev/jets3t/jets3t/0.9.0/jets3t-0.9.0.jar:/Users/mbossert/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/mbossert/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/Users/mbossert/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/Users/mbossert/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-auth/2.6.0/hadoop-auth-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/directory/server/apacheds-kerberos-codec/2.0.0-M15/apacheds-kerberos-codec-2.0.0-M15.jar:/Users/mbossert/.m2/repository/org/apache/directory/server/apacheds-i18n/2.0.0-M15/apacheds-i18n-2.0.0-M15.jar:/Users/mbossert/.m2/repository/org/apache/directory/api/api-asn1-api/1.0.0-M20/api-asn1-api-1.0.0-M20.jar:/Users/mbossert/.m2/repository/org/apache/directory/api/api-util/1.0.0-M20/api-util-1.0.0-M20.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-framework/2.6.0/curator-framework-2.6.0.jar:/Users/mbossert/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-client/2.6.0/curator-client-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/curator/curator-recipes/2.6.0/curator-recipes-2.6.0.jar:/Users/mbossert/.m2/repository/org/htrace/htrace-core/3.0.4/htrace-core-3.0.4.jar:/Users/mbossert/.m2/repository/com/datatorrent/netlet/1.3.2/netlet-1.3.2.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/kryo/4.0.2/kryo-4.0.2.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/reflectasm/1.11.3/reflectasm-1.11.3.jar:/Users/mbossert/.m2/repository/org/ow2/asm/asm/5.0.4/asm-5.0.4.jar:/Users/mbossert/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/Users/mbossert/.m2/repository/javax/validation/validation-api/1.1.0.Final/validation-api-1.1.0.Final.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/mbossert/.m2/repository/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar:/Users/mbossert/.m2/repository/org/apache/httpcomponents/httpcore/4.3.3/httpcore-4.3.3.jar:/Users/mbossert/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/mbossert/.m2/repository/com/sun/jersey/contribs/jersey-apache-client4/1.9/jersey-apache-client4-1.9.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-client/1.9/jersey-client-1.9.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.6.0/hadoop-yarn-client-2.6.0.jar:/Users/mbossert/.m2/repository/com/google/guava/guava/11.0.2/guava-11.0.2.jar:/Users/mbossert/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/mbossert/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/mbossert/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/mbossert/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-annotations/2.6.0/hadoop-annotations-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.6.0/hadoop-yarn-api-2.6.0.jar:/Users/mbossert/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.6.0/hadoop-yarn-common-2.6.0.jar:/Users/mbossert/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/mbossert/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/mbossert/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/mbossert/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/mbossert/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/mbossert/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/mbossert/.m2/repository/com/google/inject/extensions/guice-servlet/3.0/guice-servlet-3.0.jar:/Users/mbossert/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/mbossert/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/mbossert/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/mbossert/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-server/1.9/jersey-server-1.9.jar:/Users/mbossert/.m2/repository/asm/asm/3.1/asm-3.1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-json/1.9/jersey-json-1.9.jar:/Users/mbossert/.m2/repository/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/contribs/jersey-guice/1.9/jersey-guice-1.9.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-tests/2.6.0/hadoop-yarn-server-tests-2.6.0-tests.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.6.0/hadoop-yarn-server-common-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/Users/mbossert/.m2/repository/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar:/Users/mbossert/.m2/repository/io/netty/netty/3.7.0.Final/netty-3.7.0.Final.jar:/Users/mbossert/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-nodemanager/2.6.0/hadoop-yarn-server-nodemanager-2.6.0.jar:/Users/mbossert/.m2/repository/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-resourcemanager/2.6.0/hadoop-yarn-server-resourcemanager-2.6.0.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-applicationhistoryservice/2.6.0/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar:/Users/mbossert/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/mbossert/.m2/repository/org/apache/hadoop/hadoop-yarn-server-web-proxy/2.6.0/hadoop-yarn-server-web-proxy-2.6.0.jar:/Users/mbossert/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/Users/mbossert/.m2/repository/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar:/Users/mbossert/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/mbossert/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/mbossert/.m2/repository/jline/jline/2.11/jline-2.11.jar:/Users/mbossert/.m2/repository/org/apache/ant/ant/1.9.2/ant-1.9.2.jar:/Users/mbossert/.m2/repository/org/apache/ant/ant-launcher/1.9.2/ant-launcher-1.9.2.jar:/Users/mbossert/.m2/repository/net/engio/mbassador/1.1.9/mbassador-1.1.9.jar:/Users/mbossert/.m2/repository/org/mockito/mockito-core/1.10.19/mockito-core-1.10.19.jar:/Users/mbossert/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/mbossert/.m2/repository/net/lingala/zip4j/zip4j/1.3.2/zip4j-1.3.2.jar:/Users/mbossert/.m2/repository/commons-beanutils/commons-beanutils/1.9.2/commons-beanutils-1.9.2.jar:/Users/mbossert/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-servlet/8.1.10.v20130312/jetty-servlet-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-security/8.1.10.v20130312/jetty-security-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-server/8.1.10.v20130312/jetty-server-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-continuation/8.1.10.v20130312/jetty-continuation-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-websocket/8.1.10.v20130312/jetty-websocket-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-util/8.1.10.v20130312/jetty-util-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-io/8.1.10.v20130312/jetty-io-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/eclipse/jetty/jetty-http/8.1.10.v20130312/jetty-http-8.1.10.v20130312.jar:/Users/mbossert/.m2/repository/org/apache/xbean/xbean-asm5-shaded/4.3/xbean-asm5-shaded-4.3.jar:/Users/mbossert/.m2/repository/org/jctools/jctools-core/1.1/jctools-core-1.1.jar:/Users/mbossert/.m2/repository/org/apache/apex/apex-shaded-ning19/1.0.0/apex-shaded-ning19-1.0.0.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-embedder/3.3.9/maven-embedder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-settings/3.3.9/maven-settings-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-core/3.3.9/maven-core-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-model/3.3.9/maven-model-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-settings-builder/3.3.9/maven-settings-builder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-repository-metadata/3.3.9/maven-repository-metadata-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-artifact/3.3.9/maven-artifact-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-aether-provider/3.3.9/maven-aether-provider-3.3.9.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-impl/1.0.2.v20150114/aether-impl-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/com/google/inject/guice/4.0/guice-4.0-no_aop.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.21/plexus-interpolation-1.21.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-plugin-api/3.3.9/maven-plugin-api-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-model-builder/3.3.9/maven-model-builder-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-builder-support/3.3.9/maven-builder-support-3.3.9.jar:/Users/mbossert/.m2/repository/org/apache/maven/maven-compat/3.3.9/maven-compat-3.3.9.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-utils/3.0.22/plexus-utils-3.0.22.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-classworlds/2.5.2/plexus-classworlds-2.5.2.jar:/Users/mbossert/.m2/repository/org/eclipse/sisu/org.eclipse.sisu.plexus/0.3.2/org.eclipse.sisu.plexus-0.3.2.jar:/Users/mbossert/.m2/repository/javax/enterprise/cdi-api/1.0/cdi-api-1.0.jar:/Users/mbossert/.m2/repository/javax/annotation/jsr250-api/1.0/jsr250-api-1.0.jar:/Users/mbossert/.m2/repository/org/eclipse/sisu/org.eclipse.sisu.inject/0.3.2/org.eclipse.sisu.inject-0.3.2.jar:/Users/mbossert/.m2/repository/org/codehaus/plexus/plexus-component-annotations/1.6/plexus-component-annotations-1.6.jar:/Users/mbossert/.m2/repository/org/sonatype/plexus/plexus-sec-dispatcher/1.3/plexus-sec-dispatcher-1.3.jar:/Users/mbossert/.m2/repository/org/sonatype/plexus/plexus-cipher/1.7/plexus-cipher-1.7.jar:/Users/mbossert/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-connector-basic/1.0.2.v20150114/aether-connector-basic-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-api/1.0.2.v20150114/aether-api-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-spi/1.0.2.v20150114/aether-spi-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-util/1.0.2.v20150114/aether-util-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/eclipse/aether/aether-transport-wagon/1.0.2.v20150114/aether-transport-wagon-1.0.2.v20150114.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-http/2.10/wagon-http-2.10.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-http-shared/2.10/wagon-http-shared-2.10.jar:/Users/mbossert/.m2/repository/org/jsoup/jsoup/1.7.2/jsoup-1.7.2.jar:/Users/mbossert/.m2/repository/org/apache/maven/wagon/wagon-provider-api/2.10/wagon-provider-api-2.10.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-mockito/1.6.5/powermock-api-mockito-1.6.5.jar:/Users/mbossert/.m2/repository/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-mockito-common/1.6.5/powermock-api-mockito-common-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-api-support/1.6.5/powermock-api-support-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-module-junit4-rule/1.6.5/powermock-module-junit4-rule-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-classloading-base/1.6.5/powermock-classloading-base-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-reflect/1.6.5/powermock-reflect-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-core/1.6.5/powermock-core-1.6.5.jar:/Users/mbossert/.m2/repository/org/javassist/javassist/3.20.0-GA/javassist-3.20.0-GA.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-module-junit4-common/1.6.5/powermock-module-junit4-common-1.6.5.jar:/Users/mbossert/.m2/repository/org/powermock/powermock-classloading-xstream/1.6.5/powermock-classloading-xstream-1.6.5.jar:/Users/mbossert/.m2/repository/com/thoughtworks/xstream/xstream/1.4.9/xstream-1.4.9.jar:/Users/mbossert/.m2/repository/xmlpull/xmlpull/
> 1.1.3.1/xmlpull-1.1.3.1.jar:/Users/mbossert/.m2/repository/xpp3/xpp3_min/1.1.4c/xpp3_min-1.1.4c.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-grizzly2/1.9/jersey-test-framework-grizzly2-1.9.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-test-framework/jersey-test-framework-core/1.9/jersey-test-framework-core-1.9.jar:/Users/mbossert/.m2/repository/javax/servlet/javax.servlet-api/3.0.1/javax.servlet-api-3.0.1.jar:/Users/mbossert/.m2/repository/com/sun/jersey/jersey-grizzly2/1.9/jersey-grizzly2-1.9.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http/2.1.2/grizzly-http-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.2/grizzly-framework-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/gmbal/gmbal-api-only/3.0.0-b023/gmbal-api-only-3.0.0-b023.jar:/Users/mbossert/.m2/repository/org/glassfish/external/management-api/3.0.0-b012/management-api-3.0.0-b012.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http-server/2.1.2/grizzly-http-server-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.2/grizzly-rcm-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/grizzly/grizzly-http-servlet/2.1.2/grizzly-http-servlet-2.1.2.jar:/Users/mbossert/.m2/repository/org/glassfish/javax.servlet/3.1/javax.servlet-3.1.jar:/Users/mbossert/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/mbossert/.m2/repository/junit/junit/4.11/junit-4.11.jar:/Users/mbossert/.m2/repository/pl/pragmatists/JUnitParams/1.0.4/JUnitParams-1.0.4.jar"
> com.intellij.rt.execution.junit.JUnitStarter -ideVersion5 -junit4
> com.datatorrent.stram.StramRecoveryTest,testWriteAheadLog
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option
> MaxPermSize=128m; support was removed in 8.0
> 2018-06-19 21:47:09,707 [main] WARN  util.NativeCodeLoader <clinit> -
> Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
>
> java.lang.AssertionError: flush count
> Expected :1
> Actual   :2
>   <Click to see difference>
>
>
> at org.junit.Assert.fail(Assert.java:88)
> at org.junit.Assert.failNotEquals(Assert.java:743)
> at org.junit.Assert.assertEquals(Assert.java:118)
> at org.junit.Assert.assertEquals(Assert.java:555)
> at
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
> at
> com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
> at
> com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
> at
> com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
>
>
> Process finished with exit code 255
>
> For the second test, it fails intermittently when doing the whole
> install...but I cannot seem to duplicate the failure when run in isolation.
>
> On Tue, Jun 19, 2018 at 5:08 PM Pramod Immaneni <[hidden email]>
> wrote:
>
>> Do you see the same errors when you run the individual tests in question in
>> isolation, such as using mvn test -Dtest=<test-class>. If you do can you
>> paste the full logs of what you see when the individual tests fail.
>>
>> Thanks
>>
>> On Mon, Jun 18, 2018 at 11:41 AM Aaron Bossert <[hidden email]>
>> wrote:
>>
>>> please disregard the first iteration...this ended up being related to a
>>> hung build running in the background causing timeouts, I think.  I am
>> still
>>> having failures, but there are two and are still mysterious to me as to
>>> their root cause.  Here are the actual failures:
>>>
>>> I don't immediately see how these are related to Kryo at all...but then
>>> again, I am still familiarizing myself with the code base.  I am hoping
>>> that someone out there has a lightbulb turn on and has some notion of how
>>> they are related...
>>>
>>>
>>>
>> -------------------------------------------------------------------------------
>>> Test set: com.datatorrent.stram.StramRecoveryTest
>>>
>>>
>> -------------------------------------------------------------------------------
>>> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.119 sec
>>> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
>>> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
>>> 0.105 sec  <<< FAILURE!
>>> java.lang.AssertionError: flush count expected:<1> but was:<2>
>>> at
>>>
>>>
>> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>>>
>>>
>> -------------------------------------------------------------------------------
>>> Test set: com.datatorrent.stram.engine.StatsTest
>>>
>>>
>> -------------------------------------------------------------------------------
>>> Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.051
>> sec
>>> <<< FAILURE! - in com.datatorrent.stram.engine.StatsTest
>>>
>>>
>> testQueueSizeForContainerLocalOperators(com.datatorrent.stram.engine.StatsTest)
>>>   Time elapsed: 3.277 sec  <<< FAILURE!
>>> java.lang.AssertionError: Validate input port queue size -1
>>> at
>>>
>>>
>> com.datatorrent.stram.engine.StatsTest.baseTestForQueueSize(StatsTest.java:270)
>>> at
>>>
>>>
>> com.datatorrent.stram.engine.StatsTest.testQueueSizeForContainerLocalOperators(StatsTest.java:285)
>>> On Mon, Jun 18, 2018 at 1:20 PM Aaron Bossert <[hidden email]>
>>> wrote:
>>>
>>>> I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
>>>> serialization issue related to support for Java Instant and a couple of
>>>> other classes that are supported in newer Kryo versions.  My test build
>>> and
>>>> install (vanilla, no changes of any kind, just download apex-core and
>>>> "clean install") works fine, however, when updating the Kryo dependency
>>> to
>>>> 4.0.2, getting this non-obvious (to me) error (running "clean install
>>> -X).
>>>> I also identified a bug or perhaps a feature?  When building on my
>> macOS
>>>> laptop, I have an  Idea project folder in iCloud which is locally
>> stored
>>> in
>>>> a directory that contains a space in the name, which needs to be
>> escaped.
>>>> When I initially built, I kept running into errors related to
>> that...not
>>>> sure if that is something that should be fixed (it is not as
>>>> straightforward as I had hoped) or simply require that directory names
>>> not
>>>> include any spaces.  I have no control of the iCloud local folder
>>>> name...otherwise, would have just fixed that.
>>>>
>>>> 2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
>>>> Giving up RPC connection recovery after 504 ms
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN
>> ipc.Server
>>>> processResponse - IPC Server handler 0 on 65136, call log(containerId,
>>>> timeout), rpc version=2, client version=201208081755,
>>>> methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141
>> Retry#0:
>>>> output error
>>>> 2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke -
>>>> RPC failure, will retry after 100 ms (remaining 998 ms)
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke -
>>>> RPC failure, will retry after 100 ms (remaining 390 ms)
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN
>> ipc.Server
>>>> processResponse - IPC Server handler 0 on 65136, call log(containerId,
>>>> timeout), rpc version=2, client version=201208081755,
>>>> methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142
>> Retry#0:
>>>> output error
>>>> 2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
>>>> Giving up RPC connection recovery after 501 ms
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN
>> ipc.Server
>>>> processResponse - IPC Server handler 0 on 65136, call log(containerId,
>>>> timeout), rpc version=2, client version=201208081755,
>>>> methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146
>> Retry#0:
>>>> output error
>>>> 2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke -
>>>> RPC failure, will retry after 100 ms (remaining 995 ms)
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke -
>>>> RPC failure, will retry after 100 ms (remaining 387 ms)
>>>> java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
>>> 10.37.129.2
>>>>   to MacBook-Pro-6.local:65136 failed on socket timeout exception:
>>>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>>>> channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
>>>> http://wiki.apache.org/hadoop/SocketTimeout
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>>> at
>>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>>>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>> at
>>>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>>>> at
>>>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>>>> at
>>>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>>>> at
>>>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>>>> at
>>>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>>>> at
>>>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>>>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>>>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>>>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>>>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>>>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>>>> at
>>>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>>>> at
>>>>
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>>>> waiting for channel to be ready for read. ch :
>>>> java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
>>>> remote=MacBook-Pro-6.local/10.37.129.2:65136]
>>>> at
>>>> org.apache.hadoop.net
>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:161)
>>>> at org.apache.hadoop.net
>>> .SocketInputStream.read(SocketInputStream.java:131)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>>>> at
>>>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> 2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN
>> ipc.Server
>>>> processResponse - IPC Server handler 0 on 65136, call
>>>> reportError(containerId, null, timeout, null), rpc version=2, client
>>>> version=201208081755, methodsFingerPrint=-1300451462 from
>>>> 10.37.129.2:65142 Call#147 Retry#0: output error
>>>> 2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>>>> 2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>>>> 2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>>>> 2018-06-18 12:43:28,492 [main] INFO  stram.StramClient
>> copyInitialState -
>>>> Copying initial state took 32 ms
>>>> 2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>>>> 2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>>>> 2018-06-18 12:43:28,673 [main] INFO  stram.StramClient
>> copyInitialState -
>>>> Copying initial state took 35 ms
>>>> 2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>>>> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save -
>>> using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939
>>> as
>>>> the basepath for checkpointing.
>>>> 2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>>>> 2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>>>> 2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>>>> 2018-06-18 12:43:29,494 [main] INFO  stram.StramClient
>> copyInitialState -
>>>> Copying initial state took 29 ms
>>>> 2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>>>> 2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>>>> 2018-06-18 12:43:29,651 [main] INFO  stram.StramClient
>> copyInitialState -
>>>> Copying initial state took 32 ms
>>>> 2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog
>> -
>>>> Creating
>>>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>>>> 2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save -
>>> using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194
>>> as
>>>> the basepath for checkpointing.
>>>> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
>>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> without
>>>> locality contraint due to insufficient resources.
>>>> 2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save -
>>> using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644
>>> as
>>>> the basepath for checkpointing.
>>>> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329
>> sec
>>>> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
>>>> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time
>> elapsed:
>>>> 0.097 sec  <<< FAILURE!
>>>> java.lang.AssertionError: flush count expected:<1> but was:<2>
>>>> at
>>>>
>> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>>>>
>>>> --
>>>>
>>>> M. Aaron Bossert
>>>> (571) 242-4021
>>>> Punch Cyber Analytics Group
>>>>
>>>>
>>>>
>>> --
>>>
>>> M. Aaron Bossert
>>> (571) 242-4021
>>> Punch Cyber Analytics Group
>>>
>

Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Pramod Immaneni-3
In reply to this post by Aaron Bossert
There are hadoop IPC calls are failing possibly because of its reliance on
kryo for serializing the payload and there is some incompatibility with the
new version. I will dig in more to see what is going on.

On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <[hidden email]> wrote:

> Pramod,
>
> Thanks for taking the time to help!
>
> Here is the output (just failed parts) when running full install (clean
> install -X) on the Master branch:
>
> Running com.datatorrent.stram.StramRecoveryTest
> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
> testRpcFailover - Mock server listening at macbook-pro-6.lan/
> 192.168.87.125:62154
> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 507 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
> output error
> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 994 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 386 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
> output error
> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
> Giving up RPC connection recovery after 506 ms
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.log(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call log(containerId,
> timeout), rpc version=2, client version=201208081755,
> methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
> output error
> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 995 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
> failure, will retry after 100 ms (remaining 391 ms)
> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> 192.168.87.125
> to macbook-pro-6.lan:62154 failed on socket timeout exception:
> java.net.SocketTimeoutException: 500 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> http://wiki.apache.org/hadoop/SocketTimeout
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> at
>
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> at
>
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
>
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> at
>
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
>
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> at
>
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
>
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> at
>
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.junit.runners.Suite.runChild(Suite.java:127)
> at org.junit.runners.Suite.runChild(Suite.java:26)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> at
>
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> at
>
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> waiting for channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> remote=macbook-pro-6.lan/192.168.87.125:62154]
> at
> org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> at
>
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> at
>
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
> processResponse - IPC Server handler 0 on 62154, call
> reportError(containerId, null, timeout, null), rpc version=2, client
> version=201208081755, methodsFingerPrint=-1300451462 from
> 192.168.87.125:62160 Call#142 Retry#0: output error
> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 32 ms
> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 28 ms
> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
> the basepath for checkpointing.
> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 22 ms
> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
> Copying initial state took 29 ms
> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> Creating
>
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
> the basepath for checkpointing.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> locality contraint due to insufficient resources.
> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
> the basepath for checkpointing.
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.111 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
>
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> Running com.datatorrent.stram.CustomControlTupleTest
> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
> the basepath for checkpointing.
> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62181
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458489500 to 1529458490463
> 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@7d88b4a4identifier
> =tcp://localhost:62181/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
> {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@4dca4fb0[identifier=2.output.1]
> 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
> Removing ln LogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@49665770[identifier=1.out.1]
> 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62181
> 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
> the basepath for checkpointing.
> 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62186
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>
> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458491500 to 1529458492630
> 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62186
> 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
> the basepath for checkpointing.
> 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62187
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
> - Catching up from 1529458493500 to 1529458494852
> 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@5dbf681cidentifier
> =tcp://localhost:62187/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815
> {identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@46bbe39d[identifier=3.output.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7a566f6b[identifier=1.out.1]
> 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
> Removing ln LogicalNode@2e985326identifier
> =tcp://localhost:62187/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
> {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=487,
> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@3de15cc7[identifier=2.output.1]
> 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62187
> 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
> the basepath for checkpointing.
> 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62195
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>
> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
> 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458495500 to 1529458497040
> 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
> Container thread container-0 is still alive
> 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62195
> 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 4 expected 4
> 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
> the basepath for checkpointing.
> 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62196
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
> Started container container-3
> 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Entering heartbeat loop..
> 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
> - Catching up from 1529458499500 to 1529458500273
> 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=3.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[0], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> partitions=[1], bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(3.output),
> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/5.<merge#output>(2.output),
> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
> container-3 msg: [container-3] Exiting heartbeat loop..
> 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
> Container container-3 terminating.
> 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@d80a435identifier
> =tcp://localhost:62196/3.output.1,
> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
> {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212
> {identifier=3.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@1684ecee[identifier=3.output.1]
> 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@75d245a1identifier
> =tcp://localhost:62196/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
> partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
> {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@379bd431[identifier=2.output.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input,
> partitions=[BitVector{mask=1, bits=0}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> Removing ln LogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
> upstream=1.out.1, group=genToProcessor/3.input,
> partitions=[BitVector{mask=1, bits=1}],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=36,
> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@47bc3c23[identifier=1.out.1]
> 2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62196
> 2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 0 expected 1
> 2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
> the basepath for checkpointing.
> 2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> spooling.
> 2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
> registered - Server started listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
> server started: localhost:62204
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
> Started container container-0
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
> Started container container-1
> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Entering heartbeat loop..
> 2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
> Started container container-2
> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Entering heartbeat loop..
> 2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Entering heartbeat loop..
> 2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> 2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Deploy request:
>
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> 0,
>
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> 2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
> - Catching up from 1529458501500 to 1529458502467
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=1.out.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> identifier=2.output.1, windowId=ffffffffffffffff}
> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
> identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> partitions=null, bufferSize=1024}
> 2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
> on exit condition
> 2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
> container-2 msg: [container-2] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
> container-1 msg: [container-1] Exiting heartbeat loop..
> 2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
> processHeartbeatResponse - Received shutdown request type ABORT
> 2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
> container-0 msg: [container-0] Exiting heartbeat loop..
> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
> Container container-2 terminating.
> 2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947
> {identifier=1.out.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@67d38a09[identifier=1.out.1]
> 2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
> Container container-1 terminating.
> 2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
> Container container-0 terminating.
> 2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
> Removing ln LogicalNode@1badfe12identifier
> =tcp://localhost:62204/2.output.1,
> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
> {da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266
> {identifier=2.output.1,
> data=1048576, readingOffset=0, writingOffset=481,
> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> DataList@7afd481[identifier=2.output.1]
> 2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
> Server stopped listening at /0:0:0:0:0:0:0:0:62204
> 2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
> Application finished.
> 2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
> Control Tuples received 3 expected 3
> 2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
> /Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
> the basepath for checkpointing.
> 2018-06-19 21:35
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Aaron Bossert
Gentlemen,

I am working at the start of a fairly large project.  I have some questions
related to the general health of Apex...need to get a warm and fuzzy
feeling that the project is not going to die on the vine as it were.

I am seeing the volume of commits and contributor activity dropped off
significantly since early 2016 and there has been a drop again after
Datatorrent folded...What is your sense of the project?  I really like the
framework and definitely would prefer to use it as well as contribute
back...just want to make sure I am not going to have work on it solo or
worse, end up having to switch to something else later...

Your thoughts?

Aaron

On Wed, Jun 20, 2018 at 12:10 PM Pramod Immaneni <[hidden email]>
wrote:

> There are hadoop IPC calls are failing possibly because of its reliance on
> kryo for serializing the payload and there is some incompatibility with the
> new version. I will dig in more to see what is going on.
>
> On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <[hidden email]>
> wrote:
>
> > Pramod,
> >
> > Thanks for taking the time to help!
> >
> > Here is the output (just failed parts) when running full install (clean
> > install -X) on the Master branch:
> >
> > Running com.datatorrent.stram.StramRecoveryTest
> > 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
> > testRpcFailover - Mock server listening at macbook-pro-6.lan/
> > 192.168.87.125:62154
> > 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 507 ms
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 62154, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136
> Retry#0:
> > output error
> > 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC
> > failure, will retry after 100 ms (remaining 994 ms)
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC
> > failure, will retry after 100 ms (remaining 386 ms)
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 62154, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137
> Retry#0:
> > output error
> > 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 506 ms
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 62154, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141
> Retry#0:
> > output error
> > 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC
> > failure, will retry after 100 ms (remaining 995 ms)
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke -
> RPC
> > failure, will retry after 100 ms (remaining 391 ms)
> > java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
> > 192.168.87.125
> > to macbook-pro-6.lan:62154 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> > remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
> > remote=macbook-pro-6.lan/192.168.87.125:62154]
> > at
> > org.apache.hadoop.net
> > .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> > .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 62154, call
> > reportError(containerId, null, timeout, null), rpc version=2, client
> > version=201208081755, methodsFingerPrint=-1300451462 from
> > 192.168.87.125:62160 Call#142 Retry#0: output error
> > 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 32 ms
> > 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 28 ms
> > 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 22 ms
> > 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 29 ms
> > 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003
> as
> > the basepath for checkpointing.
> > Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
> > <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> > testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> > 0.111 sec  <<< FAILURE!
> > java.lang.AssertionError: flush count expected:<1> but was:<2>
> > at
> >
> >
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> >
> >
> > Running com.datatorrent.stram.CustomControlTupleTest
> > 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
> > /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> > spooling.
> > 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
> > registered - Server started listening at /0:0:0:0:0:0:0:0:62181
> > 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
> > server started: localhost:62181
> > 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
> > Started container container-0
> > 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
> > Started container container-1
> > 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
> > Started container container-2
> > 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Entering heartbeat loop..
> > 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Entering heartbeat loop..
> > 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Entering heartbeat loop..
> > 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> > 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> > 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> > 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator
> activate
> > - Catching up from 1529458489500 to 1529458490463
> > 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
> > type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
> > partitions=null, bufferSize=1024}
> > 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=1.out.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=2.output.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
> > type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
> > partitions=null, bufferSize=1024}
> > 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run -
> Stopping
> > on exit condition
> > 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Exiting heartbeat loop..
> > 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Exiting heartbeat loop..
> > 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Exiting heartbeat loop..
> > 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
> > Container container-2 terminating.
> > 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
> > Container container-1 terminating.
> > 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
> > Container container-0 terminating.
> > 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
> > Removing ln LogicalNode@7d88b4a4identifier
> > =tcp://localhost:62181/2.output.1,
> > upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092
> > {identifier=2.output.1,
> > data=1048576, readingOffset=0, writingOffset=481,
> > starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@4dca4fb0[identifier=2.output.1]
> > 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
> > Removing ln LogicalNode@3cb5be9fidentifier
> =tcp://localhost:62181/1.out.1,
> > upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4
> > {identifier=1.out.1,
> > data=1048576, readingOffset=0, writingOffset=481,
> > starting_window=5b29af3900000001, ending_window=5b29af3900000005,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@49665770[identifier=1.out.1]
> > 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
> > Server stopped listening at /0:0:0:0:0:0:0:0:62181
> > 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
> > Application finished.
> > 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest
> testApp -
> > Control Tuples received 3 expected 3
> > 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
> > /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> > spooling.
> > 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
> > registered - Server started listening at /0:0:0:0:0:0:0:0:62186
> > 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
> > server started: localhost:62186
> > 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
> > Started container container-0
> > 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Entering heartbeat loop..
> > 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
> >
> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
> >
> >
> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
> > 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator
> activate
> > - Catching up from 1529458491500 to 1529458492630
> > 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run -
> Stopping
> > on exit condition
> > 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Exiting heartbeat loop..
> > 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
> > Container container-0 terminating.
> > 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
> > Server stopped listening at /0:0:0:0:0:0:0:0:62186
> > 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
> > Application finished.
> > 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest
> testApp -
> > Control Tuples received 3 expected 3
> > 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
> > /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> > spooling.
> > 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
> > registered - Server started listening at /0:0:0:0:0:0:0:0:62187
> > 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
> > server started: localhost:62187
> > 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
> > Started container container-0
> > 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
> > Started container container-1
> > 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Entering heartbeat loop..
> > 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
> > Started container container-2
> > 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Entering heartbeat loop..
> > 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
> > Started container container-3
> > 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Entering heartbeat loop..
> > 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
> > container-3 msg: [container-3] Entering heartbeat loop..
> > 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> > 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> > 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> >
> >
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
> >
> >
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> > 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> > 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator
> activate
> > - Catching up from 1529458493500 to 1529458494852
> > 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=1.out.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=2.output.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=3.output.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> > type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> > partitions=[1], bufferSize=1024}
> > 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
> > type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> > partitions=[0], bufferSize=1024}
> > 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
> > type=ProcessorToReceiver/5.<merge#output>(3.output),
> > upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> > 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
> > type=ProcessorToReceiver/5.<merge#output>(2.output),
> > upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> > 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run -
> Stopping
> > on exit condition
> > 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
> > container-3 msg: [container-3] Exiting heartbeat loop..
> > 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Exiting heartbeat loop..
> > 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Exiting heartbeat loop..
> > 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Exiting heartbeat loop..
> > 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
> > Container container-1 terminating.
> > 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
> > Container container-3 terminating.
> > 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> > Removing ln LogicalNode@5dbf681cidentifier
> > =tcp://localhost:62187/3.output.1,
> > upstream=3.output.1,
> group=ProcessorToReceiver/5.<merge#output>(3.output),
> > partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815
> > {identifier=3.output.1,
> > data=1048576, readingOffset=0, writingOffset=487,
> > starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@46bbe39d[identifier=3.output.1]
> > 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> > Removing ln LogicalNode@7fb3226aidentifier
> =tcp://localhost:62187/1.out.1,
> > upstream=1.out.1, group=genToProcessor/2.input,
> > partitions=[BitVector{mask=1, bits=0}],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> > {identifier=1.out.1,
> > data=1048576, readingOffset=0, writingOffset=487,
> > starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> > refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@7a566f6b[identifier=1.out.1]
> > 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
> > Removing ln LogicalNode@2551b8a4identifier
> =tcp://localhost:62187/1.out.1,
> > upstream=1.out.1, group=genToProcessor/3.input,
> > partitions=[BitVector{mask=1, bits=1}],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
> > {identifier=1.out.1,
> > data=1048576, readingOffset=0, writingOffset=487,
> > starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@7a566f6b[identifier=1.out.1]
> > 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
> > Container container-2 terminating.
> > 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
> > Removing ln LogicalNode@2e985326identifier
> > =tcp://localhost:62187/2.output.1,
> > upstream=2.output.1,
> group=ProcessorToReceiver/5.<merge#output>(2.output),
> > partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b
> > {identifier=2.output.1,
> > data=1048576, readingOffset=0, writingOffset=487,
> > starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@3de15cc7[identifier=2.output.1]
> > 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
> > Container container-0 terminating.
> > 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
> > Server stopped listening at /0:0:0:0:0:0:0:0:62187
> > 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
> > Application finished.
> > 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest
> testApp -
> > Control Tuples received 3 expected 3
> > 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
> > /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> > spooling.
> > 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
> > registered - Server started listening at /0:0:0:0:0:0:0:0:62195
> > 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
> > server started: localhost:62195
> > 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
> > Started container container-0
> > 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Entering heartbeat loop..
> > 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
> >
> >
> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> >
> >
> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
> > 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator
> activate
> > - Catching up from 1529458495500 to 1529458497040
> > 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run -
> Stopping
> > on exit condition
> > 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
> > Container thread container-0 is still alive
> > 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
> > Server stopped listening at /0:0:0:0:0:0:0:0:62195
> > 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
> > Application finished.
> > 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest
> testApp -
> > Control Tuples received 4 expected 4
> > 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Exiting heartbeat loop..
> > 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
> > Container container-0 terminating.
> > 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641
> as
> > the basepath for checkpointing.
> > 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
> > /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
> > spooling.
> > 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
> > registered - Server started listening at /0:0:0:0:0:0:0:0:62196
> > 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
> > server started: localhost:62196
> > 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
> > Started container container-0
> > 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Entering heartbeat loop..
> > 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
> > Started container container-1
> > 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
> > Started container container-2
> > 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Entering heartbeat loop..
> > 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
> > Started container container-3
> > 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Entering heartbeat loop..
> > 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
> > container-3 msg: [container-3] Entering heartbeat loop..
> > 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
> > 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> > 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
> >
> >
> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
> >
> >
> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
> > 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Deploy request:
> >
> >
> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
> > 0,
> >
> >
> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
> > 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator
> activate
> > - Catching up from 1529458499500 to 1529458500273
> > 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=1.out.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=3.output.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> > type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
> > partitions=[0], bufferSize=1024}
> > 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received publisher request: PublishRequestTuple{version=1.0,
> > identifier=2.output.1, windowId=ffffffffffffffff}
> > 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
> > type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
> > partitions=[1], bufferSize=1024}
> > 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
> > type=ProcessorToReceiver/5.<merge#output>(3.output),
> > upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
> > 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
> > onMessage - Received subscriber request:
> SubscribeRequestTuple{version=1.0,
> > identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
> > type=ProcessorToReceiver/5.<merge#output>(2.output),
> > upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
> > 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run -
> Stopping
> > on exit condition
> > 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
> > container-2 msg: [container-2] Exiting heartbeat loop..
> > 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
> > container-0 msg: [container-0] Exiting heartbeat loop..
> > 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
> > container-3 msg: [container-3] Exiting heartbeat loop..
> > 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
> > processHeartbeatResponse - Received shutdown request type ABORT
> > 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
> > container-1 msg: [container-1] Exiting heartbeat loop..
> > 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
> > Container container-3 terminating.
> > 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
> > Removing ln LogicalNode@d80a435identifier
> > =tcp://localhost:62196/3.output.1,
> > upstream=3.output.1,
> group=ProcessorToReceiver/5.<merge#output>(3.output),
> > partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212
> > {identifier=3.output.1,
> > data=1048576, readingOffset=0, writingOffset=36,
> > starting_window=5b29af4300000001, ending_window=5b29af4300000005,
> > refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
> > DataList@1684ecee[identifier=3.output.1]
> > 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
> > Container container-2 terminating.
> > 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
> > Container container-1 terminating.
> > 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
> > Container container-0 terminating.
> > 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
> > Removing ln LogicalNode@75d245a1identifier
> > =tcp://localhost:62196/2.output.1,
> > upstream=2.output.1,
> group=ProcessorToReceiver/5.<merge#output>(2.output),
> > partitions=[],
> >
> >
> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
> > {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b
> > {identifier=2.output.1,
> > data=1048576, readingOffset=0, w



--

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group
Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Vlad Rozov-2
In reply to this post by Pramod Immaneni-3
AFAIK, those RPC calls are expected to fail (test for RPC timeout). The
test fails due to the change in Kryo flush behavior, please see my other
response.

Thank you,

Vlad

On 6/20/18 09:09, Pramod Immaneni wrote:

> There are hadoop IPC calls are failing possibly because of its reliance on
> kryo for serializing the payload and there is some incompatibility with the
> new version. I will dig in more to see what is going on.
>
> On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <[hidden email]> wrote:
>
>> Pramod,
>>
>> Thanks for taking the time to help!
>>
>> Here is the output (just failed parts) when running full install (clean
>> install -X) on the Master branch:
>>
>> Running com.datatorrent.stram.StramRecoveryTest
>> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
>> testRpcFailover - Mock server listening at macbook-pro-6.lan/
>> 192.168.87.125:62154
>> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
>> Giving up RPC connection recovery after 507 ms
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
>> output error
>> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 994 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 386 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
>> output error
>> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
>> Giving up RPC connection recovery after 506 ms
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
>> output error
>> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 995 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 391 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> 192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>>
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>>
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>>
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>>
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>>
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>>
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>>
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>>
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>>
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net
>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net
>> .SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>>
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call
>> reportError(containerId, null, timeout, null), rpc version=2, client
>> version=201208081755, methodsFingerPrint=-1300451462 from
>> 192.168.87.125:62160 Call#142 Retry#0: output error
>> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 32 ms
>> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 28 ms
>> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 22 ms
>> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 29 ms
>> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>>
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
>> the basepath for checkpointing.
>> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
>> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
>> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
>> 0.111 sec  <<< FAILURE!
>> java.lang.AssertionError: flush count expected:<1> but was:<2>
>> at
>>
>> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>>
>>
>> Running com.datatorrent.stram.CustomControlTupleTest
>> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
>> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62181
>> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458489500 to 1529458490463
>> 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
>> Removing ln LogicalNode@7d88b4a4identifier
>> =tcp://localhost:62181/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092
>> {identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@4dca4fb0[identifier=2.output.1]
>> 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
>> Removing ln LogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@49665770[identifier=1.out.1]
>> 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62181
>> 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62186
>> 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62186
>> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
>> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>>
>> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458491500 to 1529458492630
>> 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62186
>> 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62187
>> 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62187
>> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
>> Started container container-3
>> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Entering heartbeat loop..
>> 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>>
>> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>>
>> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
>> 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458493500 to 1529458494852
>> 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=3.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[1], bufferSize=1024}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[0], bufferSize=1024}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(3.output),
>> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(2.output),
>> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
>> Container container-3 terminating.
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing ln LogicalNode@5dbf681cidentifier
>> =tcp://localhost:62187/3.output.1,
>> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
>> partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815
>> {identifier=3.output.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@46bbe39d[identifier=3.output.1]
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing ln LogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input,
>> partitions=[BitVector{mask=1, bits=0}],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7a566f6b[identifier=1.out.1]
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing ln LogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
>> upstream=1.out.1, group=genToProcessor/3.input,
>> partitions=[BitVector{mask=1, bits=1}],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7a566f6b[identifier=1.out.1]
>> 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
>> Removing ln LogicalNode@2e985326identifier
>> =tcp://localhost:62187/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
>> partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b
>> {identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@3de15cc7[identifier=2.output.1]
>> 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62187
>> 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62195
>> 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62195
>> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>>
>> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>>
>> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
>> 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458495500 to 1529458497040
>> 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
>> Container thread container-0 is still alive
>> 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62195
>> 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 4 expected 4
>> 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62196
>> 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62196
>> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
>> Started container container-3
>> 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Entering heartbeat loop..
>> 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>>
>> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>>
>> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
>> 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458499500 to 1529458500273
>> 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=3.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[0], bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[1], bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(3.output),
>> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(2.output),
>> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
>> Container container-3 terminating.
>> 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
>> Removing ln LogicalNode@d80a435identifier
>> =tcp://localhost:62196/3.output.1,
>> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
>> partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212
>> {identifier=3.output.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@1684ecee[identifier=3.output.1]
>> 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing ln LogicalNode@75d245a1identifier
>> =tcp://localhost:62196/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
>> partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b
>> {identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@379bd431[identifier=2.output.1]
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing ln LogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input,
>> partitions=[BitVector{mask=1, bits=0}],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@47bc3c23[identifier=1.out.1]
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing ln LogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
>> upstream=1.out.1, group=genToProcessor/3.input,
>> partitions=[BitVector{mask=1, bits=1}],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@47bc3c23[identifier=1.out.1]
>> 2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62196
>> 2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 0 expected 1
>> 2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
>> the basepath for checkpointing.
>> 2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62204
>> 2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62204
>> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>>
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>>
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458501500 to 1529458502467
>> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
>> Removing ln LogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947
>> {identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@67d38a09[identifier=1.out.1]
>> 2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
>> Removing ln LogicalNode@1badfe12identifier
>> =tcp://localhost:62204/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>>
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266
>> {identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7afd481[identifier=2.output.1]
>> 2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62204
>> 2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
>> the basepath for checkpointing.
>> 2018-06-19 21:35

Reply | Threaded
Open this post in threaded view
|

Re: Branch 3.7.0 failing install related to Kryo version...perhaps

Vlad Rozov-2
In reply to this post by Vlad Rozov-2
After removing flush override all tests pass on my machine, so it seems
to be the only issue with the Kryo upgrade.

Thank you,

Vlad

On 6/20/18 09:06, Vlad Rozov wrote:

> At minimum one problem is caused by a change in Kryo behavior. Please
> take a look at Journal.setOutputStream(). There is a workaround for
> Kryo flush() that is not needed anymore:
>
> if (out !=null) {
>    output =new Output(4096, -1)
>    {
>      @Override public void flush()throws KryoException
>      {
>        super.flush(); // Kryo does not flush internal output stream during flush. We need
> to flush it explicitly. try {
>          getOutputStream().flush(); }catch (IOException e) {
>          throw new KryoException(e); }
>      }
>    };
> Thank you,
>
> Vlad
> On 6/19/18 18:53, Aaron Bossert wrote:
>> Pramod,
>>
>> Thanks for taking the time to help!
>>
>> Here is the output (just failed parts) when running full install (clean
>> install -X) on the Master branch:
>>
>> Running com.datatorrent.stram.StramRecoveryTest
>> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
>> testRpcFailover - Mock server listening at macbook-pro-6.lan/
>> 192.168.87.125:62154
>> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy invoke -
>> Giving up RPC connection recovery after 507 ms
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62155
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62155 Call#136 Retry#0:
>> output error
>> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 994 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62156
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 386 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62157
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62156 Call#137 Retry#0:
>> output error
>> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy invoke -
>> Giving up RPC connection recovery after 506 ms
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62159
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call log(containerId,
>> timeout), rpc version=2, client version=201208081755,
>> methodsFingerPrint=-1300451462 from 192.168.87.125:62159 Call#141 Retry#0:
>> output error
>> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 995 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62160
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy invoke - RPC
>> failure, will retry after 100 ms (remaining 391 ms)
>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/192.168.87.125
>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> java.net.SocketTimeoutException: 500 millis timeout while waiting for
>> channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details see:
>> http://wiki.apache.org/hadoop/SocketTimeout
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
>> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> at
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> at
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> at
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> at
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> at
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> at
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> at
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> at
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> Caused by: java.net.SocketTimeoutException: 500 millis timeout while
>> waiting for channel to be ready for read. ch :
>> java.nio.channels.SocketChannel[connected local=/192.168.87.125:62161
>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> at
>> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:161)
>> at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:131)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> at
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> at
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN  ipc.Server
>> processResponse - IPC Server handler 0 on 62154, call
>> reportError(containerId, null, timeout, null), rpc version=2, client
>> version=201208081755, methodsFingerPrint=-1300451462 from
>> 192.168.87.125:62160 Call#142 Retry#0: output error
>> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 32 ms
>> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 28 ms
>> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 22 ms
>> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient copyInitialState -
>> Copying initial state took 29 ms
>> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
>> Creating
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan <init> -
>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
>> locality contraint due to insufficient resources.
>> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003 as
>> the basepath for checkpointing.
>> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.143 sec
>> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
>> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
>> 0.111 sec  <<< FAILURE!
>> java.lang.AssertionError: flush count expected:<1> but was:<2>
>> at
>> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>>
>>
>> Running com.datatorrent.stram.CustomControlTupleTest
>> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
>> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62181
>> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:49,452 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:49,452 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:49,452 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:50,460 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:34:50,460 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:34:50,460 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:50,463 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458489500 to 1529458490463
>> 2018-06-19 21:34:50,465 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62181/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:50,466 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62181/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:51,458 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:51,458 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,458 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:51,458 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,458 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:34:51,461 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:34:51,467 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:34:51,467 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:51,467 [ServerHelper-86-1] INFO  server.Server run -
>> Removing lnLogicalNode@7d88b4a4identifier=tcp://localhost:62181/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@35d66f18
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@d43c092{identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@4dca4fb0[identifier=2.output.1]
>> 2018-06-19 21:34:51,468 [ServerHelper-86-1] INFO  server.Server run -
>> Removing lnLogicalNode@3cb5be9fidentifier=tcp://localhost:62181/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@5c9a41d0
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@5a324bf4{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af3900000001, ending_window=5b29af3900000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@49665770[identifier=1.out.1]
>> 2018-06-19 21:34:51,469 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62181
>> 2018-06-19 21:34:51,469 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:51,469 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:51,492 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp5496551078484285394 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:51,623 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:51,624 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62186
>> 2018-06-19 21:34:51,624 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62186
>> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:51,624 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:52,628 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
>> OperatorDeployInfo[id=2,name=process,type=OIO,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>> OperatorDeployInfo[id=3,name=receiver,type=OIO,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=THREAD_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:34:52,630 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458491500 to 1529458492630
>> 2018-06-19 21:34:53,628 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:53,629 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:53,630 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:53,640 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:53,641 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62186
>> 2018-06-19 21:34:53,642 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:53,642 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:53,659 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp2212795894390935125 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:53,844 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:53,844 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62187
>> 2018-06-19 21:34:53,844 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62187
>> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:53,845 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:53,845 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster run -
>> Started container container-3
>> 2018-06-19 21:34:53,845 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:53,845 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Entering heartbeat loop..
>> 2018-06-19 21:34:54,850 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,850 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,850 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
>> 2018-06-19 21:34:54,850 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:34:54,852 [container-3] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458493500 to 1529458494852
>> 2018-06-19 21:34:54,855 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,857 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=3.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[1], bufferSize=1024}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[0], bufferSize=1024}
>> 2018-06-19 21:34:54,858 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/3.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(3.output),
>> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:54,859 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62187/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(2.output),
>> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:34:55,851 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:55,852 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:55,852 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,852 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:34:55,857 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:34:55,858 [container-3] INFO  stram.StramLocalCluster run -
>> Container container-3 terminating.
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing lnLogicalNode@5dbf681cidentifier=tcp://localhost:62187/3.output.1,
>> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
>> partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6244ac9
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@60e28815{identifier=3.output.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@46bbe39d[identifier=3.output.1]
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing lnLogicalNode@7fb3226aidentifier=tcp://localhost:62187/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input,
>> partitions=[BitVector{mask=1, bits=0}],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2ad6890f
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7a566f6b[identifier=1.out.1]
>> 2018-06-19 21:34:55,858 [ServerHelper-92-1] INFO  server.Server run -
>> Removing lnLogicalNode@2551b8a4identifier=tcp://localhost:62187/1.out.1,
>> upstream=1.out.1, group=genToProcessor/3.input,
>> partitions=[BitVector{mask=1, bits=1}],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@6368ccb7
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@e00fc9e{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7a566f6b[identifier=1.out.1]
>> 2018-06-19 21:34:55,862 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:34:55,862 [ServerHelper-92-1] INFO  server.Server run -
>> Removing lnLogicalNode@2e985326identifier=tcp://localhost:62187/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
>> partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@7d68bf24
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@7405581b{identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=487,
>> starting_window=5b29af3d00000001, ending_window=5b29af3d00000006,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@3de15cc7[identifier=2.output.1]
>> 2018-06-19 21:34:55,862 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:55,864 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62187
>> 2018-06-19 21:34:55,864 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:55,864 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:34:55,883 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp8804999206923662400 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:56,032 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:56,032 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62195
>> 2018-06-19 21:34:56,032 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62195
>> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:56,033 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:57,038 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]]]
>> 2018-06-19 21:34:57,040 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458495500 to 1529458497040
>> 2018-06-19 21:34:58,042 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:34:59,045 [main] WARN  stram.StramLocalCluster run -
>> Container thread container-0 is still alive
>> 2018-06-19 21:34:59,047 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62195
>> 2018-06-19 21:34:59,047 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:34:59,047 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:34:59,047 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 4 expected 4
>> 2018-06-19 21:34:59,047 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:34:59,057 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:34:59,064 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp4046668014410536641 as
>> the basepath for checkpointing.
>> 2018-06-19 21:34:59,264 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:34:59,264 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62196
>> 2018-06-19 21:34:59,265 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62196
>> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:34:59,265 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:34:59,265 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:34:59,265 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:34:59,266 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster run -
>> Started container container-3
>> 2018-06-19 21:34:59,266 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:34:59,266 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Entering heartbeat loop..
>> 2018-06-19 21:35:00,270 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,270 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[0]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,271 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=4,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=5,sourcePortName=outputPort,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]],
>> OperatorDeployInfo.UnifierDeployInfo[id=5,name=process.output#unifier,type=UNIFIER,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>],
>> OperatorDeployInfo.InputDeployInfo[portName=<merge#output>,streamId=ProcessorToReceiver,sourceNodeId=3,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=outputPort,streamId=ProcessorToReceiver,bufferServer=<null>]]]]
>> 2018-06-19 21:35:00,270 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=3,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=1,partitionKeys=[1]]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:00,273 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458499500 to 1529458500273
>> 2018-06-19 21:35:00,274 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,276 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=3.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,277 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[0], bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/3.input, upstreamIdentifier=1.out.1, mask=1,
>> partitions=[1], bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/3.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(3.output),
>> upstreamIdentifier=3.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:00,278 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62196/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/5.<merge#output>(2.output),
>> upstreamIdentifier=2.output.1, mask=0, partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:01,273 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:35:01,273 [container-3] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,273 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,273 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,274 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-3] INFO  stram.StramLocalCluster log -
>> container-3 msg: [container-3] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,273 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:01,274 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:35:01,279 [container-3] INFO  stram.StramLocalCluster run -
>> Container container-3 terminating.
>> 2018-06-19 21:35:01,279 [ServerHelper-98-1] INFO  server.Server run -
>> Removing lnLogicalNode@d80a435identifier=tcp://localhost:62196/3.output.1,
>> upstream=3.output.1, group=ProcessorToReceiver/5.<merge#output>(3.output),
>> partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2f53b5e7
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@1e2d4212{identifier=3.output.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@1684ecee[identifier=3.output.1]
>> 2018-06-19 21:35:01,285 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:35:01,285 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:35:01,286 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing lnLogicalNode@75d245a1identifier=tcp://localhost:62196/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/5.<merge#output>(2.output),
>> partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@719c5cd2
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@43d2338b{identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@379bd431[identifier=2.output.1]
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing lnLogicalNode@54c0b0d5identifier=tcp://localhost:62196/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input,
>> partitions=[BitVector{mask=1, bits=0}],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@649adb0
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=3, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@47bc3c23[identifier=1.out.1]
>> 2018-06-19 21:35:01,286 [ServerHelper-98-1] INFO  server.Server run -
>> Removing lnLogicalNode@2422ada2identifier=tcp://localhost:62196/1.out.1,
>> upstream=1.out.1, group=genToProcessor/3.input,
>> partitions=[BitVector{mask=1, bits=1}],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@2e6f42b9
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@15201b67{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=36,
>> starting_window=5b29af4300000001, ending_window=5b29af4300000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@47bc3c23[identifier=1.out.1]
>> 2018-06-19 21:35:01,287 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62196
>> 2018-06-19 21:35:01,287 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:35:01,288 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 0 expected 1
>> 2018-06-19 21:35:01,305 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp6727909541678525259 as
>> the basepath for checkpointing.
>> 2018-06-19 21:35:01,460 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:35:01,460 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62204
>> 2018-06-19 21:35:01,461 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62204
>> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:35:01,461 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:35:01,461 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:35:01,461 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:35:01,462 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:35:02,464 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:35:02,464 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=localhost]]]]
>> 2018-06-19 21:35:02,464 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=localhost]]]]
>> 2018-06-19 21:35:02,467 [container-2] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458501500 to 1529458502467
>> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62204/2.output.1, windowId=ffffffffffffffff,
>> type=ProcessorToReceiver/3.input, upstreamIdentifier=2.output.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:02,469 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=1.out.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received publisher request: PublishRequestTuple{version=1.0,
>> identifier=2.output.1, windowId=ffffffffffffffff}
>> 2018-06-19 21:35:02,470 [ProcessWideEventLoop] INFO  server.Server
>> onMessage - Received subscriber request: SubscribeRequestTuple{version=1.0,
>> identifier=tcp://localhost:62204/1.out.1, windowId=ffffffffffffffff,
>> type=genToProcessor/2.input, upstreamIdentifier=1.out.1, mask=0,
>> partitions=null, bufferSize=1024}
>> 2018-06-19 21:35:03,463 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:35:03,463 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,463 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,463 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,463 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:03,464 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:35:03,464 [container-2] INFO  stram.StramLocalCluster run -
>> Container container-2 terminating.
>> 2018-06-19 21:35:03,465 [ServerHelper-101-1] INFO  server.Server run -
>> Removing lnLogicalNode@5a90f429identifier=tcp://localhost:62204/1.out.1,
>> upstream=1.out.1, group=genToProcessor/2.input, partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@1fa9e9ce
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@6b00c947{identifier=1.out.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@67d38a09[identifier=1.out.1]
>> 2018-06-19 21:35:03,470 [container-1] INFO  stram.StramLocalCluster run -
>> Container container-1 terminating.
>> 2018-06-19 21:35:03,470 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:35:03,471 [ServerHelper-101-1] INFO  server.Server run -
>> Removing lnLogicalNode@1badfe12identifier=tcp://localhost:62204/2.output.1,
>> upstream=2.output.1, group=ProcessorToReceiver/3.input, partitions=[],
>> iterator=com.datatorrent.bufferserver.internal.DataList$DataListIterator@3abf0b66
>> {da=com.datatorrent.bufferserver.internal.DataList$Block@6a887266{identifier=2.output.1,
>> data=1048576, readingOffset=0, writingOffset=481,
>> starting_window=5b29af4500000001, ending_window=5b29af4500000005,
>> refCount=2, uniqueIdentifier=0, next=null, future=null}}} from dl
>> DataList@7afd481[identifier=2.output.1]
>> 2018-06-19 21:35:03,472 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62204
>> 2018-06-19 21:35:03,472 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:35:03,472 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:35:03,489 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp1123378605276624191 as
>> the basepath for checkpointing.
>> 2018-06-19 21:35:03,633 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:35:03,633 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62209
>> 2018-06-19 21:35:03,633 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62209
>> 2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:35:03,634 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:35:04,641 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=output,streamId=ProcessorToReceiver,bufferServer=<null>]]],
>> OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferServer=<null>]]],
>> OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=CONTAINER_LOCAL,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:35:04,643 [container-0] INFO  engine.WindowGenerator activate
>> - Catching up from 1529458503500 to 1529458504643
>> 2018-06-19 21:35:05,640 [main] INFO  stram.StramLocalCluster run - Stopping
>> on exit condition
>> 2018-06-19 21:35:05,641 [container-0] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Received shutdown request type ABORT
>> 2018-06-19 21:35:05,641 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Exiting heartbeat loop..
>> 2018-06-19 21:35:05,653 [container-0] INFO  stram.StramLocalCluster run -
>> Container container-0 terminating.
>> 2018-06-19 21:35:05,655 [ProcessWideEventLoop] INFO  server.Server run -
>> Server stopped listening at /0:0:0:0:0:0:0:0:62209
>> 2018-06-19 21:35:05,655 [main] INFO  stram.StramLocalCluster run -
>> Application finished.
>> 2018-06-19 21:35:05,655 [main] INFO  stram.CustomControlTupleTest testApp -
>> Control Tuples received 3 expected 3
>> 2018-06-19 21:35:05,672 [main] INFO  util.AsyncFSStorageAgent save - using
>> /Users/mbossert/testIdea/apex-core/engine/target/chkp9044425874557598001 as
>> the basepath for checkpointing.
>> 2018-06-19 21:35:05,819 [main] INFO  storage.DiskStorage <init> - using
>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath for
>> spooling.
>> 2018-06-19 21:35:05,819 [ProcessWideEventLoop] INFO  server.Server
>> registered - Server started listening at /0:0:0:0:0:0:0:0:62211
>> 2018-06-19 21:35:05,819 [main] INFO  stram.StramLocalCluster run - Buffer
>> server started: localhost:62211
>> 2018-06-19 21:35:05,819 [container-0] INFO  stram.StramLocalCluster run -
>> Started container container-0
>> 2018-06-19 21:35:05,819 [container-1] INFO  stram.StramLocalCluster run -
>> Started container container-1
>> 2018-06-19 21:35:05,820 [container-0] INFO  stram.StramLocalCluster log -
>> container-0 msg: [container-0] Entering heartbeat loop..
>> 2018-06-19 21:35:05,820 [container-1] INFO  stram.StramLocalCluster log -
>> container-1 msg: [container-1] Entering heartbeat loop..
>> 2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster run -
>> Started container container-2
>> 2018-06-19 21:35:05,820 [container-2] INFO  stram.StramLocalCluster log -
>> container-2 msg: [container-2] Entering heartbeat loop..
>> 2018-06-19 21:35:06,826 [container-1] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> 2018-06-19 21:35:06,826 [container-2] INFO  engine.StreamingContainer
>> processHeartbeatResponse - Deploy request:
>> [OperatorDeployInfo[id=2,name=process,type=GENERIC,checkpoint={ffffffffffffffff,
>> 0,
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=genToProcessor,sourceNodeId=1,sourcePortName=out,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=