See
<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/1159/display/redirect>
Changes:
------------------------------------------
[...truncated 26.60 MB...]
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:478)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:478)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:493)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1600(DefaultJobBundleFactory.java:431)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:168)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:258)
at
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
at
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.$closeResource(ExecutableStageDoFnOperator.java:455)
at
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.dispose(ExecutableStageDoFnOperator.java:483)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.tryDisposeAllOperators(StreamTask.java:562)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:443)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(2/2)] INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend -
Initializing heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing
heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing
heap keyed state backend with stream factory.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel
for localhost:43055.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with
unbounded number of workers.
[grpc-default-executor-2] INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService -
Beam Fn Control client connected with id 32-1
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group}
(1/2) (48af58c1612576e254573776c7967d70) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[4]assert_that/{Create, Group} (1/2) (48af58c1612576e254573776c7967d70).
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [4]assert_that/{Create, Group} (1/2)
(48af58c1612576e254573776c7967d70) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[4]assert_that/{Create, Group} 48af58c1612576e254573776c7967d70.
[[1]Create/FlatMap(<lambda at core.py:2646>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at
core.py:2646>) (1/2) (23a3322a96e5e81d6faddc3ca66a7076) switched from RUNNING
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2646>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[1]Create/FlatMap(<lambda at core.py:2646>) (1/2)
(23a3322a96e5e81d6faddc3ca66a7076).
[[1]Create/FlatMap(<lambda at core.py:2646>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [1]Create/FlatMap(<lambda at core.py:2646>) (1/2)
(23a3322a96e5e81d6faddc3ca66a7076) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[1]Create/FlatMap(<lambda at core.py:2646>)
23a3322a96e5e81d6faddc3ca66a7076.INFO:apache_beam.runners.worker.sdk_worker:Creating
insecure state channel for localhost:32783.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[4]assert_that/{Create, Group} (1/2) (48af58c1612576e254573776c7967d70)
switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[1]Create/FlatMap(<lambda at core.py:2646>) (1/2)
(23a3322a96e5e81d6faddc3ca66a7076) switched from RUNNING to FINISHED.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for
localhost:43081
[grpc-default-executor-2] INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client
connected.
[[1]Create/FlatMap(<lambda at core.py:2646>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at
core.py:2646>) (2/2) (ceaf3b36b47406e7b3a5fd2103e522de) switched from RUNNING
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2646>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[1]Create/FlatMap(<lambda at core.py:2646>) (2/2)
(ceaf3b36b47406e7b3a5fd2103e522de).
[[1]Create/FlatMap(<lambda at core.py:2646>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [1]Create/FlatMap(<lambda at core.py:2646>) (2/2)
(ceaf3b36b47406e7b3a5fd2103e522de) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[1]Create/FlatMap(<lambda at core.py:2646>) ceaf3b36b47406e7b3a5fd2103e522de.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[1]Create/FlatMap(<lambda at core.py:2646>) (2/2)
(ceaf3b36b47406e7b3a5fd2103e522de) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group}
(2/2) (e4276e41968333b370214b251fb4893c) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[4]assert_that/{Create, Group} (2/2) (e4276e41968333b370214b251fb4893c).
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [4]assert_that/{Create, Group} (2/2)
(e4276e41968333b370214b251fb4893c) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[4]assert_that/{Create, Group} e4276e41968333b370214b251fb4893c.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[4]assert_that/{Create, Group} (2/2) (e4276e41968333b370214b251fb4893c)
switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at
fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(033c61b1e95ee6df9de16de88b313dd4) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(033c61b1e95ee6df9de16de88b313dd4).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(033c61b1e95ee6df9de16de88b313dd4) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task [3]{Create,
Map(<lambda at fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem 033c61b1e95ee6df9de16de88b313dd4.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create,
Map(<lambda at fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem (1/2) (033c61b1e95ee6df9de16de88b313dd4) switched from RUNNING
to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at
fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(ad46ddff0d71bf579c231ede77bd9ce0) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(ad46ddff0d71bf579c231ede77bd9ce0).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:604>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(ad46ddff0d71bf579c231ede77bd9ce0) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task [3]{Create,
Map(<lambda at fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem ad46ddff0d71bf579c231ede77bd9ce0.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that} (1/2)
(a7d2b7118ad3b4a2fbf38300556f7a2b) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(1/2) (a7d2b7118ad3b4a2fbf38300556f7a2b).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task GroupByKey -> [5]{Map(<lambda at
fn_api_runner_test.py:607>), assert_that} (1/2)
(a7d2b7118ad3b4a2fbf38300556f7a2b) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
a7d2b7118ad3b4a2fbf38300556f7a2b.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
ToKeyedWorkItem (1/2) (98f56b48cfaee320febc5d4787ec245b) switched from RUNNING
to FINISHED.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for ToKeyedWorkItem (1/2)
(98f56b48cfaee320febc5d4787ec245b).
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create,
Map(<lambda at fn_api_runner_test.py:604>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem (2/2) (ad46ddff0d71bf579c231ede77bd9ce0) switched from RUNNING
to FINISHED.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (1/2)
(98f56b48cfaee320febc5d4787ec245b) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem
98f56b48cfaee320febc5d4787ec245b.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that} (1/2)
(a7d2b7118ad3b4a2fbf38300556f7a2b) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (1/2)
(98f56b48cfaee320febc5d4787ec245b) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that} (2/2)
(3e9780f77f4ef49c3f82b2811dea1f03) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(2/2) (3e9780f77f4ef49c3f82b2811dea1f03).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task GroupByKey -> [5]{Map(<lambda at
fn_api_runner_test.py:607>), assert_that} (2/2)
(3e9780f77f4ef49c3f82b2811dea1f03) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that}
3e9780f77f4ef49c3f82b2811dea1f03.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
ToKeyedWorkItem (2/2) (500d9fbb1307568fdaf78d6effe44348) switched from RUNNING
to FINISHED.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for ToKeyedWorkItem (2/2)
(500d9fbb1307568fdaf78d6effe44348).
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (2/2)
(500d9fbb1307568fdaf78d6effe44348) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem
500d9fbb1307568fdaf78d6effe44348.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey
-> [3]assert_that/{Group, Unkey, Match} (2/2)
(1317f09363f6e44faeec39c5c31fb585) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)
(1317f09363f6e44faeec39c5c31fb585).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task assert_that/Group/GroupByKey ->
[3]assert_that/{Group, Unkey, Match} (2/2) (1317f09363f6e44faeec39c5c31fb585)
[FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match}
1317f09363f6e44faeec39c5c31fb585.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:607>), assert_that} (2/2)
(3e9780f77f4ef49c3f82b2811dea1f03) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (2/2)
(500d9fbb1307568fdaf78d6effe44348) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)
(1317f09363f6e44faeec39c5c31fb585) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory -
Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:39621"
INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight
requests to complete
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for
unknown endpoint.
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data
plane.
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 419, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/build/gradleenv/2022703439/lib/python3.5/site-packages/grpc/_channel.py",>
line 416, in __next__
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/build/gradleenv/2022703439/lib/python3.5/site-packages/grpc/_channel.py",>
line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1581014859.222440410","description":"Error received from peer
ipv4:127.0.0.1:43081","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data
channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state
handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey
-> [3]assert_that/{Group, Unkey, Match} (1/2)
(9468177719fe666d74b9cc22a1a2cbc3) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)
(9468177719fe666d74b9cc22a1a2cbc3).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task assert_that/Group/GroupByKey ->
[3]assert_that/{Group, Unkey, Match} (1/2) (9468177719fe666d74b9cc22a1a2cbc3)
[FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match}
9468177719fe666d74b9cc22a1a2cbc3.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)
(9468177719fe666d74b9cc22a1a2cbc3) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job
test_windowing_1581014857.5093632 (f72694b2da93aa8b94d8a3e66dc45ee4) switched
from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint
coordinator for job f72694b2da93aa8b94d8a3e66dc45ee4.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore -
Shutting down
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job
f72694b2da93aa8b94d8a3e66dc45ee4 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job
test_windowing_1581014857.5093632(f72694b2da93aa8b94d8a3e66dc45ee4).
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:0, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8132}, allocationId:
2fd2d76399105b4128d17197f81e3020, jobId: f72694b2da93aa8b94d8a3e66dc45ee4).
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection
6b2c978c10a668601e0fd2cb3ca28724: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect
job manager 8b1220dea7b6858c673c611bcc764679@akka://flink/user/jobmanager_63
for job f72694b2da93aa8b94d8a3e66dc45ee4 from the resource manager.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:1, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8132}, allocationId:
ec36978747065efad90b05c9628a1db4, jobId: f72694b2da93aa8b94d8a3e66dc45ee4).
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job
f72694b2da93aa8b94d8a3e66dc45ee4 from job leader monitoring.
[mini-cluster-io-thread-14] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job
f72694b2da93aa8b94d8a3e66dc45ee4 with leader id
8b1220dea7b6858c673c611bcc764679 lost leadership.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job f72694b2da93aa8b94d8a3e66dc45ee4.
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini
Cluster
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest
endpoint.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job f72694b2da93aa8b94d8a3e66dc45ee4.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to
job f72694b2da93aa8b94d8a3e66dc45ee4 because it is not registered.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job f72694b2da93aa8b94d8a3e66dc45ee4.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor
akka://flink/user/taskmanager_62.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager
connection 6b2c978c10a668601e0fd2cb3ca28724.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing
TaskExecutor connection 09291ef6-47a2-43a8-affd-6179cf2f43dc because: The
TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory /tmp/flink-io-1be6ce75-925e-4b05-9fe9-4090bf7d6fa8
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the
network environment and its components.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory
/tmp/flink-netty-shuffle-aa9add4b-dff7-4708-944f-11b7780638ad
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the
kvState service and its components.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.filecache.FileCache - removed file cache directory
/tmp/flink-dist-cache-fa18c8fe-7e8e-45fc-865e-41dc398028ab
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_62.
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down
cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing
the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl -
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.beam.runners.flink.metrics.FileReporter - wrote metrics to
/tmp/flinktest-conffit9plav/test-metrics.txt
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37381
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 198
msecs
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers :
MetricQueryResults(Counters(26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:607>)_23}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2646>)_4}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2646>)_27}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:607>)_23}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 5,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:604>)_17}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:607>)_23}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:607>)_23}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: 5,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:604>)_17}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: 1,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: 1,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2646>)_4}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: 1,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: 2,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: 2,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2646>)_27}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: 2,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2646>)_4}:
0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2646>)_4}:
0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: 2,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2646>)_27}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: 2,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: 2,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:604>)_17}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:604>)_17}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2646>)_27}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}:
0)Distributions(14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13,
count=1, min=13, max=13},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=70,
count=5, min=14, max=14},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=100,
count=5, min=20, max=20},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=140,
count=5, min=28, max=28},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=60,
count=2, min=29, max=31},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=70,
count=5, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19,
count=1, min=19, max=19},
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=70,
count=5, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15,
count=1, min=15, max=15},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=75,
count=2, min=36, max=39},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=15,
count=1, min=15, max=15},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=66,
count=2, min=32, max=34},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13,
count=1, min=13, max=13},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14,
count=1, min=14, max=14},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=47,
count=1, min=47, max=47},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=39,
count=1, min=39, max=39},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17,
count=1, min=17, max=17},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=46,
count=2, min=22, max=24},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=48,
count=2, min=23, max=25},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=52,
count=2, min=25, max=27},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=59,
count=1, min=59, max=59}))
[flink-runner-job-invoker] WARN
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed
to remove job staging directory for token
{"sessionId":"job_a92ae3dc-b52b-48f1-86d3-e616e74b15d0","basePath":"/tmp/flinktestoyyrynl8"}:
{}
java.io.FileNotFoundException:
/tmp/flinktestoyyrynl8/job_a92ae3dc-b52b-48f1-86d3-e616e74b15d0/MANIFEST (No
such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
at
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:245)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:246)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:112)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:98)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
self.run()
File "/usr/lib/python3.5/threading.py", line 862, in run
self._target(*self._args, **self._kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 434, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 419, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/build/gradleenv/2022703439/lib/python3.5/site-packages/grpc/_channel.py",>
line 416, in __next__
return self._next()
File
"<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/build/gradleenv/2022703439/lib/python3.5/site-packages/grpc/_channel.py",>
line 703, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1581014859.222440410","description":"Error received from peer
ipv4:127.0.0.1:43081","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:__main__:removing conf dir: /tmp/flinktest-conffit9plav
----------------------------------------------------------------------
Ran 78 tests in 146.302s
OK (skipped=14)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://builds.apache.org/job/beam_PostCommit_Python35_VR_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 55
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py35:flinkCompatibilityMatrixBatchLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 43m 41s
72 actionable tasks: 54 executed, 17 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/pwbkwnlnz7z5q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]