See
<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/351/display/redirect>
Changes:
------------------------------------------
[...truncated 24.94 MB...]
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:514)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at
org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory$1.close(ExternalEnvironmentFactory.java:155)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:476)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:476)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:491)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1800(DefaultJobBundleFactory.java:431)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:168)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:258)
at
org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44)
at
org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204)
at
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.$closeResource(ExecutableStageDoFnOperator.java:489)
at
org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.dispose(ExecutableStageDoFnOperator.java:489)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.tryDisposeAllOperators(StreamTask.java:562)
at
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:443)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing
heap keyed state backend with stream factory.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.state.heap.HeapKeyedStateBackend - Initializing
heap keyed state backend with stream factory.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel
for localhost:35001.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with
unbounded number of workers.
[grpc-default-executor-2] INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService -
Beam Fn Control client connected with id 31-1
[[1]Create/FlatMap(<lambda at core.py:2591>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at
core.py:2591>) (2/2) (e9e23092eb1f24fb4c7a800c66cb1419) switched from RUNNING
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2591>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[1]Create/FlatMap(<lambda at core.py:2591>) (2/2)
(e9e23092eb1f24fb4c7a800c66cb1419).
[[1]Create/FlatMap(<lambda at core.py:2591>) (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [1]Create/FlatMap(<lambda at core.py:2591>) (2/2)
(e9e23092eb1f24fb4c7a800c66cb1419) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[1]Create/FlatMap(<lambda at core.py:2591>) e9e23092eb1f24fb4c7a800c66cb1419.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[1]Create/FlatMap(<lambda at core.py:2591>) (2/2)
(e9e23092eb1f24fb4c7a800c66cb1419) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group}
(2/2) (d984c3b047928819d40fdbd226052524) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[4]assert_that/{Create, Group} (2/2) (d984c3b047928819d40fdbd226052524).
[[4]assert_that/{Create, Group} (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [4]assert_that/{Create, Group} (2/2)
(d984c3b047928819d40fdbd226052524) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[4]assert_that/{Create, Group} d984c3b047928819d40fdbd226052524.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[4]assert_that/{Create, Group} (2/2) (d984c3b047928819d40fdbd226052524)
switched from RUNNING to FINISHED.
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for
localhost:33199.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for
localhost:38201
[grpc-default-executor-2] INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client
connected.
[[1]Create/FlatMap(<lambda at core.py:2591>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at
core.py:2591>) (1/2) (f97c09e0051cac4c269ffb6a8d08ef5b) switched from RUNNING
to FINISHED.
[[1]Create/FlatMap(<lambda at core.py:2591>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[1]Create/FlatMap(<lambda at core.py:2591>) (1/2)
(f97c09e0051cac4c269ffb6a8d08ef5b).
[[1]Create/FlatMap(<lambda at core.py:2591>) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [1]Create/FlatMap(<lambda at core.py:2591>) (1/2)
(f97c09e0051cac4c269ffb6a8d08ef5b) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[1]Create/FlatMap(<lambda at core.py:2591>) f97c09e0051cac4c269ffb6a8d08ef5b.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[1]Create/FlatMap(<lambda at core.py:2591>) (1/2)
(f97c09e0051cac4c269ffb6a8d08ef5b) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group}
(1/2) (3ad278f50070c100233ffd10b0d674f5) switched from RUNNING to FINISHED.
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[4]assert_that/{Create, Group} (1/2) (3ad278f50070c100233ffd10b0d674f5).
[[4]assert_that/{Create, Group} (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [4]assert_that/{Create, Group} (1/2)
(3ad278f50070c100233ffd10b0d674f5) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
[4]assert_that/{Create, Group} 3ad278f50070c100233ffd10b0d674f5.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
[4]assert_that/{Create, Group} (1/2) (3ad278f50070c100233ffd10b0d674f5)
switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at
fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(f40bdadd6646bd594f46b364ad287aeb) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(f40bdadd6646bd594f46b364ad287aeb).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)
(f40bdadd6646bd594f46b364ad287aeb) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task [3]{Create,
Map(<lambda at fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem f40bdadd6646bd594f46b364ad287aeb.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create,
Map(<lambda at fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem (2/2) (f40bdadd6646bd594f46b364ad287aeb) switched from RUNNING
to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at
fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(1441df9c476a1ccc176bb1d79ecb8f10) switched from RUNNING to FINISHED.
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(1441df9c476a1ccc176bb1d79ecb8f10).
[[3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:591>),
WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)
(1441df9c476a1ccc176bb1d79ecb8f10) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task [3]{Create,
Map(<lambda at fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem 1441df9c476a1ccc176bb1d79ecb8f10.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that} (2/2)
(bbddcd1deba7d117077193d19f76bba0) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(2/2) (bbddcd1deba7d117077193d19f76bba0).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task GroupByKey -> [5]{Map(<lambda at
fn_api_runner_test.py:594>), assert_that} (2/2)
(bbddcd1deba7d117077193d19f76bba0) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
bbddcd1deba7d117077193d19f76bba0.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
ToKeyedWorkItem (2/2) (0229432aee74cd86e54de8287ef89ba6) switched from RUNNING
to FINISHED.
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for ToKeyedWorkItem (2/2)
(0229432aee74cd86e54de8287ef89ba6).
[ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (2/2)
(0229432aee74cd86e54de8287ef89ba6) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem
0229432aee74cd86e54de8287ef89ba6.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create,
Map(<lambda at fn_api_runner_test.py:591>), WindowInto(WindowIntoFn)} ->
ToKeyedWorkItem (1/2) (1441df9c476a1ccc176bb1d79ecb8f10) switched from RUNNING
to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that} (2/2)
(bbddcd1deba7d117077193d19f76bba0) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (2/2)
(0229432aee74cd86e54de8287ef89ba6) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that} (1/2)
(9edbd53a9ab38cf11650d7ef3382002a) switched from RUNNING to FINISHED.
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources
for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(1/2) (9edbd53a9ab38cf11650d7ef3382002a).
[GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
(1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task GroupByKey -> [5]{Map(<lambda at
fn_api_runner_test.py:594>), assert_that} (1/2)
(9edbd53a9ab38cf11650d7ef3382002a) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that}
9edbd53a9ab38cf11650d7ef3382002a.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
ToKeyedWorkItem (1/2) (cec2594396d7188073604adfe6484cfe) switched from RUNNING
to FINISHED.
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Freeing task resources for ToKeyedWorkItem (1/2)
(cec2594396d7188073604adfe6484cfe).
[ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task -
Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (1/2)
(cec2594396d7188073604adfe6484cfe) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task ToKeyedWorkItem
cec2594396d7188073604adfe6484cfe.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey
-> [3]assert_that/{Group, Unkey, Match} (2/2)
(105c94e09c76de3310ea953badd1e514) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)
(105c94e09c76de3310ea953badd1e514).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task assert_that/Group/GroupByKey ->
[3]assert_that/{Group, Unkey, Match} (2/2) (105c94e09c76de3310ea953badd1e514)
[FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match}
105c94e09c76de3310ea953badd1e514.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey ->
[5]{Map(<lambda at fn_api_runner_test.py:594>), assert_that} (1/2)
(9edbd53a9ab38cf11650d7ef3382002a) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (1/2)
(cec2594396d7188073604adfe6484cfe) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)
(105c94e09c76de3310ea953badd1e514) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory -
Closing environment urn: "beam:env:external:v1"
payload: "\n\021\022\017localhost:39821"
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for
unknown endpoint.
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data
plane.
Traceback (most recent call last):
File "apache_beam/runners/worker/data_plane.py", line 421, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1577602496.297345253","description":"Error received from peer
ipv4:127.0.0.1:38201","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
self.run()
File "/usr/lib/python2.7/threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "apache_beam/runners/worker/data_plane.py", line 436, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File "apache_beam/runners/worker/data_plane.py", line 421, in _read_inputs
for elements in elements_iterator:
File
"<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 413, in next
return self._next()
File
"<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",>
line 703, in _next
raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1577602496.297345253","description":"Error received from peer
ipv4:127.0.0.1:38201","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight
requests to complete
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey
-> [3]assert_that/{Group, Unkey, Match} (1/2)
(01b614e51a02c7fdd161a7c323f9b5b3) switched from RUNNING to FINISHED.
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)
(01b614e51a02c7fdd161a7c323f9b5b3).
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)]
INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem
streams are closed for task assert_that/Group/GroupByKey ->
[3]assert_that/{Group, Unkey, Match} (1/2) (01b614e51a02c7fdd161a7c323f9b5b3)
[FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match}
01b614e51a02c7fdd161a7c323f9b5b3.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph -
assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)
(01b614e51a02c7fdd161a7c323f9b5b3) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job
test_windowing_1577602493.9 (57770077b92cc62c4c04ff4c7f5d4092) switched from
state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint
coordinator for job 57770077b92cc62c4c04ff4c7f5d4092.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore -
Shutting down
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data
channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state
handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job
57770077b92cc62c4c04ff4c7f5d4092 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job
test_windowing_1577602493.9(57770077b92cc62c4c04ff4c7f5d4092).
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:1, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8128}, allocationId:
1f16efe8416e6f4ea015efb5dd9be8de, jobId: 57770077b92cc62c4c04ff4c7f5d4092).
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:0, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647, managedMemoryInMB=8128}, allocationId:
68ad4c831aec1910249fda05467a4e37, jobId: 57770077b92cc62c4c04ff4c7f5d4092).
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection
6b87266ced46e1696cbd434d74895d2b: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job
57770077b92cc62c4c04ff4c7f5d4092 from job leader monitoring.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job 57770077b92cc62c4c04ff4c7f5d4092.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect
job manager 98477735d96d34cd530810929fd54a76@akka://flink/user/jobmanager_61
for job 57770077b92cc62c4c04ff4c7f5d4092 from the resource manager.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job 57770077b92cc62c4c04ff4c7f5d4092.
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini
Cluster
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to
job 57770077b92cc62c4c04ff4c7f5d4092 because it is not registered.
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest
endpoint.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor
akka://flink/user/taskmanager_60.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager
connection 6b87266ced46e1696cbd434d74895d2b.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing
TaskExecutor connection 7d4d67b1-90f3-41e1-a7a0-744b3e3548d2 because: The
TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory /tmp/flink-io-48748c6a-b6e9-40ab-8abc-6e629a5328ca
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the
network environment and its components.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager
removed spill file directory
/tmp/flink-netty-shuffle-da128f16-7baf-4ca0-9543-3295e8d59764
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the
kvState service and its components.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.filecache.FileCache - removed file cache directory
/tmp/flink-dist-cache-98898a8e-da33-44f0-bc35-7cfe9d7377ce
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_60.
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down
cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing
the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl -
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.beam.runners.flink.metrics.FileReporter - wrote metrics to
/tmp/flinktest-conftPF20A/test-metrics.txt
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:37235
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 471
msecs
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers :
MetricQueryResults(Counters(26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:594>)_23}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:591>)_17}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2591>)_27}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2591>)_27}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2591>)_27}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:591>)_17}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: 5,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:591>)_17}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2591>)_4}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: 1,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:591>)_17}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: 1,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2591>)_27}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: 1,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: 1,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: 1,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: 1,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: 2,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: 5,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: 2,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:594>)_23}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: 2,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}:
0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2591>)_4}:
0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2591>)_4}:
0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2591>)_4}:
0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:594>)_23}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: 2,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: 2,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at
fn_api_runner_test.py:594>)_23}: 0,
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: 2,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0,
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}:
0)Distributions(14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13,
count=1, min=13, max=13},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=70,
count=5, min=14, max=14},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=100,
count=5, min=20, max=20},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=60,
count=2, min=29, max=31},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=140,
count=5, min=28, max=28},
48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=70,
count=5, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19,
count=1, min=19, max=19},
14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=70,
count=5, min=14, max=14},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15,
count=1, min=15, max=15},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=75,
count=2, min=36, max=39},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=15,
count=1, min=15, max=15},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=66,
count=2, min=32, max=34},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13,
count=1, min=13, max=13},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14,
count=1, min=14, max=14},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=47,
count=1, min=47, max=47},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=39,
count=1, min=39, max=39},
26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17,
count=1, min=17, max=17},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=46,
count=2, min=22, max=24},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=48,
count=2, min=23, max=25},
24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=52,
count=2, min=25, max=27},
42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=59,
count=1, min=59, max=59}))
[flink-runner-job-invoker] WARN
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed
to remove job staging directory for token
{"sessionId":"job_d8699269-c1c0-438d-8cac-a43d485e6b1a","basePath":"/tmp/flinktestwvn4_2"}:
{}
java.io.FileNotFoundException:
/tmp/flinktestwvn4_2/job_d8699269-c1c0-438d-8cac-a43d485e6b1a/MANIFEST (No such
file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
at
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
at
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:__main__:removing conf dir: /tmp/flinktest-conftPF20A
----------------------------------------------------------------------
Ran 76 tests in 197.000s
OK (skipped=14)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 55
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:flinkCompatibilityMatrixBatchLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 41m 42s
72 actionable tasks: 54 executed, 17 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/iecn2lhez7dxa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]