See <https://builds.apache.org/job/beam_PostCommit_Python2/589/display/redirect>
Changes:
------------------------------------------
[...truncated 770.01 KB...]
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak
safety net for task DataSink (DiscardingOutput) (1/2)
(14bc917365f0a1777c2290a3dff733cd) [DEPLOYING]
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink
(DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Registering task at network:
DataSink (DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd)
[DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(14bc917365f0a1777c2290a3dff733cd) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(14bc917365f0a1777c2290a3dff733cd) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink
(DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd).
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task DataSink (DiscardingOutput) (1/2)
(14bc917365f0a1777c2290a3dff733cd) [FINISHED]
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task DataSink
(DiscardingOutput) 14bc917365f0a1777c2290a3dff733cd.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (14bc917365f0a1777c2290a3dff733cd) switched from
RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job
BeamApp-root-0930121022-f43326e0 (d852a6fe1a153a98d7d679884aa8d5c7) switched
from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job
d852a6fe1a153a98d7d679884aa8d5c7 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job
BeamApp-root-0930121022-f43326e0(d852a6fe1a153a98d7d679884aa8d5c7).
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection
821e63a4670109eee7416d039dc6decd: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect
job manager 98c4e4df1df4f0a995a1e2725ef747ac@akka://flink/user/jobmanager_1 for
job d852a6fe1a153a98d7d679884aa8d5c7 from the resource manager.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:1, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647}, allocationId: 740560020ed912e642893a01897cc0f1,
jobId: d852a6fe1a153a98d7d679884aa8d5c7).
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini
Cluster
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest
endpoint.
[mini-cluster-io-thread-15] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job
d852a6fe1a153a98d7d679884aa8d5c7 with leader id
98c4e4df1df4f0a995a1e2725ef747ac lost leadership.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot
TaskSlot(index:0, state:ACTIVE, resource profile:
ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647,
directMemoryInMB=2147483647, nativeMemoryInMB=2147483647,
networkMemoryInMB=2147483647}, allocationId: a867034bf3d761bded23aad50b4c6587,
jobId: d852a6fe1a153a98d7d679884aa8d5c7).
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job
d852a6fe1a153a98d7d679884aa8d5c7 from job leader monitoring.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job d852a6fe1a153a98d7d679884aa8d5c7.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager
connection for job d852a6fe1a153a98d7d679884aa8d5c7.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to
job d852a6fe1a153a98d7d679884aa8d5c7 because it is not registered.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor
akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed
spill file directory /tmp/flink-io-24802c82-f725-4caa-993c-817d3d8890e9
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the
network environment and its components.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.filecache.FileCache - removed file cache directory
/tmp/flink-dist-cache-c40a046c-3ad6-4881-8899-950f552e213e
[ForkJoinPool.commonPool-worker-11] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_0.
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down
cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the
SlotManager.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending
the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatcher.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:34767
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 13586
msecs
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers :
MetricQueryResults(Counters(ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2474>)_26}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}:
14,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=pcollection}: 5,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: 1,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: 1,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: 1,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2474>)_26}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: 1,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: 12,
pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}:
0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}:
0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}:
0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2474>)_26}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
pcollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}:
0, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}:
0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: 1,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2474>)_26}: 0,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: 1,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: 1,
pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_13}: 6,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: 12,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:378>)_22}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: 3,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: 3,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}:
0, ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: 3,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 2,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}:
0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: 3,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: 3,
pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: 3,
ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: 12,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}:
0, ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: 12,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}:
0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2474>)_4}:
14,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:377>)_21}:
0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:373>)_18}:
0)Distributions(ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54,
count=3, min=18, max=18},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51,
count=3, min=17, max=17},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45,
count=3, min=15, max=15},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14,
count=1, min=14, max=14},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=19,
count=1, min=19, max=19},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13,
count=1, min=13, max=13},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=72,
count=3, min=24, max=24},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16,
count=1, min=16, max=16},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15,
count=1, min=15, max=15},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192,
count=12, min=16, max=16},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17,
count=1, min=17, max=17},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13,
count=1, min=13, max=13},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58,
count=1, min=58, max=58},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41,
count=1, min=41, max=41},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63,
count=3, min=21, max=21},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33,
count=1, min=33, max=33},
ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180,
count=12, min=15, max=15},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57,
count=3, min=19, max=19},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54,
count=3, min=18, max=18}))
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
- Manifest at
/tmp/beam-artifact-staging/job_d7cc98fa-4f42-4fb0-875a-2c9d53dfbe96/MANIFEST
has 0 artifact locations
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
- Removed dir
/tmp/beam-artifact-staging/job_d7cc98fa-4f42-4fb0-875a-2c9d53dfbe96/
INFO:root:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py2:crossLanguageTests
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
======================================================================
ERROR: test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/complete/game/hourly_team_score_it_test.py",>
line 89, in test_hourly_team_score_it
self.test_pipeline.get_full_options_as_args(**extra_opts))
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/complete/game/hourly_team_score.py",>
line 303, in run
}, options.view_as(GoogleCloudOptions).project))
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",>
line 427, in __exit__
self.run().wait_until_finish()
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",>
line 420, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",>
line 53, in run_pipeline
pipeline, options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 484, in run_pipeline
self.dataflow_client.create_job(self.job), self)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 206, in wrapper
return fun(*args, **kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 530, in create_job
self.create_job_description(job)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 560, in create_job_description
resources = self._stage_resources(job.options)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 490, in _stage_resources
staging_location=google_cloud_options.staging_location)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 168, in stage_job_resources
requirements_cache_path)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 206, in wrapper
return fun(*args, **kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",>
line 487, in _populate_requirements_cache
processes.check_output(cmd_args, stderr=processes.STDOUT)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 91, in check_output
.format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/processes.py",>
line 83, in check_output
out = subprocess.check_output(*args, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 574, in check_output
raise CalledProcessError(retcode, cmd, output=output)
CalledProcessError: Command
'['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']'
returned non-zero exit status 1
Pip install failed for package: -r
Output from execution of subprocess: DEPRECATION: Python 2.7 will reach the
end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7
won't be maintained after that date. A future version of pip will drop support
for Python 2.7. More details about Python 2 support in pip, can be found at
https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))
File was already downloaded
/tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz
Collecting mock (from -r postcommit_requirements.txt (line 2))
ERROR: Could not find a version that satisfies the requirement mock (from -r
postcommit_requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for mock (from -r
postcommit_requirements.txt (line 2))
-------------------- >> begin captured logging << --------------------
root: WARNING: --region not set; will default to us-central1. Future releases
of Beam will require the user to set the region explicitly.
https://cloud.google.com/compute/docs/regions-zones/regions-zones
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/datasets HTTP/1.1" 200 None
root: WARNING: --region not set; will default to us-central1. Future releases
of Beam will require the user to set the region explicitly.
https://cloud.google.com/compute/docs/regions-zones/regions-zones
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://dataflow-samples/game/gaming_data'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://dataflow-samples/game/gaming_data*' ->
'gs\\:\\/\\/dataflow\\-samples\\/game\\/gaming\\_data[^/\\\\]*'
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0econds.
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 2 files in 0.0710110664368 seconds.
root: WARNING: Typical end users should not use this worker jar feature. It can
only be used when FnAPI is enabled.
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://dataflow-samples/game/gaming_data'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://dataflow-samples/game/gaming_data*' ->
'gs\\:\\/\\/dataflow\\-samples\\/game\\/gaming\\_data[^/\\\\]*'
root: INFO: Starting the size estimation of the input
root: INFO: Finished listing 2 files in 0.054297208786 seconds.
root: INFO: Starting GCS upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/pipeline.pb...
root: INFO: Completed GCS upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/pipeline.pb
in 0 seconds.
root: INFO: Starting GCS upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/requirements.txt...
root: INFO: Completed GCS upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0930120340-538122.1569845020.538352/requirements.txt
in 0 seconds.
root: INFO: Executing command:
['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3986.574s
FAILED (SKIP=4, errors=1)
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 7m 21s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/6fmiympaqkszi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]