See <https://builds.apache.org/job/beam_PostCommit_Python2/461/display/redirect>
------------------------------------------
[...truncated 778.71 KB...]
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086) switched from
SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink
(DiscardingOutput) (1/2) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink
(DiscardingOutput) (1/2).
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(6ba7515956dcf72f14d58ff77fba4086) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak
safety net for task DataSink (DiscardingOutput) (1/2)
(6ba7515956dcf72f14d58ff77fba4086) [DEPLOYING]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition
(MapPartition at [3]assert_that/{Group, Unkey, Match}) (1/2)
(61e90cc9098b5a81abd5597db56a2ef3) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink
(DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086) [DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Registering task at network:
DataSink (DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086)
[DEPLOYING].
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(6ba7515956dcf72f14d58ff77fba4086) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086) switched from
DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (1/2)
(6ba7515956dcf72f14d58ff77fba4086) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink
(DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086).
[DataSink (DiscardingOutput) (1/2)] INFO
org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are
closed for task DataSink (DiscardingOutput) (1/2)
(6ba7515956dcf72f14d58ff77fba4086) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and
sending final execution state FINISHED to JobManager for task DataSink
(DiscardingOutput) 6ba7515956dcf72f14d58ff77fba4086.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink
(DiscardingOutput) (1/2) (6ba7515956dcf72f14d58ff77fba4086) switched from
RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job
BeamApp-root-0913120825-b124b1c (67e5fed028eac573dcebb8c5d1ba1953) switched
from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job
67e5fed028eac573dcebb8c5d1ba1953 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job
BeamApp-root-0913120825-b124b1c(67e5fed028eac573dcebb8c5d1ba1953).
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini
Cluster
[flink-runner-job-invoker] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest
endpoint.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor
akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection
bfa971469bc5f38ef9e34d4230b763d5: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatchera8433abb-3c45-46fb-9e8f-c73f615aff96.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher
akka://flink/user/dispatchera8433abb-3c45-46fb-9e8f-c73f615aff96.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting
down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect
job manager 981cb0324442ad26f724b85cc2d4448c@akka://flink/user/jobmanager_1 for
job 67e5fed028eac573dcebb8c5d1ba1953 from the resource manager.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job
67e5fed028eac573dcebb8c5d1ba1953 with leader id
981cb0324442ad26f724b85cc2d4448c lost leadership.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed
spill file directory /tmp/flink-io-e8049e9f-373c-4eab-9074-801316636989
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the
network environment and its components.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatchera8433abb-3c45-46fb-9e8f-c73f615aff96.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the
SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending
the SlotManager.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-7] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-2] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-2] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[ForkJoinPool.commonPool-worker-2] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:36533
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 15124
msecs
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers :
MetricQueryResults(Counters(ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}:
0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=pcollection}: 5,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}:
0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: 1,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: 1,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_1}: 3,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: 1,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}:
0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: 12,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}:
0, pcollection_1:beam:metric:element_count:v1 {PCOLLECTION=pcollection_2}: 3,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: 1,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
pcollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}:
0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Precombine}:
0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_2root/Init/Map/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_30}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:370>)_21}:
0,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
pcollection_1:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0,
pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/Merge}:
0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_17:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: 1,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: 1,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: 1,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=external_2root/Init/Map/ParMultiDo(Anonymous).output}: 6,
ref_PCollection_PCollection_27:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: 1,
pcollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/pcollection_1:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2466>)_26}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}:
11,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}:
0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_13}: 6,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/pcollection:0}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(unicode)_17}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: 3,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: 12,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}:
0, ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: 3,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: 3,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_17:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_34}: 0,
ref_PCollection_PCollection_12:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0,
pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ExternalTransform(beam:transforms:xlang:count)/Combine.perKey(Count)/ExtractOutputs}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2466>)_26}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_40}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: 3,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_1:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}:
2,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_33}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_27:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2466>)_26}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at
core.py:2466>)_26}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_9:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_28}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:366>)_18}:
11,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_32}: 2,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: 3,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Match_41}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: 3,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}:
2, pcollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_14:0}: 0,
ref_PCollection_PCollection_14:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: 3,
ref_PCollection_PCollection_14:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Map(<lambda at external_test.py:371>)_22}:
0, ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: 12,
ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: 12,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}:
0, ref_PCollection_PCollection_9:beam:metric:element_count:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: 12,
ref_PCollection_PCollection_27:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_39}:
0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0,
ref_PCollection_PCollection_14:beam:metric:ptransform_execution_time:total_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_29}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_12:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=external_1root/ParDo(Anonymous)/ParMultiDo(Anonymous)}: 0,
ref_PCollection_PCollection_9:beam:metric:pardo_execution_time:start_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0,
ref_PCollection_PCollection_1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
{PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2466>)_4}:
0,
ref_PCollection_PCollection_17:beam:metric:pardo_execution_time:process_bundle_msecs:v1
{PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}:
0)Distributions(ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=54,
count=3, min=18, max=18},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=51,
count=3, min=17, max=17},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=45,
count=3, min=15, max=15},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14,
count=1, min=14, max=14},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19,
count=1, min=19, max=19},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13,
count=1, min=13, max=13},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=16,
count=1, min=16, max=16},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=72,
count=3, min=24, max=24},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15,
count=1, min=15, max=15},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=192,
count=12, min=16, max=16},
ref_PCollection_PCollection_17:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17,
count=1, min=17, max=17},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_9:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=168,
count=12, min=14, max=14},
ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13,
count=1, min=13, max=13},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=58,
count=1, min=58, max=58},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=41,
count=1, min=41, max=41},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=63,
count=3, min=21, max=21},
ref_PCollection_PCollection_27:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=33,
count=1, min=33, max=33},
ref_PCollection_PCollection_1:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=180,
count=12, min=15, max=15},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=57,
count=3, min=19, max=19},
ref_PCollection_PCollection_14:beam:metric:sampled_byte_size:v1
{PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=54,
count=3, min=18, max=18}))
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService
- Manifest at
/tmp/beam-artifact-staging/job_cfc1da77-18e6-4829-9e00-e328d5a1528a/MANIFEST
has 0 artifact locations
[flink-runner-job-invoker] INFO
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
- Removed dir
/tmp/beam-artifact-staging/job_cfc1da77-18e6-4829-9e00-e328d5a1528a/
> Task :sdks:python:test-suites:portable:py2:crossLanguageTests
> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:root:Writing 100000 documents to mongodb finished in 61.205 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the
default runner: DirectRunner.
INFO:root:Reading from mongodb beam_mongodbio_it_db:integration_test_1568376477
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:83:
FutureWarning: ReadFromMongoDB is experimental.
| 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7f0daf36d050> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7f0daf36d140> ====================
INFO:root:==================== <function lift_combiners at 0x7f0daf36d1b8>
====================
INFO:root:==================== <function expand_sdf at 0x7f0daf36d230>
====================
INFO:root:==================== <function expand_gbk at 0x7f0daf36d2a8>
====================
INFO:root:==================== <function sink_flattens at 0x7f0daf36d398>
====================
INFO:root:==================== <function greedily_fuse at 0x7f0daf36d410>
====================
INFO:root:==================== <function read_to_impulse at 0x7f0daf36d488>
====================
INFO:root:==================== <function impulse_to_input at 0x7f0daf36d500>
====================
INFO:root:==================== <function inject_timer_pcollections at
0x7f0daf36d668> ====================
INFO:root:==================== <function sort_stages at 0x7f0daf36d6e0>
====================
INFO:root:==================== <function window_pcollection_coders at
0x7f0daf36d758> ====================
INFO:root:Running
((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/1))))))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running
((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/0)))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running
(assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:root:Read 100000 documents from mongodb finished in 21.418 seconds
mongoioit27346
mongoioit27346
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
======================================================================
ERROR: Failure: SyntaxError (invalid syntax (external_test_py37.py, line 46))
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/loader.py",>
line 418, in loadTestsFromName
addr.filename, addr.module)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/importer.py",>
line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/importer.py",>
line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
SyntaxError: invalid syntax (external_test_py37.py, line 46)
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: WARNING: Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
root: WARNING: python-snappy is not installed; some tests will be skipped.
root: WARNING: Tensorflow is not installed, so skipping some tests.
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 46 tests in 4483.201s
FAILED (SKIP=4, errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_17-12959189162103630116?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_20_56-4865207679632901182?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_29_05-7908112554113905855?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_39_07-11783228301079586504?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_48_30-1495407176459481416?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_57_37-12721896028240586189?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_06_07_50-5377495361321138995?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_27-16102262057661516326?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_18_36-5338721825192977456?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_27_58-10524528825535074031?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_37_54-17590406217742279394?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_17-14036075730840044213?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_22_26-1442102014904448158?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_42_45-915759399021997941?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_23-11096428499239558632?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_16_17-8747757548230436407?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_24_28-12395171802039176255?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_32_17-9978632130580224787?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_41_31-6981853279545301667?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_18-8551558180356228726?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_11_21-9384554267341268718?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_20_34-1168743161699797932?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_30_37-11139236275856477514?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_39_11-2684063336060725453?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_17-14951383946854259133?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_10_36-14257342683846763840?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_21_00-9864060451646819824?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_30_56-4383618173779004105?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_39_12-11004949126018455555?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_19-5317881285947318210?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_11_19-16076715236525601925?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_20_48-8799916871060524701?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_29_46-6836644043078051118?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_38_24-16996738590847101954?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_47_21-1746402225092448646?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_02_19-14194304708783170879?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_12_41-17778533074307882027?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_23_48-15324801516834289039?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_32_48-571488811945078855?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_05_41_34-5174242588854599988?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 16m 16s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/rcw4jiiwffq7u
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]