See 
<https://builds.apache.org/job/beam_PostCommit_Python35/1045/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8629] Don't return mutable class type hints.


------------------------------------------
[...truncated 495.13 KB...]
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory /tmp/flink-io-baeea3c2-c12a-4dfa-b769-e28f7ab5ae26
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the 
network environment and its components.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager 
removed spill file directory 
/tmp/flink-netty-shuffle-b50501ff-d850-4659-9648-30f030c52e10
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the 
kvState service and its components.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader 
service.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.filecache.FileCache - removed file cache directory 
/tmp/flink-dist-cache-81e5b6c3-9002-4112-9fa4-29d5c37ab48e
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor 
akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache 
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down 
cluster because application is in CANCELED, diagnostics 
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all 
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing 
the SlotManager.
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - 
Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
 - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher 
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-13] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - 
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - 
Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:44731
[flink-akka.actor.default-dispatcher-12] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 40753 
msecs
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : 
MetricQueryResults(Counters(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda 
at core.py:2532>)_30}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_29}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda 
at core.py:2532>)_30}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user
 {NAMESPACE=__main__.WordExtractingDoFn, 
PTRANSFORM=ref_AppliedPTransform_split_17, NAME=words}: 96, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}:
 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_18:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 
0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_15}: 44, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_10}: 27, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_16}: 44, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_11}: 96, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_17}: 44, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_12}: 96, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_2}: 1, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_1}: 1, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}:
 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}:
 0, 
6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 
0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_split_17}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Extract_41}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WindowInto(WindowIntoFn)_36}:
 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_count_23}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda 
at core.py:2532>)_30}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_pair_with_one_18}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/ReadSplits_16}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_17}: 44, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 
0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/FlatMap(<lambda 
at core.py:2532>)_30}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 
0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_20:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_24}: 2, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/WriteBundles_34}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_17:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/Pair_35}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 
0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user
 {NAMESPACE=__main__.WordExtractingDoFn, 
PTRANSFORM=ref_AppliedPTransform_split_17, NAME=empty_lines}: 2, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_28:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_23}: 2, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_20:0}: 0, 
6format.None/beam:env:external:v1:0:beam:metric:element_count:v1 
{PCOLLECTION=ref_PCollection_PCollection_22}: 2, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user
 {NAMESPACE=__main__.WordExtractingDoFn, 
PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_lengths}: 298, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_28}: 2, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_format_24}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 
0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_21:0}: 0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=fn/write/ref_PCollection_PCollection_29:0}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:element_count:v1
 {PCOLLECTION=ref_PCollection_PCollection_30}: 2, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_read/Read/Split_5}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:pardo_execution_time:start_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_43}: 0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/InitializeWrite_33}: 
0, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 
{PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/DoOnce/Map(decode)_32}: 
0, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:ptransform_execution_time:total_msecs:v1
 {PTRANSFORM=ref_AppliedPTransform_write/Write/WriteImpl/PreFinalize_42}: 0, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1
 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 
0)Distributions(19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=685, 
count=34, min=18, max=27}, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=869, 
count=37, min=20, max=29}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=0, 
count=0, min=9223372036854775807, max=-9223372036854775808}, 
19group/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=697, 
count=36, min=17, max=26}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=106, 
count=2, min=53, max=53}, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=14, 
count=1, min=14, max=14}, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=684, 
count=1, min=684, max=684}, 
17read/Read/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, 
count=1, min=13, max=13}, 
6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 
{PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=827, 
count=39, min=19, max=25}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:2:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, 
count=1, min=15, max=15}, 
6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 
{PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=276, 
count=2, min=138, max=138}, 
6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 
{PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=278, 
count=2, min=139, max=139}, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=13, 
count=1, min=13, max=13}, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:user_distribution
 {NAMESPACE=__main__.WordExtractingDoFn, 
PTRANSFORM=ref_AppliedPTransform_split_17, NAME=word_len_dist}: 
DistributionResult{sum=298, count=96, min=1, max=10}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:1:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, 
count=1, min=15, max=15}, 
36write/Write/WriteImpl/DoOnce/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, 
count=1, min=15, max=15}, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=276, 
count=2, min=138, max=138}, 
46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=271, 
count=1, min=271, max=271}, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=676, 
count=19, min=14, max=84}, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=469, 
count=28, min=14, max=23}, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=639, 
count=34, min=16, max=22}, 
36read/Read/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=686, 
count=1, min=686, max=686}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=15, 
count=1, min=15, max=15}, 
6format.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 
{PCOLLECTION=ref_PCollection_PCollection_24}: DistributionResult{sum=278, 
count=2, min=139, max=139}, 
40write/Write/WriteImpl/DoOnce/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1
 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=81, 
count=1, min=81, max=81}))
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService - 
Manifest at 
/tmp/beam-tempuehyocfh/artifactsd1mrcu6h/job_36f3ea65-ec2b-4bcd-9e69-9621fb8ec456/MANIFEST
 has 1 artifact locations
[flink-runner-job-invoker] INFO 
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService
 - Removed dir 
/tmp/beam-tempuehyocfh/artifactsd1mrcu6h/job_36f3ea65-ec2b-4bcd-9e69-9621fb8ec456/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Getting 
job metrics for 
BeamApp-jenkins-1120233226-477352c1_ddc73667-a033-433b-ba08-b21515d349a7
[grpc-default-executor-0] INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Finished 
getting job metrics for 
BeamApp-jenkins-1120233226-477352c1_ddc73667-a033-433b-ba08-b21515d349a7
INFO:root:number of empty lines: 2
INFO:root:average word length: 3

> Task :sdks:python:test-suites:portable:py35:portableWordCountSparkRunnerBatch
Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 138, in <module>
    run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 117, in run
    result = p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 41, in run_pipeline
    return super(SparkRunner, self).run_pipeline(pipeline, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 207, in run_pipeline
    job_service = self.create_job_service(options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 126, in create_job_service
    return server.start()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 81, in start
    self._endpoint = self._job_server.start()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 106, in start
    cmd, endpoint = self.subprocess_cmd_and_endpoint()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 142, in subprocess_cmd_and_endpoint
    jar_path = self.local_jar(self.path_to_jar())
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 74, in path_to_jar
    return self.path_to_beam_jar('runners:spark:job-server:shadowJar')
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/portability/job_server.py";,>
 line 135, in path_to_beam_jar
    return subprocess_server.JavaJarServer.path_to_beam_jar(gradle_target)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/utils/subprocess_server.py";,>
 line 181, in path_to_beam_jar
    local_path, os.path.abspath(project_root), gradle_target))
RuntimeError: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/runners/spark/job-server/build/libs/beam-runners-spark-job-server-2.18.0-SNAPSHOT.jar>
 not found. Please build the server with 
  cd <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src;> ./gradlew 
runners:spark:job-server:shadowJar

> Task :sdks:python:test-suites:portable:py35:portableWordCountSparkRunnerBatch 
> FAILED

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_32_04-15563250395635261010?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_47_05-281933988669512771?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_55_18-5353500117240609507?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_04_03-14110129815679263591?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_12_56-252446799979548892?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_31_59-15798608179562148260?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_55_49-11942784146857243433?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_04_23-2476868987929868641?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_13_08-14946170986505089516?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_32_04-15367264383800084776?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_45_01-14687264919397876105?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_53_47-8622218975425600769?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_31_59-7998699161568176427?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_51_30-4993858798828681140?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_00_32-520334318899753153?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_08_59-17424110764067131210?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_31_59-184526070992045160?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_41_48-13153415947818054752?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_50_32-8260130746901616187?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_58_41-15355216112505040235?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_07_16-17248847361722296554?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_32_01-10770152453577468194?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_40_48-4813960071900262035?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_49_51-14223051140850541536?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_59_29-18012552685685294769?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_06_49-17628310658780746797?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_32_07-10405490595706048236?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_41_02-10403129892949528863?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_49_51-6532777631342614323?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_59_33-9026142265270782805?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_09_06-14964326018494862470?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_18_24-4158402351664624040?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_32_02-12371271199345063453?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_42_20-15506805463927216599?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_15_52_34-15643361537081452918?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_01_38-14866272517311609755?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-20_16_11_08-78272464669386476?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1214:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due 
to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 
is addressed. 
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This 
test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3322.959s

OK (SKIP=6)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for 
> details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py35:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 34s
81 actionable tasks: 61 executed, 20 from cache

Publishing build scan...
https://gradle.com/s/qacn56llcdjrg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to