See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/3529/display/redirect?page=changes>

Changes:

[Kyle Weaver] [BEAM-10925] Don't publish udf-test-provider to Maven.

[noreply] [BEAM-11780] Use vendored cloudbuild python client. (#13933)

[noreply] [BEAM-11804] Remove vendors/sdk-java-extensions-protobuf (#13968)

[noreply] [BEAM-7372][BEAM-9372] cleanup python 2.x and 3.5 codepaths (#13913)


------------------------------------------
[...truncated 29.18 MB...]
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)'
INFO:apache_beam.utils.subprocess_server:b'\t... 1 more'
INFO:apache_beam.utils.subprocess_server:b''
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.rest.handler.legacy.backpressure.BackPressureRequestCoordinator
 shutDown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down back pressure 
request coordinator.'
ERROR:root:java.lang.RuntimeException: Error received from SDK harness for 
instruction 48: org.apache.beam.sdk.util.UserCodeException: 
java.lang.IllegalArgumentException: Multiple entries with same key: 
user-agent=Apache_Beam_Java/2.29.0-SNAPSHOT and user-agent=spanner-java/
        at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
        at 
org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema$DoFnInvoker.invokeSetup(Unknown
 Source)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.<init>(FnApiDoFnRunner.java:473)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner$Factory.createRunnerForPTransform(FnApiDoFnRunner.java:183)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner$Factory.createRunnerForPTransform(FnApiDoFnRunner.java:157)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:247)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:208)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.createRunnerAndConsumersForPTransformRecursively(ProcessBundleHandler.java:208)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.createBundleProcessor(ProcessBundleHandler.java:518)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.lambda$processBundle$0(ProcessBundleHandler.java:287)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler$BundleProcessorCache.get(ProcessBundleHandler.java:598)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:282)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Multiple entries with same key: 
user-agent=Apache_Beam_Java/2.29.0-SNAPSHOT and user-agent=spanner-java/
        at 
com.google.common.collect.ImmutableMap.conflictException(ImmutableMap.java:215)
        at 
com.google.common.collect.ImmutableMap.checkNoConflict(ImmutableMap.java:209)
        at 
com.google.common.collect.RegularImmutableMap.checkNoConflictInKeyBucket(RegularImmutableMap.java:147)
        at 
com.google.common.collect.RegularImmutableMap.fromEntryArray(RegularImmutableMap.java:110)
        at 
com.google.common.collect.ImmutableMap$Builder.build(ImmutableMap.java:393)
        at 
com.google.cloud.spanner.spi.v1.GapicSpannerRpc.<init>(GapicSpannerRpc.java:320)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerRpcFactory.create(SpannerOptions.java:467)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerRpcFactory.create(SpannerOptions.java:462)
        at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:561)
        at 
com.google.cloud.spanner.SpannerOptions.getSpannerRpcV1(SpannerOptions.java:1169)
        at com.google.cloud.spanner.SpannerImpl.<init>(SpannerImpl.java:134)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:457)
        at 
com.google.cloud.spanner.SpannerOptions$DefaultSpannerFactory.create(SpannerOptions.java:452)
        at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:541)
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor.createAndConnect(SpannerAccessor.java:163)
        at 
org.apache.beam.sdk.io.gcp.spanner.SpannerAccessor.getOrCreate(SpannerAccessor.java:98)
        at 
org.apache.beam.sdk.io.gcp.spanner.ReadSpannerSchema.setup(ReadSpannerSchema.java:45)

INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down 
TaskExecutorLocalStateStoresManager.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped dispatcher 
akka://flink/user/rpc/dispatcher_2.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager removed 
spill file directory /tmp/flink-io-50bb49d0-6bbc-4f56-a9ae-bfc3c2b19334'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.io.network.NettyShuffleEnvironment close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down the network 
environment and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.io.disk.FileChannelManagerImpl lambda$getFileCloser$0'
INFO:apache_beam.utils.subprocess_server:b'INFO: FileChannelManager removed 
spill file directory 
/tmp/flink-netty-shuffle-07f8daae-4fe4-40fa-b313-3ce18bd032db'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down the kvState 
service and its components.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService stop'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stop job leader service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.filecache.FileCache shutdown'
INFO:apache_beam.utils.subprocess_server:b'INFO: removed file cache directory 
/tmp/flink-dist-cache-53a3deb9-7eac-468f-8c17-ad121ff7d886'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.taskexecutor.TaskExecutor handleOnStopException'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped TaskExecutor 
akka://flink/user/rpc/taskmanager_0.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopping Akka RPC service.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopping Akka RPC service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped Akka RPC service.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down BLOB cache'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Shutting down BLOB cache'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.blob.BlobServer close'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped BLOB server at 
0.0.0.0:44201'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.flink.runtime.rpc.akka.AkkaRpcService lambda$stopService$7'
INFO:apache_beam.utils.subprocess_server:b'INFO: Stopped Akka RPC service.'
ERROR
test_spanner_update 
(apache_beam.io.gcp.tests.xlang_spannerio_it_test.CrossLanguageSpannerIOTest) 
... INFO:apache_beam.utils.subprocess_server:Using pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/java/io/google-cloud-platform/expansion-service/build/libs/beam-sdks-java-io-google-cloud-platform-expansion-service-2.29.0-SNAPSHOT.jar>
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/java/io/google-cloud-platform/expansion-service/build/libs/beam-sdks-java-io-google-cloud-platform-expansion-service-2.29.0-SNAPSHOT.jar'>
 '56211']
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Starting expansion service at 
localhost:56211'
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:03 AM 
org.apache.beam.sdk.expansion.service.ExpansionService loadRegisteredTransforms'
INFO:apache_beam.utils.subprocess_server:b'INFO: Registering external 
transforms: [beam:external:java:pubsub:read:v1, 
beam:external:java:pubsub:write:v1, beam:external:java:spanner:insert:v1, 
beam:external:java:spanner:update:v1, beam:external:java:spanner:replace:v1, 
beam:external:java:spanner:insert_or_update:v1, 
beam:external:java:spanner:delete:v1, beam:external:java:spanner:read:v1, 
beam:external:java:generate_sequence:v1]'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:pubsub:read:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@1f89ab83'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:pubsub:write:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@e73f9ac'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:insert:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@61064425'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:update:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@7b1d7fff'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:replace:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@299a06ac'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:insert_or_update:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@383534aa'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:delete:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@6bc168e5'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:spanner:read:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@7b3300e5'
INFO:apache_beam.utils.subprocess_server:b'\tbeam:external:java:generate_sequence:v1:
 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$3/212628335@2e5c649'
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
DEBUG:root:Waiting for grpc channel to be ready at localhost:56211.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:04 AM 
org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO:apache_beam.utils.subprocess_server:b"INFO: Expanding 'Write to Spanner' 
with URN 'beam:external:java:spanner:update:v1'"
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:06 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO:apache_beam.utils.subprocess_server:b"WARNING: Configuration class 
'org.apache.beam.sdk.io.gcp.spanner.SpannerTransformRegistrar$WriteBuilder$Configuration'
 has no schema registered. Attempting to construct with setter approach."
DEBUG:root:Sending SIGINT to job_server
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:44301
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.6 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.6_sdk:2.29.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7ff5f5428a60> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:21 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Impulse_2\n  Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Generate_3\n  Generate:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map to 
row_4\n  Map to row:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/MapElements/Map/ParMultiDo(Anonymous)\n  Write to 
Spanner/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup)\n  Write 
to Spanner/SpannerIO.Write/To mutation 
group/ParMultiDo(ToMutationGroup):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/Impulse\n  Write to Spanner/SpannerIO.Write/Write 
mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information 
schema/ParMultiDo(ReadSpannerSchema)\n  Write to Spanner/SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information 
schema/ParMultiDo(ReadSpannerSchema):beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine:beam:transform:combine_per_key_precombine:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/RewindowIntoGlobal/Window.Assign\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/RewindowIntoGlobal/Window.Assign:beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable 
Mutations\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Filter Unbatchable Mutations:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create 
Batches/ParMultiDo(GatherSortCreateBatches)\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create 
Batches/ParMultiDo(GatherSortCreateBatches):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Merge:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to 
Spanner\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Write batches to Spanner:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7ff5f54291e0> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:21 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner.translations:Stages: 
['ref_AppliedPTransform_Impulse_2\n  Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 
'ref_AppliedPTransform_Generate_3\n  Generate:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Map to 
row_4\n  Map to row:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/MapElements/Map/ParMultiDo(Anonymous)\n  Write to 
Spanner/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/To mutation group/ParMultiDo(ToMutationGroup)\n  Write 
to Spanner/SpannerIO.Write/To mutation 
group/ParMultiDo(ToMutationGroup):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/Impulse\n  Write to Spanner/SpannerIO.Write/Write 
mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Create 
Seed/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Read information 
schema/ParMultiDo(ReadSpannerSchema)\n  Write to Spanner/SpannerIO.Write/Write 
mutations to Cloud Spanner/Read information 
schema/ParMultiDo(ReadSpannerSchema):beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Precombine:beam:transform:combine_per_key_precombine:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Group:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Merge:beam:transform:combine_per_key_merge_accumulators:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/ExtractOutputs:beam:transform:combine_per_key_extract_outputs:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization)\n
  Write to Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Schema 
View/Combine.GloballyAsSingletonView/View.VoidKeyToMultimapMaterialization/ParDo(VoidKeyToMultimapMaterialization)/ParMultiDo(VoidKeyToMultimapMaterialization):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/RewindowIntoGlobal/Window.Assign\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/RewindowIntoGlobal/Window.Assign:beam:transform:window_into:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Filter Unbatchable 
Mutations\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Filter Unbatchable Mutations:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create 
Batches/ParMultiDo(GatherSortCreateBatches)\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Gather Sort And Create 
Batches/ParMultiDo(GatherSortCreateBatches):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Merge\n  Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Merge:beam:transform:flatten:v1\n  must follow: \n  
downstream_side_inputs: <unknown>', 'external_10Write to 
Spanner/SpannerIO.Write/Write mutations to Cloud Spanner/Write batches to 
Spanner\n  Write to Spanner/SpannerIO.Write/Write mutations to Cloud 
Spanner/Write batches to Spanner:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: <unknown>']
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/flink/1.12/job-server/build/libs/beam-runners-flink-1.12-job-server-2.29.0-SNAPSHOT.jar'>
 '--flink-master' '[auto]' '--artifacts-dir' 
'/tmp/beam-temphfpzpv53/artifacts0k_0qv07' '--job-port' '49827' 
'--artifact-port' '0' '--expansion-port' '0']
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:27:56.134Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:02.318Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.729Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.774Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-02-12T06:28:16.811Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-02-11_22_19_42-10931258595129165273 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... PyVersion ---> 
3.6.8 (default, Dec 24 2018, 19:24:27) 
[GCC 5.4.0 20160609]
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... Setting up!
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:.... Spanner Client 
created!
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Creating test 
database: pybeam-read-d9e1eb2c484d3a4
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.admin
 HTTP/1.1" 200 241
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Creating database: 
Done! name: 
"projects/apache-beam-testing/instances/beam-test/databases/pybeam-read-d9e1eb2c484d3a4"
state: READY
create_time {
  seconds: 1613111310
  nanos: 952774000
}

INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Dummy Data: Adding 
dummy data...
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2021-02-11_22_22_19-8483241608154996889 after 360 seconds
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/?recursive=true
 HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data
 HTTP/1.1" 200 241
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Spanner Read IT 
Setup Complete...
INFO:apache_beam.io.gcp.experimental.spannerio_read_it_test:Running Spanner via 
sql
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:126:
 FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility 
guarantees.
  sql="select * from Users")
INFO:apache_beam.runners.portability.stager:Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver 
createArtifactStagingService'
INFO:apache_beam.utils.subprocess_server:b'INFO: ArtifactStagingService started 
on localhost:43877'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started 
on localhost:42347'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on 
localhost:49827'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, 
terminate with Ctrl+C'
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'sdk_worker_parallelism' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'files_to_stage' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'flink_master' was already added
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Staging artifacts for 
job_2cd01418-0514-4560-8fe9-b59918c4d6aa.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 
resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for 
job_2cd01418-0514-4560-8fe9-b59918c4d6aa.external_10beam:env:docker:v1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 onNext'
INFO:apache_beam.utils.subprocess_server:b'INFO: Getting 7 artifacts for 
job_2cd01418-0514-4560-8fe9-b59918c4d6aa.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:13 AM 
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService$2 
resolveNextEnvironment'
INFO:apache_beam.utils.subprocess_server:b'INFO: Resolving artifacts for 
job_2cd01418-0514-4560-8fe9-b59918c4d6aa.ref_Environment_default_environment_1.'

> Task :sdks:python:test-suites:portable:py36:postCommitPy36IT
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
DEBUG:root:Waiting for grpc channel to be ready at localhost:49827.
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver 
createArtifactStagingService'
INFO:apache_beam.utils.subprocess_server:b'INFO: ArtifactStagingService started 
on localhost:43877'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService'
INFO:apache_beam.utils.subprocess_server:b'INFO: Java ExpansionService started 
on localhost:42347'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer'
INFO:apache_beam.utils.subprocess_server:b'INFO: JobService started on 
localhost:49827'
INFO:apache_beam.utils.subprocess_server:b'Feb 12, 2021 6:28:12 AM 
org.apache.beam.runners.jobsubmission.JobServerDriver run'
INFO:apache_beam.utils.subprocess_server:b'INFO: Job server now running, 
terminate with Ctrl+C'
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'experiments' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'job_name' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'runner' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'temp_location' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'dataflow_kms_key' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'enable_streaming_engine' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'project' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_region' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'worker_zone' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'zone' was 
already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'pubsub_root_url' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 'streaming' 
was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'environment_cache_millis' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'job_endpoint' was already added
DEBUG:apache_beam.runners.portability.portable_runner:Runner option 
'output_executable_path' was already added
java.lang.OutOfMemoryError: GC overhead limit exceeded
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to