See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/8/display/redirect?page=changes>

Changes:

[dhuntsperger] added GitHub example references to Python multilang quickstart

[noreply] Merge pull request #16579 from Revert "Revert "Merge pull request 
#15863

[noreply] Merge pull request #16606 from [BEAM-13247] [Playground] Embedding


------------------------------------------
[...truncated 418.29 KB...]
              self)
E         
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E         Workflow failed. Causes: Job appears to be stuck. Several workers 
have failed to start up in a row, and no worker has successfully started up for 
this job. Last error reported: Unable to pull container image due to error: 
image pull request failed with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..

apache_beam/runners/dataflow/dataflow_runner.py:1636: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:238 Using 
pre-built snapshot at 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.37.0-SNAPSHOT.jar>
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:116 Starting 
service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.37.0-SNAPSHOT.jar'>
 '36425' 
'--filesToStage=<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.37.0-SNAPSHOT.jar']>
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'Starting expansion service at localhost:36425'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Jan 28, 
2022 7:27:49 AM org.apache.beam.sdk.expansion.service.ExpansionService 
loadRegisteredTransforms'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
Registering external transforms: [beam:external:java:sql:v1, 
beam:transform:org.apache.beam:pubsub_read:v1, 
beam:transform:org.apache.beam:pubsub_write:v1, 
beam:transform:org.apache.beam:pubsublite_write:v1, 
beam:transform:org.apache.beam:pubsublite_read:v1, 
beam:transform:org.apache.beam:spanner_insert:v1, 
beam:transform:org.apache.beam:spanner_update:v1, 
beam:transform:org.apache.beam:spanner_replace:v1, 
beam:transform:org.apache.beam:spanner_insert_or_update:v1, 
beam:transform:org.apache.beam:spanner_delete:v1, 
beam:transform:org.apache.beam:spanner_read:v1, 
beam:transform:org.apache.beam:kafka_read_with_metadata:v1, 
beam:transform:org.apache.beam:kafka_read_without_metadata:v1, 
beam:transform:org.apache.beam:kafka_write:v1, 
beam:external:java:generate_sequence:v1]'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:external:java:sql:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@79efed2d'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:pubsub_read:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@2928854b'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:pubsub_write:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@27ae2fd0'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:pubsublite_write:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@29176cc1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:pubsublite_read:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@2f177a4b'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_insert:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@4278a03f'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_update:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@147ed70f'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_replace:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@61dd025'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_insert_or_update:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@124c278f'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_delete:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@15b204a1'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:spanner_read:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@77167fb7'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:kafka_read_with_metadata:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@1fe20588'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:kafka_read_without_metadata:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@6ce139a4'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:transform:org.apache.beam:kafka_write:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@6973bf95'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b'\tbeam:external:java:generate_sequence:v1: 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$1@2ddc8ecb'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Jan 28, 
2022 7:27:51 AM org.apache.beam.sdk.expansion.service.ExpansionService expand'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"INFO: 
Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 
'beam:external:java:sql:v1'"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Jan 28, 
2022 7:27:51 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Jan 28, 
2022 7:27:52 AM 
org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader
 payloadToConfig'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b"WARNING: Configuration class 
'org.apache.beam.sdk.extensions.sql.expansion.ExternalSqlTransformRegistrar$Configuration'
 has no schema registered. Attempting to construct with setter approach."
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Jan 28, 
2022 7:27:56 AM org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner 
convertToBeamRelInternal'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO: 
BEAMPlan>'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b"BeamZetaSqlCalcRel(expr#0=[{inputs}], expr#1=[1:BIGINT], 
expr#2=['foo':VARCHAR], expr#3=[3.1400000000000001243E0:DOUBLE], int=[$t1], 
str=[$t2], flt=[$t3])"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  
BeamValuesRel(tuples=[[{ 0 }]])'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b''
INFO     apache_beam.runners.portability.stager:stager.py:305 Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:371 Make sure that locally built Python SDK 
docker image has Python 3.8 interpreter.
INFO     root:environments.py:380 Default Python SDK image for environment is 
apache/beam_python3.8_sdk:2.37.0.dev
INFO     root:environments.py:295 Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220117
INFO     root:environments.py:302 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220117" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7efc9d651670> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7efc9d651e50> 
====================
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:454 
Defaulting to the temp_location as staging_location: 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/beam-sdks-java-extensions-sql-expansion-service-2.37.0-SNAPSHOT-8yMr5TOg5TKOmAzoOjjaVw7b_E_zX1geFgqS8rTs27E.jar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/beam-sdks-java-extensions-sql-expansion-service-2.37.0-SNAPSHOT-8yMr5TOg5TKOmAzoOjjaVw7b_E_zX1geFgqS8rTs27E.jar
 in 10 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/dataflow_python_sdk.tar...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/dataflow_python_sdk.tar
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 
Starting GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/pipeline.pb...
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 
Completed GCS upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/pipeline.pb
 in 0 seconds.
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:879 
Create job: <Job
                                                                           
clientRequestId: '20220128072802773405-3474'
                                                                           
createTime: '2022-01-28T07:28:15.135650Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2022-01-27_23_28_14-15126802493498932353'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-0128072802-771751'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2022-01-28T07:28:15.135650Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:881 
Created job with id: [2022-01-27_23_28_14-15126802493498932353]
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:882 
Submitted job: 2022-01-27_23_28_14-15126802493498932353
INFO     apache_beam.runners.dataflow.internal.apiclient:apiclient.py:883 To 
access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-27_23_28_14-15126802493498932353?project=apache-beam-testing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2022-01-27_23_28_14-15126802493498932353 is in state JOB_STATE_RUNNING
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:17.768Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2022-01-27_23_28_14-15126802493498932353. The number of workers will be between 
1 and 1000.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:18.030Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2022-01-27_23_28_14-15126802493498932353.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.188Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.748Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.778Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.842Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.880Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.917Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.947Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:21.988Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.021Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.053Z: JOB_MESSAGE_DEBUG: Inserted coder converter before 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.088Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_15 for input 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_13.None-post13
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.120Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.154Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.185Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/RestoreTags into 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.219Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/RestoreTags
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.259Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.346Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.395Z: JOB_MESSAGE_DETAILED: Fusing consumer 
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
 into 
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.459Z: JOB_MESSAGE_DETAILED: Fusing consumer 
external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction
 into 
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.507Z: JOB_MESSAGE_DETAILED: Fusing consumer 
external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
 into 
external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.562Z: JOB_MESSAGE_DETAILED: Fusing consumer 
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)
 into 
external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/ProcessElementAndRestrictionWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.597Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into 
SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.669Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/FlatMap(<lambda at core.py:3228>) into 
assert_that/Create/Impulse
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.811Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at 
core.py:3228>)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.833Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.857Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.888Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.909Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.934Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.961Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:22.980Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.011Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.032Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.156Z: JOB_MESSAGE_DEBUG: Executing wait step start23
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.204Z: JOB_MESSAGE_BASIC: Executing operation 
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.237Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.261Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.291Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.358Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.416Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:23.467Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:3228>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:28:59.064Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:29:07.327Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:29:51.535Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:30:19.299Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:31:05.889Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:31:34.452Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:32:17.787Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:32:42.862Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:33:31.123Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:33:56.384Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.566Z: JOB_MESSAGE_WARNING: A worker was unable to start up. 
 Error: Unable to pull container image due to error: image pull request failed 
with error: Error response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.609Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job 
appears to be stuck. Several workers have failed to start up in a row, and no 
worker has successfully started up for this job. Last error reported: Unable to 
pull container image due to error: image pull request failed with error: Error 
response from daemon: manifest for 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948 not found: 
manifest unknown: Failed to fetch "20220128065948" from request 
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20220128065948".. 
This is likely due to an invalid SDK container image URL. Please verify any 
provided SDK container image is valid and that Dataflow workers have 
permissions to pull image..
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.666Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:3228>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.737Z: JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0128072802-771751.1643354882.772226/dax-tmp-2022-01-27_23_28_14-15126802493498932353-S03-0-42cb1addf3fa1aa5/[email protected]."
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.767Z: JOB_MESSAGE_WARNING: 
S03:SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
 failed.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.796Z: JOB_MESSAGE_BASIC: Finished operation 
SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/Impulse+SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_7/Create.Values/Read(CreateSource)/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/PairWithRestriction+external_2SqlTransform-beam-external-java-sql-v1--BeamValuesRel_7-Create-Values-Read-CreateSource--ParDo-Bound/SplitWithSizing
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.860Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.921Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:50.941Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:34:51.206Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:35:35.390Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-28T07:35:35.457Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 
Job 2022-01-27_23_28_14-15126802493498932353 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml>
 -
=================== 9 failed, 14 warnings in 1095.52 seconds ===================

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
>  FAILED
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages FAILED
Error: No such image: 
us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220128065948
ERROR: (gcloud.container.images.untag) Image could not be found: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python:20220128065948]

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
Error: No such image: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948
ERROR: (gcloud.container.images.untag) Image could not be found: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220128065948]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 330

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerPythonImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 289

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 37m 29s
153 actionable tasks: 115 executed, 32 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/bnmywoz4fdyig

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to