See
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/1413/display/redirect>
Changes:
------------------------------------------
[...truncated 741.02 KB...]
'Job did not reach to a terminal state after waiting indefinitely.')
if terminated and self.state != PipelineState.DONE:
# TODO(BEAM-1290): Consider converting this to an error log based on
# theresolution of the issue.
raise DataflowRuntimeException(
'Dataflow pipeline failed. State: %s, Error:\n%s' %
(self.state, getattr(self._runner, 'last_error_msg', None)),
> self)
E
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow
pipeline failed. State: FAILED, Error:
E Workflow failed. Causes: Job appears to be stuck. Several workers
have failed to start up in a row, and no worker has successfully started up for
this job. Last error reported: Unable to pull container image due to error:
image pull request failed with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image..
apache_beam/runners/dataflow/dataflow_runner.py:1643: DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO apache_beam.runners.portability.stager:stager.py:303 Copying Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
WARNING root:environments.py:374 Make sure that locally built Python SDK
docker image has Python 3.6 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.35.0.dev
INFO root:environments.py:296 Using provided Python SDK container image:
gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211015
INFO root:environments.py:304 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/python36-fnapi:beam-master-20211015" for Docker
environment
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7f7a60c1e840>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7f7a60d72048>
====================
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:454
Defaulting to the temp_location as staging_location:
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/icedtea-sound-Bdoi2wYa757-fzq5vconCy4SSQ22ZaOq7yuC98fKPs8.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/jaccess-CMbK-IOdQPLKHEqCuDnE-yBk-VpbtVT-hgjbHRUGO78.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/localedata-ae5Z0L6ak4922fztWeWy7ajiWXdG3ubNrwerJRFoFj0.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/nashorn-XHtz_UehGpYcLTOrATrTnMNVUgEVa_ttoWkPxnVfqTo.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/cldrdata-k07I6K9W3X5KTQbcDIEsqM0LXyM18f0eR6IaJw-P_xk.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/dnsns-RGhCDg3GVOQVC2r6ka2N0hmI4eqQH6VobuoAnQ74MnE.jar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/beam-sdks-java-testing-expansion-service-testExpansionService-2.35.0-SNAPSHOT-UROKmaGNH684BTk9cZMtLqcho6glylYpdf6EKMYw3pA.jar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/beam-sdks-java-testing-expansion-service-testExpansionService-2.35.0-SNAPSHOT-UROKmaGNH684BTk9cZMtLqcho6glylYpdf6EKMYw3pA.jar
in 3 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/dataflow_python_sdk.tar...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/dataflow_python_sdk.tar
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:648
Starting GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/pipeline.pb...
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:667
Completed GCS upload to
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1108002056-175296.1636330856.175813/pipeline.pb
in 0 seconds.
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:828
Create job: <Job
clientRequestId: '20211108002056176712-3474'
createTime: '2021-11-08T00:21:01.668201Z'
currentStateTime: '1970-01-01T00:00:00Z'
id:
'2021-11-07_16_21_01-15280342487259213274'
location: 'us-central1'
name: 'beamapp-jenkins-1108002056-175296'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-11-08T00:21:01.668201Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:830
Created job with id: [2021-11-07_16_21_01-15280342487259213274]
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:831
Submitted job: 2021-11-07_16_21_01-15280342487259213274
INFO apache_beam.runners.dataflow.internal.apiclient:apiclient.py:837 To
access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-11-07_16_21_01-15280342487259213274?project=apache-beam-testing
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191
Job 2021-11-07_16_21_01-15280342487259213274 is in state JOB_STATE_RUNNING
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:04.138Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2021-11-07_16_21_01-15280342487259213274. The number of workers will be between
1 and 1000.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:04.185Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2021-11-07_16_21_01-15280342487259213274.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:05.828Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.333Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.375Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.426Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.449Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
check_odd/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a
combiner.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.466Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
check_even/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a
combiner.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.501Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.540Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.566Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.601Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.635Z: JOB_MESSAGE_DEBUG: Inserted coder converter before
flatten ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.683Z: JOB_MESSAGE_DEBUG: Inserted coder converter before
flatten ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.704Z: JOB_MESSAGE_DEBUG: Inserted coder converter before
flatten ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.736Z: JOB_MESSAGE_DEBUG: Inserted coder converter before
flatten ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.772Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Flatten_27 for input
ref_AppliedPTransform_check_even-Group-CoGroupByKeyImpl-Tag-0-_25.None-post19
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.803Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
check_even/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten
check_even/Group/CoGroupByKeyImpl/Flatten, into producer
check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.827Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/MapTuple(collect_values) into
check_even/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.849Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/RestoreTags into
check_even/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.873Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Unkey into check_even/Group/RestoreTags
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.902Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Match into check_even/Unkey
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.935Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Flatten_45 for input
ref_AppliedPTransform_check_odd-Group-CoGroupByKeyImpl-Tag-0-_43.None-post25
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:06.969Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten
check_odd/Group/CoGroupByKeyImpl/Flatten, into producer
check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.002Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/MapTuple(collect_values) into
check_odd/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.027Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/RestoreTags into
check_odd/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.055Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Unkey
into check_odd/Group/RestoreTags
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.085Z: JOB_MESSAGE_DETAILED: Fusing consumer check_odd/Match
into check_odd/Unkey
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.111Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/GroupByKey/Write into
check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.145Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write into
check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.179Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/FlatMap(<lambda at core.py:3222>) into Create/Impulse
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.212Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at
core.py:3222>)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.240Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.259Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.289Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.314Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.337Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.362Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.396Z: JOB_MESSAGE_DETAILED: Fusing consumer
Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.421Z: JOB_MESSAGE_DETAILED: Fusing consumer
ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition)
into Create/Map(decode)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.456Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/WindowInto(WindowIntoFn) into
ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.482Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/WindowInto(WindowIntoFn) into
ExternalTransform(beam:transforms:xlang:test:partition)/ParMultiDo(Partition)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.516Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Create/FlatMap(<lambda at core.py:3222>) into
check_even/Create/Impulse
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.545Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Create/Map(decode) into check_even/Create/FlatMap(<lambda at
core.py:3222>)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.576Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/Tag[0] into check_even/Create/Map(decode)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.592Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity into
check_even/Group/CoGroupByKeyImpl/Tag[0]
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.623Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/ToVoidKey into check_even/WindowInto(WindowIntoFn)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.651Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/Tag[1] into check_even/ToVoidKey
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.685Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity into
check_even/Group/CoGroupByKeyImpl/Tag[1]
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.710Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Create/FlatMap(<lambda at core.py:3222>) into check_odd/Create/Impulse
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.743Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Create/Map(decode) into check_odd/Create/FlatMap(<lambda at
core.py:3222>)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.769Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/Tag[0] into check_odd/Create/Map(decode)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.795Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity into
check_odd/Group/CoGroupByKeyImpl/Tag[0]
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.825Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/ToVoidKey into check_odd/WindowInto(WindowIntoFn)
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.847Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/Tag[1] into check_odd/ToVoidKey
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.875Z: JOB_MESSAGE_DETAILED: Fusing consumer
check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity into
check_odd/Group/CoGroupByKeyImpl/Tag[1]
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.907Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.937Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:07.970Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.140Z: JOB_MESSAGE_DEBUG: Executing wait step start39
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.197Z: JOB_MESSAGE_BASIC: Executing operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.219Z: JOB_MESSAGE_BASIC: Executing operation
check_even/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.233Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.251Z: JOB_MESSAGE_BASIC: Executing operation
check_odd/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.263Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.301Z: JOB_MESSAGE_BASIC: Finished operation
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.315Z: JOB_MESSAGE_BASIC: Finished operation
check_even/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.315Z: JOB_MESSAGE_BASIC: Finished operation
check_odd/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.355Z: JOB_MESSAGE_DEBUG: Value
"Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session"
materialized.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.381Z: JOB_MESSAGE_DEBUG: Value
"check_odd/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.409Z: JOB_MESSAGE_DEBUG: Value
"check_even/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.438Z: JOB_MESSAGE_BASIC: Executing operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:3222>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.473Z: JOB_MESSAGE_BASIC: Executing operation
check_odd/Create/Impulse+check_odd/Create/FlatMap(<lambda at
core.py:3222>)+check_odd/Create/Map(decode)+check_odd/Group/CoGroupByKeyImpl/Tag[0]+check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:08.510Z: JOB_MESSAGE_BASIC: Executing operation
check_even/Create/Impulse+check_even/Create/FlatMap(<lambda at
core.py:3222>)+check_even/Create/Map(decode)+check_even/Group/CoGroupByKeyImpl/Tag[0]+check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_even/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:37.437Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:21:51.103Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:22:31.723Z: JOB_MESSAGE_WARNING: A worker was unable to start up.
Error: Unable to pull container image due to error: image pull request failed
with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:22:59.121Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:23:44.795Z: JOB_MESSAGE_WARNING: A worker was unable to start up.
Error: Unable to pull container image due to error: image pull request failed
with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:24:12.267Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:24:54.097Z: JOB_MESSAGE_WARNING: A worker was unable to start up.
Error: Unable to pull container image due to error: image pull request failed
with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:25:21.729Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:26:06.554Z: JOB_MESSAGE_WARNING: A worker was unable to start up.
Error: Unable to pull container image due to error: image pull request failed
with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:26:29.627Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:10.902Z: JOB_MESSAGE_WARNING: A worker was unable to start up.
Error: Unable to pull container image due to error: image pull request failed
with error: Error response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:10.928Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Job
appears to be stuck. Several workers have failed to start up in a row, and no
worker has successfully started up for this job. Last error reported: Unable to
pull container image due to error: image pull request failed with error: Error
response from daemon: manifest for
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113 not found:
manifest unknown: Failed to fetch "20211108000113" from request
"/v2/apache-beam-testing/java-postcommit-it/java/manifests/20211108000113"..
This is likely due to an invalid SDK container image URL. Please verify any
provided SDK container image is valid and that Dataflow workers have
permissions to pull image..
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:10.992Z: JOB_MESSAGE_BASIC: Finished operation
check_even/Create/Impulse+check_even/Create/FlatMap(<lambda at
core.py:3222>)+check_even/Create/Map(decode)+check_even/Group/CoGroupByKeyImpl/Tag[0]+check_even/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_even/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:10.992Z: JOB_MESSAGE_BASIC: Finished operation
check_odd/Create/Impulse+check_odd/Create/FlatMap(<lambda at
core.py:3222>)+check_odd/Create/Map(decode)+check_odd/Group/CoGroupByKeyImpl/Tag[0]+check_odd/Group/CoGroupByKeyImpl/Flatten/InputIdentity+check_odd/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:10.992Z: JOB_MESSAGE_BASIC: Finished operation
Create/Impulse+Create/FlatMap(<lambda at
core.py:3222>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:11.060Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:11.257Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:11.306Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:11.565Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker
pool from 1 to 0.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:53.057Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2021-11-08T00:27:53.088Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191
Job 2021-11-07_16_21_01-15280342487259213274 is in state JOB_STATE_FAILED
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/pytest_xlangValidateRunner.xml>
-
=============== 9 failed, 1 passed, 6 warnings in 944.02 seconds ===============
> Task
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
> FAILED
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 19744.
Stopping expansion service pid: 19748.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ba4c871c00d48a134b066790613b4ceef046d527a274d882dbdbc3be75fc86f5
Deleted: sha256:ce56c1d1380fc26f4aa0dbe100b7cc3ba48c32f2fcd585e684929080bfacee87
Deleted: sha256:f10eaeca0dd66e7bc6d972ad258a1a79e7da8b3d469b4471f179e649e970d1b2
Deleted: sha256:027c59e6baabc2a6584c4f14024ea8d5c86b59a89522af4b7353fa911649741c
Deleted: sha256:9fd7fd621a65fcac0d34da42f026e9a5265ce8298a2936bdf399103c6ea2b944
Deleted: sha256:93b807ba4f528fada1053449f13a9dc59e083370cfc8becae5df81adbaaf2421
Deleted: sha256:71ce7d7636e12db0d48f59a53e9f0b5cc444c81440aa756acee12c1b28e84d20
Deleted: sha256:a70cb6f2de4aace6de3f16d9fbeac872ecd5a2ad41c1156bb70c85df16e4a629
Deleted: sha256:89c9728a42877c7a12321f46080736e8822431b6b2462ad682904daa7ae91baf
Deleted: sha256:307a204c5eb9c3dee14870727b3d7df48ad1aa3dd50596b5e5a9b129e2ab9e2c
Deleted: sha256:4ff02cf3568e7994e11aee9ac7ce1e43303e9aa1964e7ae0acae7b6011704882
Deleted: sha256:18daf7527b6322d58e37766daac7dd8722abe7f47b8a7df883ecd33b8110b3d7
Deleted: sha256:2b19f85e1fc19ce4f9e423a3c04886fedb974c70ae01dd40030991686cd83534
Deleted: sha256:5db3e30a6c2f2a504cdb85ffbc87aaa2639072a787de612c672b555dcbbc3f4a
Deleted: sha256:4597a1f002546d4184f094699a57e98fccee13cf4c5ebaf87cc413e8995088ee
Deleted: sha256:77557c3dab2f802187863c7da54bcb26e8c38a0c32d548cbbd13a29270e279f5
Deleted: sha256:5fb95ef93c356d4529c38a47e6eab91ec4d21739a86ac1bed052509d2a8507ff
Deleted: sha256:b5f0c7d19656a01ee08d351744d2382dbb9836eebd8b6db2b1672eebffda9ada
Deleted: sha256:00bfb58c994048ff8771cb305b95535a196ae53652cc6955bc76954f9dd0b690
Deleted: sha256:39ef5984bec9a21282febb350aa4bde694d271b0864168ffb1db981d352162ab
Deleted: sha256:2d805fd0997a208c16642a12dc37f66a0516c96517d5069956c76db34a4a73f4
Deleted: sha256:969f9e0c365a74b10470b60f4c96c3df5120cc4d3d7f465f5d65ef320f80037b
Deleted: sha256:70da34f9659af14f40f9170465c12679a06b0c14706b9ca7b2c2d6466ab19ec4
Deleted: sha256:f7eae0085058cebe22ed363f8c71ef806d38b7d5b3aa7ce8fec8a793a11176ac
Deleted: sha256:c47bfb55bf6fa0abe9c7e0ab55da8a69d747ce7ae7f1cb0691ce85e75fdf0ff7
Deleted: sha256:ccfc3ac8d597d60b630b0674b09029eab9c4b0d05de810a0fe1fdbbbd2144cf4
Deleted: sha256:15a97c87c0f8449491fe60f89ed55ba1e5e4275d370337019a8a5ea979b2745a
Deleted: sha256:f9b09f17b6ab6912fd48019a68104ebb07eb9ab228eaf4bf047aed672d30d2df
Deleted: sha256:7094705cc90147761fc39e15a6d1d982d4341b4a324e2c46d3376cd46e90a3aa
ERROR: (gcloud.container.images.untag) Image could not be found:
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20211108000113]
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
line: 278
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 28m 5s
132 actionable tasks: 94 executed, 34 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/oaz4oveqepmss
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]