See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/322/display/redirect?page=changes>
Changes: [chamikaramj] Renames ExternalPythonTransform to PythonExternalTransform ------------------------------------------ [...truncated 284.17 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.211Z: JOB_MESSAGE_DEBUG: Executing wait step start44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.300Z: JOB_MESSAGE_BASIC: Executing operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.332Z: JOB_MESSAGE_BASIC: Executing operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.347Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.369Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.381Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.404Z: JOB_MESSAGE_BASIC: Executing operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.439Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.472Z: JOB_MESSAGE_BASIC: Executing operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.506Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.539Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.587Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.616Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.638Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.689Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.713Z: JOB_MESSAGE_DEBUG: Value "create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:12.812Z: JOB_MESSAGE_BASIC: Executing operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:21.268Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:35.126Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:43:58.910Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.604Z: JOB_MESSAGE_BASIC: Finished operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.710Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34-split-with-sizing-out9" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.720Z: JOB_MESSAGE_BASIC: Finished operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.797Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13-split-with-sizing-out3" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.829Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:50:52.906Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:04.709Z: JOB_MESSAGE_BASIC: Finished operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:04.810Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:04.864Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:05.313Z: JOB_MESSAGE_BASIC: Finished operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:05.396Z: JOB_MESSAGE_DEBUG: Value "read corpus/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:05.455Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:05.514Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:05.600Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:08.715Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:08.801Z: JOB_MESSAGE_DEBUG: Value "create_ignore_corpus/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:08.908Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:08.978Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.064Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input1-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.497Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.568Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.598Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.646Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.681Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.705Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.737Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.779Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:09.806Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input0-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.515Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.590Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.614Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.671Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.704Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.741Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.766Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.802Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input0-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.835Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.957Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:12.981Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.070Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.104Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.168Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.198Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.230Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.247Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.249Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.284Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.333Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.377Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:13.403Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:15.461Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.310Z: JOB_MESSAGE_BASIC: Finished operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.370Z: JOB_MESSAGE_DEBUG: Value "read_words/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.436Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.507Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.561Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:16.627Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.080Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.555Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.616Z: JOB_MESSAGE_DEBUG: Value "create_ignore_word/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.701Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.756Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.824Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input1-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:19.935Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:23.235Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:23.298Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:23.353Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:23.406Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:25.832Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:25.924Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:25.982Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.013Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.031Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.085Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.102Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.159Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:26.243Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.688Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.766Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.829Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.877Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.940Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:29.995Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:32.460Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:32.536Z: JOB_MESSAGE_DEBUG: Executing success step success42 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:32.620Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:32.760Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:51:32.790Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:52:07.412Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:52:07.457Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-28T07:52:07.493Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-04-28_00_43_02-5127582446640806820 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1 files in 0.043782711029052734 seconds. [32mPASSED[0m apache_beam/examples/cookbook/coders_it_test.py::CodersIT::test_coders_output_files_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/4a4a1509-f042-403e-9955-4b8d8f73794f/input.txt [31mFAILED[0m =================================== FAILURES =================================== [31m[1m_______________ CodersIT.test_coders_output_files_on_small_input _______________[0m self = <apache_beam.examples.cookbook.coders_it_test.CodersIT testMethod=test_coders_output_files_on_small_input> [1m @pytest.mark.no_xdist[0m [1m @pytest.mark.examples_postcommit[0m [1m def test_coders_output_files_on_small_input(self):[0m [1m test_pipeline = TestPipeline(is_integration_test=True)[0m [1m # Setup the files with expected content.[0m [1m OUTPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'[0m [1m output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])[0m [1m INPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'[0m [1m input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])[0m [1m create_content_input_file([0m [1m input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m [1m extra_opts = {'input': input, 'output': output}[0m [1m> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m [1m[31mapache_beam/examples/cookbook/coders_it_test.py[0m:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [1m[31mapache_beam/examples/cookbook/coders.py[0m:87: in run [1m p[0m [1m[31mapache_beam/transforms/ptransform.py[0m:1092: in __ror__ [1m return self.transform.__ror__(pvalueish, self.label)[0m [1m[31mapache_beam/transforms/ptransform.py[0m:614: in __ror__ [1m result = p.apply(self, pvalueish, label)[0m [1m[31mapache_beam/pipeline.py[0m:662: in apply [1m return self.apply(transform, pvalueish)[0m [1m[31mapache_beam/pipeline.py[0m:708: in apply [1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m [1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply [1m return super().apply(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:185: in apply [1m return m(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform [1m return transform.expand(input)[0m [1m[31mapache_beam/io/textio.py[0m:690: in expand [1m self._source.output_type_hint())[0m _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.io.textio._TextSource object at 0x7fadfb9cb610> [1m def output_type_hint(self):[0m [1m try:[0m [1m> return self._coder.to_type_hint()[0m [1m[31mE AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'[0m [1m[31mapache_beam/io/textio.py[0m:409: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/4a4a1509-f042-403e-9955-4b8d8f73794f/input.txt [33m=============================== warnings summary ===============================[0m <unknown>:54 <unknown>:54: DeprecationWarning: invalid escape sequence \c <unknown>:62 <unknown>:62: DeprecationWarning: invalid escape sequence \d <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported pipeline_options=pcoll.pipeline.options, -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py39-no-xdist.xml> - [31m[1m= 1 failed, 5 passed, 1 skipped, 5371 deselected, 14 warnings in 2482.17 seconds =[0m > Task :sdks:python:test-suites:dataflow:py39:examples FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 203 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 56m 58s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/wzuce3rqjsus4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
