See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/311/display/redirect>
Changes: ------------------------------------------ [...truncated 284.23 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:53.944Z: JOB_MESSAGE_DEBUG: Executing wait step start44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.006Z: JOB_MESSAGE_BASIC: Executing operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.040Z: JOB_MESSAGE_BASIC: Executing operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.054Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.071Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.083Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.096Z: JOB_MESSAGE_BASIC: Executing operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.123Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.157Z: JOB_MESSAGE_BASIC: Executing operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.191Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.223Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.260Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.271Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.318Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.335Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.391Z: JOB_MESSAGE_DEBUG: Value "create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:44:54.470Z: JOB_MESSAGE_BASIC: Executing operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:45:16.274Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:45:23.187Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:45:40.982Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:47.879Z: JOB_MESSAGE_BASIC: Finished operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:47.961Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:48.005Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:51.947Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:52.031Z: JOB_MESSAGE_DEBUG: Value "create_ignore_corpus/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:52.079Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:52.126Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:52.187Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input1-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:55.262Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:55.324Z: JOB_MESSAGE_DEBUG: Value "create_ignore_word/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:55.391Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:55.455Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:55.538Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input1-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.719Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.822Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.860Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.928Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.950Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.972Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.980Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:58.997Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:59.033Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:59.035Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:59.072Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:52:59.106Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:01.983Z: JOB_MESSAGE_BASIC: Finished operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.051Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13-split-with-sizing-out3" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.056Z: JOB_MESSAGE_BASIC: Finished operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.120Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.155Z: JOB_MESSAGE_DEBUG: Value "read corpus/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.218Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.265Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:02.320Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.422Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.494Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.549Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.577Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.614Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.628Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.667Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.694Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.728Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input0-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:16.762Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.297Z: JOB_MESSAGE_BASIC: Finished operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.365Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34-split-with-sizing-out9" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.427Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.728Z: JOB_MESSAGE_BASIC: Finished operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.824Z: JOB_MESSAGE_DEBUG: Value "read_words/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.885Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:18.942Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:19.030Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:21.832Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:28.828Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:28.914Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:28.951Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:28.982Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.037Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.038Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.094Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.106Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.151Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input0-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.183Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:29.219Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:30.945Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:31.324Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:31.383Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:31.444Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:31.547Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:34.807Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:34.868Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:34.940Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:34.978Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:35.001Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:35.038Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:35.062Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:35.113Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:35.159Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.131Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.196Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.266Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.320Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.387Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:38.462Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:40.790Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:40.860Z: JOB_MESSAGE_DEBUG: Executing success step success42 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:40.960Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:41.105Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:53:41.141Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:54:14.904Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:54:14.954Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-25T13:54:14.985Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-04-25_06_44_47-10015677148878033011 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1 files in 0.05151844024658203 seconds. [32mPASSED[0m apache_beam/examples/cookbook/coders_it_test.py::CodersIT::test_coders_output_files_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/ee76c962-7c09-4b4a-bbea-59f957276f46/input.txt [31mFAILED[0m =================================== FAILURES =================================== [31m[1m_______________ CodersIT.test_coders_output_files_on_small_input _______________[0m self = <apache_beam.examples.cookbook.coders_it_test.CodersIT testMethod=test_coders_output_files_on_small_input> [1m @pytest.mark.no_xdist[0m [1m @pytest.mark.examples_postcommit[0m [1m def test_coders_output_files_on_small_input(self):[0m [1m test_pipeline = TestPipeline(is_integration_test=True)[0m [1m # Setup the files with expected content.[0m [1m OUTPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'[0m [1m output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])[0m [1m INPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'[0m [1m input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])[0m [1m create_content_input_file([0m [1m input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m [1m extra_opts = {'input': input, 'output': output}[0m [1m> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m [1m[31mapache_beam/examples/cookbook/coders_it_test.py[0m:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [1m[31mapache_beam/examples/cookbook/coders.py[0m:87: in run [1m p[0m [1m[31mapache_beam/transforms/ptransform.py[0m:1092: in __ror__ [1m return self.transform.__ror__(pvalueish, self.label)[0m [1m[31mapache_beam/transforms/ptransform.py[0m:614: in __ror__ [1m result = p.apply(self, pvalueish, label)[0m [1m[31mapache_beam/pipeline.py[0m:662: in apply [1m return self.apply(transform, pvalueish)[0m [1m[31mapache_beam/pipeline.py[0m:708: in apply [1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m [1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply [1m return super().apply(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:185: in apply [1m return m(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform [1m return transform.expand(input)[0m [1m[31mapache_beam/io/textio.py[0m:690: in expand [1m self._source.output_type_hint())[0m _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.io.textio._TextSource object at 0x7f8674b47ac0> [1m def output_type_hint(self):[0m [1m try:[0m [1m> return self._coder.to_type_hint()[0m [1m[31mE AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'[0m [1m[31mapache_beam/io/textio.py[0m:409: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/ee76c962-7c09-4b4a-bbea-59f957276f46/input.txt [33m=============================== warnings summary ===============================[0m <unknown>:54 <unknown>:54: DeprecationWarning: invalid escape sequence \c <unknown>:62 <unknown>:62: DeprecationWarning: invalid escape sequence \d <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported pipeline_options=pcoll.pipeline.options, -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py39-no-xdist.xml> - [31m[1m= 1 failed, 5 passed, 1 skipped, 5371 deselected, 14 warnings in 2586.16 seconds =[0m > Task :sdks:python:test-suites:dataflow:py39:examples FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 183 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 59m 5s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/lvoo7nihe3koe Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
