See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/307/display/redirect>
Changes: ------------------------------------------ [...truncated 284.24 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.675Z: JOB_MESSAGE_DEBUG: Executing wait step start44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.741Z: JOB_MESSAGE_BASIC: Executing operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.767Z: JOB_MESSAGE_BASIC: Executing operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.791Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.821Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.821Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.856Z: JOB_MESSAGE_BASIC: Executing operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.889Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.922Z: JOB_MESSAGE_BASIC: Executing operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.954Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:33.975Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.007Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.027Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.062Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.085Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.117Z: JOB_MESSAGE_DEBUG: Value "create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:34.185Z: JOB_MESSAGE_BASIC: Executing operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:46:56.526Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:47:08.996Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:47:22.325Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:20.169Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:20.249Z: JOB_MESSAGE_DEBUG: Value "create_ignore_word/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:20.305Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:20.370Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:20.433Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input1-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:33.351Z: JOB_MESSAGE_BASIC: Finished operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:33.435Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34-split-with-sizing-out9" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:33.513Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:36.135Z: JOB_MESSAGE_BASIC: Finished operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:36.192Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13-split-with-sizing-out3" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:36.249Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:45.514Z: JOB_MESSAGE_BASIC: Finished operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:45.562Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:45.604Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.460Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.527Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.551Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.630Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.687Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.707Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.737Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.755Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input0-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.800Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.875Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.936Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:49.985Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.171Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.271Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.305Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.343Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.363Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.386Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.406Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.448Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:50.473Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.566Z: JOB_MESSAGE_BASIC: Finished operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.623Z: JOB_MESSAGE_DEBUG: Value "read corpus/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.724Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.772Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.841Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:52.896Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:53.034Z: JOB_MESSAGE_BASIC: Finished operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:53.100Z: JOB_MESSAGE_DEBUG: Value "read_words/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:53.173Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:53.225Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:53.291Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:55.586Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:56.388Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:56.454Z: JOB_MESSAGE_DEBUG: Value "create_ignore_corpus/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:56.520Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:56.576Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:54:56.683Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input1-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.522Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.588Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.631Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.678Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.731Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.732Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.777Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.799Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.854Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input0-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.877Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:00.922Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:02.725Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:04.176Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:04.241Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:04.291Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:04.358Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.632Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.730Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.791Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.817Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.834Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.909Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.920Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:06.969Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:07.025Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:08.833Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:08.899Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:08.957Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:09.001Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:09.058Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:09.118Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:11.420Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:11.481Z: JOB_MESSAGE_DEBUG: Executing success step success42 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:11.571Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:11.683Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:11.717Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:46.547Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:46.597Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-24T13:55:46.632Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-04-24_06_46_25-11105999642520754664 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1 files in 0.047745704650878906 seconds. [32mPASSED[0m apache_beam/examples/cookbook/coders_it_test.py::CodersIT::test_coders_output_files_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/cb94646d-ae64-4399-b144-271586355029/input.txt [31mFAILED[0m =================================== FAILURES =================================== [31m[1m_______________ CodersIT.test_coders_output_files_on_small_input _______________[0m self = <apache_beam.examples.cookbook.coders_it_test.CodersIT testMethod=test_coders_output_files_on_small_input> [1m @pytest.mark.no_xdist[0m [1m @pytest.mark.examples_postcommit[0m [1m def test_coders_output_files_on_small_input(self):[0m [1m test_pipeline = TestPipeline(is_integration_test=True)[0m [1m # Setup the files with expected content.[0m [1m OUTPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'[0m [1m output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])[0m [1m INPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'[0m [1m input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])[0m [1m create_content_input_file([0m [1m input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m [1m extra_opts = {'input': input, 'output': output}[0m [1m> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m [1m[31mapache_beam/examples/cookbook/coders_it_test.py[0m:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [1m[31mapache_beam/examples/cookbook/coders.py[0m:87: in run [1m p[0m [1m[31mapache_beam/transforms/ptransform.py[0m:1092: in __ror__ [1m return self.transform.__ror__(pvalueish, self.label)[0m [1m[31mapache_beam/transforms/ptransform.py[0m:614: in __ror__ [1m result = p.apply(self, pvalueish, label)[0m [1m[31mapache_beam/pipeline.py[0m:662: in apply [1m return self.apply(transform, pvalueish)[0m [1m[31mapache_beam/pipeline.py[0m:708: in apply [1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m [1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply [1m return super().apply(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:185: in apply [1m return m(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform [1m return transform.expand(input)[0m [1m[31mapache_beam/io/textio.py[0m:690: in expand [1m self._source.output_type_hint())[0m _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.io.textio._TextSource object at 0x7f024e011160> [1m def output_type_hint(self):[0m [1m try:[0m [1m> return self._coder.to_type_hint()[0m [1m[31mE AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'[0m [1m[31mapache_beam/io/textio.py[0m:409: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/cb94646d-ae64-4399-b144-271586355029/input.txt [33m=============================== warnings summary ===============================[0m <unknown>:54 <unknown>:54: DeprecationWarning: invalid escape sequence \c <unknown>:62 <unknown>:62: DeprecationWarning: invalid escape sequence \d <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported pipeline_options=pcoll.pipeline.options, -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py39-no-xdist.xml> - [31m[1m= 1 failed, 5 passed, 1 skipped, 5371 deselected, 14 warnings in 2699.81 seconds =[0m > Task :sdks:python:test-suites:dataflow:py39:examples FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 183 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 1h 37s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/rpghhvqocqa52 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
