See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/317/display/redirect?page=changes>
Changes: [msbukal] FhirIO: use .search() or .searchType instead of .setResourceType() [noreply] fixes copy by value error for bytes.Buffer in Error (#17469) [noreply] Merge pull request #17354 from [BEAM-14170] - Create a test that runs [noreply] Merge pull request #17447 from [BEAM-14357] Fix [noreply] [BEAM-14324, BEAM-14325] Staticcheck cleanup in test files (#17393) ------------------------------------------ [...truncated 284.25 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.705Z: JOB_MESSAGE_DEBUG: Executing wait step start44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.781Z: JOB_MESSAGE_BASIC: Executing operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.818Z: JOB_MESSAGE_BASIC: Executing operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.832Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.867Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.953Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:25.980Z: JOB_MESSAGE_BASIC: Executing operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.013Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.047Z: JOB_MESSAGE_BASIC: Executing operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.076Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.103Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.131Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.181Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.200Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.246Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.296Z: JOB_MESSAGE_DEBUG: Value "create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:26.363Z: JOB_MESSAGE_BASIC: Executing operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:30.298Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:11:47.728Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:12:11.857Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.713Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.776Z: JOB_MESSAGE_BASIC: Finished operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.780Z: JOB_MESSAGE_DEBUG: Value "create_ignore_corpus/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.854Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.884Z: JOB_MESSAGE_DEBUG: Value "read corpus/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.927Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:07.982Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:08.009Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input1-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:08.041Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:08.096Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.192Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.297Z: JOB_MESSAGE_DEBUG: Value "create_ignore_word/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.305Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.359Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.382Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.413Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.415Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.480Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.515Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.538Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.552Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.568Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.586Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input1-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.607Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.611Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.633Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:11.719Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:13.558Z: JOB_MESSAGE_BASIC: Finished operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:13.627Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:13.683Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:27.531Z: JOB_MESSAGE_BASIC: Finished operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:27.603Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34-split-with-sizing-out9" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:27.668Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:30.976Z: JOB_MESSAGE_BASIC: Finished operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:31.050Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13-split-with-sizing-out3" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:31.110Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.420Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.558Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.604Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.638Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.681Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.698Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.736Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.761Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:41.796Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input0-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:43.943Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.026Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.085Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.115Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.147Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.167Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.206Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.217Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.292Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.317Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input0-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.382Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.438Z: JOB_MESSAGE_BASIC: Finished operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.536Z: JOB_MESSAGE_DEBUG: Value "read_words/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.630Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.685Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.773Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:44.852Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:46.769Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:47.390Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:50.876Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:50.948Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:51.002Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:51.059Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.522Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.600Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.681Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.716Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.742Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.790Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.791Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.864Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:53.945Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.112Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.164Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.229Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.298Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.387Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:57.441Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:59.666Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:59.755Z: JOB_MESSAGE_DEBUG: Executing success step success42 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:17:59.841Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:18:00.028Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:18:00.053Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:18:32.569Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:18:32.619Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-27T02:18:32.658Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-04-26_19_11_16-10047380238909920073 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1 files in 0.04544782638549805 seconds. [32mPASSED[0m apache_beam/examples/cookbook/coders_it_test.py::CodersIT::test_coders_output_files_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/d310128b-3458-4409-8dd1-37013f361c83/input.txt [31mFAILED[0m =================================== FAILURES =================================== [31m[1m_______________ CodersIT.test_coders_output_files_on_small_input _______________[0m self = <apache_beam.examples.cookbook.coders_it_test.CodersIT testMethod=test_coders_output_files_on_small_input> [1m @pytest.mark.no_xdist[0m [1m @pytest.mark.examples_postcommit[0m [1m def test_coders_output_files_on_small_input(self):[0m [1m test_pipeline = TestPipeline(is_integration_test=True)[0m [1m # Setup the files with expected content.[0m [1m OUTPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'[0m [1m output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])[0m [1m INPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'[0m [1m input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])[0m [1m create_content_input_file([0m [1m input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m [1m extra_opts = {'input': input, 'output': output}[0m [1m> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m [1m[31mapache_beam/examples/cookbook/coders_it_test.py[0m:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [1m[31mapache_beam/examples/cookbook/coders.py[0m:87: in run [1m p[0m [1m[31mapache_beam/transforms/ptransform.py[0m:1092: in __ror__ [1m return self.transform.__ror__(pvalueish, self.label)[0m [1m[31mapache_beam/transforms/ptransform.py[0m:614: in __ror__ [1m result = p.apply(self, pvalueish, label)[0m [1m[31mapache_beam/pipeline.py[0m:662: in apply [1m return self.apply(transform, pvalueish)[0m [1m[31mapache_beam/pipeline.py[0m:708: in apply [1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m [1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply [1m return super().apply(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:185: in apply [1m return m(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform [1m return transform.expand(input)[0m [1m[31mapache_beam/io/textio.py[0m:690: in expand [1m self._source.output_type_hint())[0m _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.io.textio._TextSource object at 0x7f226ebd55b0> [1m def output_type_hint(self):[0m [1m try:[0m [1m> return self._coder.to_type_hint()[0m [1m[31mE AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'[0m [1m[31mapache_beam/io/textio.py[0m:409: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/d310128b-3458-4409-8dd1-37013f361c83/input.txt [33m=============================== warnings summary ===============================[0m <unknown>:54 <unknown>:54: DeprecationWarning: invalid escape sequence \c <unknown>:62 <unknown>:62: DeprecationWarning: invalid escape sequence \d <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported pipeline_options=pcoll.pipeline.options, -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py39-no-xdist.xml> - [31m[1m= 1 failed, 5 passed, 1 skipped, 5371 deselected, 14 warnings in 2479.37 seconds =[0m > Task :sdks:python:test-suites:dataflow:py39:examples FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 203 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 1h 23m 19s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/z2pnbd6jqml2o Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
