See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/293/display/redirect?page=changes>
Changes: [bulat.safiullin] [BEAM-14247] [Website] add image [bulat.safiullin] [BEAM-14247] [Website] center image [mattcasters] BEAM-1857 : CHANGES.md entry for 2.38.0 [mmack] [BEAM-14323] Improve IDE integration of Spark cross version builds [noreply] [BEAM-14112] Fixed ReadFromBigQuery with Interactive Beam (#17306) [noreply] Update .asf.yaml (#17409) [noreply] [BEAM-14336] Sickbay flight delays test - dataset seems to be missing [noreply] [BEAM-14338] Update watermark unit tests to use time.Time.Equals() [noreply] [BEAM-14328] Tweaks to "Differences from pandas" page (#17413) [Andrew Pilloud] [BEAM-14253] Disable broken test pending Dataflow fix [yiru] fix: BigQuery Storage Connector trace id population missing bracket [noreply] [BEAM-14330] Temporarily disable the clusters auto-cleanup (#17400) [noreply] Update Beam website to release 2.38.0 (#17378) [noreply] [BEAM-14213] Add API and construction time validation for Batched DoFns [noreply] Minor: Update release guide regarding archive.apache.org (#17419) ------------------------------------------ [...truncated 283.51 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.009Z: JOB_MESSAGE_DEBUG: Executing wait step start44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.094Z: JOB_MESSAGE_BASIC: Executing operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.116Z: JOB_MESSAGE_BASIC: Executing operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.204Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.235Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.271Z: JOB_MESSAGE_BASIC: Executing operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.353Z: JOB_MESSAGE_BASIC: Executing operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.374Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.462Z: JOB_MESSAGE_BASIC: Executing operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.563Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.837Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.903Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.963Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:17.963Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:18.062Z: JOB_MESSAGE_DEBUG: Value "create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:18.103Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:18.137Z: JOB_MESSAGE_BASIC: Executing operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:42.701Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:46:45.224Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:47:08.934Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:25.407Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_word/Impulse+create_ignore_word/FlatMap(<lambda at core.py:3320>)+create_ignore_word/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:25.506Z: JOB_MESSAGE_DEBUG: Value "create_ignore_word/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:25.571Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:25.615Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input1-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:25.667Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input1-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:28.632Z: JOB_MESSAGE_BASIC: Finished operation create_ignore_corpus/Impulse+create_ignore_corpus/FlatMap(<lambda at core.py:3320>)+create_ignore_corpus/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:28.706Z: JOB_MESSAGE_DEBUG: Value "create_ignore_corpus/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:28.764Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:28.825Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input1-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:28.894Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input1-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:31.960Z: JOB_MESSAGE_BASIC: Finished operation read_words/FilesToRemoveImpulse/Impulse+read_words/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read_words/FilesToRemoveImpulse/Map(decode)+read_words/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:32.026Z: JOB_MESSAGE_DEBUG: Value "read_words/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:32.088Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:32.183Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:32.232Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:34.553Z: JOB_MESSAGE_BASIC: Finished operation create groups/Impulse+create groups/FlatMap(<lambda at core.py:3320>)+create groups/MaybeReshuffle/Reshuffle/AddRandomKeys+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:34.605Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:34.659Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.473Z: JOB_MESSAGE_BASIC: Finished operation read corpus/FilesToRemoveImpulse/Impulse+read corpus/FilesToRemoveImpulse/FlatMap(<lambda at core.py:3320>)+read corpus/FilesToRemoveImpulse/Map(decode)+read corpus/MapFilesToRemove [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.490Z: JOB_MESSAGE_BASIC: Finished operation read_words/Read/Impulse+read_words/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/PairWithRestriction+ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.529Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34-split-with-sizing-out9" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.553Z: JOB_MESSAGE_DEBUG: Value "read corpus/MapFilesToRemove.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.575Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.620Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.674Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:38.721Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input1-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.187Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read_words-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_34/ProcessElementAndRestrictionWithSizing+read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.259Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.306Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.339Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.373Z: JOB_MESSAGE_BASIC: Executing operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.400Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.430Z: JOB_MESSAGE_BASIC: Finished operation attach word/View-python_side_input0-attach word [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.454Z: JOB_MESSAGE_DEBUG: Value "read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.505Z: JOB_MESSAGE_DEBUG: Value "attach word/View-python_side_input0-attach word.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:52.528Z: JOB_MESSAGE_BASIC: Executing operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:54.691Z: JOB_MESSAGE_BASIC: Finished operation read corpus/Read/Impulse+read corpus/Read/Map(<lambda at iobase.py:898>)+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/PairWithRestriction+ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:54.758Z: JOB_MESSAGE_DEBUG: Value "ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13-split-with-sizing-out3" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:54.806Z: JOB_MESSAGE_BASIC: Executing operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:55.547Z: JOB_MESSAGE_BASIC: Finished operation read_words/_PassThroughThenCleanup/Create/Impulse+read_words/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read_words/_PassThroughThenCleanup/Create/Map(decode)+read_words/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.654Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3320>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.714Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.739Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.796Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.828Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.844Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.852Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.873Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.896Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.898Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.943Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:54:59.976Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.126Z: JOB_MESSAGE_BASIC: Finished operation ref_AppliedPTransform_read-corpus-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_13/ProcessElementAndRestrictionWithSizing+read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.226Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.251Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.318Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.350Z: JOB_MESSAGE_BASIC: Executing operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.355Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.398Z: JOB_MESSAGE_BASIC: Finished operation attach corpus/View-python_side_input0-attach corpus [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.408Z: JOB_MESSAGE_DEBUG: Value "read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/View-python_side_input0-read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles).out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.526Z: JOB_MESSAGE_BASIC: Executing operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.555Z: JOB_MESSAGE_DEBUG: Value "attach corpus/View-python_side_input0-attach corpus.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:05.609Z: JOB_MESSAGE_BASIC: Executing operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:07.234Z: JOB_MESSAGE_BASIC: Finished operation read corpus/_PassThroughThenCleanup/Create/Impulse+read corpus/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:3320>)+read corpus/_PassThroughThenCleanup/Create/Map(decode)+read corpus/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:08.723Z: JOB_MESSAGE_BASIC: Finished operation create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+create groups/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+create groups/MaybeReshuffle/Reshuffle/RemoveRandomKeys+create groups/Map(decode)+attach corpus+attach word+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:08.786Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:08.836Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:08.900Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.252Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.330Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.409Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.429Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.442Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.476Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.508Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.542Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:11.598Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.418Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.478Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.541Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.591Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.648Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:14.719Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:17.052Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:17.103Z: JOB_MESSAGE_DEBUG: Executing success step success42 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:17.475Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:17.513Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:17.582Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:51.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:51.793Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:242 2022-04-21T01:55:51.815Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:197 Job 2022-04-20_18_46_07-8093763463795895743 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:572 Finished listing 1 files in 0.044492483139038086 seconds. [32mPASSED[0m apache_beam/examples/cookbook/coders_it_test.py::CodersIT::test_coders_output_files_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/01642e82-7f4e-4813-ab4e-34b01804c5f2/input.txt [31mFAILED[0m =================================== FAILURES =================================== [31m[1m_______________ CodersIT.test_coders_output_files_on_small_input _______________[0m self = <apache_beam.examples.cookbook.coders_it_test.CodersIT testMethod=test_coders_output_files_on_small_input> [1m @pytest.mark.no_xdist[0m [1m @pytest.mark.examples_postcommit[0m [1m def test_coders_output_files_on_small_input(self):[0m [1m test_pipeline = TestPipeline(is_integration_test=True)[0m [1m # Setup the files with expected content.[0m [1m OUTPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'[0m [1m output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])[0m [1m INPUT_FILE_DIR = \[0m [1m 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'[0m [1m input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])[0m [1m create_content_input_file([0m [1m input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m [1m extra_opts = {'input': input, 'output': output}[0m [1m> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m [1m[31mapache_beam/examples/cookbook/coders_it_test.py[0m:93: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [1m[31mapache_beam/examples/cookbook/coders.py[0m:87: in run [1m p[0m [1m[31mapache_beam/transforms/ptransform.py[0m:1092: in __ror__ [1m return self.transform.__ror__(pvalueish, self.label)[0m [1m[31mapache_beam/transforms/ptransform.py[0m:614: in __ror__ [1m result = p.apply(self, pvalueish, label)[0m [1m[31mapache_beam/pipeline.py[0m:662: in apply [1m return self.apply(transform, pvalueish)[0m [1m[31mapache_beam/pipeline.py[0m:708: in apply [1m pvalueish_result = self.runner.apply(transform, pvalueish, self._options)[0m [1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:141: in apply [1m return super().apply(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:185: in apply [1m return m(transform, input, options)[0m [1m[31mapache_beam/runners/runner.py[0m:215: in apply_PTransform [1m return transform.expand(input)[0m [1m[31mapache_beam/io/textio.py[0m:690: in expand [1m self._source.output_type_hint())[0m _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <apache_beam.io.textio._TextSource object at 0x7fef4d8071c0> [1m def output_type_hint(self):[0m [1m try:[0m [1m> return self._coder.to_type_hint()[0m [1m[31mE AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'[0m [1m[31mapache_beam/io/textio.py[0m:409: AttributeError ------------------------------ Captured log call ------------------------------- [32mINFO [0m root:coders_it_test.py:48 Creating file: gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/01642e82-7f4e-4813-ab4e-34b01804c5f2/input.txt [33m=============================== warnings summary ===============================[0m <unknown>:54 <unknown>:54: DeprecationWarning: invalid escape sequence \c <unknown>:62 <unknown>:62: DeprecationWarning: invalid escape sequence \d <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2152: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2158: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2452: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2478: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported pipeline_options=pcoll.pipeline.options, -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py39-no-xdist.xml> - [31m[1m===== 1 failed, 5 passed, 5364 deselected, 14 warnings in 2691.20 seconds ======[0m > Task :sdks:python:test-suites:dataflow:py39:examples FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 183 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 1h 36s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/xfpevyrvyveji Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
