See <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/175/display/redirect?page=changes>
Changes: [noreply] Update CHANGES.md after 2.49.0 cut (#27328) ------------------------------------------ [...truncated 1.85 MB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:38.001Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:38.063Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:38.107Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:38.915Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_2#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/DoOnce/Impulse+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_2#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/DoOnce/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_2#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/DoOnce/Map(decode)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_2#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/InjectDefault+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_2#vocabulary]/FormatCount+Analyze/CreateTensorBinding[compute_and_apply_vocabulary_2#vocabulary#temporary_analyzer_output#vocab_compute_and_apply_vocabulary_2_vocabulary_unpruned_vocab_size]/ToTensorBinding [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:38.992Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateTensorBinding[compute_and_apply_vocabulary_2#vocabulary#temporary_analyzer_output#vocab_compute_and_apply_vocabulary_2_vocabulary_unpruned_vocab_size]/ToTensorBinding.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:39.067Z: JOB_MESSAGE_BASIC: Executing operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input7-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:39.126Z: JOB_MESSAGE_BASIC: Finished operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input7-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:39.184Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input7-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.605Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.681Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.738Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.803Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.854Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:40.910Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/Impulse+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/Map(decode)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile+Analyze/CreateTensorBinding[compute_and_apply_vocabulary_15#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:42.043Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/Impulse+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/CreatePath/Map(decode)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_15#vocabulary]/WaitForVocabularyFile+Analyze/CreateTensorBinding[compute_and_apply_vocabulary_15#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:42.111Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateTensorBinding[compute_and_apply_vocabulary_15#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:42.165Z: JOB_MESSAGE_BASIC: Executing operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input48-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:42.213Z: JOB_MESSAGE_BASIC: Finished operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input48-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:42.272Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input48-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:44.228Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/Prepare/Impulse+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/Prepare/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/Prepare/Map(decode)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/OrderElements+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/Map(<lambda at iobase.py:1140>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:44.301Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:45.557Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:45.608Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/GroupByKey/Read+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.219Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/GroupByKey/Read+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.284Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/WriteBundles.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.562Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.588Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.615Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.640Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.673Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.696Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:49.740Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.542Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.593Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.635Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.684Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.738Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:51.797Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.205Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.252Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WriteToText/Write/WriteImpl/FinalizeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.310Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.354Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.416Z: JOB_MESSAGE_DEBUG: Value "Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile/View-python_side_input0-Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:54.463Z: JOB_MESSAGE_BASIC: Executing operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/Impulse+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/Map(decode)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile+Analyze/CreateTensorBinding[compute_and_apply_vocabulary_2#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:55.640Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/Impulse+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/FlatMap(<lambda at core.py:3722>)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/CreatePath/Map(decode)+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_2#vocabulary]/WaitForVocabularyFile+Analyze/CreateTensorBinding[compute_and_apply_vocabulary_2#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:55.698Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateTensorBinding[compute_and_apply_vocabulary_2#vocabulary#temporary_analyzer_output_2#Const]/ToTensorBinding.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:55.754Z: JOB_MESSAGE_BASIC: Executing operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input9-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:55.810Z: JOB_MESSAGE_BASIC: Finished operation Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input9-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:10:55.863Z: JOB_MESSAGE_DEBUG: Value "Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants/View-python_side_input9-Analyze/CreateSavedModel[tf_v2_only]/ReplaceWithConstants.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:11:52.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 26 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:11:59.586Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:11:59.609Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:12:04.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Resizing worker pool from 30 to 1. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:15:56.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:25:54.559Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:35:55.815Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.130Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/GroupByKey/Read+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine/Extract+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/SwapTokensAndCounts/KvSwap+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FilterByThresholds(5.0)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FlattenToSingleMetric+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/BatchVocabulary/ParDo(_GlobalWindowsBatchingDoFn)+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/SortBatches [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.236Z: JOB_MESSAGE_DEBUG: Executing failure step failure2106 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.258Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S322:Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/GroupByKey/Read+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine/Extract+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/SwapTokensAndCounts/KvSwap+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FilterByThresholds(5.0)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FlattenToSingleMetric+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/BatchVocabulary/ParDo(_GlobalWindowsBatchingDoFn)+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/SortBatches failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-k2td, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-fj0g, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-tddt, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-d29t [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.489Z: JOB_MESSAGE_WARNING: Unable to delete temp files: "gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-0701004916-340916-xnthvafm.1688172556.341110/dax-tmp-2023-06-30_17_49_41-9996984402569670720-S652-0-24b0e8d8f1f0f7b8/[email protected]." [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.523Z: JOB_MESSAGE_WARNING: S652:Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/GroupByKey/Read+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/Combine+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/Combine/Extract+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/SwapTokensAndCounts/KvSwap+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyPrune[compute_and_apply_vocabulary_20#vocabulary]/ApplyThresholdsAndTopK/FilterByThresholds(5.0)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyPrune[compute_and_apply_vocabulary_20#vocabulary]/ApplyThresholdsAndTopK/FlattenToSingleMetric+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_20#vocabulary]/BatchAndPreSort/BatchVocabulary/ParDo(_GlobalWindowsBatchingDoFn)+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_20#vocabulary]/BatchAndPreSort/SortBatches failed. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.564Z: JOB_MESSAGE_BASIC: Finished operation Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/GroupByKey/Read+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/Combine+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/MergeCountPerToken/Combine/Extract+Analyze/VocabularyMerge[compute_and_apply_vocabulary_20#vocabulary]/SwapTokensAndCounts/KvSwap+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyPrune[compute_and_apply_vocabulary_20#vocabulary]/ApplyThresholdsAndTopK/FilterByThresholds(5.0)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyPrune[compute_and_apply_vocabulary_20#vocabulary]/ApplyThresholdsAndTopK/FlattenToSingleMetric+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_20#vocabulary]/BatchAndPreSort/BatchVocabulary/ParDo(_GlobalWindowsBatchingDoFn)+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_20#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_20#vocabulary]/BatchAndPreSort/SortBatches [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:54.889Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:55.315Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:41:55.346Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:44:45.945Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:44:46.005Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-07-01T01:44:46.034Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-06-30_17_49_41-9996984402569670720 is in state JOB_STATE_FAILED [31m[1mERROR [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1563 Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-06-30_17_49_41-9996984402569670720?project=<ProjectId> [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:18 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py>:18: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html import pkg_resources ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871: 180 warnings <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871: 144 warnings <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2350: 36 warnings <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py>:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py:20 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages pkg_resources.declare_namespace(__name__) apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 apache_beam/typehints/pandas_type_compatibility_test.py:67 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. }).set_index(pd.Int64Index(range(123, 223), name='an_index')), apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 apache_beam/typehints/pandas_type_compatibility_test.py:90 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(123, 223), name='an_index'), apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 apache_beam/typehints/pandas_type_compatibility_test.py:91 <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91: FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas in a future version. Use pandas.Index with the appropriate dtype instead. pd.Int64Index(range(475, 575), name='another_index'), apache_beam/testing/benchmarks/cloudml/cloudml_benchmark_test.py::CloudMLTFTBenchmarkTest::test_cloudml_benchmark_criteo_10GB apache_beam/testing/benchmarks/cloudml/cloudml_benchmark_test.py::CloudMLTFTBenchmarkTest::test_cloudml_benchmark_criteo_small <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py>:484: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. bq_dataset_ref = self._client.dataset(dataset_name) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/pytest_TFTransformTests-df-py39.xml> - [36m[1m=========================== short test summary info ============================[0m [31mFAILED[0m apache_beam/testing/benchmarks/cloudml/cloudml_benchmark_test.py::[1mCloudMLTFTBenchmarkTest::test_cloudml_benchmark_cirteo_no_shuffle_10GB[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. Causes: S322:Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/GroupByKey/Read+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/MergeCountPerToken/Combine/Extract+Analyze/VocabularyMerge[compute_and_apply_vocabulary_11#vocabulary]/SwapTokensAndCounts/KvSwap+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FilterByThresholds(5.0)+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountUnfiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyPrune[compute_and_apply_vocabulary_11#vocabulary]/ApplyThresholdsAndTopK/FlattenToSingleMetric+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/KeyWithVoid+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/BatchVocabulary/ParDo(_GlobalWindowsBatchingDoFn)+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Analyze/VocabularyCountFiltered[compute_and_apply_vocabulary_11#vocabulary]/TotalVocabSize/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write+Analyze/VocabularyOrderAndWrite[compute_and_apply_vocabulary_11#vocabulary]/BatchAndPreSort/SortBatches failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-k2td, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-fj0g, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-tddt, Root cause: Timed out waiting for an update from the worker. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors#worker-lost-contact. Worker ID: beamapp-jenkins-070100491-06301749-ocku-harness-d29t [31m====== [31m[1m1 failed[0m, [32m2 passed[0m, [33m8 skipped[0m, [33m431 warnings[0m[31m in 6914.14s (1:55:14)[0m[31m =======[0m > Task :sdks:python:test-suites:dataflow:py39:tftTests FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_CloudML_Benchmarks_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 428 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:tftTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 57m 46s 16 actionable tasks: 10 executed, 4 from cache, 2 up-to-date Publishing build scan... https://ge.apache.org/s/ujib3qgmejm6y Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
