See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1163/display/redirect>
Changes: ------------------------------------------ [...truncated 103.99 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:27.924Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:27.961Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:27.994Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.026Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.061Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/Map(decode) into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.093Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Run trials into EstimatePiTransform/Initialize/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.116Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/KeyWithVoid into EstimatePiTransform/Run trials [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.146Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial into EstimatePiTransform/Sum/KeyWithVoid [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.179Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write into EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.216Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/Combine into EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.248Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/Combine/Extract into EstimatePiTransform/Sum/CombinePerKey/Combine [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.276Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/UnKey into EstimatePiTransform/Sum/CombinePerKey/Combine/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.301Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn) into EstimatePiTransform/Sum/UnKey [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.337Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WriteBundles into WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.362Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Pair into WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.385Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/GroupByKey/Write into WriteToText/Write/WriteImpl/Pair [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.415Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Extract into WriteToText/Write/WriteImpl/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.461Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.499Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.524Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.551Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.756Z: JOB_MESSAGE_DEBUG: Executing wait step start37 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.841Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3507>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.891Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.919Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.930Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.971Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:28.991Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.008Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.027Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.066Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.069Z: JOB_MESSAGE_DEBUG: Value "EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.107Z: JOB_MESSAGE_DEBUG: Value "EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.146Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/Impulse+EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:29.194Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:49:55.179Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:50:08.149Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:50:28.827Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:11.416Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:12.642Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/Impulse+EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:12.705Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:12.759Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:12.823Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys+EstimatePiTransform/Initialize/Map(decode)+EstimatePiTransform/Run trials+EstimatePiTransform/Sum/KeyWithVoid+EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial+EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.719Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3507>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.775Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.807Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.876Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.901Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.921Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.930Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.963Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.974Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:13.992Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:14.025Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:14.052Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:22.907Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys+EstimatePiTransform/Initialize/Map(decode)+EstimatePiTransform/Run trials+EstimatePiTransform/Sum/KeyWithVoid+EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial+EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:22.964Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:23.010Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:23.070Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read+EstimatePiTransform/Sum/CombinePerKey/Combine+EstimatePiTransform/Sum/CombinePerKey/Combine/Extract+EstimatePiTransform/Sum/UnKey+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:24.831Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read+EstimatePiTransform/Sum/CombinePerKey/Combine+EstimatePiTransform/Sum/CombinePerKey/Combine/Extract+EstimatePiTransform/Sum/UnKey+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:24.899Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:24.943Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:25.015Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.558Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.628Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.699Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.737Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.766Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.803Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.838Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.875Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:27.944Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.124Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.201Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.263Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.325Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.397Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:31.463Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:33.929Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:33.990Z: JOB_MESSAGE_DEBUG: Executing success step success35 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:34.056Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:34.220Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:57:34.255Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:59:48.668Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:59:48.726Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T07:59:48.766Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-24_23_49_23-17642935499717746996 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:615 Finished listing 1 files in 0.04657483100891113 seconds. [32mPASSED[0m apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.runners.portability.stager:stager.py:780 Executing command: ['/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/build/gradleenv/2050596098/bin/python3.10', '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp9zkq1bn8/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp310', '--platform', 'manylinux2014_x86_64'] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/build/apache-beam.tar.gz" to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:484 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.10_sdk:2.44.0.dev [32mINFO [0m root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221122 [32mINFO [0m root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20221122" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7fe310f663b0> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7fe310f66b90> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/requirements.txt... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/requirements.txt in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/pickled_main_session... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/pickled_main_session in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/mock-2.0.0-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/mock-2.0.0-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/PyHamcrest-1.10.1-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/beautifulsoup4-4.11.1-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/beautifulsoup4-4.11.1-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/parameterized-0.7.5-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/dataflow_python_sdk.tar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/dataflow_python_sdk.tar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:732 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:748 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1125080002-677598-49mrzpk0.1669363202.677758/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Create job: <Job clientRequestId: '20221125080002678671-3550' createTime: '2022-11-25T08:00:05.238343Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2022-11-25_00_00_04-15338778294887345924' location: 'us-central1' name: 'beamapp-jenkins-1125080002-677598-49mrzpk0' projectId: 'apache-beam-testing' stageStates: [] startTime: '2022-11-25T08:00:05.238343Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:913 Created job with id: [2022-11-25_00_00_04-15338778294887345924] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:914 Submitted job: 2022-11-25_00_00_04-15338778294887345924 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:915 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-25_00_00_04-15338778294887345924?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-25_00_00_04-15338778294887345924?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2022-11-25_00_00_04-15338778294887345924?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:198 Job 2022-11-25_00_00_04-15338778294887345924 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:05.861Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2022-11-25_00_00_04-15338778294887345924. The number of workers will be between 1 and 1000. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:06.145Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2022-11-25_00_00_04-15338778294887345924. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:08.685Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.169Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.225Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.426Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.470Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.522Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.592Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.748Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.795Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.841Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.886Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_46 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.931Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:11.975Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.021Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.066Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.113Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.137Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_46 for input ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-ParDo-WriteRecordsTo_41.WrittenFiles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.174Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround, through flatten WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.208Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.245Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.278Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.314Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.359Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.404Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.447Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_46-u46 for input ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-IdentityWorkaround_47.None-c44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.482Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.517Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.555Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.593Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.628Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.665Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadInputText/Read/Map(<lambda at iobase.py:908>) into ReadInputText/Read/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.710Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction into ReadInputText/Read/Map(<lambda at iobase.py:908>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.741Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.774Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ParseGameEventFn into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.808Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FilterStartTime into HourlyTeamScore/ParseGameEventFn [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.850Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FilterEndTime into HourlyTeamScore/FilterStartTime [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.886Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/AddEventTimestamps into HourlyTeamScore/FilterEndTime [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.932Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FixedWindowsTeam into HourlyTeamScore/AddEventTimestamps [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:12.968Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>) into HourlyTeamScore/FixedWindowsTeam [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.013Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial into HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.059Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.100Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.133Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.168Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.212Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.246Z: JOB_MESSAGE_DETAILED: Fusing consumer TeamScoresDict into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.290Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/ConvertToRow into TeamScoresDict [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.340Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal into WriteTeamScoreSums/ConvertToRow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.377Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.410Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.445Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.493Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.533Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.568Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.610Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.655Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.693Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.753Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.799Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.849Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:13.890Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.150Z: JOB_MESSAGE_DEBUG: Executing wait step start61 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.233Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.283Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.296Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.318Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.340Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.350Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.399Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.422Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.434Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.452Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.495Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.541Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.575Z: JOB_MESSAGE_DEBUG: Value "HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.596Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.612Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.645Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:14.691Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:00:23.865Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:01:07.387Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:243 2022-11-25T08:01:34.194Z: JOB_MESSAGE_DETAILED: Workers have started successfully. FATAL: command execution failed java.io.IOException: Backing channel 'apache-beam-jenkins-9' is disconnected. at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215) at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285) at com.sun.proxy.$Proxy138.isAlive(Unknown Source) at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215) at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207) at hudson.Launcher$ProcStarter.join(Launcher.java:524) at hudson.plugins.gradle.Gradle.perform(Gradle.java:317) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814) at hudson.model.Build$BuildExecution.build(Build.java:199) at hudson.model.Build$BuildExecution.doRun(Build.java:164) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522) at hudson.model.Run.execute(Run.java:1896) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44) at hudson.model.ResourceController.execute(ResourceController.java:101) at hudson.model.Executor.run(Executor.java:442) Caused by: java.io.IOException: Pipe closed after 0 cycles at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126) at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105) at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94) at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75) at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105) at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39) at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61) Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure ERROR: apache-beam-jenkins-9 is offline; cannot locate jdk_1.8_latest --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
