See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/1527/display/redirect>
Changes: ------------------------------------------ [...truncated 110.66 KB...] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:45.973Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/DoOnce/Map(decode) into WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.009Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>) into EstimatePiTransform/Initialize/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.047Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys into EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.080Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.116Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.146Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.172Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.199Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.245Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.275Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Initialize/Map(decode) into EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.307Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Run trials into EstimatePiTransform/Initialize/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.334Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/KeyWithVoid into EstimatePiTransform/Run trials [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.390Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial into EstimatePiTransform/Sum/KeyWithVoid [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.420Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write into EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.452Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/Combine into EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.492Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/CombinePerKey/Combine/Extract into EstimatePiTransform/Sum/CombinePerKey/Combine [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.526Z: JOB_MESSAGE_DETAILED: Fusing consumer EstimatePiTransform/Sum/UnKey into EstimatePiTransform/Sum/CombinePerKey/Combine/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.550Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn) into EstimatePiTransform/Sum/UnKey [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.573Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/WriteBundles into WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.619Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Pair into WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.649Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/GroupByKey/Write into WriteToText/Write/WriteImpl/Pair [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.678Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToText/Write/WriteImpl/Extract into WriteToText/Write/WriteImpl/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.731Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.783Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.814Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:46.837Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.050Z: JOB_MESSAGE_DEBUG: Executing wait step start37 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.122Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3507>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.151Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.165Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.194Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.664Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.736Z: JOB_MESSAGE_DEBUG: Value "EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:24:47.796Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/Impulse+EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:25:15.910Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:25:25.525Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:26:05.065Z: JOB_MESSAGE_DETAILED: Workers have started successfully. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:12.354Z: JOB_MESSAGE_DETAILED: All workers have finished the startup processes and began to receive work requests. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:13.622Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/Impulse+EstimatePiTransform/Initialize/FlatMap(<lambda at core.py:3507>)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/AddRandomKeys+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:13.686Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:13.743Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:13.809Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.028Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.082Z: JOB_MESSAGE_DEBUG: Value "EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.142Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys+EstimatePiTransform/Initialize/Map(decode)+EstimatePiTransform/Run trials+EstimatePiTransform/Sum/KeyWithVoid+EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial+EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.617Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/DoOnce/Impulse+WriteToText/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3507>)+WriteToText/Write/WriteImpl/DoOnce/Map(decode)+WriteToText/Write/WriteImpl/InitializeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.682Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/DoOnce/Map(decode).None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.706Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/InitializeWrite.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.773Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.805Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.821Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.839Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.867Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.879Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/WriteBundles/View-python_side_input0-WriteToText/Write/WriteImpl/WriteBundles.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.909Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.925Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input0-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:14.969Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input0-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:23.455Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+EstimatePiTransform/Initialize/MaybeReshuffle/Reshuffle/RemoveRandomKeys+EstimatePiTransform/Initialize/Map(decode)+EstimatePiTransform/Run trials+EstimatePiTransform/Sum/KeyWithVoid+EstimatePiTransform/Sum/CombinePerKey/GroupByKey+EstimatePiTransform/Sum/CombinePerKey/Combine/Partial+EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:23.511Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:23.829Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:23.881Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:24.093Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:24.158Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:24.228Z: JOB_MESSAGE_BASIC: Executing operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read+EstimatePiTransform/Sum/CombinePerKey/Combine+EstimatePiTransform/Sum/CombinePerKey/Combine/Extract+EstimatePiTransform/Sum/UnKey+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:26Z: JOB_MESSAGE_BASIC: Finished operation EstimatePiTransform/Sum/CombinePerKey/GroupByKey/Read+EstimatePiTransform/Sum/CombinePerKey/Combine+EstimatePiTransform/Sum/CombinePerKey/Combine/Extract+EstimatePiTransform/Sum/UnKey+WriteToText/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteToText/Write/WriteImpl/WriteBundles+WriteToText/Write/WriteImpl/Pair+WriteToText/Write/WriteImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:26.065Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:26.125Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:26.196Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.669Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/GroupByKey/Read+WriteToText/Write/WriteImpl/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.723Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/Extract.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.789Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.819Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.846Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.885Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.914Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input1-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:28.947Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize/View-python_side_input1-WriteToText/Write/WriteImpl/PreFinalize.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:29.017Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.033Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/PreFinalize [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.085Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/PreFinalize.None" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.152Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.211Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.267Z: JOB_MESSAGE_DEBUG: Value "WriteToText/Write/WriteImpl/FinalizeWrite/View-python_side_input2-WriteToText/Write/WriteImpl/FinalizeWrite.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:32.327Z: JOB_MESSAGE_BASIC: Executing operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:34.633Z: JOB_MESSAGE_BASIC: Finished operation WriteToText/Write/WriteImpl/FinalizeWrite [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:34.688Z: JOB_MESSAGE_DEBUG: Executing success step success35 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:34.760Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:35.051Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:33:35.078Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:35:45.905Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:35:45.947Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:35:45.984Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-02-24_06_24_38-11536141819926601820 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.io.gcp.gcsio:gcsio.py:644 Finished listing 1 files in 0.053618431091308594 seconds. [32mPASSED[0m apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.runners.portability.stager:stager.py:772 Executing command: ['/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/build/gradleenv/2050596098/bin/python3.10', '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmp4prcqaae/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp310', '--platform', 'manylinux2014_x86_64'] [32mINFO [0m apache_beam.runners.portability.stager:stager.py:330 Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/build/apache-beam.tar.gz" to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:443 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:376 Default Python SDK image for environment is apache/beam_python3.10_sdk:2.47.0.dev [32mINFO [0m root:environments.py:295 Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20230126 [32mINFO [0m root:environments.py:302 Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python310-fnapi:beam-master-20230126" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7f4106018040> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7f4106018820> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/requirements.txt... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/requirements.txt in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/pickled_main_session... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/pickled_main_session in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/mock-2.0.0-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/mock-2.0.0-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/seaborn-0.12.2-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/seaborn-0.12.2-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/PyHamcrest-1.10.1-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/inflection-0.5.1-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/inflection-0.5.1-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/beautifulsoup4-4.11.2-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/beautifulsoup4-4.11.2-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/parameterized-0.7.5-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/matplotlib-3.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/matplotlib-3.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/dataflow_python_sdk.tar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/dataflow_python_sdk.tar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:725 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:741 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0224143558-008021-erbmyffu.1677249358.008180/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:900 Create job: <Job clientRequestId: '20230224143558009211-2424' createTime: '2023-02-24T14:36:01.361095Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-02-24_06_36_00-15941021631000236110' location: 'us-central1' name: 'beamapp-jenkins-0224143558-008021-erbmyffu' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-02-24T14:36:01.361095Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:902 Created job with id: [2023-02-24_06_36_00-15941021631000236110] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:903 Submitted job: 2023-02-24_06_36_00-15941021631000236110 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:904 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-24_06_36_00-15941021631000236110?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-24_06_36_00-15941021631000236110?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-02-24_06_36_00-15941021631000236110?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-02-24_06_36_00-15941021631000236110 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:02.893Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-02-24_06_36_00-15941021631000236110. The number of workers will be between 1 and 1000. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:02.946Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-02-24_06_36_00-15941021631000236110. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:05.318Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.629Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.652Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.709Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.744Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.777Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.800Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.852Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.875Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.920Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.941Z: JOB_MESSAGE_DEBUG: Inserted coder converter before flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_47 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:06.977Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:1110>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.013Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.038Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.061Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.087Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.111Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.144Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_47 for input ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-ParDo-WriteRecordsTo_42.WrittenFiles [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.180Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround, through flatten WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.203Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.224Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.248Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.281Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.312Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.336Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.369Z: JOB_MESSAGE_DETAILED: Unzipping flatten ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-DestinationFilesUnio_47-u46 for input ref_AppliedPTransform_WriteTeamScoreSums-WriteToBigQuery-BigQueryBatchFileLoads-IdentityWorkaround_48.None-c44 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.392Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.416Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.437Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.469Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.492Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.524Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadInputText/Read/Map(<lambda at iobase.py:908>) into ReadInputText/Read/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.548Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction into ReadInputText/Read/Map(<lambda at iobase.py:908>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.580Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.613Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ParseGameEventFn into ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.635Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FilterStartTime into HourlyTeamScore/ParseGameEventFn [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.661Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FilterEndTime into HourlyTeamScore/FilterStartTime [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.686Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/AddEventTimestamps into HourlyTeamScore/FilterEndTime [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.716Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/FixedWindowsTeam into HourlyTeamScore/AddEventTimestamps [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.739Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>) into HourlyTeamScore/FixedWindowsTeam [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.767Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial into HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:142>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.796Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.822Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.850Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.880Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.906Z: JOB_MESSAGE_DETAILED: Fusing consumer HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.936Z: JOB_MESSAGE_DETAILED: Fusing consumer TeamScoresDict into HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.961Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/ConvertToRow into TeamScoresDict [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:07.982Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal into WriteTeamScoreSums/ConvertToRow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.007Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.035Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.062Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.089Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.117Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.148Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.176Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.210Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.242Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) into WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.297Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.323Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.347Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.373Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.558Z: JOB_MESSAGE_DEBUG: Executing wait step start61 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.614Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:3507>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.639Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read/Impulse+ReadInputText/Read/Map(<lambda at iobase.py:908>)+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_ReadInputText-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.651Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.653Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at core.py:3507>)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:08.678Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:42.879Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:36:47.854Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-02-24T14:37:23.338Z: JOB_MESSAGE_DETAILED: Workers have started successfully. FATAL: command execution failed java.io.IOException: Backing channel 'apache-beam-jenkins-14' is disconnected. at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:215) at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:285) at com.sun.proxy.$Proxy141.isAlive(Unknown Source) at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1215) at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1207) at hudson.Launcher$ProcStarter.join(Launcher.java:524) at hudson.plugins.gradle.Gradle.perform(Gradle.java:317) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:814) at hudson.model.Build$BuildExecution.build(Build.java:199) at hudson.model.Build$BuildExecution.doRun(Build.java:164) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:522) at hudson.model.Run.execute(Run.java:1896) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:44) at hudson.model.ResourceController.execute(ResourceController.java:101) at hudson.model.Executor.run(Executor.java:442) Caused by: java.io.IOException: Pipe closed after 0 cycles at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126) at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105) at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:94) at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:75) at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:105) at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39) at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61) Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure ERROR: apache-beam-jenkins-14 is offline; cannot locate jdk_1.8_latest --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
