See
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/8837/display/redirect?page=changes>
Changes:
[johnjcasey] [BEAM-12391] update avro sink to close the opened file handle,
instead
[Robert Bradshaw] [BEAM-13482] Python fully qualified name external transforms.
[Robert Bradshaw] Add Python expansion service entry point.
[Kyle Weaver] [BEAM-13569] Change Spark dependencies to implementation.
[Kyle Weaver] remove redundant dependency
[wuren] [BEAM-13591] Bump log4j2 version to 2.17.1
[relax] Add Flink runner support for OrderedListState. This version reads the
[zyichi] Fix sdk_container_builder too many values to unpack error
[noreply] [BEAM-13480] Sickbay PubSubIntegrationTest.test_streaming_data_only on
[Kyle Weaver] remove redundant testImplementation dependencies
[noreply] [BEAM-13430] Swap to use "mainClass" instead of "main" since it was
[noreply] [BEAM-13430] Replace deprecated "appendix" with "archiveAppendix"
[noreply] [BEAM-13015] Add jamm as a java agent to the Java SDK harness
container
[noreply] [BEAM-13430] Partially revert
[noreply] Merge pull request #15863 from [BEAM-13184] Autosharding for
[noreply] [BEAM-11936] Enable FloatingPointAssertionWithinEpsilon errorprone
check
[noreply] [BEAM-11936] Enable LockNotBeforeTry errorprone check (#16259)
[noreply] [BEAM-11936] Enable errorprone unused checks (#16262)
[noreply] Add Nexmark Query 14 (#16337)
[noreply] [BEAM-13015] Migrate all user state and side implementations to
support
------------------------------------------
[...truncated 236.04 KB...]
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:881 Created job
with id: [2022-01-04_17_20_18-3804186475933556740]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:882 Submitted job:
2022-01-04_17_20_18-3804186475933556740
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:883 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-04_17_20_18-3804186475933556740?project=apache-beam-testing
[33mWARNING [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:63
Waiting indefinitely for streaming job.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-01-04_17_20_18-3804186475933556740 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:18.121Z: JOB_MESSAGE_BASIC: Streaming Engine auto-enabled. Use
--experiments=disable_streaming_engine to opt out.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:18.122Z: JOB_MESSAGE_BASIC: Dataflow Runner V2 auto-enabled.
Use --experiments=disable_runner_v2 to opt out.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:21.871Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for
Dataflow Streaming Engine. Workers will scale between 1 and 100 unless
maxNumWorkers is specified.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:22.119Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2022-01-04_17_20_18-3804186475933556740. The number of workers will be between
1 and 100.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:22.152Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2022-01-04_17_20_18-3804186475933556740.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:26.651Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-f.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.340Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.370Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.461Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.496Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a
combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.526Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.553Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
compute/View-python_side_input0-compute/GroupByKey: GroupByKey not followed by
a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.581Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.618Z: JOB_MESSAGE_DETAILED: Expanding
SplittableProcessKeyed operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.652Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into streaming Read/Write steps
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.804Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:27.971Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.005Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.036Z: JOB_MESSAGE_DEBUG: Inserted coder converter after
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_39
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.069Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_39 for input
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_37.None
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.102Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity, through flatten
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer
assert_that/Group/CoGroupByKeyImpl/Tag[0]
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.140Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_39-u36 for
input
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_39.None-c34
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.167Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream, through flatten
assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.217Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity into
assert_that/Group/CoGroupByKeyImpl/Tag[1]
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.243Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream into
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.276Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/FlatMap(<lambda at core.py:3228>) into side/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.307Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/AddRandomKeys into side/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.371Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
side/MaybeReshuffle/Reshuffle/AddRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.399Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.423Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.467Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.497Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
side/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.530Z: JOB_MESSAGE_DETAILED: Fusing consumer
side/Map(decode) into side/MaybeReshuffle/Reshuffle/RemoveRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.559Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/MapToVoidKey0 into side/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.596Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/View-python_side_input0-compute/PairWithVoidKey into
compute/MapToVoidKey0
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.628Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/View-python_side_input0-compute/GroupByKey/WriteStream into
compute/View-python_side_input0-compute/PairWithVoidKey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.664Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/View-python_side_input0-compute/GroupByKey/MergeBuckets into
compute/View-python_side_input0-compute/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.696Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/View-python_side_input0-compute/Values into
compute/View-python_side_input0-compute/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.728Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/View-python_side_input0-compute/StreamingPCollectionViewWriter into
compute/View-python_side_input0-compute/Values
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.759Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Create/FlatMap(<lambda at core.py:3228>) into
assert_that/Create/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.794Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.828Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.862Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/FlatMap(<lambda at core.py:3228>) into start/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.897Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/AddRandomKeys into start/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.932Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
start/MaybeReshuffle/Reshuffle/AddRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.966Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:28.989Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.022Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.054Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
start/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.089Z: JOB_MESSAGE_DETAILED: Fusing consumer
start/Map(decode) into start/MaybeReshuffle/Reshuffle/RemoveRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.124Z: JOB_MESSAGE_DETAILED: Fusing consumer compute into
start/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.147Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/WindowInto(WindowIntoFn) into compute
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.179Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.218Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.239Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets into
assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.271Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.307Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/RestoreTags into
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.336Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Unkey into assert_that/Group/RestoreTags
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.374Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Match into assert_that/Unkey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.421Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.450Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.478Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.512Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.570Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.601Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-f...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:29.628Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:20:41.721Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:21:12.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 so that the pipeline can catch up with its backlog and keep up
with its input rate.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:21:38.555Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:21:38.587Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:26:34.361Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-f.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:26:45.774Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T01:27:03.350Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:09.803Z: JOB_MESSAGE_BASIC: Cancel request is committed for
workflow job: 2022-01-04_17_20_18-3804186475933556740.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:09.923Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:09.990Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:10.011Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:10.042Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-05T02:35:10.069Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-01-04_17_20_18-3804186475933556740 is in state JOB_STATE_CANCELLING
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 4.066855117017169 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 6.877856731642726 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 19.74560521479694 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 32.77757035000946 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 42.63338709109205 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 86.77356552960889 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33mWARNING [0m apache_beam.utils.retry:retry.py:268 Retry with exponential
backoff: waiting for 279.9791314852938 seconds before retrying get_job because
we caught exception: apitools.base.py.exceptions.InvalidUserInputError: Request
missing required parameter jobId
Traceback for above exception
(most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",>
line 253, in wrapper
return fun(*args, **kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",>
line 950, in get_job
response =
self._client.projects_locations_jobs.Get(request)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",>
line 928, in Get
return self._RunMethod(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 701, in _RunMethod
http_request =
self.PrepareHttpRequest(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 683, in PrepareHttpRequest
url_builder.relative_path =
self.__ConstructRelativePath(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/base_api.py",>
line 583, in __ConstructRelativePath
return
util.ExpandRelativePath(method_config, params,
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/apitools/base/py/util.py",>
line 129, in ExpandRelativePath
raise
exceptions.InvalidUserInputError(
[33m=============================== warnings summary
===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml>
-
[31m[1m======== 1 failed, 29 passed, 3 skipped, 8 warnings in 7173.33 seconds
=========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
> FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 226
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 42m 44s
83 actionable tasks: 58 executed, 23 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/xiqjg25igddke
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]