See
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2958/display/redirect?page=changes>
Changes:
[chamikaramj] Adds several example multi-language Python pipelines
[noreply] [BEAM-13399] Move service liveness polling to Runner type (#16487)
------------------------------------------
[...truncated 85.54 KB...]
[1m some_pairs = [('crouton', 17), ('supreme', None)][0m
[1m pipeline = self.create_pipeline()[0m
[1m main_input = pipeline | 'main input' >> beam.Create([1])[0m
[1m side_list = pipeline | 'side list' >> beam.Create(a_list)[0m
[1m side_pairs = pipeline | 'side pairs' >> beam.Create(some_pairs)[0m
[1m results = main_input | 'concatenate' >> beam.Map([0m
[1m lambda x,[0m
[1m the_list,[0m
[1m the_dict: [x, the_list, the_dict],[0m
[1m beam.pvalue.AsList(side_list),[0m
[1m beam.pvalue.AsDict(side_pairs))[0m
[1m [0m
[1m def matcher(expected_elem, expected_list, expected_pairs):[0m
[1m def match(actual):[0m
[1m [[actual_elem, actual_list, actual_dict]] = actual[0m
[1m equal_to([expected_elem])([actual_elem])[0m
[1m equal_to(expected_list)(actual_list)[0m
[1m equal_to(expected_pairs)(actual_dict.items())[0m
[1m [0m
[1m return match[0m
[1m [0m
[1m assert_that(results, matcher(1, a_list, some_pairs))[0m
[1m> pipeline.run()[0m
[1m[31mapache_beam/transforms/sideinputs_test.py[0m:247:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/testing/test_pipeline.py[0m:112: in run
[1m result = super().run([0m
[1m[31mapache_beam/pipeline.py[0m:546: in run
[1m return Pipeline.from_runner_api([0m
[1m[31mapache_beam/pipeline.py[0m:573: in run
[1m return self.runner.run_pipeline(self, self._options)[0m
[1m[31mapache_beam/runners/dataflow/test_dataflow_runner.py[0m:64: in
run_pipeline
[1m self.result.wait_until_finish(duration=wait_duration)[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <DataflowPipelineResult <Job
clientRequestId: '20220113070108308838-7016'
createTime: '2022-01-13T07:01:17.713976Z'
...01-13T07:01:17.713976Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> at 0x7f6c66401550>
duration = None
[1m def wait_until_finish(self, duration=None):[0m
[1m if not self.is_in_terminal_state():[0m
[1m if not self.has_job:[0m
[1m raise IOError('Failed to get the Dataflow job id.')[0m
[1m [0m
[1m thread = threading.Thread([0m
[1m target=DataflowRunner.poll_for_job_completion,[0m
[1m args=(self._runner, self, duration))[0m
[1m [0m
[1m # Mark the thread as a daemon thread so a keyboard interrupt on the
main[0m
[1m # thread will terminate everything. This is also the reason we will
not[0m
[1m # use thread.join() to wait for the polling thread.[0m
[1m thread.daemon = True[0m
[1m thread.start()[0m
[1m while thread.is_alive():[0m
[1m> time.sleep(5.0)[0m
[1m[31mE Failed: Timeout >4500.0s[0m
[1m[31mapache_beam/runners/dataflow/dataflow_runner.py[0m:1625: Failed
------------------------------ Captured log call -------------------------------
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:693
Executing command:
['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'/tmp/tmp7q5aypoe/tmp_requirements.txt', '--exists-action', 'i', '--no-binary',
':all:']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:303 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
[33mWARNING [0m root:environments.py:371 Make sure that locally built Python
SDK docker image has Python 3.8 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for
environment is apache/beam_python3.8_sdk:2.37.0.dev
[32mINFO [0m root:environments.py:295 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20211222
[32mINFO [0m root:environments.py:302 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20211222" for Docker
environment
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/requirements.txt...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/requirements.txt
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pbr-5.8.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pbr-5.8.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/mock-2.0.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/mock-2.0.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/six-1.16.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/six-1.16.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/soupsieve-2.3.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/soupsieve-2.3.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/PyHamcrest-1.10.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/parameterized-0.7.5.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/parameterized-0.7.5.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/beautifulsoup4-4.10.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow_python_sdk.tar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow_python_sdk.tar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow-worker.jar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow-worker.jar
in 6 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pipeline.pb
in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20']
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:879 Create job:
<Job
clientRequestId: '20220113070108308838-7016'
createTime: '2022-01-13T07:01:17.713976Z'
currentStateTime: '1970-01-01T00:00:00Z'
id:
'2022-01-12_23_01_16-12345785048189360045'
location: 'us-central1'
name: 'beamapp-jenkins-0113070108-307607'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-01-13T07:01:17.713976Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:881 Created job
with id: [2022-01-12_23_01_16-12345785048189360045]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:882 Submitted job:
2022-01-12_23_01_16-12345785048189360045
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:883 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-12_23_01_16-12345785048189360045?project=apache-beam-testing
[33mWARNING [0m
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:63
Waiting indefinitely for streaming job.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-01-12_23_01_16-12345785048189360045 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:20.408Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for
Dataflow Streaming Engine. Workers will scale between 1 and 100 unless
maxNumWorkers is specified.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:22.644Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2022-01-12_23_01_16-12345785048189360045. The number of workers will be between
1 and 100.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:22.686Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2022-01-12_23_01_16-12345785048189360045.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:30.364Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:31.880Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:31.918Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:31.992Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.036Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a
combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.071Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
concatenate/View-python_side_input1-concatenate/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.105Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.169Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
concatenate/View-python_side_input0-concatenate/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.200Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.263Z: JOB_MESSAGE_DETAILED: Expanding
SplittableProcessKeyed operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.296Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into streaming Read/Write steps
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.515Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.724Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.818Z: JOB_MESSAGE_DEBUG: Inserted coder converter after
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.875Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44 for input
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_42.None
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.917Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity, through flatten
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer
assert_that/Group/CoGroupByKeyImpl/Tag[0]
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.947Z: JOB_MESSAGE_DETAILED: Unzipping flatten
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44-u51 for
input
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44.None-c49
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:32.979Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream, through flatten
assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.011Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity into
assert_that/Group/CoGroupByKeyImpl/Tag[1]
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.045Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream into
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.081Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/FlatMap(<lambda at core.py:3228>) into side list/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.110Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/AddRandomKeys into side list/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.144Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into side
list/MaybeReshuffle/Reshuffle/AddRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.176Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.212Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.246Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.282Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/MaybeReshuffle/Reshuffle/RemoveRandomKeys into side
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.306Z: JOB_MESSAGE_DETAILED: Fusing consumer side
list/Map(decode) into side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.338Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input0-concatenate/PairWithVoidKey into side
list/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.394Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input0-concatenate/GroupByKey/WriteStream into
concatenate/View-python_side_input0-concatenate/PairWithVoidKey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.448Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input0-concatenate/GroupByKey/MergeBuckets into
concatenate/View-python_side_input0-concatenate/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.483Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input0-concatenate/Values into
concatenate/View-python_side_input0-concatenate/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.526Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input0-concatenate/StreamingPCollectionViewWriter
into concatenate/View-python_side_input0-concatenate/Values
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.569Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/FlatMap(<lambda at core.py:3228>) into side pairs/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.608Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/AddRandomKeys into side pairs/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.634Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into side
pairs/MaybeReshuffle/Reshuffle/AddRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.680Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.724Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.765Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.798Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/MaybeReshuffle/Reshuffle/RemoveRandomKeys into side
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.829Z: JOB_MESSAGE_DETAILED: Fusing consumer side
pairs/Map(decode) into side pairs/MaybeReshuffle/Reshuffle/RemoveRandomKeys
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.885Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input1-concatenate/PairWithVoidKey into side
pairs/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.944Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input1-concatenate/GroupByKey/WriteStream into
concatenate/View-python_side_input1-concatenate/PairWithVoidKey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:33.991Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input1-concatenate/GroupByKey/MergeBuckets into
concatenate/View-python_side_input1-concatenate/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.031Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input1-concatenate/Values into
concatenate/View-python_side_input1-concatenate/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.078Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/View-python_side_input1-concatenate/StreamingPCollectionViewWriter
into concatenate/View-python_side_input1-concatenate/Values
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.109Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Create/FlatMap(<lambda at core.py:3228>) into
assert_that/Create/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.139Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at
core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.172Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.215Z: JOB_MESSAGE_DETAILED: Fusing consumer main
input/FlatMap(<lambda at core.py:3228>) into main input/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.257Z: JOB_MESSAGE_DETAILED: Fusing consumer main
input/Map(decode) into main input/FlatMap(<lambda at core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.291Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate
into main input/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.323Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/WindowInto(WindowIntoFn) into concatenate
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.361Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.383Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.443Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets into
assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.490Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.528Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/RestoreTags into
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.562Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Unkey into assert_that/Group/RestoreTags
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.604Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Match into assert_that/Unkey
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.689Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.727Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.751Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.785Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.849Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:34.880Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:36.933Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:01:40.701Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:02:20.818Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 so that the pipeline can catch up with its backlog and keep up
with its input rate.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:02:49.871Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:02:49.896Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:09:34.442Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:09:52.178Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T07:10:00.328Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m oauth2client.transport:transport.py:183 Refreshing due to a
401 (attempt 1/2)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-01-12_23_01_16-12345785048189360045 is in state JOB_STATE_CANCELLING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:07.453Z: JOB_MESSAGE_BASIC: Cancel request is committed for
workflow job: 2022-01-12_23_01_16-12345785048189360045.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:08.211Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:08.248Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:08.274Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:08.310Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-01-13T08:16:08.331Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[33m=============================== warnings summary
===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml>
-
[31m[1m======== 1 failed, 31 passed, 1 skipped, 8 warnings in 6131.95 seconds
=========[0m
> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
> FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 226
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 30m 20s
93 actionable tasks: 58 executed, 33 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jdzl7n6wvsjli
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]