See 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/2958/display/redirect?page=changes>

Changes:

[chamikaramj] Adds several example multi-language Python pipelines

[noreply] [BEAM-13399] Move service liveness polling to Runner type (#16487)


------------------------------------------
[...truncated 85.54 KB...]
      some_pairs = [('crouton', 17), ('supreme', None)]
      pipeline = self.create_pipeline()
      main_input = pipeline | 'main input' >> beam.Create([1])
      side_list = pipeline | 'side list' >> beam.Create(a_list)
      side_pairs = pipeline | 'side pairs' >> beam.Create(some_pairs)
      results = main_input | 'concatenate' >> beam.Map(
          lambda x,
          the_list,
          the_dict: [x, the_list, the_dict],
          beam.pvalue.AsList(side_list),
          beam.pvalue.AsDict(side_pairs))
    
      def matcher(expected_elem, expected_list, expected_pairs):
        def match(actual):
          [[actual_elem, actual_list, actual_dict]] = actual
          equal_to([expected_elem])([actual_elem])
          equal_to(expected_list)(actual_list)
          equal_to(expected_pairs)(actual_dict.items())
    
        return match
    
      assert_that(results, matcher(1, a_list, some_pairs))
>     pipeline.run()

apache_beam/transforms/sideinputs_test.py:247: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/testing/test_pipeline.py:112: in run
    result = super().run(
apache_beam/pipeline.py:546: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:573: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:64: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20220113070108308838-7016'
 createTime: '2022-01-13T07:01:17.713976Z'
...01-13T07:01:17.713976Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> at 0x7f6c66401550>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
    
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the 
main
        # thread will terminate everything. This is also the reason we will 
not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >4500.0s

apache_beam/runners/dataflow/dataflow_runner.py:1625: Failed
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:693 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/bin/python3.8',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmp7q5aypoe/tmp_requirements.txt', '--exists-action', 'i', '--no-binary', 
':all:']
INFO     apache_beam.runners.portability.stager:stager.py:303 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:371 Make sure that locally built Python 
SDK docker image has Python 3.8 interpreter.
INFO     root:environments.py:380 Default Python SDK image for 
environment is apache/beam_python3.8_sdk:2.37.0.dev
INFO     root:environments.py:295 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20211222
INFO     root:environments.py:302 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20211222" for Docker 
environment
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pbr-5.8.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pbr-5.8.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/mock-2.0.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/mock-2.0.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/six-1.16.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/six-1.16.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/soupsieve-2.3.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/soupsieve-2.3.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/PyHamcrest-1.10.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/parameterized-0.7.5.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/parameterized-0.7.5.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/beautifulsoup4-4.10.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/beautifulsoup4-4.10.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow-worker.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/dataflow-worker.jar
 in 6 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:699 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:715 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113070108-307607.1642057268.307774/pipeline.pb
 in 0 seconds.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20']
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20']
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:879 Create job: 
<Job
                                                                           
clientRequestId: '20220113070108308838-7016'
                                                                           
createTime: '2022-01-13T07:01:17.713976Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2022-01-12_23_01_16-12345785048189360045'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-0113070108-307607'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2022-01-13T07:01:17.713976Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:881 Created job 
with id: [2022-01-12_23_01_16-12345785048189360045]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:882 Submitted job: 
2022-01-12_23_01_16-12345785048189360045
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:883 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-01-12_23_01_16-12345785048189360045?project=apache-beam-testing
WARNING  
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:63 
Waiting indefinitely for streaming job.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-01-12_23_01_16-12345785048189360045 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:20.408Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for 
Dataflow Streaming Engine. Workers will scale between 1 and 100 unless 
maxNumWorkers is specified.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:22.644Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2022-01-12_23_01_16-12345785048189360045. The number of workers will be between 
1 and 100.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:22.686Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2022-01-12_23_01_16-12345785048189360045.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:30.364Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:31.880Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:31.918Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:31.992Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.036Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.071Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
concatenate/View-python_side_input1-concatenate/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.105Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.169Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
concatenate/View-python_side_input0-concatenate/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.200Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not 
followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.263Z: JOB_MESSAGE_DETAILED: Expanding 
SplittableProcessKeyed operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.296Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into streaming Read/Write steps
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.515Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.724Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.818Z: JOB_MESSAGE_DEBUG: Inserted coder converter after 
flatten ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.875Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44 for input 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Tag-0-_42.None
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.917Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer 
assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.947Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44-u51 for 
input 
ref_AppliedPTransform_assert_that-Group-CoGroupByKeyImpl-Flatten_44.None-c49
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:32.979Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer 
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.011Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity into 
assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.045Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/WriteStream into 
assert_that/Group/CoGroupByKeyImpl/Flatten/OutputIdentity
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.081Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/FlatMap(<lambda at core.py:3228>) into side list/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.110Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/AddRandomKeys into side list/FlatMap(<lambda at 
core.py:3228>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.144Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into side 
list/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.176Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.212Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.246Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into 
side list/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.282Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/MaybeReshuffle/Reshuffle/RemoveRandomKeys into side 
list/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.306Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
list/Map(decode) into side list/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.338Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input0-concatenate/PairWithVoidKey into side 
list/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.394Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input0-concatenate/GroupByKey/WriteStream into 
concatenate/View-python_side_input0-concatenate/PairWithVoidKey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.448Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input0-concatenate/GroupByKey/MergeBuckets into 
concatenate/View-python_side_input0-concatenate/GroupByKey/ReadStream
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.483Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input0-concatenate/Values into 
concatenate/View-python_side_input0-concatenate/GroupByKey/MergeBuckets
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.526Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input0-concatenate/StreamingPCollectionViewWriter 
into concatenate/View-python_side_input0-concatenate/Values
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.569Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/FlatMap(<lambda at core.py:3228>) into side pairs/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.608Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/AddRandomKeys into side pairs/FlatMap(<lambda at 
core.py:3228>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.634Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into side 
pairs/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.680Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.724Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into 
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.765Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into 
side pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.798Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/MaybeReshuffle/Reshuffle/RemoveRandomKeys into side 
pairs/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.829Z: JOB_MESSAGE_DETAILED: Fusing consumer side 
pairs/Map(decode) into side pairs/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.885Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input1-concatenate/PairWithVoidKey into side 
pairs/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.944Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input1-concatenate/GroupByKey/WriteStream into 
concatenate/View-python_side_input1-concatenate/PairWithVoidKey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:33.991Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input1-concatenate/GroupByKey/MergeBuckets into 
concatenate/View-python_side_input1-concatenate/GroupByKey/ReadStream
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.031Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input1-concatenate/Values into 
concatenate/View-python_side_input1-concatenate/GroupByKey/MergeBuckets
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.078Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/View-python_side_input1-concatenate/StreamingPCollectionViewWriter 
into concatenate/View-python_side_input1-concatenate/Values
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.109Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/FlatMap(<lambda at core.py:3228>) into 
assert_that/Create/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.139Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at 
core.py:3228>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.172Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.215Z: JOB_MESSAGE_DETAILED: Fusing consumer main 
input/FlatMap(<lambda at core.py:3228>) into main input/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.257Z: JOB_MESSAGE_DETAILED: Fusing consumer main 
input/Map(decode) into main input/FlatMap(<lambda at core.py:3228>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.291Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate 
into main input/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.323Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into concatenate
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.361Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.383Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.443Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/ReadStream
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.490Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/MergeBuckets
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.528Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/RestoreTags into 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.562Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/RestoreTags
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.604Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.689Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.727Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.751Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.785Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.849Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:34.880Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:36.933Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:01:40.701Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:02:20.818Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 so that the pipeline can catch up with its backlog and keep up 
with its input rate.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:02:49.871Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:02:49.896Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:09:34.442Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:09:52.178Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T07:10:00.328Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     oauth2client.transport:transport.py:183 Refreshing due to a 
401 (attempt 1/2)
INFO     oauth2client.transport:transport.py:183 Refreshing due to a 
401 (attempt 1/2)
INFO     oauth2client.transport:transport.py:183 Refreshing due to a 
401 (attempt 1/2)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-01-12_23_01_16-12345785048189360045 is in state JOB_STATE_CANCELLING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:07.453Z: JOB_MESSAGE_BASIC: Cancel request is committed for 
workflow job: 2022-01-12_23_01_16-12345785048189360045.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:08.211Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:08.248Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:08.274Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:08.310Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-01-13T08:16:08.331Z: JOB_MESSAGE_BASIC: Stopping worker pool...
=============================== warnings summary 
===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/pytest_validatesRunnerStreamingTests-df-py38-xdist.xml>
 -
======== 1 failed, 31 passed, 1 skipped, 8 warnings in 6131.95 seconds 
=========

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests 
> FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 226

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 30m 20s
93 actionable tasks: 58 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/jdzl7n6wvsjli

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to