See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/57/display/redirect>

Changes:


------------------------------------------
[...truncated 521.06 KB...]
        result = result_file.read().strip()
    
      self.assertEqual(
>         sorted(self.EXPECTED_RESULT), 
sorted(self.format_result(result)))

apache_beam/examples/cookbook/coders_test.py:98: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/cookbook/coders_test.py:60: in format_result
    result_list = list(
apache_beam/examples/cookbook/coders_test.py:62: in <lambda>
    lambda result_elem: format_tuple(result_elem.split(',')),
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

result_elem_list = ['']

    def format_tuple(result_elem_list):
>     [country, counter] = result_elem_list
E     ValueError: not enough values to unpack (expected 2, got 1)

apache_beam/examples/cookbook/coders_test.py:57: ValueError
------------------------------ Captured log call -------------------------------
INFO     root:coders_test.py:51 Creating temp file: 
/tmp/tmpd83f7spo/input.txt
INFO     apache_beam.runners.portability.stager:stager.py:754 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/bin/python3.8',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpkaq0h57a/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp38', '--platform', 'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:325 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:371 Make sure that locally built Python 
SDK docker image has Python 3.8 interpreter.
INFO     root:environments.py:380 Default Python SDK image for 
environment is apache/beam_python3.8_sdk:2.38.0.dev
INFO     root:environments.py:295 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208
INFO     root:environments.py:302 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7fadaa807820> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7fadaa80a040> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pickled_main_session...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pickled_main_session
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pbr-5.8.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pbr-5.8.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/mock-2.0.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/mock-2.0.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/six-1.16.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/six-1.16.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/soupsieve-2.3.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/soupsieve-2.3.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/PyHamcrest-1.10.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/parameterized-0.7.5.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/parameterized-0.7.5.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/beautifulsoup4-4.10.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/beautifulsoup4-4.10.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/seaborn-0.11.2-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/seaborn-0.11.2-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/beautifulsoup4-4.10.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/beautifulsoup4-4.10.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0218194546-771037-i0m7rf6v.1645213546.771196/pipeline.pb
 in 0 seconds.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20', 
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20', 
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:886 Create job: 
<Job
                                                                           
clientRequestId: '20220218194546772073-2337'
                                                                           
createTime: '2022-02-18T19:45:51.389009Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2022-02-18_11_45_50-12395888003596793215'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-0218194546-771037-i0m7rf6v'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2022-02-18T19:45:51.389009Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 Created job 
with id: [2022-02-18_11_45_50-12395888003596793215]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:889 Submitted job: 
2022-02-18_11_45_50-12395888003596793215
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-18_11_45_50-12395888003596793215?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-02-18_11_45_50-12395888003596793215 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:45:57.372Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2022-02-18_11_45_50-12395888003596793215. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:45:57.689Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2022-02-18_11_45_50-12395888003596793215.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:00.258Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:00.975Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.005Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.074Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.139Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.211Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.249Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.294Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.319Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/InitializeWrite into 
write/Write/WriteImpl/DoOnce/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.355Z: JOB_MESSAGE_DETAILED: Fusing consumer 
read/Read/Map(<lambda at iobase.py:898>) into read/Read/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.399Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
 into read/Read/Map(<lambda at iobase.py:898>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.434Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
 into 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.462Z: JOB_MESSAGE_DETAILED: Fusing consumer points into 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.502Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial into points
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.534Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/GroupByKey/Write into 
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.560Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/Combine into CombinePerKey(sum)/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.595Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/Combine/Extract into CombinePerKey(sum)/Combine
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.627Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WindowInto(WindowIntoFn) into 
CombinePerKey(sum)/Combine/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.651Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>) into 
write/Write/WriteImpl/DoOnce/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.684Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/DoOnce/Map(decode) into 
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.745Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WriteBundles into 
write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.779Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.801Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/Pair
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.851Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.899Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.928Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.962Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:01.997Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.149Z: JOB_MESSAGE_DEBUG: Executing wait step start34
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.222Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.259Z: JOB_MESSAGE_BASIC: Executing operation 
read/Read/Impulse+read/Read/Map(<lambda at 
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.273Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.292Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.311Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.322Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.373Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.373Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.432Z: JOB_MESSAGE_DEBUG: Value 
"CombinePerKey(sum)/GroupByKey/Session" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:02.464Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/GroupByKey/Session" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:14.962Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:46:46.344Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:47:05.712Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:47:05.746Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.398Z: JOB_MESSAGE_BASIC: Finished operation 
read/Read/Impulse+read/Read/Map(<lambda at 
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.458Z: JOB_MESSAGE_DEBUG: Value 
"ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.531Z: JOB_MESSAGE_BASIC: Executing operation 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.547Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.606Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/DoOnce/Map(decode).None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.634Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/InitializeWrite.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.691Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.715Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.743Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.751Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.763Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.797Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.815Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.837Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:17.911Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:23.207Z: JOB_MESSAGE_BASIC: Finished operation 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:23.278Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:23.328Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:23.392Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:24.542Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:24.603Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:24.650Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:24.719Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:26.987Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.054Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/Extract.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.115Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.142Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.165Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.211Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.232Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.280Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:27.345Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.002Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.071Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.133Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.193Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.264Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:30.335Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:32.235Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:32.282Z: JOB_MESSAGE_DEBUG: Executing success step success32
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:32.359Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:32.461Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:54:32.499Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:57:04.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:57:04.350Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-18T19:57:04.375Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-02-18_11_45_50-12395888003596793215 is in state JOB_STATE_DONE
=============================== warnings summary 
===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2463:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py38-no-xdist.xml>
 -
===== 4 failed, 3 passed, 5213 deselected, 12 warnings in 5087.46 
seconds ======

> Task :sdks:python:test-suites:dataflow:py38:examples FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 54m 28s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/3ozhbnrq6iw32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to