See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/29/display/redirect?page=changes>

Changes:

[noreply] [BEAM-13679] playground - move quick start category to the top 
(#16808)

[noreply] Update license_script.sh (#16789)

[noreply] [BEAM-13908] [Coverage] Better testing coverage for gcpopts (#16816)

[noreply] Merge pull request #16809 from [BEAM-12164] Added integration test for


------------------------------------------
[...truncated 475.77 KB...]
      test_pipeline = TestPipeline(is_integration_test=True)
    
      # Setup the files with expected content.
      temp_folder = tempfile.mkdtemp()
      self.create_content_input_file(
          os.path.join(temp_folder, 'input.txt'),
          '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))
      extra_opts = {
          'input': '%s/input.txt' % temp_folder,
          'output': os.path.join(temp_folder, 'result')
      }
      coders.run(test_pipeline.get_full_options_as_args(**extra_opts))
    
      # Load result file and compare.
      with open_shards(os.path.join(temp_folder, 'result-*-of-*')) as 
result_file:
        result = result_file.read().strip()
    
      self.assertEqual(
>         sorted(self.EXPECTED_RESULT), 
sorted(self.format_result(result)))

apache_beam/examples/cookbook/coders_test.py:98: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/cookbook/coders_test.py:60: in format_result
    result_list = list(
apache_beam/examples/cookbook/coders_test.py:62: in <lambda>
    lambda result_elem: format_tuple(result_elem.split(',')),
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

result_elem_list = ['']

    def format_tuple(result_elem_list):
>     [country, counter] = result_elem_list
E     ValueError: not enough values to unpack (expected 2, got 1)

apache_beam/examples/cookbook/coders_test.py:57: ValueError
------------------------------ Captured log call -------------------------------
INFO     root:coders_test.py:51 Creating temp file: 
/tmp/tmph69sw7qd/input.txt
INFO     apache_beam.runners.portability.stager:stager.py:697 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/bin/python3.8',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmp6wfludfu/tmp_requirements.txt', '--exists-action', 'i', '--no-binary', 
':all:']
INFO     apache_beam.runners.portability.stager:stager.py:305 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING  root:environments.py:371 Make sure that locally built Python 
SDK docker image has Python 3.8 interpreter.
INFO     root:environments.py:380 Default Python SDK image for 
environment is apache/beam_python3.8_sdk:2.38.0.dev
INFO     root:environments.py:295 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208
INFO     root:environments.py:302 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function pack_combiners at 0x7fa79ad465e0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 
==================== <function sort_stages at 0x7fa79ad46dc0> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pickled_main_session...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pickled_main_session
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pbr-5.8.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pbr-5.8.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/mock-2.0.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/mock-2.0.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/six-1.16.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/six-1.16.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/soupsieve-2.3.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/soupsieve-2.3.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/PyHamcrest-1.10.1.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/parameterized-0.7.5.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/parameterized-0.7.5.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/beautifulsoup4-4.10.0.tar.gz...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/beautifulsoup4-4.10.0.tar.gz
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211073719-876591-bd2svcp1.1644565039.876763/pipeline.pb
 in 0 seconds.
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20', 
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
WARNING  apache_beam.options.pipeline_options:pipeline_options.py:309 
Discarding unparseable args: ['--sleep_secs=20', 
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:886 Create job: 
<Job
                                                                           
clientRequestId: '20220211073719877679-5745'
                                                                           
createTime: '2022-02-11T07:37:22.432663Z'
                                                                           
currentStateTime: '1970-01-01T00:00:00Z'
                                                                           id: 
'2022-02-10_23_37_21-15099663679967344226'
                                                                           
location: 'us-central1'
                                                                           
name: 'beamapp-jenkins-0211073719-876591-bd2svcp1'
                                                                           
projectId: 'apache-beam-testing'
                                                                           
stageStates: []
                                                                           
startTime: '2022-02-11T07:37:22.432663Z'
                                                                           
steps: []
                                                                           
tempFiles: []
                                                                           
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 Created job 
with id: [2022-02-10_23_37_21-15099663679967344226]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:889 Submitted job: 
2022-02-10_23_37_21-15099663679967344226
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-10_23_37_21-15099663679967344226?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-02-10_23_37_21-15099663679967344226 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:25.361Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2022-02-10_23_37_21-15099663679967344226. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:25.747Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2022-02-10_23_37_21-15099663679967344226.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:28.193Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.057Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.099Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.188Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.225Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.295Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.324Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.354Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.417Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/InitializeWrite into 
write/Write/WriteImpl/DoOnce/Map(decode)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.450Z: JOB_MESSAGE_DETAILED: Fusing consumer 
read/Read/Map(<lambda at iobase.py:898>) into read/Read/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.475Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
 into read/Read/Map(<lambda at iobase.py:898>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.520Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
 into 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.546Z: JOB_MESSAGE_DETAILED: Fusing consumer points into 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.578Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial into points
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.609Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/GroupByKey/Write into 
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.629Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/Combine into CombinePerKey(sum)/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.662Z: JOB_MESSAGE_DETAILED: Fusing consumer 
CombinePerKey(sum)/Combine/Extract into CombinePerKey(sum)/Combine
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.683Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WindowInto(WindowIntoFn) into 
CombinePerKey(sum)/Combine/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.716Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3228>) into 
write/Write/WriteImpl/DoOnce/Impulse
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.750Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/DoOnce/Map(decode) into 
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3228>)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.773Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/WriteBundles into 
write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.808Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.840Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/Pair
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.873Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.910Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.939Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.963Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:30.997Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.147Z: JOB_MESSAGE_DEBUG: Executing wait step start34
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.224Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3228>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.258Z: JOB_MESSAGE_BASIC: Executing operation 
read/Read/Impulse+read/Read/Map(<lambda at 
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.271Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.289Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.300Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.312Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.361Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.362Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.444Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/GroupByKey/Session" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:31.476Z: JOB_MESSAGE_DEBUG: Value 
"CombinePerKey(sum)/GroupByKey/Session" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:37:58.631Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:38:17.372Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number 
of workers to 1 based on the rate of progress in the currently running stage(s).
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:38:36.760Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:38:36.789Z: JOB_MESSAGE_DETAILED: Workers have started 
successfully.
INFO     oauth2client.transport:transport.py:183 Refreshing due to a 
401 (attempt 1/2)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.302Z: JOB_MESSAGE_BASIC: Finished operation 
read/Read/Impulse+read/Read/Map(<lambda at 
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.381Z: JOB_MESSAGE_DEBUG: Value 
"ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.433Z: JOB_MESSAGE_BASIC: Executing operation 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.440Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3228>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.540Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/DoOnce/Map(decode).None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.572Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/InitializeWrite.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.661Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.684Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.715Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.720Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.729Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.755Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.779Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.832Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:04.856Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:10.077Z: JOB_MESSAGE_BASIC: Finished operation 
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:10.141Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:10.185Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:10.241Z: JOB_MESSAGE_BASIC: Executing operation 
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:11.430Z: JOB_MESSAGE_BASIC: Finished operation 
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:11.498Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:11.550Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:11.608Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.012Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.098Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/Extract.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.154Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.176Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.205Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.235Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.267Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.301Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:14.369Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:16.984Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/PreFinalize
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:17.054Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/PreFinalize.None" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:17.121Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:17.175Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:17.241Z: JOB_MESSAGE_DEBUG: Value 
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite.out"
 materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:17.298Z: JOB_MESSAGE_BASIC: Executing operation 
write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:19.060Z: JOB_MESSAGE_BASIC: Finished operation 
write/Write/WriteImpl/FinalizeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:19.141Z: JOB_MESSAGE_DEBUG: Executing success step success32
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:19.209Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:19.302Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:45:19.321Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:47:46.936Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker 
pool from 1 to 0.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:47:46.987Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236 
2022-02-11T07:47:47.020Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job 
2022-02-10_23_37_21-15099663679967344226 is in state JOB_STATE_DONE
=============================== warnings summary 
===============================
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2463:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    pipeline_options=pcoll.pipeline.options,

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py38-no-xdist.xml>
 -
===== 4 failed, 3 passed, 5200 deselected, 12 warnings in 4814.35 
seconds ======

> Task :sdks:python:test-suites:dataflow:py38:examples FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 182

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1h 47m 56s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qcm3o7hnrkhmo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to