See 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/10171/display/redirect>

Changes:


------------------------------------------
[...truncated 87.10 KB...]
    @pytest.mark.it_validatesrunner
    def test_as_list_and_as_dict_side_inputs(self):
      a_list = [5, 1, 3, 2, 9]
      some_pairs = [('crouton', 17), ('supreme', None)]
      pipeline = self.create_pipeline()
      main_input = pipeline | 'main input' >> beam.Create([1])
      side_list = pipeline | 'side list' >> beam.Create(a_list)
      side_pairs = pipeline | 'side pairs' >> beam.Create(some_pairs)
      results = main_input | 'concatenate' >> beam.Map(
          lambda x,
          the_list,
          the_dict: [x, the_list, the_dict],
          beam.pvalue.AsList(side_list),
          beam.pvalue.AsDict(side_pairs))
    
      def matcher(expected_elem, expected_list, expected_pairs):
        def match(actual):
          [[actual_elem, actual_list, actual_dict]] = actual
          equal_to([expected_elem])([actual_elem])
          equal_to(expected_list)(actual_list)
          equal_to(expected_pairs)(actual_dict.items())
    
        return match
    
      assert_that(results, matcher(1, a_list, some_pairs))
>     pipeline.run()

apache_beam/transforms/sideinputs_test.py:247: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/testing/test_pipeline.py:114: in run
    False if self.not_use_test_runner_api else test_runner_api))
apache_beam/pipeline.py:553: in run
    self._options).run(False)
apache_beam/pipeline.py:577: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20221207144938560250-1790'
 createTime: '2022-12-07T14:49:42.659411Z'
...022-12-07T14:49:42.659411Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7fe32b91c550>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely. '
            '{}'.format(consoleUrl))
    
        # TODO(https://github.com/apache/beam/issues/21695): Also run this check
        # if wait_until_finish was called after the pipeline completed.
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          _LOGGER.error(consoleUrl)
          raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
>             self)
E         
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E         Workflow failed.

apache_beam/runners/dataflow/dataflow_runner.py:1556: 
DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:772 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/bin/python3.7',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpcnclhool/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp37m', '--platform', 
'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:330 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO     root:environments.py:376 Default Python SDK image for 
environment is apache/beam_python3.7_sdk:2.45.0.dev
INFO     root:environments.py:296 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20221205
INFO     root:environments.py:304 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20221205" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function pack_combiners at 0x7fe35d166170> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 
==================== <function sort_stages at 0x7fe35d166950> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/seaborn-0.12.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/seaborn-0.12.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/beautifulsoup4-4.11.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/beautifulsoup4-4.11.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.5.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.6.2-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.6.2-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/matplotlib-3.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/dataflow_python_sdk.tar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/dataflow_python_sdk.tar
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:723 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:742 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1207144938-558952-f9kotkri.1670424578.559190/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:898 Create job: 
<Job
 clientRequestId: '20221207144938560250-1790'
 createTime: '2022-12-07T14:49:42.659411Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-12-07_06_49_42-5252315322962421303'
 location: 'us-central1'
 name: 'beamapp-jenkins-1207144938-558952-f9kotkri'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-12-07T14:49:42.659411Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:900 Created job 
with id: [2022-12-07_06_49_42-5252315322962421303]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:901 Submitted job: 
2022-12-07_06_49_42-5252315322962421303
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:907 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-07_06_49_42-5252315322962421303?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-07_06_49_42-5252315322962421303?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2022-12-07_06_49_42-5252315322962421303 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:43.163Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2022-12-07_06_49_42-5252315322962421303. The number of workers will be between 
1 and 1000.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:43.234Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically 
enabled for job 2022-12-07_06_49_42-5252315322962421303.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:45.366Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-f.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.674Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey 
operations into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.709Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a 
combiner.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.746Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations 
into optimizable parts.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.785Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.851Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner 
information.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.900Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, 
Write, and Flatten operations
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.940Z: JOB_MESSAGE_DETAILED: Unzipping flatten s12 for input 
s10.None
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:47.971Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten, into producer 
assert_that/Group/CoGroupByKeyImpl/Tag[0]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.005Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.041Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.068Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/RestoreTags into 
assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.105Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/RestoreTags
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.126Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.159Z: JOB_MESSAGE_DETAILED: Unzipping flatten s12-u13 for 
input s13-reify-value0-c11
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.192Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten 
assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.226Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.260Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify into 
assert_that/Group/CoGroupByKeyImpl/Tag[1]
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.294Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.331Z: JOB_MESSAGE_DETAILED: Fusing consumer 
concatenate/concatenate into main input/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.364Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into concatenate/concatenate
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.399Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.434Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.474Z: JOB_MESSAGE_DEBUG: Workflow config is missing a 
default resource spec.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.512Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and 
teardown to workflow graph.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.540Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop 
steps.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.572Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.772Z: JOB_MESSAGE_DEBUG: Executing wait step start21
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.844Z: JOB_MESSAGE_BASIC: Executing operation side list/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.875Z: JOB_MESSAGE_BASIC: Executing operation side pairs/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.878Z: JOB_MESSAGE_BASIC: Finished operation side list/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.911Z: JOB_MESSAGE_BASIC: Finished operation side pairs/Read
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.943Z: JOB_MESSAGE_DEBUG: Value "side list/Read.out" 
materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:48.997Z: JOB_MESSAGE_DEBUG: Value "side pairs/Read.out" 
materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.035Z: JOB_MESSAGE_BASIC: Executing operation 
concatenate/_UnpickledSideInput(Read.out.0)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.072Z: JOB_MESSAGE_BASIC: Finished operation 
concatenate/_UnpickledSideInput(Read.out.0)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.080Z: JOB_MESSAGE_BASIC: Executing operation 
concatenate/_UnpickledSideInput(Read.out.1)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.117Z: JOB_MESSAGE_BASIC: Finished operation 
concatenate/_UnpickledSideInput(Read.out.1)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.150Z: JOB_MESSAGE_DEBUG: Value 
"concatenate/_UnpickledSideInput(Read.out.0).output" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.195Z: JOB_MESSAGE_DEBUG: Value 
"concatenate/_UnpickledSideInput(Read.out.1).output" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.275Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.325Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.363Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.726Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.790Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.877Z: JOB_MESSAGE_BASIC: Executing operation main 
input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:49.906Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:50.880Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:50.962Z: JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:50.962Z: JOB_MESSAGE_BASIC: Finished operation main 
input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:51.040Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:51.124Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:49:51.160Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:50:09.901Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:50:27.842Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 
2022-12-07T14:50:27.885Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 
2022-12-07_06_49_42-5252315322962421303 is in state JOB_STATE_FAILED
ERROR    
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1552 Console 
URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2022-12-07_06_49_42-5252315322962421303?project=<ProjectId>
=============================== warnings summary 
===============================
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerBatchTests-df-py37.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/transforms/sideinputs_test.py::SideInputsTest::test_as_list_and_as_dict_side_inputs
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Workflow failed.
======= 1 failed, 32 passed, 8 skipped, 
9 warnings in 2356.28s (0:39:16) =======

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 217

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 47s
18 actionable tasks: 12 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fykztpybmpvpk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to