See 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/9759/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Batch encoding and decoding of schema data.

[Robert Bradshaw] Add microbenchmark for batch row encoding.

[Robert Bradshaw] Add batch testing for standard row coders.

[noreply] replaced tabs with spaces in readme file (#23446)

[noreply] [Playground] [Backend] Adding the tags field to the example response

[noreply] [Playground] [Backend] Edited the function for getting executable name

[noreply] Fix type inference for set/delete attr. (#23242)

[noreply] Support VR test including TestStream for Spark runner in streaming 
mode


------------------------------------------
[...truncated 1.33 MB...]

    @pytest.mark.it_validatescontainer
    def test_wordcount_it_with_prebuilt_sdk_container_cloud_build(self):
>     self._run_wordcount_it(
          wordcount.run,
          experiment='beam_fn_api',
          prebuild_sdk_container_engine='cloud_build')

apache_beam/examples/wordcount_it_test.py:103: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/wordcount_it_test.py:146: in _run_wordcount_it
    run_wordcount(
apache_beam/examples/wordcount.py:106: in run
    output | 'Write' >> WriteToText(known_args.output)
apache_beam/pipeline.py:597: in __exit__
    self.result = self.run()
apache_beam/pipeline.py:547: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:574: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20220930231437709005-9986'
 createTime: '2022-09-30T23:14:39.316233Z'
...022-09-30T23:14:39.316233Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7fa6c9ad1400>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
>         time.sleep(5.0)
E         Failed: Timeout >900.0s

apache_beam/runners/dataflow/dataflow_runner.py:1658: Failed
______________ ExerciseMetricsPipelineTest.test_metrics_fnapi_it _______________
[gw0] linux -- Python 3.9.10 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/build/gradleenv/-1734967050/bin/python3.9>

self = 
<apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest
 testMethod=test_metrics_fnapi_it>

    @pytest.mark.it_postcommit
    @pytest.mark.it_validatescontainer
    def test_metrics_fnapi_it(self):
>     result = self.run_pipeline(experiment='beam_fn_api')

apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py:57: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py:43: in 
run_pipeline
    return dataflow_exercise_metrics_pipeline.apply_and_run(p)
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py:176: in 
apply_and_run
    result = pipeline.run()
apache_beam/pipeline.py:547: in run
    return Pipeline.from_runner_api(
apache_beam/pipeline.py:574: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20220930233307138907-3550'
 createTime: '2022-09-30T23:33:08.226713Z'
...022-09-30T23:33:08.226713Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7fa6c75b0eb0>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely. '
            '{}'.format(consoleUrl))
    
        # TODO(https://github.com/apache/beam/issues/21695): Also run this check
        # if wait_until_finish was called after the pipeline completed.
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          _LOGGER.error(consoleUrl)
>         raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
              self)
E         
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E         Workflow failed.

apache_beam/runners/dataflow/dataflow_runner.py:1673: DataflowRuntimeException
=============================== warnings summary ===============================
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib; see 
the module's documentation for alternative uses
    from imp import load_source

../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:37
../../build/gradleenv/-1734967050/lib/python3.9/site-packages/google/api_core/grpc_helpers.py:37
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/google/api_core/grpc_helpers.py>:37:
 DeprecationWarning: Support for grpcio-gcp is deprecated. This feature will be
              removed from `google-api-core` after January 1, 2024. If you need 
to
              continue to use this feature, please pin to a specific version of
              `google-api-core`.
    warnings.warn(

../../build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/build/gradleenv/-1734967050/lib/python3.9/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/typehints/pandas_type_compatibility_test.py:66
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:66:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:89
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:89:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/pytest-beam_python3.9_sdk.xml>
 -
=========================== short test summary info ============================
FAILED 
apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it_with_prebuilt_sdk_container_cloud_build
FAILED 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py::ExerciseMetricsPipelineTest::test_metrics_fnapi_it
======= 2 failed, 2 passed, 3 skipped, 8 warnings in 2210.44s (0:36:50) ========
cleanup_container
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk:20220930-225419099278790
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk@sha256:b0c89307224b68dd7cc7544e532db2f5b02ecbc958b9608f1965bc33f9d3e164
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk:38d136f3-30e4-4685-a65c-54d178ec893d
Untagged: 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk@sha256:f1343062964dbf37e15e71a22776e64ba0694fe607c3cee861e4a4300936cd86
Deleted: sha256:68ae11c96a28580e1fa2d450d4fd5df3017a6407c65c0d45788992e3af4ee310
Deleted: sha256:441bf5bb051f8739c05e4c0e80e17ee605db26c35cac0a446698732d149af13a
Deleted: sha256:b323bd569a2edbace3024d3941cf66fe52038020cbf440f99267e2d17fd499ff
Deleted: sha256:0c6be3cea6e7acca236405e0895cdd897ef12f93dd39dd6d67cd565dbcd988e1
Deleted: sha256:1a61d00b5770e9a1277b19d162a0ab3607c9e0b3bcf74802493ca0ca03cb45ec
Deleted: sha256:1459812fc8b1e945962aae29e2f66f7b618b7b4405879fe2a8557a1363e7ab34
Deleted: sha256:c07f6b57a69af962d2de2eb89182de30889b9d8b5057521b33ffa0859d439155
Deleted: sha256:5360c2331de0a3a0b4213aab7351f25507ea7502732c07c112f4b86ec02c0105
Deleted: sha256:e06e40cb086ad1604c6999d9c208e80c187bb2b9e4af9214370f13d660d227f0
Deleted: sha256:358485d14df244cda59747c4ddc61acedb2a52ce6c636077a8588b47de1c8033
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk@sha256:b0c89307224b68dd7cc7544e532db2f5b02ecbc958b9608f1965bc33f9d3e164
  Associated tags:
 - 20220930-225419099278790
Tags:
- 
us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk:20220930-225419099278790
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk:20220930-225419099278790].

> Task :sdks:python:test-suites:dataflow:py38:validatesContainer
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk@sha256:d3b7c99c16cc98291a5682928ee097a91ffa4030dc5c4667deca04ba22f3d29e
  Associated tags:
 - b6841598-2910-4190-be54-4cf6cd575114
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk:b6841598-2910-4190-be54-4cf6cd575114].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk@sha256:d3b7c99c16cc98291a5682928ee097a91ffa4030dc5c4667deca04ba22f3d29e].
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk@sha256:47612cf34e989291cfaf06f662080178887a78b7073ec1087d07d2716e1c6b8e
  Associated tags:
 - e5ccca92-02af-4f82-84b9-0eb00b397fdf
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk:e5ccca92-02af-4f82-84b9-0eb00b397fdf].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python38_sdk/beam_python_prebuilt_sdk@sha256:47612cf34e989291cfaf06f662080178887a78b7073ec1087d07d2716e1c6b8e].
Removed the container

> Task :sdks:python:test-suites:dataflow:py39:validatesContainer
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/beam_python3.9_sdk@sha256:b0c89307224b68dd7cc7544e532db2f5b02ecbc958b9608f1965bc33f9d3e164].
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk@sha256:f1343062964dbf37e15e71a22776e64ba0694fe607c3cee861e4a4300936cd86
  Associated tags:
 - 38d136f3-30e4-4685-a65c-54d178ec893d
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk:38d136f3-30e4-4685-a65c-54d178ec893d].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk@sha256:f1343062964dbf37e15e71a22776e64ba0694fe607c3cee861e4a4300936cd86].
Digests:
- 
us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk@sha256:8688c91451503d4eee5fc60d14c762b3a74012d9b00822a7a93252a5b4736e95
  Associated tags:
 - 58da1336-7f62-4873-a6df-2b5dc6f38899
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk:58da1336-7f62-4873-a6df-2b5dc6f38899].
Deleted 
[us.gcr.io/apache-beam-testing/jenkins/prebuild_python39_sdk/beam_python_prebuilt_sdk@sha256:8688c91451503d4eee5fc60d14c762b3a74012d9b00822a7a93252a5b4736e95].

> Task :sdks:python:test-suites:dataflow:py39:validatesContainer FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 359

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py39:validatesContainer'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 47s
51 actionable tasks: 41 executed, 4 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/qmvamfcojhlqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to