See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/1452/display/redirect>

Changes:


------------------------------------------
[...truncated 12.03 MB...]
___________________ SklearnInference.test_sklearn_regression 
___________________
[gw2] linux -- Python 3.10.2 
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/build/gradleenv/2050596098/bin/python3.10>

self = <apache_beam.ml.inference.sklearn_inference_it_test.SklearnInference 
testMethod=test_sklearn_regression>

    @unittest.skipIf(sys.version_info >= (3, 11, 0), "Beam#27151")
    def test_sklearn_regression(self):
      test_pipeline = TestPipeline(is_integration_test=True)
      input_file = 
'gs://apache-beam-ml/testing/inputs/japanese_housing_test_data.csv'  # pylint: 
disable=line-too-long
      output_file_dir = 'gs://temp-storage-for-end-to-end-tests'
      output_file = '/'.join([output_file_dir, str(uuid.uuid4()), 'result.txt'])
      model_path = 'gs://apache-beam-ml/models/japanese_housing/'
      extra_opts = {
          'input': input_file,
          'output': output_file,
          'model_path': model_path,
      }
>     sklearn_japanese_housing_regression.run(
          test_pipeline.get_full_options_as_args(**extra_opts),
          save_main_session=False)

apache_beam/ml/inference/sklearn_inference_it_test.py:133: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/examples/inference/sklearn_japanese_housing_regression.py:179:
 in run
    result = pipeline.run()
apache_beam/pipeline.py:560: in run
    self._options).run(False)
apache_beam/pipeline.py:584: in run
    return self.runner.run_pipeline(self, self._options)
apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
    self.result.wait_until_finish(duration=wait_duration)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <DataflowPipelineResult <Job
 clientRequestId: '20231017102233277644-2841'
 createTime: '2023-10-17T10:23:45.089516Z'
...023-10-17T10:23:45.089516Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f11a7117df0>
duration = None

    def wait_until_finish(self, duration=None):
      if not self.is_in_terminal_state():
        if not self.has_job:
          raise IOError('Failed to get the Dataflow job id.')
        consoleUrl = (
            "Console URL: https://console.cloud.google.com/";
            f"dataflow/jobs/<RegionId>/{self.job_id()}"
            "?project=<ProjectId>")
        thread = threading.Thread(
            target=DataflowRunner.poll_for_job_completion,
            args=(self._runner, self, duration))
    
        # Mark the thread as a daemon thread so a keyboard interrupt on the main
        # thread will terminate everything. This is also the reason we will not
        # use thread.join() to wait for the polling thread.
        thread.daemon = True
        thread.start()
        while thread.is_alive():
          time.sleep(5.0)
    
        # TODO: Merge the termination code in poll_for_job_completion and
        # is_in_terminal_state.
        terminated = self.is_in_terminal_state()
        assert duration or terminated, (
            'Job did not reach to a terminal state after waiting indefinitely. '
            '{}'.format(consoleUrl))
    
        if terminated and self.state != PipelineState.DONE:
          # TODO(BEAM-1290): Consider converting this to an error log based on
          # theresolution of the issue.
          _LOGGER.error(consoleUrl)
>         raise DataflowRuntimeException(
              'Dataflow pipeline failed. State: %s, Error:\n%s' %
              (self.state, getattr(self._runner, 'last_error_msg', None)),
E             
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
E             Workflow failed.

apache_beam/runners/dataflow/dataflow_runner.py:771: 
DataflowRuntimeException
------------------------------ Captured log call -------------------------------
INFO     apache_beam.runners.portability.stager:stager.py:761 
Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/build/gradleenv/2050596098/bin/python3.10',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'/tmp/tmpm1hdc8kf/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', 
'--implementation', 'cp', '--abi', 'cp310', '--platform', 
'manylinux2014_x86_64']
INFO     apache_beam.runners.portability.stager:stager.py:322 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/cloud-dataflow/v1beta3/beam_python3.10_sdk:beam-master-20231009
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.10_sdk:beam-master-20231009" for 
Docker environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f11d995fac0> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f11d996c310> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/mock-2.0.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/mock-2.0.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/seaborn-0.12.2-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/seaborn-0.12.2-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/seaborn-0.13.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/seaborn-0.13.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/PyHamcrest-1.10.1-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/PyHamcrest-1.10.1-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/inflection-0.5.1-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/inflection-0.5.1-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/beautifulsoup4-4.12.2-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/beautifulsoup4-4.12.2-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/parameterized-0.7.5-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/parameterized-0.7.5-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow_transform-1.14.0-py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow_transform-1.14.0-py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.33.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.33.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.33.1-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.33.1-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.34.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.34.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.35.0-py2.py3-none-any.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/google_cloud_aiplatform-1.35.0-py2.py3-none-any.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tfx_bsl-1.14.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tfx_bsl-1.14.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.7.3-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.7.3-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow-2.13.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow-2.13.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 29 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow-2.13.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/tensorflow-2.13.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 27 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/matplotlib-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/scikit_learn-1.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/scikit_learn-1.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/scikit_learn-1.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/requirements.txt...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/requirements.txt
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/apache_beam-2.52.0.dev0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/apache_beam-2.52.0.dev0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 1 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231017102233277644-2841'
 createTime: '2023-10-17T10:23:45.089516Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-17_03_23_41-11162700211141854183'
 location: 'us-central1'
 name: 'beamapp-jenkins-1017102233-276574-7eanav4g'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-17T10:23:45.089516Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-17_03_23_41-11162700211141854183]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-17_03_23_41-11162700211141854183
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-17_03_23_41-11162700211141854183?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-17_03_23_41-11162700211141854183?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-17_03_23_41-11162700211141854183 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:23:49.072Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:23:52.199Z: JOB_MESSAGE_BASIC: Executing operation 
WriteOutput/Write/WriteImpl/DoOnce/Impulse+WriteOutput/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3759>)+WriteOutput/Write/WriteImpl/DoOnce/Map(decode)+WriteOutput/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:23:52.224Z: JOB_MESSAGE_BASIC: Executing operation 
WriteOutput/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:23:52.307Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:24:01.609Z: JOB_MESSAGE_BASIC: Finished operation 
WriteOutput/Write/WriteImpl/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:24:01.705Z: JOB_MESSAGE_BASIC: Executing operation 
FileNames/Impulse+FileNames/FlatMap(<lambda at 
core.py:3759>)+FileNames/Map(decode)+ParDo(LoadDataframe)+Partition/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn)+RunInference
 all_features/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
all_features/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 floor_area/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
floor_area/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 no_features/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
no_features/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 stations/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
stations/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteOutput/Write/WriteImpl/GroupByKey/Session/Flatten+WriteOutput/Write/WriteImpl/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:24:27.107Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
ERROR    
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-10-17T10:46:41.091Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone 
us-central1-b failed to bring up any of the desired 1 workers. Please refer to 
https://cloud.google.com/dataflow/docs/guides/common-errors#worker-pool-failure 
for help troubleshooting. INTERNAL_ERROR: Instance 
'beamapp-jenkins-101710223-10170323-3kz0-harness-vbh4' creation failed: 
Internal error. Please try again or contact Google Support. (Code: 
'8589016946754060551')
ERROR    
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 
2023-10-17T10:46:41.117Z: JOB_MESSAGE_ERROR: Workflow failed.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:46:41.172Z: JOB_MESSAGE_BASIC: Finished operation 
FileNames/Impulse+FileNames/FlatMap(<lambda at 
core.py:3759>)+FileNames/Map(decode)+ParDo(LoadDataframe)+Partition/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn)+RunInference
 all_features/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
all_features/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 floor_area/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
floor_area/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 no_features/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
no_features/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+RunInference
 stations/BatchElements/ParDo(_GlobalWindowsBatchingDoFn)+RunInference 
stations/BeamML_RunInference+AllPredictions+WriteOutput/Write/WriteImpl/Map(<lambda
 at 
iobase.py:1144>)+WriteOutput/Write/WriteImpl/WindowInto(WindowIntoFn)+WriteOutput/Write/WriteImpl/GroupByKey/Session/Flatten+WriteOutput/Write/WriteImpl/GroupByKey/Write
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:202 
2023-10-17T10:46:41.245Z: JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://temp-storage-for-end-to-end-tests/temp-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/dax-tmp-2023-10-17_03_23_41-11162700211141854183-S01-0-b00aa9f9a6d933a6/[email protected]."
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:202 
2023-10-17T10:46:41.287Z: JOB_MESSAGE_WARNING: Unable to delete temp files: 
"gs://temp-storage-for-end-to-end-tests/temp-it/beamapp-jenkins-1017102233-276574-7eanav4g.1697538153.276765/dax-tmp-2023-10-17_03_23_41-11162700211141854183-S01-1-94be3161b34d5020/[email protected]."
WARNING  
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:202 
2023-10-17T10:46:41.311Z: JOB_MESSAGE_WARNING: 
S01:WriteOutput/Write/WriteImpl/DoOnce/Impulse+WriteOutput/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3759>)+WriteOutput/Write/WriteImpl/DoOnce/Map(decode)+WriteOutput/Write/WriteImpl/InitializeWrite
 failed.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:46:41.331Z: JOB_MESSAGE_BASIC: Finished operation 
WriteOutput/Write/WriteImpl/DoOnce/Impulse+WriteOutput/Write/WriteImpl/DoOnce/FlatMap(<lambda
 at 
core.py:3759>)+WriteOutput/Write/WriteImpl/DoOnce/Map(decode)+WriteOutput/Write/WriteImpl/InitializeWrite
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:46:41.451Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-17T10:46:57.499Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-17_03_23_41-11162700211141854183 is in state JOB_STATE_FAILED
ERROR    
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:770 Console 
URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-10-17_03_23_41-11162700211141854183?project=<ProjectId>
=============================== warnings summary 
===============================
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/build/gradleenv/2050596098/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

apache_beam/examples/complete/game/game_stats_it_test.py::GameStatsIT::test_game_stats_it
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_it
apache_beam/examples/complete/game/leader_board_it_test.py::LeaderBoardIT::test_leader_board_it
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_file_loads
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: The default value of numeric_only in DataFrame.mean is 
deprecated. In a future version, it will default to False. In addition, 
specifying 'numeric_only=None' is deprecated. Select only valid columns or 
specify the value of numeric_only to silence this warning.
    return airline_df[at_top_airports].mean()

apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:170:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

apache_beam/io/gcp/bigquery_read_it_test.py::ReadNewTypesTests::test_native_source
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:706:
 BeamDeprecationWarning: BigQuerySource is deprecated since 2.25.0. Use 
ReadFromBigQuery instead.
    beam.io.BigQuerySource(query=self.query, use_standard_sql=True)))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/pytest_postCommitIT-df-py310.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/gcp/bigquery_test.py::PubSubBigQueryIT::test_streaming_inserts
 - RuntimeError: Timeout after 600 seconds while waiting for job 
2023-10-17_03_11_20-156576803112257216 enters expected state CANCELLED. Current 
state is CANCELLING.
FAILED 
apache_beam/io/gcp/bigquery_read_it_test.py::ReadTests::test_table_schema_retrieve_specifying_only_table
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Workflow failed.
FAILED 
apache_beam/io/gcp/datastore/v1new/datastore_write_it_test.py::DatastoreWriteIT::test_datastore_write_limit
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Workflow failed.
FAILED 
apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_regression
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Workflow failed.
===== 4 failed, 84 passed, 50 skipped, 
17 warnings in 11558.21s (3:12:38) ======

> Task :sdks:python:test-suites:dataflow:py310:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python310/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 139

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py310:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 3h 18m 36s
214 actionable tasks: 150 executed, 60 from cache, 4 up-to-date

Publishing build scan...
https://ge.apache.org/s/merxfwluors5k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to