See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/1225/display/redirect?page=changes>

Changes:

[noreply] [Tour Of Beam] Playground Router GRPC API host (#24542)

[noreply] Bump golang.org/x/net from 0.3.0 to 0.4.0 in /sdks (#24587)

[noreply] Replaced finalize with DoFn Teardown in Neo4jIO (#24571)


------------------------------------------
[...truncated 455.39 KB...]
ERROR    apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1552 
Console URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2022-12-08_10_40_26-9982826075831059323?project=<ProjectId>
=============================== warnings summary ===============================
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib and 
slated for removal in Python 3.12; see the module's documentation for 
alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

../../build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py:147:
 64 warnings
apache_beam/transforms/sql_test.py: 27 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py>:147:
 DeprecationWarning: ssl.PROTOCOL_TLS is deprecated
    context = ssl.SSLContext(DEFAULT_TLS_VERSION)

../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml>
 -
=========================== short test summary info ============================
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_filter - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_agg - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_windowing_before_sql 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_tagged_join - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_map - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_generate_data 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_project - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_row - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_zetasql_generate_data
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
=========== 9 failed, 5 skipped, 140 warnings in 1119.73s (0:18:39) ============

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
>  FAILED
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208181752
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208181752]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d]

Deleted 
[[us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208181752] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:14201eb4ca149b26141e64d58e1aca023735a8b201247e1589a67d1dab40157d].

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208181752
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208181752]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208181752] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:eb803821a1ea6d43f254e9ff1a11c3a6798713c3d8014557e5463807975e2d15].

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 1254286.
Stopping expansion service pid: 1254287.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 31m 35s
168 actionable tasks: 114 executed, 48 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/gffibnvdjnpb6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to