See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/1221/display/redirect?page=changes>

Changes:

[noreply] Migrate testing subpackages from interface{} to any (#24570)

[noreply] fix go lints (#24566)


------------------------------------------
[...truncated 921.62 KB...]
ERROR    apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1552 
Console URL: 
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2022-12-07_10_41_06-3257873758910929866?project=<ProjectId>
=============================== warnings summary ===============================
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib and 
slated for removal in Python 3.12; see the module's documentation for 
alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

../../build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py:147:
 64 warnings
apache_beam/transforms/sql_test.py: 27 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py>:147:
 DeprecationWarning: ssl.PROTOCOL_TLS is deprecated
    context = ssl.SSLContext(DEFAULT_TLS_VERSION)

../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml>
 -
=========================== short test summary info ============================
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_filter - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_project - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_generate_data 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_tagged_join - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_row - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_windowing_before_sql 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_map - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_agg - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_zetasql_generate_data
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
=========== 9 failed, 5 skipped, 140 warnings in 1252.43s (0:20:52) ============

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
>  FAILED
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221207181749
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221207181749]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0]

Deleted 
[[us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221207181749] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:72d410fa422e4c3cbabd54186da0de69cea0d4e6c60ca2520ca4aa067c286fc0].

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221207181749
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221207181749]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221207181749] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:f13493523b907c41c2c757afa99e275275d33bd5667075678616fa761340af8f].

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 3459300.
Stopping expansion service pid: 3459301.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 33m 38s
168 actionable tasks: 114 executed, 48 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/s65jiwcdyzcny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to