See 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/1222/display/redirect?page=changes>

Changes:

[noreply] pubsub: fix typo in grpc client factory

[Kenneth Knowles] Suppress keyfor warnings

[Kenneth Knowles] Suppress checker warnings that are confusing and difficult

[Kenneth Knowles] Add @Pure annotations to MongoDbIO autovalue fields

[Kenneth Knowles] Suppress checker in FnApiDoFnRunner due to crash

[Kenneth Knowles] Suppress checker framework in Dataflow

[Kenneth Knowles] Fix some nullness errors in Spark runner

[Kenneth Knowles] Upgrade checker framework to 3.27.0

[noreply] Samza runner support for non unique stateId across multiple ParDos

[noreply] Bump to Hadoop 3.3.4 for performance tests (#24550)

[noreply] regenerate python dependencies (#24582)


------------------------------------------
[...truncated 428.90 KB...]
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
../../build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py:15
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/hdfs/config.py>:15:
 DeprecationWarning: the imp module is deprecated in favour of importlib and 
slated for removal in Python 3.12; see the module's documentation for 
alternative uses
    from imp import load_source

../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
../../build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

../../build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py:147:
 64 warnings
apache_beam/transforms/sql_test.py: 27 warnings
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/httplib2/__init__.py>:147:
 DeprecationWarning: ssl.PROTOCOL_TLS is deprecated
    context = ssl.SSLContext(DEFAULT_TLS_VERSION)

../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
../../build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/build/gradleenv/1922375555/lib/python3.10/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
apache_beam/typehints/pandas_type_compatibility_test.py:67
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:67:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    }).set_index(pd.Int64Index(range(123, 223), name='an_index')),

apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
apache_beam/typehints/pandas_type_compatibility_test.py:90
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:90:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(123, 223), name='an_index'),

apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
apache_beam/typehints/pandas_type_compatibility_test.py:91
  
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/apache_beam/typehints/pandas_type_compatibility_test.py>:91:
 FutureWarning: pandas.Int64Index is deprecated and will be removed from pandas 
in a future version. Use pandas.Index with the appropriate dtype instead.
    pd.Int64Index(range(475, 575), name='another_index'),

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_PythonUsingJavaSQL_Dataflow/ws/src/sdks/python/pytest_xlangSqlValidateRunner.xml>
 -
=========================== short test summary info ============================
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_agg - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_row - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_generate_data 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_filter - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_tagged_join - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_project - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED apache_beam/transforms/sql_test.py::SqlTransformTest::test_map - 
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow 
pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_windowing_before_sql 
- apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
FAILED 
apache_beam/transforms/sql_test.py::SqlTransformTest::test_zetasql_generate_data
 - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: 
Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
  File 
"/usr/local/lib/python3.10/site-packages/dataflow_worker/batchworker.py", line 
648, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/executor.py", 
line 208, in execute
    op.start()
  File "dataflow_worker/shuffle_operations.py", line 154, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 156, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "dataflow_worker/shuffle_operations.py", line 169, in 
dataflow_worker.shuffle_operations.ShuffleWriteOperation.start
  File "/usr/local/lib/python3.10/site-packages/dataflow_worker/shuffle.py", 
line 589, in __enter__
    raise RuntimeError(_PYTHON_310_SHUFFLE_ERROR_MESSAGE)
RuntimeError: This pipeline requires Dataflow Runner v2 in order to run with 
currently used version of Apache Beam on Python 3.10+. Please verify that the 
Dataflow Runner v2 is not disabled in the pipeline options or enable it 
explicitly via: --dataflow_service_option=use_runner_v2. Alternatively, 
downgrade to Python 3.9 to use Dataflow Runner v1.
============ 9 failed, 5 skipped, 140 warnings in 987.96s (0:16:27) ============

> Task 
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql
>  FAILED
> Task :runners:google-cloud-dataflow-java:cleanupXVR UP-TO-DATE

> Task :runners:google-cloud-dataflow-java:cleanUpDockerPythonImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208001754
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4
Deleted: sha256:41a7383688d57c78d0479ea06391477d2dfbeda26e4b64f963bb41b096ba1e4b
Deleted: sha256:b84cd4dc1c3508935a5b8b4ab8d51f74d2a3c44d18a07db20462d4882349e886
Deleted: sha256:371c5f6311e2ca4da82243c95b098c6249ac313ecf5fadf12df2d8e3980417bc
Deleted: sha256:48353466620953c9d435c992d48c26c2e52c2be45dd3121ee552c27bd30156f7
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208001754]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4]

Deleted 
[[us.gcr.io/apache-beam-testing/java-postcommit-it/python:20221208001754] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/python@sha256:da1b4c6443a384eef4a4ccd4cff46b860f8fdf07ae14c0c4eb0f99f05e1e11c4].

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208001754
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208001754]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221208001754] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3])].
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:4447b1c40b3f11c17c2f362713ffa97eb9a95410494c6948fc1ef2c11a7824f3].

> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
Stopping expansion service pid: 919164.
Stopping expansion service pid: 919165.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 30m 27s
168 actionable tasks: 115 executed, 47 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/5ldoapwdkunci

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to