See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/804/display/redirect>

Changes:


------------------------------------------
[...truncated 641.41 KB...]

> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:32.102Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:36.247Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:36.512Z: JOB_MESSAGE_BASIC: Stopping worker pool...

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguagePythonUsingJava
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:22:25.856Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_16_46-2535700920278123126 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697339776-ead398]
PASSED
------------------------------ live log teardown 
-------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:198 Deleting 
instance [bt-write-xlang-1697338447-fd142b]


=============================== warnings summary 
===============================
../../build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py:17
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/build/gradleenv/2050596099/lib/python3.11/site-packages/google/api_core/operations_v1/abstract_operations_client.py>:17:
 DeprecationWarning: The distutils package is deprecated and slated for removal 
in Python 3.12. Use setuptools or check PEP 632 for potential alternatives
    from distutils import util

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=== 13 passed, 19 skipped, 7190 
deselected, 1 warning in 4405.95s (1:13:25) 
====

> Task :sdks:python:test-suites:dataflow:py311:gcpCrossLanguageCleanup
Stopping expansion service pid: 2830399.
Skipping invalid pid: 2830400.

> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:23:41.720Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_18_08-12222349131269263398 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697339874-e0fd54]
FAILED
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_set_mutation
 
-------------------------------- live log call 
---------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697340290-c17d99]
INFO     apache_beam.runners.portability.stager:stager.py:323 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f458a6d8280> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f458a6d8a60> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-EjI1Tn7CefDo704so8SRa6K27Y_VY9DkG6OSl6d19Ds.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-EjI1Tn7CefDo704so8SRa6K27Y_VY9DkG6OSl6d19Ds.jar
 in 5 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015032456-822784-2g2q1q2c.1697340296.822936/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231015032456823733-1731'
 createTime: '2023-10-15T03:25:04.434088Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-14_20_25_03-2669496976574743052'
 location: 'us-central1'
 name: 'beamapp-jenkins-1015032456-822784-2g2q1q2c'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-15T03:25:04.434088Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-14_20_25_03-2669496976574743052]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-14_20_25_03-2669496976574743052
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-14_20_25_03-2669496976574743052?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-14_20_25_03-2669496976574743052?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-14_20_25_03-2669496976574743052?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_25_03-2669496976574743052 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:07.540Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:10.188Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:10.256Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:10.480Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:19.489Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:25:46.567Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:27:59.827Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:00.177Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:00.230Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:00.893Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:00.954Z: JOB_MESSAGE_BASIC: Executing operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:08.753Z: JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:28:08.928Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:30:30.132Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_25_03-2669496976574743052 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697340290-c17d99]
PASSED
------------------------------ live log teardown 
-------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:198 Deleting 
instance [bt-write-xlang-1697338794-85b7fe]


=================================== FAILURES ===================================
_____________ TestWriteToBigtableXlangIT.test_delete_row_mutation 
______________

args = (name: 
"projects/apache-beam-testing/instances/bt-write-xlang-1697338794-85b7fe/tables/test-table-1697339874-e0fd54"
,)
kwargs = {'metadata': [('x-goog-request-params', 
'name=projects/apache-beam-testing/instances/bt-write-xlang-1697338794-85b7fe/...e-1697339874-e0fd54'),
 ('x-goog-api-client', 'gl-python/3.8.10 grpc/1.59.0 gax/2.12.0 
gapic/2.21.0')], 'timeout': 60.0}

    @functools.wraps(callable_)
    def error_remapped_callable(*args, **kwargs):
        try:
>           return callable_(*args, **kwargs)

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:75:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/grpc/_channel.py:1161:
 in __call__
    return _end_unary_response_blocking(state, call, False, None)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

state = <grpc._channel._RPCState object at 0x7f45ba0778b0>
call = <grpc._cython.cygrpc.SegregatedCall object at 0x7f45b9b36f00>
with_call = False, deadline = None

    def _end_unary_response_blocking(
        state: _RPCState,
        call: cygrpc.SegregatedCall,
        with_call: bool,
        deadline: Optional[float],
    ) -> Union[ResponseType, Tuple[ResponseType, grpc.Call]]:
        if state.code is grpc.StatusCode.OK:
            if with_call:
                rendezvous = _MultiThreadedRendezvous(state, call, None, 
deadline)
                return state.response, rendezvous
            else:
                return state.response
        else:
>           raise _InactiveRpcError(state)  # pytype: disable=not-instantiable
E           grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC 
that terminated with:
E              status = StatusCode.DEADLINE_EXCEEDED
E              details = "Deadline Exceeded"
E              debug_error_string = "UNKNOWN:Deadline Exceeded 
{created_time:"2023-10-15T03:24:49.477232822+00:00", grpc_status:4}"
E           >

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/grpc/_channel.py:1004:
 _InactiveRpcError

The above exception was the direct cause of the following exception:

self = <apache_beam.io.gcp.bigtableio_it_test.TestWriteToBigtableXlangIT 
testMethod=test_delete_row_mutation>

    def tearDown(self):
      try:
        _LOGGER.info("Deleting table [%s]", self.table.table_id)
>       self.table.delete()

apache_beam/io/gcp/bigtableio_it_test.py:191: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/cloud/bigtable/table.py:452:
 in delete
    table_client.delete_table(request={"name": self.name})
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/cloud/bigtable_admin_v2/services/bigtable_table_admin/client.py:1143:
 in delete_table
    rpc(
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:131:
 in __call__
    return wrapped_func(*args, **kwargs)
../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/api_core/timeout.py:120:
 in func_with_timeout
    return func(*args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (name: 
"projects/apache-beam-testing/instances/bt-write-xlang-1697338794-85b7fe/tables/test-table-1697339874-e0fd54"
,)
kwargs = {'metadata': [('x-goog-request-params', 
'name=projects/apache-beam-testing/instances/bt-write-xlang-1697338794-85b7fe/...e-1697339874-e0fd54'),
 ('x-goog-api-client', 'gl-python/3.8.10 grpc/1.59.0 gax/2.12.0 
gapic/2.21.0')], 'timeout': 60.0}

    @functools.wraps(callable_)
    def error_remapped_callable(*args, **kwargs):
        try:
            return callable_(*args, **kwargs)
        except grpc.RpcError as exc:
>           raise exceptions.from_grpc_error(exc) from exc
E           google.api_core.exceptions.DeadlineExceeded: 504 Deadline 
Exceeded

../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:77:
 DeadlineExceeded
------------------------------ Captured log call -------------------------------
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:186 Created table 
[test-table-1697339874-e0fd54]
INFO     apache_beam.runners.portability.stager:stager.py:323 Copying 
Beam SDK 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/build/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl";>
 to staging location.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:395 Pipeline 
has additional dependencies to be installed in SDK worker container, consider 
using the SDK container image pre-building workflow to avoid repetitive 
installations. Learn more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO     root:environments.py:313 Using provided Python SDK container 
image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest
INFO     root:environments.py:320 Python SDK container image set to 
"gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker 
environment
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function pack_combiners at 0x7f458a6d8280> 
====================
INFO     
apache_beam.runners.portability.fn_api_runner.translations:translations.py:712 
==================== <function sort_stages at 0x7f458a6d8a60> 
====================
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-EjI1Tn7CefDo704so8SRa6K27Y_VY9DkG6OSl6d19Ds.jar...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/beam-sdks-java-io-google-cloud-platform-expansion-service-2.52.0-SNAPSHOT-EjI1Tn7CefDo704so8SRa6K27Y_VY9DkG6OSl6d19Ds.jar
 in 6 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/apache_beam-2.52.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/pipeline.pb...
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS 
upload to 
gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-1015031801-256042-zw7x8zk4.1697339881.256245/pipeline.pb
 in 0 seconds.
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: 
<Job
 clientRequestId: '20231015031801257070-7351'
 createTime: '2023-10-15T03:18:08.887477Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-14_20_18_08-12222349131269263398'
 location: 'us-central1'
 name: 'beamapp-jenkins-1015031801-256042-zw7x8zk4'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-15T03:18:08.887477Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job 
with id: [2023-10-14_20_18_08-12222349131269263398]
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 
2023-10-14_20_18_08-12222349131269263398
INFO     
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the 
Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-14_20_18_08-12222349131269263398?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 
Console log: 
INFO     
apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-14_20_18_08-12222349131269263398?project=apache-beam-testing
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_18_08-12222349131269263398 is in state JOB_STATE_RUNNING
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:18:13.342Z: JOB_MESSAGE_BASIC: Worker configuration: 
e2-standard-2 in us-central1-b.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:18:15.433Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:18:15.501Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:18:31.046Z: JOB_MESSAGE_BASIC: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:32.102Z: JOB_MESSAGE_BASIC: All workers have finished the 
startup processes and began to receive work requests.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:36.247Z: JOB_MESSAGE_BASIC: Finished operation 
Create/Impulse+Create/FlatMap(<lambda at 
core.py:3759>)+Create/Map(decode)+WriteToBigTable/ParDo(_DirectRowMutationsToBeamRow)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/MapElements/Map/ParMultiDo(Anonymous)+WriteToBigTable/SchemaAwareExternalTransform/beam:schematransform:org.apache.beam:bigtable_write:v1/BigtableIO.Write/BigtableIO.WriteWithResults/ParDo(BigtableWriter)/ParMultiDo(BigtableWriter)
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:21:36.512Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:200 
2023-10-15T03:23:41.720Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO     
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:150 Job 
2023-10-14_20_18_08-12222349131269263398 is in state JOB_STATE_DONE
INFO     
apache_beam.io.gcp.bigtableio_it_test:bigtableio_it_test.py:190 Deleting table 
[test-table-1697339874-e0fd54]
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_Gcp_Dataflow/ws/src/sdks/python/pytest_gcpCrossLanguage.xml>
 -
=========================== short test summary info 
============================
FAILED 
apache_beam/io/gcp/bigtableio_it_test.py::TestWriteToBigtableXlangIT::test_delete_row_mutation
 - google.api_core.exceptions.DeadlineExceeded: 504 Deadline Exceeded
==== 1 failed, 12 passed, 19 skipped, 
7190 deselected in 4603.84s (1:16:43) ====

> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava 
> FAILED

> Task :sdks:python:test-suites:dataflow:py38:gcpCrossLanguageCleanup
Stopping expansion service pid: 2831285.
Skipping invalid pid: 2831286.

> Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup
Killing process at 2829780

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py38:gcpCrossLanguagePythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 1h 32m 18s
124 actionable tasks: 89 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/e56mb4heaejh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to