See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/5489/display/redirect?page=changes>

Changes:

[noreply] [#22319] Regenerate proto2_coder_test_messages_pb2.py manually 
(#22320)


------------------------------------------
[...truncated 281.24 KB...]
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 1008, in pull_responses
    for response in responses:
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 426, in __next__
    return self._next()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1658215882.937491511","description":"Error received from peer 
ipv6:[::1]:38481","file":"src/core/lib/surface/call.cc","file_line":966,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 671, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 654, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 426, in __next__
    return self._next()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1658215882.937392396","description":"Error received from peer 
ipv6:[::1]:41831","file":"src/core/lib/surface/call.cc","file_line":966,"grpc_message":"Socket
 closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:direct:py37:mongodbioIT
INFO:__main__:Writing 100000 documents to mongodb finished in 27.762 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1658215864
INFO:__main__:reader params   : {'projection': ['number']}
INFO:__main__:expected results: {'number_sum': 4999950000, 'docs_count': 100000}
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:153:
 FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number']))
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.41.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function annotate_downstream_side_inputs at 0x7f3ae2120830> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function fix_side_input_pcoll_coders at 0x7f3ae2120950> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f3ae2120e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f3ae2120ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_sdf at 0x7f3ae21210e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_gbk at 0x7f3ae2121170> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sink_flattens at 0x7f3ae2121290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function greedily_fuse at 0x7f3ae2121320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function read_to_impulse at 0x7f3ae21213b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function impulse_to_input at 0x7f3ae2121440> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f3ae2121680> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function add_impulse_to_dangling_transforms at 0x7f3ae21217a0> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function setup_timer_mapping at 0x7f3ae21215f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function populate_data_channel_coders at 0x7f3ae2121710> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created 
Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7f3ae1f3f410> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:__main__:Reading documents from mongodb finished in 5.105 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1658215864
INFO:__main__:reader params   : {'filter': {'number_mod_3': 0}, 'projection': 
['number']}
INFO:__main__:expected results: {'number_sum': 1666683333, 'docs_count': 33334}
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.41.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function annotate_downstream_side_inputs at 0x7f3ae2120830> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function fix_side_input_pcoll_coders at 0x7f3ae2120950> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f3ae2120e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f3ae2120ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_sdf at 0x7f3ae21210e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_gbk at 0x7f3ae2121170> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sink_flattens at 0x7f3ae2121290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function greedily_fuse at 0x7f3ae2121320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function read_to_impulse at 0x7f3ae21213b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function impulse_to_input at 0x7f3ae2121440> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f3ae2121680> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function add_impulse_to_dangling_transforms at 0x7f3ae21217a0> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function setup_timer_mapping at 0x7f3ae21215f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function populate_data_channel_coders at 0x7f3ae2121710> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created 
Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7f3ae184bfd0> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:__main__:Reading documents from mongodb finished in 2.622 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1658215864
INFO:__main__:reader params   : {'projection': ['number'], 'bucket_auto': True}
INFO:__main__:expected results: {'number_sum': 4999950000, 'docs_count': 100000}
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.41.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function annotate_downstream_side_inputs at 0x7f3ae2120830> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function fix_side_input_pcoll_coders at 0x7f3ae2120950> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f3ae2120e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f3ae2120ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_sdf at 0x7f3ae21210e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_gbk at 0x7f3ae2121170> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sink_flattens at 0x7f3ae2121290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function greedily_fuse at 0x7f3ae2121320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function read_to_impulse at 0x7f3ae21213b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function impulse_to_input at 0x7f3ae2121440> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f3ae2121680> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function add_impulse_to_dangling_transforms at 0x7f3ae21217a0> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function setup_timer_mapping at 0x7f3ae21215f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function populate_data_channel_coders at 0x7f3ae2121710> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created 
Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7f3ae2192910> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:__main__:Reading documents from mongodb finished in 8.174 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:================================================================================
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1658215864
INFO:__main__:reader params   : {'filter': {'number_mod_3': 0}, 'projection': 
['number'], 'bucket_auto': True}
INFO:__main__:expected results: {'number_sum': 1666683333, 'docs_count': 33334}
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.41.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function annotate_downstream_side_inputs at 0x7f3ae2120830> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function fix_side_input_pcoll_coders at 0x7f3ae2120950> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f3ae2120e60> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f3ae2120ef0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_sdf at 0x7f3ae21210e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function expand_gbk at 0x7f3ae2121170> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sink_flattens at 0x7f3ae2121290> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function greedily_fuse at 0x7f3ae2121320> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function read_to_impulse at 0x7f3ae21213b0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function impulse_to_input at 0x7f3ae2121440> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f3ae2121680> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function add_impulse_to_dangling_transforms at 0x7f3ae21217a0> 
====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function setup_timer_mapping at 0x7f3ae21215f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function populate_data_channel_coders at 0x7f3ae2121710> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created 
Worker handler 
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
 object at 0x7f3adbf7b150> for environment 
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO:__main__:Reading documents from mongodb finished in 2.831 seconds
mongoioit27338
mongoioit27338

> Task :sdks:python:test-suites:direct:py37:postCommitIT
> Task :sdks:python:test-suites:direct:py37:spannerioIT

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --region=us-central1 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.41.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes 
>>> --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts 
==============================
platform linux -- Python 3.7.12, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> 
inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw1] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw2] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw3] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw4] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw6] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw5] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
[gw7] Python 3.7.12 (default, Jan 15 2022, 18:42:10)  -- [GCC 9.3.0]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] 
/ gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
 
[gw1] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
 
[gw1] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
 
[gw0] PASSED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
 
[gw0] PASSED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
 

=============================== warnings summary 
===============================
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128:
 FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility 
guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117:
 FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility 
guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml>
 -
============= 5 passed, 12 skipped, 5 warnings in 1232.04 seconds 
==============

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:container:pullLicenses'.
> Process 'command './license_scripts/license_script.sh'' finished with 
> non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 28m 1s
222 actionable tasks: 152 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/mx2oibk33clmy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to