See
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/4622/display/redirect>
Changes:
------------------------------------------
[...truncated 46.35 MB...]
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PTRANSFORM": "fn/read/pcollection_1:0"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' },
{'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"urn": "beam:metric:element_count:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"type": "beam:metrics:sum_int64:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"payload": "AQ==",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PCOLLECTION": "pcollection_2"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' },
{'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"urn": "beam:metric:sampled_byte_size:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"type": "beam:metrics:distribution_int64:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"payload": "Ac0TzRPNEw==",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PCOLLECTION": "pcollection_2"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' },
{'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"urn": "beam:metric:sampled_byte_size:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"type": "beam:metrics:distribution_int64:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"payload": "Ac4TzhPOEw==",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to
Cloud Spanner/Schema
View/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' },
{'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"urn": "beam:metric:sampled_byte_size:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"type": "beam:metrics:distribution_int64:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"payload": "AdQT1BPUEw==",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PCOLLECTION": "pcollection_1"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' },
{'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"urn": "beam:metric:sampled_byte_size:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"type": "beam:metrics:distribution_int64:v1",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"payload": "AcwTzBPMEw==",'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"labels": {'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'
"PCOLLECTION": "external_7Write to Spanner/SpannerIO.Write/Write mutations to
Cloud Spanner/Schema
View/Combine.GloballyAsSingletonView/CombineValues/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output"'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }]'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.webmonitor.WebMonitorEndpoint
lambda$shutDownInternal$5'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Removing cache directory /tmp/flink-web-ui'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl
jobManagerLostLeadership'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
JobManager for job 05cdf3334a30287565c36edbef903b15 with leader id
af381040a0aa8d64b637785d6d7a4dd7 lost leadership.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.rest.RestServerEndpoint
lambda$closeAsync$1'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shut down complete.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.resourcemanager.ResourceManager
deregisterApplication'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shut down cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.entrypoint.component.DispatcherResourceManagerComponent
closeAsyncInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Closing components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.dispatcher.runner.AbstractDispatcherLeaderProcess
closeInternal'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping SessionDispatcherLeaderProcess.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.dispatcher.Dispatcher onStop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.dispatcher.Dispatcher
terminateRunningJobs'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping all currently running jobs of dispatcher
akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager
close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Closing the slot manager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService
stop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stop job leader service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.resourcemanager.slotmanager.DeclarativeSlotManager
suspend'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Suspending the slot manager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM
org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down TaskExecutorLocalStateStoresManager.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.dispatcher.Dispatcher lambda$onStop$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped dispatcher akka://flink/user/rpc/dispatcher_2.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl
lambda$getFileCloser$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
FileChannelManager removed spill file directory
/tmp/flink-io-7b40dbee-eabb-45b7-8a62-7ad3127f2f60'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.io.network.NettyShuffleEnvironment
close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down the network environment and its components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.io.disk.FileChannelManagerImpl
lambda$getFileCloser$0'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
FileChannelManager removed spill file directory
/tmp/flink-netty-shuffle-fd4ab991-6275-482d-a111-7963612b6273'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.taskexecutor.KvStateService shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down the kvState service and its components.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.taskexecutor.DefaultJobLeaderService
stop'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stop job leader service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.filecache.FileCache shutdown'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
removed file cache directory
/tmp/flink-dist-cache-f5323f26-aabd-4f80-826a-21864ef6bcca'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.taskexecutor.TaskExecutor
handleOnStopException'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped TaskExecutor akka://flink/user/rpc/taskmanager_0.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService stopService'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopping Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService
lambda$stopService$7'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped Akka RPC service.'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down BLOB cache'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.blob.AbstractBlobCache close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Shutting down BLOB cache'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.blob.BlobServer close'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped BLOB server at 0.0.0.0:40763'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Dec 11,
2021 6:17:52 AM org.apache.flink.runtime.rpc.akka.AkkaRpcService
lambda$stopService$7'
INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'INFO:
Stopped Akka RPC service.'
INFO apache_beam.runners.portability.portable_runner:portable_runner.py:576
Job state changed to DONE
PASSED [100%]
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-flink-py37.xml>
-
========================== 7 passed in 228.66 seconds ==========================
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 651, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1639203379.725566796","description":"Error received from peer
ipv4:127.0.0.1:42161","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/usr/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 651, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 634, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1639203442.396045233","description":"Error received from peer
ipv4:127.0.0.1:39515","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options:
>>> --runner=TestDataflowRunner --project=apache-beam-testing
>>> --region=us-central1
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
>>> --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz>
>>> --requirements_file=postcommit_requirements.txt --num_workers=1
>>> --sleep_secs=20
>>> --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar>
>>>
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes
>>> --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>> collect markers: -m=spannerio_it
[1m============================= test session starts
==============================[0m
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,>
inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[1m[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0m[1m[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0
20160609]
[0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15]
/ gw7 [15]
scheduling tests via LoadFileScheduling
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
[gw1] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
[gw1] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
[gw0] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
[gw0] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw0] [33mSKIPPED[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[gw1] [32mPASSED[0m
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
[33m=============================== warnings summary
===============================[0m
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128:
FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility
guarantees.
sql="select * from Users")
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117:
FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility
guarantees.
columns=["UserId", "Key"])
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
database_id=self.TEST_DATABASE))
apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135:
FutureWarning: WriteToSpanner is experimental. No backwards-compatibility
guarantees.
max_batch_size_bytes=250))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml>
-
[33m[1m============= 5 passed, 10 skipped, 5 warnings in 1601.30 seconds
==============[0m
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 120
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 18m 0s
217 actionable tasks: 165 executed, 48 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/uyuq6lejhvr66
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]