See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/6404/display/redirect?page=changes>
Changes:
[noreply] Handle single-precision float values in the standard coders tests
[noreply] [BEAM-13015, #21250] Remove looking up thread local metrics container
[noreply] [fixes #22731] Publish nightly snapshot of legacy Dataflow worker jar.
------------------------------------------
[...truncated 49.96 KB...]
timeout func_only: False
collected 65 items
apache_beam/runners/portability/spark_runner_test.py F.....s...s...ss... [ 29%]
......ss......ss..s............s.ssss..sssss.. [100%]
=================================== FAILURES ===================================
_______________________ SparkRunnerTest.test_assert_that _______________________
RuntimeError: Subprocess terminated unexpectedly with exit code 0.
During handling of the above exception, another exception occurred:
self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest
testMethod=test_assert_that>
def test_assert_that(self):
# TODO: figure out a way for fn_api_runner to parse and raise the
# underlying exception.
with self.assertRaisesRegex(Exception, 'Failed assert'):
with self.create_pipeline() as p:
> assert_that(p | beam.Create(['a', 'b']), equal_to(['a']))
E AssertionError: "Failed assert" does not match "Subprocess terminated
unexpectedly with exit code 0."
apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:113:
AssertionError
----------------------------- Captured stderr call -----------------------------
22/08/16 18:38:09 WARN
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to
retrieve the requested metadata.
22/08/16 18:38:10 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
ArtifactStagingService started on localhost:41583
22/08/16 18:38:10 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
Java ExpansionService started on localhost:45257
22/08/16 18:38:10 WARN org.apache.beam.runners.jobsubmission.JobServerDriver:
Exception during job server creation
java.io.IOException: Failed to bind to address 0.0.0.0/0.0.0.0:40845
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.netty.NettyServer.start(NettyServer.java:328)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerImpl.start(ServerImpl.java:183)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerImpl.start(ServerImpl.java:92)
at
org.apache.beam.sdk.fn.server.ServerFactory$InetSocketAddressServerFactory.createServer(ServerFactory.java:162)
at
org.apache.beam.sdk.fn.server.ServerFactory$InetSocketAddressServerFactory.create(ServerFactory.java:145)
at
org.apache.beam.sdk.fn.server.GrpcFnServer.create(GrpcFnServer.java:110)
at
org.apache.beam.runners.jobsubmission.JobServerDriver.createJobServer(JobServerDriver.java:238)
at
org.apache.beam.runners.jobsubmission.JobServerDriver.run(JobServerDriver.java:176)
at
org.apache.beam.runners.spark.SparkJobServerDriver.main(SparkJobServerDriver.java:55)
Caused by:
org.apache.beam.vendor.grpc.v1p48p1.io.netty.channel.unix.Errors$NativeIoException:
bind(..) failed: Address already in use
22/08/16 18:38:10 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
ArtifactStagingServer stopped on localhost:41583
22/08/16 18:38:10 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
Expansion stopped on localhost:45257
=============================== warnings summary ===============================
target/.tox-spark-runner-test/spark-runner-test/lib/python3.7/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.7/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/pytest_spark-runner-test.xml>
-
=========================== short test summary info ============================
FAILED
apache_beam/runners/portability/spark_runner_test.py::SparkRunnerTest::test_assert_that
======== 1 failed, 45 passed, 19 skipped, 1 warning in 89.87s (0:01:29) ========
[31mERROR: InvocationError for command
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
spark-runner-test apache_beam/runners/portability/spark_runner_test.py
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.42.0-SNAPSHOT.jar>
--environment_type=LOOPBACK' (exited with code 1)
[0m[1mspark-runner-test run-test-post: commands[0] | bash
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
[0m___________________________________ summary
____________________________________
[31mERROR: spark-runner-test: commands failed
[0m
> Task :sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK
> FAILED
> Task :sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK
[1mspark-runner-test create:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test>
[0m[1mspark-runner-test installdeps: -rbuild-requirements.txt
[0m[1mspark-runner-test inst:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz>
[0m[1mspark-runner-test installed: apache-beam @
file://<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz,attrs==22.1.0,certifi==2022.6.15,cffi==1.15.1,charset-normalizer==2.1.0,cloudpickle==2.1.0,crcmod==1.7,cryptography==37.0.4,deprecation==2.1.0,dill==0.3.1.1,distlib==0.3.1,docker==5.0.3,docopt==0.6.2,execnet==1.9.0,fastavro==1.6.0,freezegun==1.2.2,greenlet==1.1.2,grpcio==1.47.0,grpcio-tools==1.37.0,hdfs==2.7.0,httplib2==0.20.4,idna==3.3,iniconfig==1.1.1,joblib==1.1.0,mock==2.0.0,mypy-protobuf==1.18,numpy==1.22.4,orjson==3.7.12,packaging==21.3,pandas==1.4.3,parameterized==0.8.1,pbr==5.10.0,pluggy==1.0.0,proto-plus==1.22.0,protobuf==3.20.1,psycopg2-binary==2.9.3,py==1.11.0,pyarrow==7.0.0,pycparser==2.21,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.12.3,PyMySQL==1.0.2,pyparsing==3.0.9,pytest==7.1.2,pytest-forked==1.4.0,pytest-timeout==2.1.0,pytest-xdist==2.5.0,python-dateutil==2.8.2,pytz==2022.2.1,PyYAML==6.0,regex==2022.7.25,requests==2.28.1,requests-mock==1.9.3,scikit-learn==1.1.2,scipy==1.9.0,six==1.16.0,SQLAlchemy==1.4.40,tenacity==5.1.5,testcontainers==3.6.1,threadpoolctl==3.1.0,tomli==2.0.1,typing_extensions==4.3.0,urllib3==1.26.11,websocket-client==1.3.3,wrapt==1.14.1,zstandard==0.18.0>
[0m[1mspark-runner-test run-test-pre: PYTHONHASHSEED='3204242203'
[0m[1mspark-runner-test run-test-pre: commands[0] | python --version
[0mPython 3.8.10
[1mspark-runner-test run-test-pre: commands[1] | pip --version
[0mpip 22.2.1 from
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/pip>
(python 3.8)
[1mspark-runner-test run-test-pre: commands[2] | pip check
[0mNo broken requirements found.
[1mspark-runner-test run-test-pre: commands[3] | bash
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
[0m[1mspark-runner-test run-test: commands[0] |
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
spark-runner-test
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/portability/spark_runner_test.py>
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.42.0-SNAPSHOT.jar>
--environment_type=LOOPBACK'
[0m============================= test session starts
==============================
platform linux -- Python 3.8.10, pytest-7.1.2, pluggy-1.0.0
cachedir: target/.tox-spark-runner-test/spark-runner-test/.pytest_cache
rootdir:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python,>
configfile: pytest.ini
plugins: xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 65 items
apache_beam/runners/portability/spark_runner_test.py ......s...s...ss... [ 29%]
......ss......ss..s............s.ssss..sssss.. [100%]
=============================== warnings summary ===============================
target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/runners/portability/spark_runner_test.py::SparkRunnerTest::test_pardo_state_with_custom_key_coder
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/_pytest/threadexception.py>:73:
PytestUnhandledThreadExceptionWarning: Exception in thread
read_grpc_client_inputs
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 671, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 654, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1660675301.876067162","description":"Error received from peer
ipv6:[::1]:35129","file":"src/core/lib/surface/call.cc","file_line":966,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg))
apache_beam/runners/portability/spark_runner_test.py::SparkRunnerTest::test_pardo_timers_clear
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/_pytest/threadexception.py>:73:
PytestUnhandledThreadExceptionWarning: Exception in thread
read_grpc_client_inputs
Traceback (most recent call last):
File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/usr/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 671, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 654, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1660675305.019803263","description":"Error received from peer
ipv6:[::1]:44149","file":"src/core/lib/surface/call.cc","file_line":966,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/pytest_spark-runner-test.xml>
-
============ 46 passed, 19 skipped, 4 warnings in 83.50s (0:01:23) =============
[1mspark-runner-test run-test-post: commands[0] | bash
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
[0m___________________________________ summary
____________________________________
[32m spark-runner-test: commands succeeded
[0m[32m congratulations :)
[0m
> Task :sdks:python:test-suites:portable:py38:sparkValidatesRunner
> Task :sdks:python:test-suites:portable:py39:sparkCompatibilityMatrixLOOPBACK
[1mspark-runner-test create:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test>
[0m[1mspark-runner-test installdeps: -rbuild-requirements.txt
[0m[1mspark-runner-test inst:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz>
[0m[1mspark-runner-test installed: apache-beam @
file://<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz,attrs==22.1.0,certifi==2022.6.15,cffi==1.15.1,charset-normalizer==2.1.0,cloudpickle==2.1.0,crcmod==1.7,cryptography==37.0.4,deprecation==2.1.0,dill==0.3.1.1,distlib==0.3.1,docker==5.0.3,docopt==0.6.2,execnet==1.9.0,fastavro==1.6.0,freezegun==1.2.2,greenlet==1.1.2,grpcio==1.47.0,grpcio-tools==1.37.0,hdfs==2.7.0,httplib2==0.20.4,idna==3.3,iniconfig==1.1.1,joblib==1.1.0,mock==2.0.0,mypy-protobuf==1.18,numpy==1.22.4,orjson==3.7.12,packaging==21.3,pandas==1.4.3,parameterized==0.8.1,pbr==5.10.0,pluggy==1.0.0,proto-plus==1.22.0,protobuf==3.20.1,psycopg2-binary==2.9.3,py==1.11.0,pyarrow==7.0.0,pycparser==2.21,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.12.3,PyMySQL==1.0.2,pyparsing==3.0.9,pytest==7.1.2,pytest-forked==1.4.0,pytest-timeout==2.1.0,pytest-xdist==2.5.0,python-dateutil==2.8.2,pytz==2022.2.1,PyYAML==6.0,regex==2022.7.25,requests==2.28.1,requests-mock==1.9.3,scikit-learn==1.1.2,scipy==1.9.0,six==1.16.0,SQLAlchemy==1.4.40,tenacity==5.1.5,testcontainers==3.6.1,threadpoolctl==3.1.0,tomli==2.0.1,typing_extensions==4.3.0,urllib3==1.26.11,websocket-client==1.3.3,wrapt==1.14.1,zstandard==0.18.0>
[0m[1mspark-runner-test run-test-pre: PYTHONHASHSEED='1257141193'
[0m[1mspark-runner-test run-test-pre: commands[0] | python --version
[0mPython 3.9.10
[1mspark-runner-test run-test-pre: commands[1] | pip --version
[0mpip 22.2.1 from
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/pip>
(python 3.9)
[1mspark-runner-test run-test-pre: commands[2] | pip check
[0mNo broken requirements found.
[1mspark-runner-test run-test-pre: commands[3] | bash
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
[0m[1mspark-runner-test run-test: commands[0] |
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
spark-runner-test
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/apache_beam/runners/portability/spark_runner_test.py>
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.42.0-SNAPSHOT.jar>
--environment_type=LOOPBACK'
[0m============================= test session starts
==============================
platform linux -- Python 3.9.10, pytest-7.1.2, pluggy-1.0.0
cachedir: target/.tox-spark-runner-test/spark-runner-test/.pytest_cache
rootdir:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python,>
configfile: pytest.ini
plugins: xdist-2.5.0, timeout-2.1.0, forked-1.4.0, requests-mock-1.9.3
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 65 items
apache_beam/runners/portability/spark_runner_test.py ......s...s...ss... [ 29%]
......ss......ss..s............s.ssss..sssss.. [100%]
=============================== warnings summary ===============================
target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/hdfs/config.py:15
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/hdfs/config.py>:15:
DeprecationWarning: the imp module is deprecated in favour of importlib; see
the module's documentation for alternative uses
from imp import load_source
target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/tenacity/_asyncio.py:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/runners/portability/spark_runner_test.py::SparkRunnerTest::test_pardo_state_with_custom_key_coder
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/_pytest/threadexception.py>:73:
PytestUnhandledThreadExceptionWarning: Exception in thread
read_grpc_client_inputs
Traceback (most recent call last):
File "/usr/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/usr/lib/python3.9/threading.py", line 910, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 671, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 654, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.9/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.CANCELLED
details = "Multiplexer hanging up"
debug_error_string =
"{"created":"@1660675424.244567202","description":"Error received from peer
ipv6:[::1]:36525","file":"src/core/lib/surface/call.cc","file_line":966,"grpc_message":"Multiplexer
hanging up","grpc_status":1}"
>
warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg))
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/pytest_spark-runner-test.xml>
-
============ 46 passed, 19 skipped, 3 warnings in 75.44s (0:01:15) =============
[1mspark-runner-test run-test-post: commands[0] | bash
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py39/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
[0m___________________________________ summary
____________________________________
[32m spark-runner-test: commands succeeded
[0m[32m congratulations :)
[0m
> Task :sdks:python:test-suites:portable:py39:sparkValidatesRunner
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 7m 33s
80 actionable tasks: 51 executed, 27 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/iauthntuc6nns
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]