See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/4649/display/redirect?page=changes>

Changes:

[heejong] [BEAM-12390] clickhouse test fails when test resource can't be read by

[Ismaël Mejía] [BEAM-12241] Update vendored bytebuddy to version 1.11.0

[noreply] [BEAM-10144] update pipeline options snippets (#14738)

[Kyle Weaver] Update Dataflow Python containers.

[Kyle Weaver] [BEAM-12394] Include UDF provider jar in Hadoop variant test.

[Kyle Weaver] [BEAM-11738] Fail JavaUdfLoaderTest instead of ignoring if system


------------------------------------------
[...truncated 50.33 KB...]
> Task :runners:spark:2:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :runners:spark:2:classes
> Task :runners:spark:2:jar
> Task :runners:spark:2:job-server:compileJava NO-SOURCE
> Task :runners:spark:2:job-server:classes UP-TO-DATE
> Task :runners:spark:2:job-server:shadowJar

> Task :sdks:python:test-suites:portable:py36:sparkCompatibilityMatrixLOOPBACK
spark-runner-test create: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test>
spark-runner-test installdeps: -rbuild-requirements.txt
spark-runner-test inst: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz>
spark-runner-test installed: apache-beam @ 
file://<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz,apipkg==1.5,atomicwrites==1.4.0,attrs==21.2.0,avro-python3==1.9.2.1,certifi==2020.12.5,chardet==4.0.0,crcmod==1.7,dataclasses==0.8,deprecation==2.1.0,dill==0.3.1.1,distlib==0.3.1,docker==5.0.0,docopt==0.6.2,execnet==1.8.0,fastavro==1.4.1,freezegun==1.1.0,future==0.18.2,greenlet==1.1.0,grpcio==1.38.0,grpcio-tools==1.37.0,hdfs==2.6.0,httplib2==0.19.1,idna==2.10,importlib-metadata==4.0.1,mock==2.0.0,more-itertools==8.8.0,mypy-protobuf==1.18,nose==1.3.7,nose-xunitmp==0.4.1,numpy==1.19.5,oauth2client==4.1.3,packaging==20.9,pandas==1.1.5,parameterized==0.7.5,pbr==5.6.0,pluggy==0.13.1,protobuf==3.17.1,psycopg2-binary==2.8.6,py==1.10.0,pyarrow==3.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.11.4,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.3.0,pytest-timeout==1.4.2,pytest-xdist==1.34.0,python-dateutil==2.8.1,pytz==2021.1,PyYAML==5.4.1,requests==2.25.1,requests-mock==1.9.2,rsa==4.7.2,six==1.16.0,SQLAlchemy==1.4.15,tenacity==5.1.5,testcontainers==3.4.0,typing-extensions==3.7.4.3,urllib3==1.26.4,wcwidth==0.2.5,websocket-client==1.0.1,wrapt==1.12.1,zipp==3.4.1>
spark-runner-test run-test-pre: PYTHONHASHSEED='1424233599'
spark-runner-test run-test-pre: commands[0] | python --version
Python 3.6.8
spark-runner-test run-test-pre: commands[1] | pip --version
pip 21.0.1 from 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.6/site-packages/pip>
 (python 3.6)
spark-runner-test run-test-pre: commands[2] | pip check
No broken requirements found.
spark-runner-test run-test-pre: commands[3] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
spark-runner-test run-test: commands[0] | 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/apache_beam/runners/portability/spark_runner_test.py>
 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK'
============================= test session starts 
==============================
platform linux -- Python 3.6.8, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
cachedir: target/.tox-spark-runner-test/spark-runner-test/.pytest_cache
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python,>
 inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.2
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 50 items

apache_beam/runners/portability/spark_runner_test.py .s...ss.......ss... [ 38%]
.s.s............s.ssss..sssss..                                          [100%]

- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/pytest_spark-runner-test.xml>
 -
=================== 33 passed, 17 skipped in 593.90 seconds ====================
spark-runner-test run-test-post: commands[0] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py36/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
___________________________________ summary 
____________________________________
  spark-runner-test: commands succeeded
  congratulations :)

> Task :sdks:python:test-suites:portable:py36:sparkValidatesRunner

> Task :sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK
spark-runner-test create: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test>
spark-runner-test installdeps: -rbuild-requirements.txt
spark-runner-test inst: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz>
spark-runner-test installed: apache-beam @ 
file://<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz,apipkg==1.5,atomicwrites==1.4.0,attrs==21.2.0,avro-python3==1.9.2.1,certifi==2020.12.5,chardet==4.0.0,crcmod==1.7,deprecation==2.1.0,dill==0.3.1.1,distlib==0.3.1,docker==5.0.0,docopt==0.6.2,execnet==1.8.0,fastavro==1.4.1,freezegun==1.1.0,future==0.18.2,greenlet==1.1.0,grpcio==1.38.0,grpcio-tools==1.37.0,hdfs==2.6.0,httplib2==0.19.1,idna==2.10,importlib-metadata==4.0.1,mock==2.0.0,more-itertools==8.8.0,mypy-protobuf==1.18,nose==1.3.7,nose-xunitmp==0.4.1,numpy==1.20.3,oauth2client==4.1.3,packaging==20.9,pandas==1.2.4,parameterized==0.7.5,pbr==5.6.0,pluggy==0.13.1,protobuf==3.17.1,psycopg2-binary==2.8.6,py==1.10.0,pyarrow==3.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.11.4,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.3.0,pytest-timeout==1.4.2,pytest-xdist==1.34.0,python-dateutil==2.8.1,pytz==2021.1,PyYAML==5.4.1,requests==2.25.1,requests-mock==1.9.2,rsa==4.7.2,six==1.16.0,SQLAlchemy==1.4.15,tenacity==5.1.5,testcontainers==3.4.0,typing-extensions==3.7.4.3,urllib3==1.26.4,wcwidth==0.2.5,websocket-client==1.0.1,wrapt==1.12.1,zipp==3.4.1>
spark-runner-test run-test-pre: PYTHONHASHSEED='3812123000'
spark-runner-test run-test-pre: commands[0] | python --version
Python 3.7.3
spark-runner-test run-test-pre: commands[1] | pip --version
pip 21.0.1 from 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.7/site-packages/pip>
 (python 3.7)
spark-runner-test run-test-pre: commands[2] | pip check
No broken requirements found.
spark-runner-test run-test-pre: commands[3] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
spark-runner-test run-test: commands[0] | 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/apache_beam/runners/portability/spark_runner_test.py>
 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK'
============================= test session starts 
==============================
platform linux -- Python 3.7.3, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
cachedir: target/.tox-spark-runner-test/spark-runner-test/.pytest_cache
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python,>
 inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.2
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 50 items

apache_beam/runners/portability/spark_runner_test.py FsF..ss.......ss... [ 38%]
.s.s............s.ssss..sssss..                                          [100%]

=================================== FAILURES ===================================
_______________________ SparkRunnerTest.test_assert_that _______________________
RuntimeError: Pipeline timed out waiting for job service subprocess.

During handling of the above exception, another exception occurred:

self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest 
testMethod=test_assert_that>

    def test_assert_that(self):
      # TODO: figure out a way for fn_api_runner to parse and raise the
      # underlying exception.
      with self.assertRaisesRegex(Exception, 'Failed assert'):
        with self.create_pipeline() as p:
>         assert_that(p | beam.Create(['a', 'b']), equal_to(['a']))
E         AssertionError: "Failed assert" does not match "Pipeline timed out 
waiting for job service subprocess."

apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:107: 
AssertionError
_____________________ SparkRunnerTest.test_combine_per_key _____________________

self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest 
testMethod=test_combine_per_key>

    def test_combine_per_key(self):
>     with self.create_pipeline() as p:

apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:745: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
apache_beam/runners/portability/portable_runner_test.py:167: in create_pipeline
    return beam.Pipeline(self.get_runner(), self.create_options())
apache_beam/runners/portability/spark_runner_test.py:133: in create_options
    options = super(SparkRunnerTest, self).create_options()
apache_beam/runners/portability/portable_runner_test.py:155: in create_options
    options.view_as(PortableOptions).job_endpoint = self._get_job_endpoint()
apache_beam/runners/portability/portable_runner_test.py:120: in 
_get_job_endpoint
    cls._job_endpoint = cls._create_job_endpoint()
apache_beam/runners/portability/portable_runner_test.py:126: in 
_create_job_endpoint
    return cls._start_local_runner_subprocess_job_service()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

cls = <class 
'apache_beam.runners.portability.spark_runner_test.SparkRunnerTest'>

    @classmethod
    def _start_local_runner_subprocess_job_service(cls):
      cls._maybe_kill_subprocess()
      # TODO(robertwb): Consider letting the subprocess pick one and
      # communicate it back...
      # pylint: disable=unbalanced-tuple-unpacking
      job_port, expansion_port = cls._pick_unused_ports(num_ports=2)
      _LOGGER.info('Starting server on port %d.', job_port)
      cls._subprocess = subprocess.Popen(
          cls._subprocess_command(job_port, expansion_port))
      address = 'localhost:%d' % job_port
      job_service = beam_job_api_pb2_grpc.JobServiceStub(
          GRPCChannelFactory.insecure_channel(address))
      _LOGGER.info('Waiting for server to be ready...')
      start = time.time()
      timeout = 30
      while True:
        time.sleep(0.1)
        if cls._subprocess.poll() is not None:
          raise RuntimeError(
              'Subprocess terminated unexpectedly with exit code %d.' %
              cls._subprocess.returncode)
        elif time.time() - start > timeout:
          raise RuntimeError(
>             'Pipeline timed out waiting for job service subprocess.')
E         RuntimeError: Pipeline timed out waiting for job service subprocess.

apache_beam/runners/portability/portable_runner_test.py:104: RuntimeError
----------------------------- Captured stderr call -----------------------------
21/05/25 00:36:37 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:40005
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/pytest_spark-runner-test.xml>
 -
============== 2 failed, 31 passed, 17 skipped in 607.00 seconds ===============
ERROR: InvocationError for command 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test apache_beam/runners/portability/spark_runner_test.py 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK' (exited with code 1)
spark-runner-test run-test-post: commands[0] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py37/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
___________________________________ summary 
____________________________________
ERROR:   spark-runner-test: commands failed

> Task :sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK 
> FAILED
> Task :sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK
spark-runner-test create: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test>
spark-runner-test installdeps: -rbuild-requirements.txt
spark-runner-test inst: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz>
spark-runner-test installed: apache-beam @ 
file://<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/.tmp/package/1/apache-beam.tar.gz,apipkg==1.5,atomicwrites==1.4.0,attrs==21.2.0,avro-python3==1.9.2.1,certifi==2020.12.5,chardet==4.0.0,crcmod==1.7,deprecation==2.1.0,dill==0.3.1.1,distlib==0.3.1,docker==5.0.0,docopt==0.6.2,execnet==1.8.0,fastavro==1.4.1,freezegun==1.1.0,future==0.18.2,greenlet==1.1.0,grpcio==1.38.0,grpcio-tools==1.37.0,hdfs==2.6.0,httplib2==0.19.1,idna==2.10,mock==2.0.0,more-itertools==8.8.0,mypy-protobuf==1.18,nose==1.3.7,nose-xunitmp==0.4.1,numpy==1.20.3,oauth2client==4.1.3,packaging==20.9,pandas==1.2.4,parameterized==0.7.5,pbr==5.6.0,pluggy==0.13.1,protobuf==3.17.1,psycopg2-binary==2.8.6,py==1.10.0,pyarrow==3.0.0,pyasn1==0.4.8,pyasn1-modules==0.2.8,pydot==1.4.2,PyHamcrest==1.10.1,pymongo==3.11.4,pyparsing==2.4.7,pytest==4.6.11,pytest-forked==1.3.0,pytest-timeout==1.4.2,pytest-xdist==1.34.0,python-dateutil==2.8.1,pytz==2021.1,PyYAML==5.4.1,requests==2.25.1,requests-mock==1.9.2,rsa==4.7.2,six==1.16.0,SQLAlchemy==1.4.15,tenacity==5.1.5,testcontainers==3.4.0,typing-extensions==3.7.4.3,urllib3==1.26.4,wcwidth==0.2.5,websocket-client==1.0.1,wrapt==1.12.1>
spark-runner-test run-test-pre: PYTHONHASHSEED='275827904'
spark-runner-test run-test-pre: commands[0] | python --version
Python 3.8.5
spark-runner-test run-test-pre: commands[1] | pip --version
pip 21.0.1 from 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/pip>
 (python 3.8)
spark-runner-test run-test-pre: commands[2] | pip check
No broken requirements found.
spark-runner-test run-test-pre: commands[3] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
spark-runner-test run-test: commands[0] | 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/apache_beam/runners/portability/spark_runner_test.py>
 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK'
============================= test session starts 
==============================
platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1
cachedir: target/.tox-spark-runner-test/spark-runner-test/.pytest_cache
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python,>
 inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.2
timeout: 600.0s
timeout method: signal
timeout func_only: False
collected 50 items

apache_beam/runners/portability/spark_runner_test.py Fs...ss.......ss... [ 38%]
.s.s............s.ssss..sssss..                                          [100%]

=================================== FAILURES ===================================
_______________________ SparkRunnerTest.test_assert_that _______________________
RuntimeError: Pipeline timed out waiting for job service subprocess.

During handling of the above exception, another exception occurred:

self = <apache_beam.runners.portability.spark_runner_test.SparkRunnerTest 
testMethod=test_assert_that>

    def test_assert_that(self):
      # TODO: figure out a way for fn_api_runner to parse and raise the
      # underlying exception.
      with self.assertRaisesRegex(Exception, 'Failed assert'):
        with self.create_pipeline() as p:
>         assert_that(p | beam.Create(['a', 'b']), equal_to(['a']))
E         AssertionError: "Failed assert" does not match "Pipeline timed out 
waiting for job service subprocess."

apache_beam/runners/portability/fn_api_runner/fn_runner_test.py:107: 
AssertionError
=============================== warnings summary ===============================
target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py:42
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/target/.tox-spark-runner-test/spark-runner-test/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
 DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use 
"async def" instead
    def call(self, fn, *args, **kwargs):

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/pytest_spark-runner-test.xml>
 -
======== 1 failed, 32 passed, 17 skipped, 1 warnings in 580.21 seconds =========
ERROR: InvocationError for command 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/pytest_validates_runner.sh>
 spark-runner-test apache_beam/runners/portability/spark_runner_test.py 
'--spark_job_server_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.31.0-SNAPSHOT.jar>
 --environment_type=LOOPBACK' (exited with code 1)
spark-runner-test run-test-post: commands[0] | bash 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh>
___________________________________ summary 
____________________________________
ERROR:   spark-runner-test: commands failed

> Task :sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK 
> FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py37:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:sparkCompatibilityMatrixLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 27s
67 actionable tasks: 46 executed, 21 from cache
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/vawonyuvmqexs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to