See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/4684/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #16285 from [BEAM-13492][Playground]  Update 
backend


------------------------------------------
[...truncated 31.93 MB...]
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640542563
  nanos: 214361667
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640542563
  nanos: 214437723
}
message: "Closing all cached grpc data channels."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640542563
  nanos: 214507818
}
message: "Closing all cached gRPC state handlers."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640542563
  nanos: 215016603
}
message: "Done consuming work."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640542563
  nanos: 215145587
}
message: "Python sdk harness exiting."
log_location: 
"/usr/local/lib/python3.7/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed 
job in 5.022491216659546 seconds.
INFO:root:Successfully completed job in 5.022491216659546 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:40979
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f181ae06d40> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7f181ae06dd0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f181ae08560> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'>
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-temptfk6lcao/artifactstllkst5g' '--job-port' '36539' 
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Failed to load class 
"org.slf4j.impl.StaticLoggerBinder".'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Defaulting to no-operation 
(NOP) logger implementation'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: See 
http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'Exception in thread 
"grpc-default-executor-0" java.lang.NoClassDefFoundError: 
org/apache/spark/streaming/api/java/JavaStreamingListener'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.spark.SparkJobInvoker.createJobInvocation(SparkJobInvoker.java:101)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.spark.SparkJobInvoker.invokeWithExecutor(SparkJobInvoker.java:82)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.JobInvoker.invoke(JobInvoker.java:48)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.InMemoryJobService.run(InMemoryJobService.java:246)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.model.jobmanagement.v1.JobServiceGrpc$MethodHandlers.invoke(JobServiceGrpc.java:948)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.PartialForwardingServerCallListener.onHalfClose(PartialForwardingServerCallListener.java:35)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:23)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:40)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.Contexts$ContextualizedServerCallListener.onHalfClose(Contexts.java:86)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:331)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:797)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.Thread.run(Thread.java:748)'
INFO:apache_beam.utils.subprocess_server:b'Caused by: 
java.lang.ClassNotFoundException: 
org.apache.spark.streaming.api.java.JavaStreamingListener'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.net.URLClassLoader.findClass(URLClassLoader.java:382)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.ClassLoader.loadClass(ClassLoader.java:418)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.ClassLoader.loadClass(ClassLoader.java:351)'
INFO:apache_beam.utils.subprocess_server:b'\t... 17 more'
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 94, in <module>
    run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 89, in run
    output | 'Write' >> WriteToText(known_args.output)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 596, in __exit__
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 47, in run_pipeline
    return super().run_pipeline(pipeline, options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 440, in run_pipeline
    job_service_handle.submit(proto_pipeline)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 115, in submit
    return self.run(prepare_response.preparation_id)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 242, in run
    beam_job_api_pb2.RunJobRequest(preparation_id=preparation_id))
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 946, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 849, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.UNKNOWN
        details = ""
        debug_error_string = 
"{"created":"@1640542568.811001414","description":"Error received from peer 
ipv4:127.0.0.1:36539","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"","grpc_status":2}"
>

> Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch 
> FAILED
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:spannerioIT
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --region=us-central1 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.36.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   pytest options: --capture=no --timeout=4500 --color=yes 
>>> --log-cli-level=INFO --numprocesses=8 --dist=loadfile
>>>   collect markers: -m=spannerio_it
============================= test session starts 
==============================
platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> 
inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I
[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24)  -- [GCC 5.4.0 
20160609]
gw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] 
/ gw7 [15]

scheduling tests via LoadFileScheduling

apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
 
[gw1] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
 
[gw1] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
 
[gw0] PASSED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
 
[gw0] PASSED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_error_call
 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
 
[gw0] SKIPPED 
apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_update
 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
 
[gw1] PASSED 
apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_write_batches
 

=============================== warnings summary 
===============================
apache_beam/io/gcp/experimental/spannerio_read_it_test.py:128
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:128:
 FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility 
guarantees.
    sql="select * from Users")

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:190
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:190:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_read_it_test.py:117
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_read_it_test.py>:117:
 FutureWarning: ReadFromSpanner is experimental. No backwards-compatibility 
guarantees.
    columns=["UserId", "Key"])

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:171
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:171:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    database_id=self.TEST_DATABASE))

apache_beam/io/gcp/experimental/spannerio_write_it_test.py:135
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py>:135:
 FutureWarning: WriteToSpanner is experimental. No backwards-compatibility 
guarantees.
    max_batch_size_bytes=250))

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/pytest_postCommitIT-df-py37.xml>
 -
============= 5 passed, 10 skipped, 5 warnings in 1709.03 seconds 
==============

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 3 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 2h 41m 56s
178 actionable tasks: 108 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/fk5skdxfzau4c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to