See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/2092/display/redirect>

Changes:


------------------------------------------
[...truncated 17.72 MB...]
  seconds: 1640520969
  nanos: 427249670
}
message: "Python sdk harness starting."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 432505607
}
message: "Creating insecure state channel for localhost:42337."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:807"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 432793855
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:814"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 434631347
}
message: "Creating client data channel for localhost:37805"
instruction_id: "bundle_1"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:750"
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3228-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
 coord/Write)
INFO:root:Running 
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3228-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
 coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x 
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running 
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running 
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running 
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 628844499
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), 
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 639694452
}
message: "Renamed 1 shards in 0.01 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 647714376
}
message: "No more requests from control plane"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 647907018
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 648015022
}
message: "Closing all cached grpc data channels."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 648114919
}
message: "Closing all cached gRPC state handlers."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 648587465
}
message: "Done consuming work."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1640520969
  nanos: 648735284
}
message: "Python sdk harness exiting."
log_location: 
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed 
job in 7.304678440093994 seconds.
INFO:root:Successfully completed job in 7.304678440093994 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at 
localhost:33893
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.8 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.8_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7fd8c5328940> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function lift_combiners at 0x7fd8c53289d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7fd8c5329160> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'>
 '--spark-master-url' 'local[4]' '--artifacts-dir' 
'/tmp/beam-tempoyweks7d/artifactsy4h9r2gt' '--job-port' '54511' 
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Failed to load class 
"org.slf4j.impl.StaticLoggerBinder".'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Defaulting to no-operation 
(NOP) logger implementation'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: See 
http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.'
WARNING:root:Waiting for grpc channel to be ready at localhost:54511.
WARNING:root:Waiting for grpc channel to be ready at localhost:54511.
WARNING:root:Waiting for grpc channel to be ready at localhost:54511.
WARNING:root:Waiting for grpc channel to be ready at localhost:54511.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'Exception in thread 
"grpc-default-executor-0" java.lang.NoClassDefFoundError: 
org/apache/spark/streaming/api/java/JavaStreamingListener'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.spark.SparkJobInvoker.createJobInvocation(SparkJobInvoker.java:101)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.spark.SparkJobInvoker.invokeWithExecutor(SparkJobInvoker.java:82)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.JobInvoker.invoke(JobInvoker.java:48)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.runners.jobsubmission.InMemoryJobService.run(InMemoryJobService.java:246)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.model.jobmanagement.v1.JobServiceGrpc$MethodHandlers.invoke(JobServiceGrpc.java:948)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.PartialForwardingServerCallListener.onHalfClose(PartialForwardingServerCallListener.java:35)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:23)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:40)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.Contexts$ContextualizedServerCallListener.onHalfClose(Contexts.java:86)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:331)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:797)'
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)'
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
INFO:apache_beam.utils.subprocess_server:b'\tat 
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)'
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 94, in <module>
    run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py";,>
 line 89, in run
    output | 'Write' >> WriteToText(known_args.output)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 596, in __exit__
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.Thread.run(Thread.java:748)'
INFO:apache_beam.utils.subprocess_server:b'Caused by: 
java.lang.ClassNotFoundException: 
org.apache.spark.streaming.api.java.JavaStreamingListener'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.net.URLClassLoader.findClass(URLClassLoader.java:382)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.ClassLoader.loadClass(ClassLoader.java:418)'
    self.result = self.run()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 573, in run
INFO:apache_beam.utils.subprocess_server:b'\tat 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)'
INFO:apache_beam.utils.subprocess_server:b'\tat 
java.lang.ClassLoader.loadClass(ClassLoader.java:351)'
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py";,>
 line 47, in run_pipeline
INFO:apache_beam.utils.subprocess_server:b'\t... 17 more'
    return super().run_pipeline(pipeline, options)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 440, in run_pipeline
    job_service_handle.submit(proto_pipeline)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 115, in submit
    return self.run(prepare_response.preparation_id)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py";,>
 line 241, in run
    run_response = self.job_service.Run(
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py";,>
 line 946, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py";,>
 line 849, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.UNKNOWN
        details = ""
        debug_error_string = 
"{"created":"@1640520983.588200417","description":"Error received from peer 
ipv4:127.0.0.1:54511","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"","grpc_status":2}"
>

> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch 
> FAILED
> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 120

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 3 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 1h 52m 47s
173 actionable tasks: 103 executed, 64 from cache, 6 up-to-date

Publishing build scan...
https://gradle.com/s/zqvlc5464gayu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to