See
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/2097/display/redirect?page=changes>
Changes:
[mmack] [adhoc] Forbid to import guava and others from org.testcontainers.shaded
------------------------------------------
[...truncated 19.94 MB...]
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 324365854
}
message: "Control channel established."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:172"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 325721740
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:215"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 328099489
}
message: "Python sdk harness starting."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 332737684
}
message: "Creating insecure state channel for localhost:38605."
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:807"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 332963466
}
message: "State channel established."
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:814"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 334512233
}
message: "Creating client data channel for localhost:37509"
instruction_id: "bundle_1"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:750"
thread: "Thread-14"
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3228-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:root:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-3228-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 508933544
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0),
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:297"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 545362234
}
message: "Renamed 1 shards in 0.04 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/io/filebasedsink.py:345"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550040245
}
message: "No more requests from control plane"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:244"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550195932
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:245"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550269126
}
message: "Closing all cached grpc data channels."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/data_plane.py:782"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550335645
}
message: "Closing all cached gRPC state handlers."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:826"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550717592
}
message: "Done consuming work."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1640628791
nanos: 550840139
}
message: "Python sdk harness exiting."
log_location:
"/usr/local/lib/python3.8/site-packages/apache_beam/runners/worker/sdk_worker_main.py:156"
thread: "MainThread"
INFO:apache_beam.runners.portability.local_job_service:Successfully completed
job in 5.558701276779175 seconds.
INFO:root:Successfully completed job in 5.558701276779175 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at
localhost:39507
WARNING:root:Make sure that locally built Python SDK docker image has Python
3.8 interpreter.
INFO:root:Default Python SDK image for environment is
apache/beam_python3.8_sdk:2.36.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function pack_combiners at 0x7f6cf624e940> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f6cf624e9d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function sort_stages at 0x7f6cf624f160> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'>
'--spark-master-url' 'local[4]' '--artifacts-dir'
'/tmp/beam-temp1y9uf1ww/artifactsu4cwz3cn' '--job-port' '52597'
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Failed to load class
"org.slf4j.impl.StaticLoggerBinder".'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: Defaulting to no-operation
(NOP) logger implementation'
INFO:apache_beam.utils.subprocess_server:b'SLF4J: See
http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.'
WARNING:root:Waiting for grpc channel to be ready at localhost:52597.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args:
['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'Exception in thread
"grpc-default-executor-1" java.lang.NoClassDefFoundError:
org/apache/spark/streaming/api/java/JavaStreamingListener'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.runners.spark.SparkJobInvoker.createJobInvocation(SparkJobInvoker.java:101)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.runners.spark.SparkJobInvoker.invokeWithExecutor(SparkJobInvoker.java:82)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.runners.jobsubmission.JobInvoker.invoke(JobInvoker.java:48)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.runners.jobsubmission.InMemoryJobService.run(InMemoryJobService.java:246)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.model.jobmanagement.v1.JobServiceGrpc$MethodHandlers.invoke(JobServiceGrpc.java:948)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.stub.ServerCalls$UnaryServerCallHandler$UnaryServerCallListener.onHalfClose(ServerCalls.java:182)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.PartialForwardingServerCallListener.onHalfClose(PartialForwardingServerCallListener.java:35)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:23)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onHalfClose(ForwardingServerCallListener.java:40)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.Contexts$ContextualizedServerCallListener.onHalfClose(Contexts.java:86)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.halfClosed(ServerCallImpl.java:331)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1HalfClosed.runInContext(ServerImpl.java:797)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)'
INFO:apache_beam.utils.subprocess_server:b'\tat
org.apache.beam.vendor.grpc.v1p36p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)'
INFO:apache_beam.utils.subprocess_server:b'\tat
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)'
Traceback (most recent call last):
File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
INFO:apache_beam.utils.subprocess_server:b'\tat
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)'
return _run_code(code, main_globals, None,
File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
INFO:apache_beam.utils.subprocess_server:b'\tat
java.lang.Thread.run(Thread.java:748)'
INFO:apache_beam.utils.subprocess_server:b'Caused by:
java.lang.ClassNotFoundException:
org.apache.spark.streaming.api.java.JavaStreamingListener'
exec(code, run_globals)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py",>
line 94, in <module>
INFO:apache_beam.utils.subprocess_server:b'\tat
java.net.URLClassLoader.findClass(URLClassLoader.java:382)'
run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/examples/wordcount.py",>
line 89, in run
output | 'Write' >> WriteToText(known_args.output)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",>
line 596, in __exit__
INFO:apache_beam.utils.subprocess_server:b'\tat
java.lang.ClassLoader.loadClass(ClassLoader.java:418)'
INFO:apache_beam.utils.subprocess_server:b'\tat
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)'
INFO:apache_beam.utils.subprocess_server:b'\tat
java.lang.ClassLoader.loadClass(ClassLoader.java:351)'
self.result = self.run()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",>
line 573, in run
return self.runner.run_pipeline(self, self._options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py",>
line 47, in run_pipeline
INFO:apache_beam.utils.subprocess_server:b'\t... 17 more'
return super().run_pipeline(pipeline, options)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 440, in run_pipeline
job_service_handle.submit(proto_pipeline)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 115, in submit
return self.run(prepare_response.preparation_id)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",>
line 241, in run
run_response = self.job_service.Run(
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 946, in __call__
return _end_unary_response_blocking(state, call, False, None)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",>
line 849, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNKNOWN
details = ""
debug_error_string =
"{"created":"@1640628800.694878994","description":"Error received from peer
ipv4:127.0.0.1:52597","file":"src/core/lib/surface/call.cc","file_line":1074,"grpc_message":"","grpc_status":2}"
>
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch
> FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 3 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 2h 10m 31s
173 actionable tasks: 103 executed, 64 from cache, 6 up-to-date
Publishing build scan...
https://gradle.com/s/do5nharhnh3u6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]