See
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/4406/display/redirect>
Changes:
------------------------------------------
[...truncated 14.72 MB...]
thread: "Thread-14"
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:root:Running
(((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2965-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x
coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x
coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running
((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running
((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running
(ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 478435993
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0),
batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 603111267
}
message: "Renamed 1 shards in 0.12 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 622994661
}
message: "No more requests from control plane"
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:256"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 623170852
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:257"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 623262405
}
message: "Closing all cached grpc data channels."
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:783"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 623331069
}
message: "Closing all cached gRPC state handlers."
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:902"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 623736143
}
message: "Done consuming work."
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:269"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1632680313
nanos: 623834371
}
message: "Python sdk harness exiting."
log_location:
"/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:154"
thread: "MainThread"
INFO:apache_beam.runners.portability.local_job_service:Successfully completed
job in 6.344455003738403 seconds.
INFO:root:Successfully completed job in 6.344455003738403 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at
localhost:41321
WARNING:root:Make sure that locally built Python SDK docker image has Python
3.6 interpreter.
INFO:root:Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.34.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function pack_combiners at 0x7ff1bd4eb6a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7ff1bd4eb730> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function sort_stages at 0x7ff1bd4ebe18> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.34.0-SNAPSHOT.jar'>
'--spark-master-url' 'local[4]' '--artifacts-dir'
'/tmp/beam-temp9ttxyqt_/artifactscnk1qmbb' '--job-port' '53085'
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:38 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService
started on localhost:41253'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:39 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
started on localhost:37135'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:39 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
localhost:53085'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:39 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running,
terminate with Ctrl+C'
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args:
['--parallelism=2']
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
artifacts for job_07c536ad-30d2-4fb1-b8d5-a5b06023f06f.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving
artifacts for
job_07c536ad-30d2-4fb1-b8d5-a5b06023f06f.ref_Environment_default_environment_1.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1
artifacts for job_07c536ad-30d2-4fb1-b8d5-a5b06023f06f.null.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts
fully staged for job_07c536ad-30d2-4fb1-b8d5-a5b06023f06f.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.spark.SparkJobInvoker: Invoking job
BeamApp-jenkins-0926181840-682889e2_55a47245-1d33-47ec-b49d-dde025a60a5d'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation
BeamApp-jenkins-0926181840-682889e2_55a47245-1d33-47ec-b49d-dde025a60a5d'
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has
started a component necessary for the execution. Be sure to run the pipeline
using
with Pipeline() as p:
p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:40 INFO
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand
new Spark Context.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:41 WARN
org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:41 INFO
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated
aggregators accumulator:'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:41 INFO
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics
accumulator: MetricQueryResults()'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:42 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
BeamApp-jenkins-0926181840-682889e2_55a47245-1d33-47ec-b49d-dde025a60a5d on
Spark master local[4]'
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel
for localhost:38289.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with
unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 1-1'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-2'
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for
localhost:43749.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for
localhost:34773
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-3'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 WARN
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions:
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not
consistent with equals. That might cause issues on some runners.'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-4'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-5'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-7'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-9'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-6'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-8'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-10'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-11'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-12'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-13'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-14'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:43 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-15'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:44 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-0926181840-682889e2_55a47245-1d33-47ec-b49d-dde025a60a5d:
Pipeline translated successfully. Computing outputs'
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:44 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-16'
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.10 seconds.
INFO:apache_beam.utils.subprocess_server:b'21/09/26 18:18:44 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-0926181840-682889e2_55a47245-1d33-47ec-b49d-dde025a60a5d
finished.'
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data
plane.
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 635, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1632680324.661319575","description":"Error received from peer
ipv4:127.0.0.1:34773","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket
closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/usr/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 652, in <lambda>
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 635, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1632680324.661319575","description":"Error received from peer
ipv4:127.0.0.1:34773","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket
closed","grpc_status":14}"
>
Exception in thread run_worker_1-1:
Traceback (most recent call last):
File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/usr/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 246, in run
for work_request in self._control_stub.Control(get_responses()):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1632680324.661346682","description":"Error received from peer
ipv4:127.0.0.1:38289","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket
closed","grpc_status":14}"
>
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/usr/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 1033, in pull_responses
for response in responses:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 426, in __next__
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",>
line 826, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string =
"{"created":"@1632680324.661354861","description":"Error received from peer
ipv4:127.0.0.1:43749","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"Socket
closed","grpc_status":14}"
>
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 225
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 51m 15s
211 actionable tasks: 147 executed, 60 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/z3hwouz44k22y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]