See
<https://ci-beam.apache.org/job/beam_PostCommit_Python311/554/display/redirect>
Changes:
------------------------------------------
[...truncated 11.31 MB...]
nanos: 104956626
}
message: "Closing all cached grpc data channels."
log_location:
"/usr/local/lib/python3.11/site-packages/apache_beam/runners/worker/data_plane.py:803"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1692400198
nanos: 105038404
}
message: "Closing all cached gRPC state handlers."
log_location:
"/usr/local/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:904"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1692400198
nanos: 106824398
}
message: "Done consuming work."
log_location:
"/usr/local/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker.py:287"
thread: "MainThread"
INFO:root:severity: INFO
timestamp {
seconds: 1692400198
nanos: 106920003
}
message: "Python sdk harness exiting."
log_location:
"/usr/local/lib/python3.11/site-packages/apache_beam/runners/worker/sdk_worker_main.py:213"
thread: "MainThread"
b1f40fb752aa8a2e6aceedaf299cebf0c7a8fae548444e9a02114d63094d7d47
INFO:apache_beam.runners.portability.local_job_service:Completed job in
24.519991397857666 seconds with state DONE.
INFO:root:Completed job in 24.519991397857666 seconds with state DONE.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
> Task :sdks:python:test-suites:portable:py311:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at
localhost:35661
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function pack_combiners at 0x7f11d2ccc680> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function lift_combiners at 0x7f11d2ccc720> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
<function sort_stages at 0x7f11d2cccf40> ====================
INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar'
'<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.51.0-SNAPSHOT.jar'>
'--spark-master-url' 'local[4]' '--artifacts-dir'
'/tmp/beam-tempwznu6a1r/artifactsc7p33sdo' '--job-port' '57719'
'--artifact-port' '0' '--expansion-port' '0']
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:07 WARN
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to
retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:08 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService
started on localhost:46699
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:08 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
started on localhost:46199
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:08 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
localhost:57719
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:08 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running,
terminate with Ctrl+C
WARNING:root:Waiting for grpc channel to be ready at localhost:57719.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
artifacts for job_c657a0b8-66bb-414c-a2b8-c1a0414c6016.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving
artifacts for
job_c657a0b8-66bb-414c-a2b8-c1a0414c6016.ref_Environment_default_environment_1.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1
artifacts for job_c657a0b8-66bb-414c-a2b8-c1a0414c6016.null.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts
fully staged for job_c657a0b8-66bb-414c-a2b8-c1a0414c6016.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.spark.SparkJobInvoker: Invoking job
BeamApp-jenkins-0818231010-62f3d3a1_dbeddaca-955b-4506-9696-199ae6014fde
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:10 INFO
org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation
BeamApp-jenkins-0818231010-62f3d3a1_dbeddaca-955b-4506-9696-199ae6014fde
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has
started a component necessary for the execution. Be sure to run the pipeline
using
with Pipeline() as p:
p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to
RUNNING
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:11 INFO
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand
new Spark Context.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:11 WARN
org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library
for your platform... using builtin-java classes where applicable
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.util.log: Logging initialized @9034ms to
org.sparkproject.jetty.util.log.Slf4jLog
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built:
2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm
1.8.0_382-8u382-ga-1~20.04.1-b05
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.Server: Started @9192ms
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.AbstractConnector: Started
ServerConnector@74614a14{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@286c6c76{/jobs,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6035967{/jobs/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@563ca74d{/jobs/job,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@23e4440d{/jobs/job/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@a0c1ea2{/stages,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2833707a{/stages/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5e228bb4{/stages/stage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@54e0c6e{/stages/stage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@2e23286e{/stages/pool,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@4e1d396f{/stages/pool/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5006da14{/storage,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@67da35fd{/storage/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@69a50dec{/storage/rdd,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@428641bc{/storage/rdd/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@21b944ab{/environment,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@a64df11{/environment/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@16af85f2{/executors,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@3f5791c0{/executors/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@688fe695{/executors/threadDump,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@42dc0397{/executors/threadDump/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@7ca58ccb{/static,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@11eaf34{/,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@5166315d{/api,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@227e73de{/jobs/job/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@6479ab72{/stages/stage/kill,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:13 INFO
org.sparkproject.jetty.server.handler.ContextHandler: Started
o.s.j.s.ServletContextHandler@775cec0e{/metrics/json,null,AVAILABLE,@Spark}
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:14 INFO
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics
accumulator: MetricQueryResults()
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:14 WARN
software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to
retrieve the requested metadata.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:14 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
BeamApp-jenkins-0818231010-62f3d3a1_dbeddaca-955b-4506-9696-199ae6014fde on
Spark master local[4]
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel
for localhost:34449.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with
unbounded number of workers.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:17 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 1-1
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:17 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for
localhost:34031.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for
localhost:36657
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:18 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:18 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-3
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:19 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-4
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:19 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-5
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:19 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-7
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:19 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-9
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-6
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-8
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-10
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-11
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-12
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-13
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:20 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-14
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-15
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:21 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-0818231010-62f3d3a1_dbeddaca-955b-4506-9696-199ae6014fde:
Pipeline translated successfully. Computing outputs
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:21 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 1-16
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:21 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
BeamApp-jenkins-0818231010-62f3d3a1_dbeddaca-955b-4506-9696-199ae6014fde
finished.
INFO:apache_beam.utils.subprocess_server:23/08/18 23:10:21 INFO
org.sparkproject.jetty.server.AbstractConnector: Stopped
Spark@74614a14{HTTP/1.1, (http/1.1)}{127.0.0.1:4040}
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
Exception in thread read_state:
Traceback (most recent call last):
File "/usr/lib/python3.11/threading.py", line 1038, in _bootstrap_inner
Exception in thread run_worker_1-1:
Traceback (most recent call last):
File "/usr/lib/python3.11/threading.py", line 1038, in _bootstrap_inner
self.run()
File "/usr/lib/python3.11/threading.py", line 975, in run
self.run()
self._target(*self._args, **self._kwargs)
File "/usr/lib/python3.11/threading.py", line 975, in run
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 262, in run
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data
plane.
Traceback (most recent call last):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 652, in _read_inputs
for elements in elements_iterator:
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 541, in __next__
return self._next()
^^^^^^^^^^^^
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer
ipv6:%5B::1%5D:36657 {grpc_message:"Socket closed", grpc_status:14,
created_time:"2023-08-18T23:10:22.549065744+00:00"}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
File "/usr/lib/python3.11/threading.py", line 1038, in _bootstrap_inner
for work_request in self._control_stub.Control(get_responses()):
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 541, in __next__
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",>
line 1035, in pull_responses
self.run()
File "/usr/lib/python3.11/threading.py", line 975, in run
self._target(*self._args, **self._kwargs)
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 669, in <lambda>
for response in responses:
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 541, in __next__
^^^^^^^^^^^^
target=lambda: self._read_inputs(elements_iterator),
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 967, in _next
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
return self._next()
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",>
line 652, in _read_inputs
^^^^^^^^^^^^
for elements in elements_iterator:
raise self
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 967, in _next
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 541, in __next__
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer
ipv6:%5B::1%5D:34449 {created_time:"2023-08-18T23:10:22.549144784+00:00",
grpc_status:14, grpc_message:"Socket closed"}"
>
return self._next()
^^^^^^^^^^^^
File
"<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/build/gradleenv/-1720702906/lib/python3.11/site-packages/grpc/_channel.py",>
line 967, in _next
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer
ipv6:%5B::1%5D:34031 {created_time:"2023-08-18T23:10:22.54910244+00:00",
grpc_status:14, grpc_message:"Socket closed"}"
>
raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that
terminated with:
status = StatusCode.UNAVAILABLE
details = "Socket closed"
debug_error_string = "UNKNOWN:Error received from peer
ipv6:%5B::1%5D:36657 {grpc_message:"Socket closed", grpc_status:14,
created_time:"2023-08-18T23:10:22.549065744+00:00"}"
>
> Task :sdks:python:test-suites:portable:py311:postCommitPy311
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_PostCommit_Python311/ws/src/sdks/python/build.gradle'>
line: 96
* What went wrong:
Execution failed for task ':sdks:python:bdistPy311linux'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings
Execution optimizations have been disabled for 1 invalid unit(s) of work during
this build to ensure correctness.
Please consult deprecation warnings for more details.
BUILD FAILED in 21m 58s
207 actionable tasks: 142 executed, 59 from cache, 6 up-to-date
Publishing build scan...
Publishing failed.
The response from https://ge.apache.org/scans/publish/gradle/3.13.2/token was
not from Gradle Enterprise.
The specified server address may be incorrect, or your network environment may
be interfering.
Please report this problem to your Gradle Enterprise administrator via
https://ge.apache.org/help and include the following via copy/paste:
----------
Gradle version: 7.6.2
Plugin version: 3.13.2
Request URL: https://ge.apache.org/scans/publish/gradle/3.13.2/token
Request ID: ca309c43-d233-40d4-bb52-bbbcf6241b41
Response status code: 502
Response content type: text/html
----------
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]