See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/4585/display/redirect>
Changes: ------------------------------------------ [...truncated 11.94 MB...] INFO:root:Completed job in 19.11351466178894 seconds with state DONE. INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE > Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:34669 INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7fcf3967e280> ==================== INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fcf3967e310> ==================== INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7fcf3967ea60> ==================== INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.51.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempd0dwp69c/artifactsfa1mya3y' '--job-port' '49209' '--artifact-port' '0' '--expansion-port' '0'] INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:49 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:50 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:40491 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:50 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:38145 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:50 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:49209 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:50 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_37743532-202f-4b86-8d14-ca5964a4dc1e. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_37743532-202f-4b86-8d14-ca5964a4dc1e.ref_Environment_default_environment_1. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_37743532-202f-4b86-8d14-ca5964a4dc1e.null. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_37743532-202f-4b86-8d14-ca5964a4dc1e. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-0913203552-3b75c03c_59aaa565-ef40-4183-87cf-a535764a3c5c INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:52 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-0913203552-3b75c03c_59aaa565-ef40-4183-87cf-a535764a3c5c INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using with Pipeline() as p: p.apply(..) This ensures that the pipeline finishes before this program exits. INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:53 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:53 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:54 INFO org.sparkproject.jetty.util.log: Logging initialized @7597ms to org.sparkproject.jetty.util.log.Slf4jLog INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:54 INFO org.sparkproject.jetty.server.Server: jetty-9.4.44.v20210927; built: 2021-09-27T23:02:44.612Z; git: 8da83308eeca865e495e53ef315a249d63ba9332; jvm 1.8.0_382-8u382-ga-1~20.04.1-b05 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:54 INFO org.sparkproject.jetty.server.Server: Started @7730ms INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.AbstractConnector: Started ServerConnector@22858cf{HTTP/1.1, (http/1.1)}{127.0.0.1:4040} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@41fa0196{/jobs,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f2fd3c7{/jobs/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6f219ca{/jobs/job,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50d0fff1{/jobs/job/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c9afb82{/stages,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5223e18b{/stages/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1b55e717{/stages/stage,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38f9e68d{/stages/stage/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61b073c1{/stages/pool,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c0ae69e{/stages/pool/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c99d957{/storage,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79756b17{/storage/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79aaf4e6{/storage/rdd,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bcb1fd3{/storage/rdd/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@eb455f{/environment,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2d86ff89{/environment/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1df2a2bb{/executors,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6b8edadd{/executors/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ab46c74{/executors/threadDump,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@76504f18{/executors/threadDump/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c77a62{/static,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d6d911d{/,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@663c226f{/api,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4eb258f8{/jobs/job/kill,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b96834b{/stages/stage/kill,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.sparkproject.jetty.server.handler.ContextHandler: Started o.s.j.s.ServletContextHandler@17bdd075{/metrics/json,null,AVAILABLE,@Spark} INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults() INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 WARN software.amazon.awssdk.regions.internal.util.EC2MetadataUtils: Unable to retrieve the requested metadata. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:55 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-0913203552-3b75c03c_59aaa565-ef40-4183-87cf-a535764a3c5c on Spark master local[4] INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0 INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42889. INFO:apache_beam.runners.worker.sdk_worker:Control channel established. INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2 INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:46875. INFO:apache_beam.runners.worker.sdk_worker:State channel established. INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:41709 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15 INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913203552-3b75c03c_59aaa565-ef40-4183-87cf-a535764a3c5c: Pipeline translated successfully. Computing outputs INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16 INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4 INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.01 seconds. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-0913203552-3b75c03c_59aaa565-ef40-4183-87cf-a535764a3c5c finished. INFO:apache_beam.utils.subprocess_server:23/09/13 20:35:58 INFO org.sparkproject.jetty.server.AbstractConnector: Stopped Spark@22858cf{HTTP/1.1, (http/1.1)}{127.0.0.1:4040} INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE Exception in thread read_state: Traceback (most recent call last): File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner Exception in thread run_worker_1-1: Traceback (most recent call last): File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner self.run() File "/usr/lib/python3.8/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 262, in run self.run() File "/usr/lib/python3.8/threading.py", line 870, in run for work_request in self._control_stub.Control(get_responses()): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 541, in __next__ self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 1035, in pull_responses return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 967, in _next ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane. Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 541, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 967, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:41709 {grpc_message:"Socket closed", grpc_status:14, created_time:"2023-09-13T20:35:59.11101997+00:00"}" > Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner self.run() File "/usr/lib/python3.8/threading.py", line 870, in run raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "recvmsg:Connection reset by peer" debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:42889 {created_time:"2023-09-13T20:35:59.111019951+00:00", grpc_status:14, grpc_message:"recvmsg:Connection reset by peer"}" > for response in responses: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 541, in __next__ self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 669, in <lambda> return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 967, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:46875 {created_time:"2023-09-13T20:35:59.111008732+00:00", grpc_status:14, grpc_message:"Socket closed"}" > target=lambda: self._read_inputs(elements_iterator), File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 652, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 541, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/2022703442/lib/python3.8/site-packages/grpc/_channel.py",> line 967, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "UNKNOWN:Error received from peer ipv6:%5B::1%5D:41709 {grpc_message:"Socket closed", grpc_status:14, created_time:"2023-09-13T20:35:59.11101997+00:00"}" > > Task :sdks:python:test-suites:portable:py38:postCommitPy38 > Task :sdks:python:test-suites:dataflow:py38:postCommitIT warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md warning: check: missing required meta-data: url warning: check: missing meta-data: either (author and author_email) or (maintainer and maintainer_email) must be supplied > Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 397 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py38:tensorRTtests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== 2: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 424 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py38:vertexAIInferenceTest'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== 3: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 139 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 2h 15m 46s 216 actionable tasks: 153 executed, 57 from cache, 6 up-to-date Publishing build scan... Publishing failed. The response from https://ge.apache.org/scans/publish/gradle/3.13.2/token was not from Gradle Enterprise. The specified server address may be incorrect, or your network environment may be interfering. Please report this problem to your Gradle Enterprise administrator via https://ge.apache.org/help and include the following via copy/paste: ---------- Gradle version: 7.6.2 Plugin version: 3.13.2 Request URL: https://ge.apache.org/scans/publish/gradle/3.13.2/token Request ID: dddb055f-0a22-456c-aa0d-166ba232fae5 Response status code: 502 Response content type: text/html ---------- Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
