See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1650/display/redirect>
Changes: ------------------------------------------ [...truncated 1.31 MB...] 19/11/29 00:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34953. 19/11/29 00:15:08 INFO sdk_worker.create_state_handler: State channel established. 19/11/29 00:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35509 19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/11/29 00:15:08 INFO sdk_worker.run: No more requests from control plane 19/11/29 00:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete 19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:08 INFO data_plane.close: Closing all cached grpc data channels. 19/11/29 00:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers. 19/11/29 00:15:08 INFO sdk_worker.run: Done consuming work. 19/11/29 00:15:08 INFO sdk_worker_main.main: Python sdk harness exiting. 19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up. 19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__ 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created. 19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:42109 19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp 19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'} 19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0 19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41497. 19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established. 19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers. 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1 19/11/29 00:15:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44863. 19/11/29 00:15:09 INFO sdk_worker.create_state_handler: State channel established. 19/11/29 00:15:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:33501 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/11/29 00:15:09 INFO sdk_worker.run: No more requests from control plane 19/11/29 00:15:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete 19/11/29 00:15:09 INFO data_plane.close: Closing all cached grpc data channels. 19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:09 INFO sdk_worker.close: Closing all cached gRPC state handlers. 19/11/29 00:15:09 INFO sdk_worker.run: Done consuming work. 19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness exiting. 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up. 19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__ 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created. 19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:43469 19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp 19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'} 19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0 19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46185. 19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established. 19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers. 19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1 19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42357. 19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established. 19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:45441 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane 19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete 19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels. 19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers. 19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work. 19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up. 19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__ 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 19/11/29 00:15:10 INFO sdk_worker_main.main: Logging handler created. 19/11/29 00:15:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:41001 19/11/29 00:15:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp 19/11/29 00:15:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/11/29 00:15:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'} 19/11/29 00:15:10 INFO statecache.__init__: Creating state cache with size 0 19/11/29 00:15:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36987. 19/11/29 00:15:10 INFO sdk_worker.__init__: Control channel established. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1 19/11/29 00:15:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers. 19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36841. 19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established. 19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46021 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane 19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete 19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels. 19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers. 19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work. 19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting. 19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up. 19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__ 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected. 19/11/29 00:15:11 INFO sdk_worker_main.main: Logging handler created. 19/11/29 00:15:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:45481 19/11/29 00:15:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp 19/11/29 00:15:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 19/11/29 00:15:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'} 19/11/29 00:15:11 INFO statecache.__init__: Creating state cache with size 0 19/11/29 00:15:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37199. 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1 19/11/29 00:15:11 INFO sdk_worker.__init__: Control channel established. 19/11/29 00:15:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers. 19/11/29 00:15:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35381. 19/11/29 00:15:11 INFO sdk_worker.create_state_handler: State channel established. 19/11/29 00:15:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:36523 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected. 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1" payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"> 19/11/29 00:15:11 INFO sdk_worker.run: No more requests from control plane 19/11/29 00:15:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete 19/11/29 00:15:11 INFO data_plane.close: Closing all cached grpc data channels. 19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:11 INFO sdk_worker.close: Closing all cached gRPC state handlers. 19/11/29 00:15:11 INFO sdk_worker.run: Done consuming work. 19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness exiting. 19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up. 19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint. 19/11/29 00:15:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4 finished. 19/11/29 00:15:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner. 19/11/29 00:15:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7","basePath":"/tmp/sparktestfCu1th"}: {} java.io.FileNotFoundException: /tmp/sparktestfCu1th/job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7/MANIFEST (No such file or directory) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195) at java.io.FileInputStream.<init>(FileInputStream.java:138) at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118) at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82) at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252) at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88) at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92) at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63) at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE . ====================================================================== ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest) Tests that state requests work correctly when the key coder is an ---------------------------------------------------------------------- Traceback (most recent call last): File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder equal_to(expected)) File "apache_beam/pipeline.py", line 436, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish for state_response in self._state_stream: File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next _common.wait(self._state.condition.wait, _response_ready) File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb) File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once wait_fn(timeout=timeout) File "/usr/lib/python2.7/threading.py", line 359, in wait _sleep(delay) File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler raise BaseException(msg) BaseException: Timed out after 60 seconds. ====================================================================== ==================== Timed out after 60 seconds. ==================== ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest) ---------------------------------------------------------------------- Traceback (most recent call last): # Thread: <Thread(wait_until_finish_read, started daemon 140161978164992)> File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking assert_that(actual, equal_to(list(''.join(data)))) File "apache_beam/pipeline.py", line 436, in __exit__ self.run().wait_until_finish() File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish self._job_id, self._state, self._last_error_message())) # Thread: <Thread(Thread-120, started daemon 140161986557696)> RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574986497.17_4451bb5c-06f7-4f54-9832-4c2cd5f4e1e5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler. # Thread: <_MainThread(MainThread, started 140162776860416)> ---------------------------------------------------------------------- Ran 38 tests in 299.835s FAILED (errors=2, skipped=9) > Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 7m 35s 60 actionable tasks: 47 executed, 13 from cache Publishing build scan... Publishing failed. The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server. Your network environment may be interfering, or the service may be unavailable. If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste: ---------- Gradle version: 5.2.1 Plugin version: 2.3 Request URL: https://scans-in.gradle.com/in/5.2.1/2.3 Request ID: 7392d0df-69d3-4bd2-9090-2d798b99440a Response status code: 502 Response content type: text/html; charset=UTF-8 Response server type: cloudflare ---------- Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
