See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/263/display/redirect>
Changes: ------------------------------------------ [...truncated 3.45 MB...] INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_tools.py",> line 605, in _start_query_job' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' return self._start_job(request)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_tools.py",> line 551, in _start_job' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' response = self.client.jobs.Insert(request, upload=upload)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",> line 345, in Insert' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' upload=upload, upload_config=upload_config)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' return self.ProcessHttpResponse(method_config, http_response, request)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' self.__ProcessHttpResponse(method_config, http_response, request))' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' http_response, method_config=method_config, request=request)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b"apitools.base.py.exceptions.HttpForbiddenError: HttpError accessing <https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Thu, 14 Apr 2022 07:32:07 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '403', 'content-length': '528', '-content-encoding': 'gzip'}>, content <{" INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "error": {' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "code": 403,' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "message": "Access Denied: Table bigquery-samples:airline_ontime_data.flights: User does not have permission to query table bigquery-samples:airline_ontime_data.flights.",' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "errors": [' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' {' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "message": "Access Denied: Table bigquery-samples:airline_ontime_data.flights: User does not have permission to query table bigquery-samples:airline_ontime_data.flights.",' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "domain": "global",' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "reason": "accessDenied"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' ],' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' "status": "PERMISSION_DENIED"' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' }' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'>' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'During handling of the above exception, another exception occurred:' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Traceback (most recent call last):' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 267, in _execute' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' response = task()' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 340, in <lambda>' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' lambda: self.create_worker().do_instruction(request), request)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 581, in do_instruction' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' getattr(request, request_type), request.instruction_id)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 618, in process_bundle' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' bundle_processor.process_bundle(instruction_id))' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/bundle_processor.py",> line 996, in process_bundle' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' element.data)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/bundle_processor.py",> line 221, in process_encoded' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' self.output(decoded_value)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py",> line 348, in output' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py",> line 215, in receive' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' self.consumer.process(windowed_value)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/operations.py",> line 708, in process' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' delayed_applications = self.dofn_runner.process(o)' INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.033957481384277344 seconds. INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/common.py",> line 1200, in process' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' self._reraise_augmented(exn)' INFO apache_beam.utils.subprocess_server:subprocess_server.py:125 b' File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/common.py",> line 1265, in _reraise_augmented' --------------------------- Captured stdout teardown --------------------------- FAILED [ 95%] =============================== warnings summary =============================== apache_beam/io/filesystems_test.py:54 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint: disable=anomalous-backslash-in-string apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2143: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2149: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2443: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2445: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2476: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported | _PassThroughThenCleanup(files_to_remove_pcoll)) apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2139: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. table_ref = client.dataset(dataset_id).table(table_id) apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction. return airline_df[at_top_airports].mean() apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental. sink=lambda _: _WriteToPandasFileSink( apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py37.xml> - = 1 failed, 21 passed, 1 skipped, 5293 deselected, 39 warnings in 264.96 seconds = Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda> target=lambda: self._read_inputs(elements_iterator), File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1649921461.019424360","description":"Error received from peer ipv6:[::1]:38859","file":"src/core/lib/surface/call.cc","file_line":903,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda> target=lambda: self._read_inputs(elements_iterator), File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1649921461.389757139","description":"Error received from peer ipv6:[::1]:41915","file":"src/core/lib/surface/call.cc","file_line":903,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda> target=lambda: self._read_inputs(elements_iterator), File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1649921477.525501674","description":"Error received from peer ipv6:[::1]:46363","file":"src/core/lib/surface/call.cc","file_line":903,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > > Task :sdks:python:test-suites:portable:py37:sparkExamples FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 271 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py39:sparkExamples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 271 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py37:sparkExamples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org BUILD FAILED in 5m 52s 79 actionable tasks: 50 executed, 27 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/6bzd4dpiycd22 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
