See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/4327/display/redirect?page=changes>
Changes: [Robert Bradshaw] Test cleanup. [ajamato] [BEAM-11994] Update BigQueryServicesImpl to capture API_REQUEST_COUNT [noreply] [BEAM-3713] Move PerformanceTest and CrossLanguageValidateRunner from [Andrew Pilloud] [BEAM-12508] Remove all gradle from release [Andrew Pilloud] [BEAM-12507] Remove website from release [noreply] Merge pull request #15012 from [BEAM-12068] Run Dataflow V2 performance ------------------------------------------ [...truncated 1.62 MB...] gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I gw0 [28] / gw1 [28] / gw2 [28] / gw3 [28] / gw4 [28] / gw5 [28] ........................... > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:43.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:31.037Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:31.191Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2021-06-17_18_20_26-5594080212772861728. The number of workers will be between 1 and 100. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:31.239Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2021-06-17_18_20_26-5594080212772861728. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:33.551Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.158Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.187Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.280Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.335Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.366Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.400Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.455Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.515Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.546Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.571Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.592Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.622Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.663Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.687Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.721Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.780Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.836Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.881Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:34.931Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into encode INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.030Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.205Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.286Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.350Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.386Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.456Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.489Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:20:35.527Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:21:12.546Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete > Task :sdks:python:test-suites:tox:py38:testPy38pyarrow-4 . [100%] =============================== warnings summary =============================== apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 apache_beam/io/filesystems_test.py:54 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint: disable=anomalous-backslash-in-string apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/io/parquetio_test.py:416 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:416: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead for batch in orig.to_batches(chunksize=20) apache_beam/io/parquetio_test.py:377 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:377: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/io/parquetio_test.py:367 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:367: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/dataframe/io.py:566 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/dataframe/io.py>:566: FutureWarning: WriteToFiles is experimental. return pcoll | fileio.WriteToFiles( apache_beam/io/fileio.py:478 apache_beam/io/fileio.py:478 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/fileio.py>:478: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/pytest_py38-pyarrow-4.xml> - ============== 28 passed, 1 skipped, 22 warnings in 33.51 seconds ============== ============================= test session starts ============================== platform linux -- Python 3.8.5, pytest-4.6.11, py-1.10.0, pluggy-0.13.1 cachedir: target/.tox-py38-pyarrow-4/py38-pyarrow-4/.pytest_cache rootdir: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python,> inifile: pytest.ini plugins: xdist-1.34.0, timeout-1.4.2, forked-1.3.0, requests-mock-1.9.3 timeout: 600.0s timeout method: signal timeout func_only: False collected 4679 items / 4651 deselected / 3 skipped / 25 selected apache_beam/dataframe/io_test.py . [ 3%] apache_beam/io/parquetio_test.py ................. > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:21:25.333Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:21:25.373Z: JOB_MESSAGE_DETAILED: Workers have started successfully. > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:21:30.071Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. > Task :sdks:python:test-suites:tox:py38:testPy38pyarrow-4 .......... [100%] =============================== warnings summary =============================== target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py:42 <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/target/.tox-py38-pyarrow-4/py38-pyarrow-4/lib/python3.8/site-packages/tenacity/_asyncio.py>:42: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead def call(self, fn, *args, **kwargs): apache_beam/dataframe/io_test.py::IOTest::test_read_write_parquet <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/dataframe/io.py>:566: FutureWarning: WriteToFiles is experimental. return pcoll | fileio.WriteToFiles( apache_beam/dataframe/io_test.py::IOTest::test_read_write_parquet apache_beam/dataframe/io_test.py::IOTest::test_read_write_parquet <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/fileio.py>:478: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or apache_beam/io/parquetio_test.py::TestParquet::test_int96_type_conversion <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:416: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead for batch in orig.to_batches(chunksize=20) apache_beam/io/parquetio_test.py::TestParquet::test_read_with_splitting_multiple_row_group <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:377: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( apache_beam/io/parquetio_test.py::TestParquet::test_read_without_splitting_multiple_row_group <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/apache_beam/io/parquetio_test.py>:367: FutureWarning: The parameter chunksize is deprecated for pyarrow.Table.to_batches as of 0.15, please use the parameter max_chunksize instead pa.Table.from_batches([batch]) for batch in self._records_as_arrow( -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/pytest_py38-pyarrow-4_no_xdist.xml> - ====== 28 passed, 3 skipped, 4651 deselected, 7 warnings in 45.28 seconds ====== [1mpy38-pyarrow-4 run-test-post: commands[0] | bash <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/test-suites/tox/py38/build/srcs/sdks/python/scripts/run_tox_cleanup.sh> [0m___________________________________ summary ____________________________________ [32m py38-pyarrow-4: commands succeeded [0m[32m congratulations :) [0m > Task :sdks:python:test-suites:tox:py38:preCommitPy38 > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:22:12.766Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-06-18T01:22:12.821Z: JOB_MESSAGE_DETAILED: Workers have started successfully. > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-06-17_18_19_46-10769130769176967853 after 360 seconds DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 244 > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2021-06-17_18_20_26-5594080212772861728 after 364 seconds DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 244 > Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2 Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-17_18_19_46-10769130769176967853?project=apache-beam-testing test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok ---------------------------------------------------------------------- XML: nosetests-preCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 1 test in 611.377s OK > Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2 Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2021-06-17_18_20_26-5594080212772861728?project=apache-beam-testing test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok ---------------------------------------------------------------------- XML: nosetests-preCommitIT-df-py36.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 1 test in 600.559s OK > Task :sdks:python:test-suites:dataflow:preCommitIT_V2 FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':sdks:python:test-suites:tox:py37:testPy37Cloud'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':sdks:python:test-suites:tox:py36:testPy36Cloud'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 17m 2s 103 actionable tasks: 78 executed, 25 from cache Publishing build scan... https://gradle.com/s/bturkohlkfdmi Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
