See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/4635/display/redirect?page=changes>
Changes: [noreply] [BEAM-13015] Allow decoding a set of elements until we hit the block ------------------------------------------ [...truncated 314.42 KB...] [0m[1m[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0mgw0 [30] / gw1 [30] / gw2 [30] / gw3 [30] / gw4 [30] / gw5 [30] / gw6 [30] / gw7 [30] scheduling tests via LoadScheduling apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_native apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_new_types_avro apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_legacy_sql apache_beam/examples/wordcount_it_test.py::WordCountIT::test_wordcount_it apache_beam/io/gcp/big_query_query_to_table_it_test.py::BigQueryQueryToTableIT::test_big_query_standard_sql apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_data_only apache_beam/io/gcp/pubsub_integration_test.py::PubSubIntegrationTest::test_streaming_with_attributes > Task :runners:flink:1.13:job-server:shadowJar > Task :runners:google-cloud-dataflow-java:worker:compileJava Note: Some input files use or override a deprecated API. Note: Recompile with -Xlint:deprecation for details. Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :runners:google-cloud-dataflow-java:worker:classes > Task :sdks:python:test-suites:direct:py37:spannerioIT >>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner >>> --project=apache-beam-testing --region=us-central1 >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=build/apache-beam.tar.gz >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> pytest options: --capture=no --numprocesses=8 --timeout=4500 --color=yes >>> --log-cli-level=INFO >>> apache_beam/io/gcp/experimental/spannerio_read_it_test.py >>> apache_beam/io/gcp/experimental/spannerio_write_it_test.py >>> collect markers: [1m============================= test session starts ==============================[0m platform linux -- Python 3.7.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1 rootdir: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python,> inifile: pytest.ini plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3 timeout: 4500.0s timeout method: signal timeout func_only: False gw0 I / gw1 I / gw2 I / gw3 I / gw4 I / gw5 I / gw6 I / gw7 I [1m[gw0] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw2] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw4] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw1] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw5] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw3] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw6] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0m[1m[gw7] Python 3.7.10 (default, Feb 20 2021, 21:21:24) -- [GCC 5.4.0 20160609] [0mgw0 [15] / gw1 [15] / gw2 [15] / gw3 [15] / gw4 [15] / gw5 [15] / gw6 [15] / gw7 [15] scheduling tests via LoadScheduling apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_ok_call apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_ok_call apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_sql apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_ok_call [gw5] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_table_metrics_error_call [gw2] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_read_via_table apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_table_metrics_ok_call apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_spanner_error > Task :runners:google-cloud-dataflow-java:worker:shadowJar > Task :sdks:python:test-suites:direct:py37:spannerioIT [gw4] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_sql_metrics_error_call [gw6] [32mPASSED[0m apache_beam/io/gcp/experimental/spannerio_read_it_test.py::SpannerReadIntegrationTest::test_transaction_sql_metrics_error_call apache_beam/io/gcp/experimental/spannerio_write_it_test.py::SpannerWriteIntegrationTest::test_metrics_error_call > Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:39995 WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.36.0.dev INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f827c060ef0> ==================== INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f827c060f80> ==================== INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f827c05e710> ==================== INFO:apache_beam.utils.subprocess_server:Starting service with ['java' '-jar' '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.36.0-SNAPSHOT.jar'> '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-tempc7n4e8nw/artifactskd9dslrc' '--job-port' '48459' '--artifact-port' '0' '--expansion-port' '0'] WARNING:root:Waiting for grpc channel to be ready at localhost:48459. WARNING:root:Waiting for grpc channel to be ready at localhost:48459. WARNING:root:Waiting for grpc channel to be ready at localhost:48459. WARNING:root:Waiting for grpc channel to be ready at localhost:48459. INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:46511' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:44397' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:48459' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:21 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Job server now running, terminate with Ctrl+C' WARNING:root:Waiting for grpc channel to be ready at localhost:48459. WARNING:root:Waiting for grpc channel to be ready at localhost:48459. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--parallelism=2'] INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.ref_Environment_default_environment_1.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 1 artifacts for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.null.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:26 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_10d6bec1-e6cc-4b1c-bbcb-6941f180a4b1.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:27 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:27 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37' INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using with Pipeline() as p: p.apply(..) This ensures that the pipeline finishes before this program exits. INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:29 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:30 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated aggregators accumulator:' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics accumulator: MetricQueryResults()' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37 on Spark master local[4]' INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0 INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46871. INFO:apache_beam.runners.worker.sdk_worker:Control channel established. INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers. INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 1-1' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-2' INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:44635. INFO:apache_beam.runners.worker.sdk_worker:State channel established. INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:44501 INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-3' > Task :sdks:python:test-suites:dataflow:py37:postCommitIT > Task :sdks:python:test-suites:portable:py37:portableWordCountSparkRunnerBatch INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-4' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-5' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-8' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-7' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-6' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-9' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-10' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-11' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-13' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-12' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-14' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-15' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37: Pipeline translated successfully. Computing outputs' INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 1-16' INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4 INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.16 seconds. INFO:apache_beam.utils.subprocess_server:b'21/12/14 12:18:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job BeamApp-jenkins-1214121827-aca2bcde_33aed254-85c6-4a85-8660-4bd97b757f37 finished.' INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane. Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "{"created":"@1639484321.789422507","description":"Error received from peer ipv4:127.0.0.1:44501","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}" > Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 651, in <lambda> target=lambda: self._read_inputs(elements_iterator), File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 634, in _read_inputs for elements in elements_iterator: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "{"created":"@1639484321.789422507","description":"Error received from peer ipv4:127.0.0.1:44501","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}" > Exception in thread run_worker_1-1: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 234, in run for work_request in self._control_stub.Control(get_responses()): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "{"created":"@1639484321.790495211","description":"Error received from peer ipv4:127.0.0.1:46871","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}" > Exception in thread read_state: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 957, in pull_responses for response in responses: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 426, in __next__ return self._next() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py",> line 826, in _next raise self grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "{"created":"@1639484321.790462071","description":"Error received from peer ipv4:127.0.0.1:44635","file":"src/core/lib/surface/call.cc","file_line":1063,"grpc_message":"Socket closed","grpc_status":14}" > > Task :sdks:python:test-suites:portable:py37:postCommitPy37IT > Task :sdks:python:test-suites:dataflow:py37:postCommitIT warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md warning: check: missing required meta-data: url warning: check: missing meta-data: either (author and author_email) or (maintainer and maintainer_email) must be supplied > Task :sdks:python:test-suites:portable:py37:postCommitPy37 > Task :sdks:python:test-suites:portable:py37:xlangSpannerIOIT > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED > Task :sdks:python:test-suites:dataflow:py37:spannerioIT FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 120 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 2h 50m 5s 217 actionable tasks: 179 executed, 34 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/pydxanh7uo63a Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
