See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/295/display/redirect?page=changes>
Changes:
[Andrew Pilloud] [BEAM-14321] SQL passes Null for Null aggregates
[noreply] Create apache-hop-with-dataflow.md
[noreply] Add files via upload
[noreply] Delete website/www/site/content/en/blog/apache-hop-with-dataflow
[noreply] Add files via upload
[noreply] Update apache-hop-with-dataflow.md
[noreply] Update apache-hop-with-dataflow.md
[noreply] Update apache-hop-with-dataflow.md
[danielamartinmtz] Moved up get-credentials instruction for getting the
kubeconfig file
[noreply] Merge pull request #17428: [BEAM-14326] Make sure BigQuery daemon
thread
[noreply] [BEAM-14301] Add lint:ignore to noescape() func (#17355)
[noreply] [BEAM-14286] Remove unused vars in harness package (#17392)
[noreply] [BEAM-14327] Convert Results to QueryResults directly (#17398)
[noreply] [BEAM-14302] Simplify boolean check in fn.go (#17399)
[noreply] [BEAM-13983] Sklearn Loader for RunInference (#17368)
[noreply] Update authors.yml
------------------------------------------
[...truncated 83.44 MB...]
apache_beam/examples/cookbook/multiple_output_pardo_test.py::MultipleOutputParDo::test_multiple_output_pardo
-------------------------------- live log call ---------------------------------
INFO root:pipeline.py:185 Missing pipeline option (runner). Executing
pipeline using the default runner: DirectRunner.
WARNING root:environments.py:371 Make sure that locally built Python SDK
docker image has Python 3.9 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.9_sdk:2.39.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function annotate_downstream_side_inputs at
0x7ff6680888b0> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function fix_side_input_pcoll_coders at 0x7ff6680889d0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function pack_combiners at 0x7ff668088ee0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function lift_combiners at 0x7ff668088f70>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_sdf at 0x7ff6680ac160>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_gbk at 0x7ff6680ac1f0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sink_flattens at 0x7ff6680ac310>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function greedily_fuse at 0x7ff6680ac3a0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function read_to_impulse at 0x7ff6680ac430>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function impulse_to_input at 0x7ff6680ac4c0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sort_stages at 0x7ff6680ac700>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function add_impulse_to_dangling_transforms at
0x7ff6680ac820> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function setup_timer_mapping at 0x7ff6680ac670>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function populate_data_channel_coders at 0x7ff6680ac790>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:889
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7ff5e4179970> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO apache_beam.io.filebasedsink:filebasedsink.py:297 Starting
finalize_write threads with num_shards: 1 (skipped: 0), batches: 1,
num_threads: 1
INFO apache_beam.io.filebasedsink:filebasedsink.py:345 Renamed 1 shards in
0.00 seconds.
INFO apache_beam.io.filebasedsink:filebasedsink.py:297 Starting
finalize_write threads with num_shards: 1 (skipped: 0), batches: 1,
num_threads: 1
INFO apache_beam.io.filebasedsink:filebasedsink.py:345 Renamed 1 shards in
0.00 seconds.
INFO apache_beam.io.filebasedsink:filebasedsink.py:297 Starting
finalize_write threads with num_shards: 1 (skipped: 0), batches: 1,
num_threads: 1
INFO apache_beam.io.filebasedsink:filebasedsink.py:345 Renamed 1 shards in
0.00 seconds.
PASSED [ 91%]
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
SKIPPED [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
-------------------------------- live log call ---------------------------------
INFO root:pipeline.py:185 Missing pipeline option (runner). Executing
pipeline using the default runner: DirectRunner.
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[set_column_DataFrame_140695811456784],
ComputedExpression[set_index_DataFrame_140695473841008],
ComputedExpression[pre_combine_sum_DataFrame_140695811415248]]:140695473543200]>
for
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140695811265152]},
partitioning=Arbitrary,
ops=[ComputedExpression[set_column_DataFrame_140695811456784],
ComputedExpression[set_index_DataFrame_140695473841008],
ComputedExpression[pre_combine_sum_DataFrame_140695811415248]],
outputs={ComputedExpression[pre_combine_sum_DataFrame_140695811415248],
PlaceholderExpression[placeholder_DataFrame_140695811265152]}]
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[post_combine_sum_DataFrame_140695811413184]]:140695473542336]>
for
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_140695811415248]},
partitioning=Index,
ops=[ComputedExpression[post_combine_sum_DataFrame_140695811413184]],
outputs={ComputedExpression[post_combine_sum_DataFrame_140695811413184]}]
INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory
/tmp/.temp2cff859d-26ed-45b0-8789-9697684c2956
WARNING root:environments.py:371 Make sure that locally built Python SDK
docker image has Python 3.9 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.9_sdk:2.39.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function annotate_downstream_side_inputs at
0x7ff6680888b0> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function fix_side_input_pcoll_coders at 0x7ff6680889d0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function pack_combiners at 0x7ff668088ee0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function lift_combiners at 0x7ff668088f70>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_sdf at 0x7ff6680ac160>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_gbk at 0x7ff6680ac1f0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sink_flattens at 0x7ff6680ac310>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function greedily_fuse at 0x7ff6680ac3a0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function read_to_impulse at 0x7ff6680ac430>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function impulse_to_input at 0x7ff6680ac4c0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sort_stages at 0x7ff6680ac700>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function add_impulse_to_dangling_transforms at
0x7ff6680ac820> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function setup_timer_mapping at 0x7ff6680ac670>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function populate_data_channel_coders at 0x7ff6680ac790>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:889
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7ff673907f10> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO apache_beam.io.fileio:fileio.py:637 Moving temporary file
/tmp/.temp2cff859d-26ed-45b0-8789-9697684c2956/4804582983399409602_211d6ec8-ea13-4829-8ccd-0f63af17402c
to dir: /tmp as tmppx21odg_.result-00000-of-00001. Res:
FileResult(file_name='/tmp/.temp2cff859d-26ed-45b0-8789-9697684c2956/4804582983399409602_211d6ec8-ea13-4829-8ccd-0f63af17402c',
shard_index=-1, total_shards=0, window=GlobalWindow, pane=None,
destination=None)
INFO apache_beam.io.fileio:fileio.py:661 Checking orphaned temporary files
for destination None and window GlobalWindow
INFO apache_beam.io.fileio:fileio.py:676 Some files may be left orphaned in
the temporary folder: []
PASSED [100%]
=================================== FAILURES ===================================
_______________ CodersIT.test_coders_output_files_on_small_input _______________
self = <apache_beam.examples.cookbook.coders_it_test.CodersIT
testMethod=test_coders_output_files_on_small_input>
@pytest.mark.no_xdist
@pytest.mark.examples_postcommit
def test_coders_output_files_on_small_input(self):
test_pipeline = TestPipeline(is_integration_test=True)
# Setup the files with expected content.
OUTPUT_FILE_DIR = \
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'
output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])
INPUT_FILE_DIR = \
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'
input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])
create_content_input_file(
input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))
extra_opts = {'input': input, 'output': output}
> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))
apache_beam/examples/cookbook/coders_it_test.py:93:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
apache_beam/examples/cookbook/coders.py:87: in run
p
apache_beam/transforms/ptransform.py:1092: in __ror__
return self.transform.__ror__(pvalueish, self.label)
apache_beam/transforms/ptransform.py:614: in __ror__
result = p.apply(self, pvalueish, label)
apache_beam/pipeline.py:662: in apply
return self.apply(transform, pvalueish)
apache_beam/pipeline.py:708: in apply
pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/runner.py:185: in apply
return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
return transform.expand(input)
apache_beam/io/textio.py:690: in expand
self._source.output_type_hint())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.io.textio._TextSource object at 0x7ff653658c40>
def output_type_hint(self):
try:
> return self._coder.to_type_hint()
E AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'
apache_beam/io/textio.py:409: AttributeError
------------------------------ Captured log call -------------------------------
INFO root:coders_it_test.py:48 Creating file:
gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/5bad7c31-36e5-4fc3-b612-6a2a20f13ec9/input.txt
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
<unknown>:54
<unknown>:54: DeprecationWarning: invalid escape sequence \c
<unknown>:62
<unknown>:62: DeprecationWarning: invalid escape sequence \d
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/build/gradleenv/2022703443/lib/python3.9/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
pipeline_options=pcoll.pipeline.options,
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2150:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/dataframe/io.py>:629:
FutureWarning: WriteToFiles is experimental.
return pcoll | fileio.WriteToFiles(
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/pytest_postCommitExamples-flink-py39.xml>
-
= 1 failed, 20 passed, 3 skipped, 5354 deselected, 36 warnings in 427.55
seconds =
> Task :sdks:python:test-suites:portable:py39:flinkExamples FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 240
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:flinkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Flink/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 240
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py39:flinkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 8m 34s
133 actionable tasks: 87 executed, 44 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/j7bn57hzzint4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]