See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/30/display/redirect?page=changes>
Changes:
[Carl Yeksigian] Cache bucket matcher regex in GcsPath
[noreply] [BEAM-4032]Support staging binary distributions of dependency packages
------------------------------------------
[...truncated 491.84 KB...]
[1m os.path.join(temp_folder, 'input.txt'),[0m
[1m '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))[0m
[1m extra_opts = {[0m
[1m 'input': '%s/input.txt' % temp_folder,[0m
[1m 'output': os.path.join(temp_folder, 'result')[0m
[1m }[0m
[1m coders.run(test_pipeline.get_full_options_as_args(**extra_opts))[0m
[1m [0m
[1m # Load result file and compare.[0m
[1m with open_shards(os.path.join(temp_folder, 'result-*-of-*')) as
result_file:[0m
[1m result = result_file.read().strip()[0m
[1m [0m
[1m self.assertEqual([0m
[1m> sorted(self.EXPECTED_RESULT),
sorted(self.format_result(result)))[0m
[1m[31mapache_beam/examples/cookbook/coders_test.py[0m:98:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[1m[31mapache_beam/examples/cookbook/coders_test.py[0m:60: in format_result
[1m result_list = list([0m
[1m[31mapache_beam/examples/cookbook/coders_test.py[0m:62: in <lambda>
[1m lambda result_elem: format_tuple(result_elem.split(',')),[0m
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
result_elem_list = ['']
[1m def format_tuple(result_elem_list):[0m
[1m> [country, counter] = result_elem_list[0m
[1m[31mE ValueError: not enough values to unpack (expected 2, got 1)[0m
[1m[31mapache_beam/examples/cookbook/coders_test.py[0m:57: ValueError
------------------------------ Captured log call -------------------------------
[32mINFO [0m root:coders_test.py:51 Creating temp file:
/tmp/tmppb5r0ql4/input.txt
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754
Executing command:
['<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/bin/python3.8',>
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'/tmp/tmp0nh4ux64/tmp_requirements.txt', '--exists-action', 'i', '--no-deps',
'--implementation', 'cp', '--abi', 'cp38', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying
Beam SDK
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
[33mWARNING [0m root:environments.py:371 Make sure that locally built Python
SDK docker image has Python 3.8 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for
environment is apache/beam_python3.8_sdk:2.38.0.dev
[32mINFO [0m root:environments.py:295 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208
[32mINFO [0m root:environments.py:302 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208" for Docker
environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7fda1383c5e0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7fda1383cdc0>
====================
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/requirements.txt...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/requirements.txt
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pickled_main_session...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pickled_main_session
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pbr-5.8.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pbr-5.8.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/mock-2.0.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/mock-2.0.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/six-1.16.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/six-1.16.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/soupsieve-2.3.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/soupsieve-2.3.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/PyHamcrest-1.10.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/parameterized-0.7.5.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/parameterized-0.7.5.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/beautifulsoup4-4.10.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/mock-2.0.0-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/PyHamcrest-1.10.1-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/parameterized-0.7.5-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/dataflow_python_sdk.tar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/dataflow_python_sdk.tar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0211200859-114783-agsjc47k.1644610139.114969/pipeline.pb
in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:886 Create job:
<Job
clientRequestId: '20220211200859115921-7685'
createTime: '2022-02-11T20:09:01.836355Z'
currentStateTime: '1970-01-01T00:00:00Z'
id:
'2022-02-11_12_09_01-12786216560451028583'
location: 'us-central1'
name: 'beamapp-jenkins-0211200859-114783-agsjc47k'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-02-11T20:09:01.836355Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 Created job
with id: [2022-02-11_12_09_01-12786216560451028583]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:889 Submitted job:
2022-02-11_12_09_01-12786216560451028583
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-11_12_09_01-12786216560451028583?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-02-11_12_09_01-12786216560451028583 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:04.400Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2022-02-11_12_09_01-12786216560451028583. The number of workers will be between
1 and 1000.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:04.514Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2022-02-11_12_09_01-12786216560451028583.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:15.401Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.160Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.198Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.278Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.335Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.440Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.478Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.561Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.604Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/InitializeWrite into
write/Write/WriteImpl/DoOnce/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.638Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Map(<lambda at iobase.py:898>) into read/Read/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.674Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
into read/Read/Map(<lambda at iobase.py:898>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.714Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.763Z: JOB_MESSAGE_DETAILED: Fusing consumer points into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.806Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial into points
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.841Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/GroupByKey/Write into
CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.884Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/Combine into CombinePerKey(sum)/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.925Z: JOB_MESSAGE_DETAILED: Fusing consumer
CombinePerKey(sum)/Combine/Extract into CombinePerKey(sum)/Combine
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:19.974Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WindowInto(WindowIntoFn) into
CombinePerKey(sum)/Combine/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.010Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3228>) into
write/Write/WriteImpl/DoOnce/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.045Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/Map(decode) into
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3228>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.094Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WriteBundles into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.130Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.163Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/Pair
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.197Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.270Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.328Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.364Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.427Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.697Z: JOB_MESSAGE_DEBUG: Executing wait step start34
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.794Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3228>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.841Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.846Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.875Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.886Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.914Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:20.979Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:21.009Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:21.050Z: JOB_MESSAGE_DEBUG: Value
"CombinePerKey(sum)/GroupByKey/Session" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:21.085Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/GroupByKey/Session" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:09:21.967Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:10:27.129Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:10:45.397Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:10:45.439Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.116Z: JOB_MESSAGE_BASIC: Finished operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.209Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3228>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.212Z: JOB_MESSAGE_DEBUG: Value
"ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.317Z: JOB_MESSAGE_BASIC: Executing operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.355Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/DoOnce/Map(decode).None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.388Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/InitializeWrite.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.459Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.504Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.515Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.546Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.551Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.601Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.612Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.637Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:17:59.690Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:04.918Z: JOB_MESSAGE_BASIC: Finished operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+points+CombinePerKey(sum)/GroupByKey+CombinePerKey(sum)/Combine/Partial+CombinePerKey(sum)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:05.028Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:05.093Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:05.168Z: JOB_MESSAGE_BASIC: Executing operation
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:06.340Z: JOB_MESSAGE_BASIC: Finished operation
CombinePerKey(sum)/GroupByKey/Read+CombinePerKey(sum)/Combine+CombinePerKey(sum)/Combine/Extract+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:06.418Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:06.486Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:06.573Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:08.834Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:08.946Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/Extract.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.095Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.128Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.151Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.204Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.245Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.285Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:09.398Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:11.954Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:12.050Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:12.140Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:12.196Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:12.276Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:12.378Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:14.155Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:14.235Z: JOB_MESSAGE_DEBUG: Executing success step success32
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:14.358Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:14.425Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:18:14.471Z: JOB_MESSAGE_BASIC: Stopping worker pool...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:20:35.891Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker
pool from 1 to 0.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:20:35.953Z: JOB_MESSAGE_BASIC: Worker pool stopped.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-11T20:20:36.004Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-02-11_12_09_01-12786216560451028583 is in state JOB_STATE_DONE
[33m=============================== warnings summary
===============================[0m
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/tenacity/_asyncio.py>:42:
DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use
"async def" instead
def call(self, fn, *args, **kwargs):
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2463:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
pipeline_options=pcoll.pipeline.options,
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/pytest_postCommitIT-df-py38-no-xdist.xml>
-
[31m[1m===== 4 failed, 3 passed, 5204 deselected, 12 warnings in 4866.80
seconds ======[0m
> Task :sdks:python:test-suites:dataflow:py38:examples FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 182
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1h 46m 34s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5wgw2sv6ocw7q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]