See
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/398/display/redirect?page=changes>
Changes:
[github] [BEAM-9674] Don't specify selected fields when fetching BigQuery table
------------------------------------------
[...truncated 5.55 MB...]
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Unkey.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s19"
},
"serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29",
"user_name": "assert_that/Unkey"
}
},
{
"kind": "ParallelDo",
"name": "s21",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "_equal"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_4"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s20"
},
"serialized_fn": "ref_AppliedPTransform_assert_that/Match_30",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: u'2020-04-17T15:18:01.220846Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-04-17_08_17_59-18242691682725659369'
location: u'us-central1'
name: u'beamapp-jenkins-0417151743-732349'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-04-17T15:18:01.220846Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2020-04-17_08_17_59-18242691682725659369]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_59-18242691682725659369?project=apache-beam-testing
apache_beam.runners.dataflow.test_dataflow_runner: WARNING: Waiting
indefinitely for streaming job.
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-04-17_08_17_59-18242691682725659369 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:17:59.579Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-04-17_08_17_59-18242691682725659369. The number of workers will be between
1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:17:59.579Z:
JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine.
Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:17:59.579Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-04-17_08_17_59-18242691682725659369.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:07.601Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:08.900Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.550Z:
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable
parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.587Z:
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into
optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.656Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.697Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.739Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not
followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.782Z:
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into
optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.810Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write
steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:09.930Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.027Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.107Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.150Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.182Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/WriteStream, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.217Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream
into assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.258Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.297Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.332Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/RemoveRandomKeys into
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.364Z:
JOB_MESSAGE_DETAILED: Fusing consumer Input/Map(decode) into
Input/MaybeReshuffle/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.406Z:
JOB_MESSAGE_DETAILED: Fusing consumer ApplyPardo into Input/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.448Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
ApplyPardo
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.488Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.516Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.542Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets
into assert_that/Group/GroupByKey/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.581Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.620Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.665Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.707Z:
JOB_MESSAGE_DETAILED: Fusing consumer Input/FlatMap(<lambda at core.py:2714>)
into Input/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.748Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at
core.py:2714>) into assert_that/Create/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.783Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into
assert_that/Create/FlatMap(<lambda at core.py:2714>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.823Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.862Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/AddRandomKeys into Input/FlatMap(<lambda at
core.py:2714>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.901Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into
Input/MaybeReshuffle/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.934Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into
Input/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:10.978Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:11.009Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:11.040Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:11.073Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:13.375Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:13.408Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:13.443Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:34.420Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:18:44.340Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that
the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:19:18.019Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:19:18.046Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:12.314Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:56.842Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:56.894Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:56.923Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:56.973Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:24:57.005Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:26:35.285Z:
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on
low average worker CPU utilization, and the pipeline having sufficiently low
backlog and keeping up with input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:26:35.347Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-17T15:26:35.385Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-04-17_08_17_59-18242691682725659369 is in state JOB_STATE_DONE
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2266.500s
FAILED (failures=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_59-17022663198432437248?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_28_15-9443217446843233364?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_37_23-17371675503949286321?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_47_58-16309549145951032231?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_59-18242691682725659369?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_27_06-13591432093857567791?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_37_06-14920666951280719510?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_18_00-17267669916059648518?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_26_14-13980108608683528277?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_36_02-3364966737283538344?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_59-4761261400762588376?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_27_30-15844489780896366293?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_37_04-7504431314979180106?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_18_00-14497749530535792014?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_27_16-8354560654425024147?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_36_00-7956444223674888705?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_57-17557004477132285255?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_27_33-671969122095322537?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_36_37-95676805106627666?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_18_00-3489604969984768793?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_28_14-4514199258017026517?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_17_58-3135298841292343542?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_27_08-11214474187002971043?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_08_36_37-8445014582316297010?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
> FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 142
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 17m 32s
64 actionable tasks: 49 executed, 15 from cache
Publishing build scan...
https://gradle.com/s/3ejr2bj3tqm6i
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]