See
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/2813/display/redirect>
Changes:
------------------------------------------
[...truncated 22.07 MB...]
},
{
"kind": "Flatten",
"name": "s42",
"properties": {
"display_data": [],
"inputs": [
{
"@type": "OutputReference",
"output_name": "None",
"step_name": "s39"
},
{
"@type": "OutputReference",
"output_name": "None",
"step_name": "s25"
}
],
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
},
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "write/BigQueryBatchFileLoads/Flatten.out"
}
],
"user_name": "write/BigQueryBatchFileLoads/Flatten"
}
}
],
"type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: '2020-09-07T18:39:59.626714Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-09-07_11_39_58-2223130536642369008'
location: 'us-central1'
name: 'beamapp-jenkins-0907183950-702496'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-09-07T18:39:59.626714Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2020-09-07_11_39_58-2223130536642369008]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job:
2020-09-07_11_39_58-2223130536642369008
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-07_11_39_58-2223130536642369008?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-09-07_11_39_58-2223130536642369008 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:39:58.086Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-09-07_11_39_58-2223130536642369008.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:39:58.086Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-09-07_11_39_58-2223130536642369008. The number of workers will be between
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.044Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.678Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.707Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not
followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.747Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not
followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.781Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a
combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.835Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:03.910Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.034Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.089Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.132Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s20 for input s14.WrittenFiles
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.248Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
write/BigQueryBatchFileLoads/IdentityWorkaround, through flatten
write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.294Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.329Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.361Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.395Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
into write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.431Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
into write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.467Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s20-u32 for input s21.None-c30
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.503Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through
flatten write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into
producer write/BigQueryBatchFileLoads/IdentityWorkaround
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.535Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/IdentityWorkaround into
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.569Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into
write/BigQueryBatchFileLoads/IdentityWorkaround
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.598Z:
JOB_MESSAGE_DETAILED: Fusing consumer
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) into
read/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.627Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RewindowIntoGlobal into
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.663Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/AppendDestination into
write/BigQueryBatchFileLoads/RewindowIntoGlobal
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.699Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
into write/BigQueryBatchFileLoads/AppendDestination
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.726Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.760Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupShardedRows/Reify into
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.794Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupShardedRows/Write into
write/BigQueryBatchFileLoads/GroupShardedRows/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.818Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into
write/BigQueryBatchFileLoads/GroupShardedRows/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.851Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/DropShardNumber into
write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.885Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
into write/BigQueryBatchFileLoads/DropShardNumber
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.921Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/GenerateFilePrefix into
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:04.956Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/LoadJobNamePrefix into
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.002Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/CopyJobNamePrefix into
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.038Z:
JOB_MESSAGE_DETAILED: Fusing siblings
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
and
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.075Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.108Z:
JOB_MESSAGE_DETAILED: Fusing consumer
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles) into
read/_PassThroughThenCleanup/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.142Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.175Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.230Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.266Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.301Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.335Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.375Z:
JOB_MESSAGE_DETAILED: Fusing consumer
write/BigQueryBatchFileLoads/RemoveTempTables/Delete into
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.414Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.447Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.481Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.517Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.808Z:
JOB_MESSAGE_DEBUG: Executing wait step start47
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.883Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/GenerateFilePrefix+write/BigQueryBatchFileLoads/LoadJobNamePrefix+write/BigQueryBatchFileLoads/CopyJobNamePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.916Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.939Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.942Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.977Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:05.977Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.014Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.041Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.041Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.041Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.047Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.102Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.136Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.170Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:06.203Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:40:29.306Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:49:50.257Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:50:22.845Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:50:52.337Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.419Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/GenerateFilePrefix+write/BigQueryBatchFileLoads/LoadJobNamePrefix+write/BigQueryBatchFileLoads/CopyJobNamePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.703Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.737Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.772Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/LoadJobNamePrefix.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.805Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/CopyJobNamePrefix.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.839Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.866Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.901Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.922Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.923Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.958Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.974Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:49.990Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.009Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.027Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.049Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.057Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.089Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.113Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.147Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:53:50.192Z:
JOB_MESSAGE_BASIC: Executing operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:10.793Z:
JOB_MESSAGE_BASIC: Finished operation
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:10.861Z:
JOB_MESSAGE_DEBUG: Value
"read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:10.894Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:10.940Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:11.025Z:
JOB_MESSAGE_BASIC: Executing operation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:11.063Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:11.081Z:
JOB_MESSAGE_BASIC: Finished operation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:11.148Z:
JOB_MESSAGE_DEBUG: Value
"read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:11.204Z:
JOB_MESSAGE_BASIC: Executing operation
read/_PassThroughThenCleanup/Create/Read+read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:13.359Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:13.430Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:13.483Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:13.555Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:27.233Z:
JOB_MESSAGE_BASIC: Finished operation
read/_PassThroughThenCleanup/Create/Read+read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.751Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.823Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.854Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.888Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.936Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.970Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:30.987Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.001Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.027Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.032Z:
JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.049Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.057Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.094Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.106Z:
JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.116Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.169Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:31.205Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.478Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.546Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.616Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.669Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.731Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:34.798Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:35.493Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:35.558Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:35.604Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:35.668Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:37.603Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:37.660Z:
JOB_MESSAGE_DEBUG: Executing success step success45
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:37.729Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:37.785Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-09-07T18:54:37.819Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting
for job 2020-09-07_11_39_58-2223130536642369008 after 901 seconds
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 4853.974s
FAILED (SKIP=7, failures=2)
> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 118
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 25m 46s
171 actionable tasks: 131 executed, 36 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/tki4ql7m2jybs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]