See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/38/display/redirect?page=changes>

Changes:

[Kamil Wasilewski] [BEAM-9889] Publish InfluxDB backup to public GCS bucket

[Kamil Wasilewski] [BEAM-9889] Populate local instance of InfluxDB with data


------------------------------------------
[...truncated 13.73 MB...]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 
2020-07-15_05_30_14-159220807272197476
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_30_14-159220807272197476?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-07-15_05_30_14-159220807272197476 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:14.835Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-07-15_05_30_14-159220807272197476. The number of workers will be between 1 
and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:14.835Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-07-15_05_30_14-159220807272197476.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:18.565Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.362Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.392Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey not 
followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.420Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not 
followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.452Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
write/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a 
combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.492Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.515Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.633Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.681Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.716Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s13.WrittenFiles
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.736Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
write/BigQueryBatchFileLoads/IdentityWorkaround, through flatten 
write/BigQueryBatchFileLoads/DestinationFilesUnion, into producer 
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.759Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.790Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow into 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.812Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) into 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.849Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
 into write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.885Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
 into write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.912Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s19-u32 for input s20.None-c30
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.959Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through 
flatten write/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into 
producer write/BigQueryBatchFileLoads/IdentityWorkaround
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:19.992Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/IdentityWorkaround into 
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.027Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into 
write/BigQueryBatchFileLoads/IdentityWorkaround
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.075Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) into 
read/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.110Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RewindowIntoGlobal into 
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.140Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/AppendDestination into 
write/BigQueryBatchFileLoads/RewindowIntoGlobal
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.174Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
 into write/BigQueryBatchFileLoads/AppendDestination
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.206Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into 
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.237Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupShardedRows/Reify into 
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.269Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupShardedRows/Write into 
write/BigQueryBatchFileLoads/GroupShardedRows/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.303Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into 
write/BigQueryBatchFileLoads/GroupShardedRows/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.337Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/DropShardNumber into 
write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.373Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
 into write/BigQueryBatchFileLoads/DropShardNumber
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.406Z: 
JOB_MESSAGE_DETAILED: Fusing consumer write/BigQueryBatchFileLoads/Map(<lambda 
at bigquery_file_loads.py:893>) into 
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.436Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/GenerateFilePrefix into 
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.467Z: 
JOB_MESSAGE_DETAILED: Fusing siblings 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
 and 
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.504Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into 
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.537Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles) into 
read/_PassThroughThenCleanup/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.574Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into 
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.609Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into 
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.642Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into 
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.678Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.714Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow 
into write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.745Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.780Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
write/BigQueryBatchFileLoads/RemoveTempTables/Delete into 
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.809Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.843Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.880Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:20.913Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.110Z: 
JOB_MESSAGE_DEBUG: Executing wait step start47
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.180Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:893>)+write/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.217Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.228Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.252Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.263Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.284Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.307Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.318Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.318Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.332Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.336Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.372Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.393Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.418Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:21.440Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ImpulseEmptyPC/Read.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:30:42.099Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:36:27.087Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:36:46.683Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:36:46.718Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.378Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+write/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:893>)+write/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.426Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.454Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:893>).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.473Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/GenerateFilePrefix.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.507Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.532Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.549Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.557Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.574Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.583Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.588Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.604Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.623Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.627Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.637Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.644Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.666Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:893>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.696Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.722Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:39:46.788Z: 
JOB_MESSAGE_BASIC: Executing operation 
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.538Z: 
JOB_MESSAGE_BASIC: Finished operation 
read/Read+read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.590Z: 
JOB_MESSAGE_DEBUG: Value 
"read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.617Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.645Z: 
JOB_MESSAGE_BASIC: Executing operation 
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.653Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.687Z: 
JOB_MESSAGE_BASIC: Finished operation 
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.699Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.736Z: 
JOB_MESSAGE_DEBUG: Value 
"read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:06.792Z: 
JOB_MESSAGE_BASIC: Executing operation 
read/_PassThroughThenCleanup/Create/Read+read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:12.620Z: 
JOB_MESSAGE_BASIC: Finished operation 
read/_PassThroughThenCleanup/Create/Read+read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:15.837Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:15.921Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:15.959Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:16.020Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.496Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.564Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.592Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.623Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.653Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.677Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.703Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.709Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.721Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.732Z: 
JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.752Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.763Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.786Z: 
JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.792Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.828Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.866Z: 
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:27.899Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:31.829Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:31.922Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:31.987Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:32.039Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:32.101Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:32.170Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:34.779Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:34.849Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:34.900Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:34.967Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:36.872Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:36.936Z: 
JOB_MESSAGE_DEBUG: Executing success step success45
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:37.098Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:37.140Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-07-15T12:40:37.181Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting 
for job 2020-07-15_05_30_14-159220807272197476 after 901 seconds
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_08-9439695238377145737?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_26_41-1378008081438555014?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_44_37-14388886898949333444?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_59_19-13429379347677310608?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_09_24-6368775906144605534?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_17_47-17821074942413105015?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_24_57-17391273832162243663?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_05-13702962645613274178?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_48_50-5636194064935382445?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_01_31-4384212337893214432?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_08_09-18306756154197477607?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_16_26-7170780260779463357?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_27_31-4170835165831709579?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_07-12890112711961901842?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_22_09-14985697245239889064?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_41_12-17665372174556308212?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_57_59-16448968140434242776?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_07_55-17721135192741342422?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_16_12-6520698609134531285?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_24_46-12152581707639393028?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_30_47-18088290167855956442?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_03-4907875762048541901?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_36_05-14392809164313414784?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_53_57-5078629334632717854?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_04_41-1503649938723675830?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_21_54-9297862948555498558?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_06-6575633971169372281?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_13_32-18272273524480780236?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_31_33-1285947447796594497?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_51_39-1903034655303585671?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_03_54-15489048915897387852?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_12_31-15871277725189524840?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_19_57-14699767830581950216?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_04-14689067995397103336?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_10_30-16366119593441363970?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_30_14-159220807272197476?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_49_30-3874537763378728286?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_03_34-6231945513123388229?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_12_36-8913597235976021590?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_19_49-3328129719458395307?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_26_30-4829865429824344227?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_06-8947712860458899413?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_11_43-1308027879434960211?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_29_37-14804547062256596025?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_47_24-17579914171250729409?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_00_40-474733820783883974?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_09_55-11809589285053299389?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_17_54-6719009866266566367?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_25_00-15084383454519706145?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_04_06-14288304746520870379?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_16_03-7179803684093875854?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_32_50-11117544159631216253?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_05_51_27-10971657038761191415?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_03_34-18112541299727745871?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_12_28-1808211806469853507?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-15_06_20_19-1824222537236267825?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py38.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 64 tests in 5617.046s

FAILED (SKIP=7, errors=4, failures=1)

> Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'>
 line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 36m 37s
146 actionable tasks: 113 executed, 31 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/vyezu6cp3nua4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to