See 
<https://builds.apache.org/job/beam_PostCommit_Python35/1242/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8962] Add option to disable the metric container accumulator


------------------------------------------
[...truncated 924.65 KB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:19.353Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:21.984Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:22.050Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:22.131Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:22.204Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:22.276Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:39.428Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:39.496Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:39.633Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:39.759Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:39.935Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:44.912Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:45.765Z: 
JOB_MESSAGE_BASIC: Finished operation Create/Read+Map(<lambda at 
bigquery_file_loads_test.py:674>)+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:45.834Z: 
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:45.888Z: 
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:45.950Z: 
JOB_MESSAGE_BASIC: Executing operation 
GroupByKey/Read+GroupByKey/GroupByWindow+FlatMap(<lambda at 
bigquery_file_loads_test.py:676>)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteWithMultipleDests/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write+WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.413Z: 
JOB_MESSAGE_BASIC: Finished operation 
GroupByKey/Read+GroupByKey/GroupByWindow+FlatMap(<lambda at 
bigquery_file_loads_test.py:676>)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteWithMultipleDests/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write+WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.490Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.523Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.540Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.572Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.612Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:03:57.649Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:04.055Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:04.140Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:04.191Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:04.281Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:08.496Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:08.547Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:08.589Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:08.643Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:16.900Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:16.947Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:16.971Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.007Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.024Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.050Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.062Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.109Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.119Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.139Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.193Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.217Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.239Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.797Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.852Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.874Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.899Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.924Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.948Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:17.964Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.006Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.013Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.025Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.057Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.080Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:18.106Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.665Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.712Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.769Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.834Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.887Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:21.943Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.337Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.394Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.449Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.493Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.514Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.547Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.573Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.581Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.630Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:24.652Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:26.194Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:27.313Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:27.367Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:27.408Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:27.471Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:29.167Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:29.222Z: 
JOB_MESSAGE_DEBUG: Executing success step success97
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:29.340Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:29.384Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:04:29.403Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:05:55.119Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:05:55.171Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-16T17:05:55.205Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-16_08_57_35-16431532638118655659 is in state JOB_STATE_DONE
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query 
SELECT name, language FROM python_bq_file_loads_15765154208341.output_table1 to 
BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/9cd81a10-b26d-4a3c-81a8-3e178377f6cf?location=US&maxResults=0&timeoutMs=10000
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/9cd81a10-b26d-4a3c-81a8-3e178377f6cf?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonb035a1b79dced3c81ffd4ed33563a5d8f3421e34/data
 HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Result of query is: [('beam', 
'go'), ('beam', 'py'), ('spark', 'py'), ('beam', 'java'), ('flink', 'java'), 
('flink', 'scala'), ('spark', 'scala'), ('spark', 'scala')]
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query 
SELECT name, foundation FROM python_bq_file_loads_15765154208341.output_table2 
to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/1ef7b3b4-e551-48ac-afd8-912ec04afc27?location=US&maxResults=0&timeoutMs=10000
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/1ef7b3b4-e551-48ac-afd8-912ec04afc27?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon1a2d2d486d10289fbf8b66647201ca0f93609c1e/data
 HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Result of query is: [('flink', 
'apache'), ('spark', 'apache'), ('beam', 'apache')]
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query 
SELECT name, language FROM python_bq_file_loads_15765154208341.output_table3 to 
BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 500 41
apache_beam.io.gcp.bigquery_file_loads_test: INFO: Deleting dataset 
python_bq_file_loads_15765154208341 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_30-7641843263311752478?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_02_54-12402984188437383708?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_11_03-10346193068111930885?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_18_52-14855560308527889599?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_26_46-8065880326512533025?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_27-10841978310686083319?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_11_08-1822898390673370864?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_18_49-6746597018035297137?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_26_24-12097256073263635517?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_29-2898709727297119427?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_00_59-7716113872540435380?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_09_11-11190988591581841933?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_29-4951319983933861549?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_06_32-11358914183468906080?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_14_46-3049400754209209299?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_21_54-15165139076481884284?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_28-11995068409813572882?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_57_27-14745860367975787355?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_05_18-18258086117852420901?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_13_19-14961112106732757971?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_21_08-9046011764904280652?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_27-4002864298042470890?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_57_07-17381217935192211980?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_04_53-3910875101144167027?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_13_05-8529733788718725963?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_21_12-4353359396527424064?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_29_02-17637622910986160462?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_28-4800474726120858576?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_57_03-14932265100921946189?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_06_09-10747823677640596640?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_15_17-1401150368988471040?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_24_21-9014470271674079067?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_48_29-2266820357684086143?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_08_57_35-16431532638118655659?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_06_52-7493597095340949664?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_15_32-14720057798393418957?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-16_09_25_32-2730805652803892376?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 2939.354s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 50m 1s
83 actionable tasks: 63 executed, 20 from cache

Publishing build scan...
https://gradle.com/s/l75rmchzg6jda

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to