See 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1107/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7044] portable Spark: support stateful dofns

------------------------------------------
[...truncated 755.51 KB...]
root: INFO: 2019-06-11T21:25:04.285Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-11T21:25:04.678Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-06-11T21:25:04.733Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-11T21:25:04.775Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-06-11T21:25:04.818Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-06-11T21:25:04.864Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-06-11T21:25:04.889Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
root: INFO: 2019-06-11T21:25:04.931Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
root: INFO: 2019-06-11T21:25:04.978Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s21 for input s15.out_WrittenFiles
root: INFO: 2019-06-11T21:25:05.027Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify,
 through flatten 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DestinationFilesUnion, into 
producer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-11T21:25:05.068Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s53 for input s47.out_WrittenFiles
root: INFO: 2019-06-11T21:25:05.121Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify,
 through flatten 
WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion, into 
producer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-11T21:25:05.163Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s53-u71 for input s54-reify-value45-c69
root: INFO: 2019-06-11T21:25:05.202Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write,
 through flatten 
WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, 
into producer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-06-11T21:25:05.238Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow into 
FlatMap(<lambda at bigquery_file_loads_test.py:444>)
root: INFO: 2019-06-11T21:25:05.277Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow into 
FlatMap(<lambda at bigquery_file_loads_test.py:444>)
root: INFO: 2019-06-11T21:25:05.324Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-06-11T21:25:05.374Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
root: INFO: 2019-06-11T21:25:05.422Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
root: INFO: 2019-06-11T21:25:05.459Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination
root: INFO: 2019-06-11T21:25:05.512Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination
 into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-06-11T21:25:05.554Z: JOB_MESSAGE_DETAILED: Fusing consumer 
Map(<lambda at bigquery_file_loads_test.py:442>) into Create/Read
root: INFO: 2019-06-11T21:25:05.603Z: JOB_MESSAGE_DETAILED: Fusing consumer 
GroupByKey/Reify into Map(<lambda at bigquery_file_loads_test.py:442>)
root: INFO: 2019-06-11T21:25:05.642Z: JOB_MESSAGE_DETAILED: Fusing consumer 
GroupByKey/GroupByWindow into GroupByKey/Read
root: INFO: 2019-06-11T21:25:05.689Z: JOB_MESSAGE_DETAILED: Fusing consumer 
FlatMap(<lambda at bigquery_file_loads_test.py:444>) into 
GroupByKey/GroupByWindow
root: INFO: 2019-06-11T21:25:05.731Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
 into WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-06-11T21:25:05.788Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-06-11T21:25:05.835Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
 into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber
root: INFO: 2019-06-11T21:25:05.879Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber into 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
root: INFO: 2019-06-11T21:25:05.915Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow 
into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-06-11T21:25:05.953Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify into 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-06-11T21:25:06.022Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write into 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-06-11T21:25:06.068Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
 into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read
root: INFO: 2019-06-11T21:25:06.111Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-11T21:25:06.159Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination into 
WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow
root: INFO: 2019-06-11T21:25:06.203Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
 into WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination
root: INFO: 2019-06-11T21:25:06.238Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations) 
into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
root: INFO: 2019-06-11T21:25:06.285Z: JOB_MESSAGE_DETAILED: Fusing consumer 
GroupByKey/Write into GroupByKey/Reify
root: INFO: 2019-06-11T21:25:06.322Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
root: INFO: 2019-06-11T21:25:06.360Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify
root: INFO: 2019-06-11T21:25:06.408Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix into 
WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-06-11T21:25:06.454Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read
root: INFO: 2019-06-11T21:25:06.500Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete 
into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-06-11T21:25:06.536Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-06-11T21:25:06.578Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:550>) into 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-06-11T21:25:06.621Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-06-11T21:25:06.661Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-06-11T21:25:06.707Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-06-11T21:25:06.753Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-06-11T21:25:06.802Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs
root: INFO: 2019-06-11T21:25:06.841Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
root: INFO: 2019-06-11T21:25:06.891Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue 
into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-06-11T21:25:06.937Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
root: INFO: 2019-06-11T21:25:06.974Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
root: INFO: 2019-06-11T21:25:07.012Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames 
into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-06-11T21:25:07.045Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs 
into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-06-11T21:25:07.090Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs 
into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read
root: INFO: 2019-06-11T21:25:07.130Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
root: INFO: 2019-06-11T21:25:07.175Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-06-11T21:25:07.230Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
root: INFO: 2019-06-11T21:25:07.274Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
 into 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
root: INFO: 2019-06-11T21:25:07.314Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:550>) into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read
root: INFO: 2019-06-11T21:25:07.360Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read
root: INFO: 2019-06-11T21:25:07.406Z: JOB_MESSAGE_DETAILED: Fusing consumer 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
 into 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
root: INFO: 2019-06-11T21:25:07.449Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-06-11T21:25:07.497Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-06-11T21:25:07.543Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-06-11T21:25:07.603Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-11T21:25:07.905Z: JOB_MESSAGE_DEBUG: Executing wait step 
start92
root: INFO: 2019-06-11T21:25:08.019Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:550>)
root: INFO: 2019-06-11T21:25:08.056Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-06-11T21:25:08.068Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-06-11T21:25:08.108Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:550>)
root: INFO: 2019-06-11T21:25:08.122Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-06-11T21:25:08.167Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix
root: INFO: 2019-06-11T21:25:08.223Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-06-11T21:25:08.265Z: JOB_MESSAGE_BASIC: Executing operation 
MakeSchemas/Read
root: INFO: 2019-06-11T21:25:08.318Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-06-11T21:25:08.370Z: JOB_MESSAGE_BASIC: Executing operation 
MakeTables/Read
root: INFO: 2019-06-11T21:25:08.419Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-06-11T21:25:08.451Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create
root: INFO: 2019-06-11T21:25:08.503Z: JOB_MESSAGE_BASIC: Executing operation 
GroupByKey/Create
root: INFO: 2019-06-11T21:25:08.539Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
root: INFO: 2019-06-11T21:25:08.588Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
root: INFO: 2019-06-11T21:25:08.644Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" 
materialized.
root: INFO: 2019-06-11T21:25:08.689Z: JOB_MESSAGE_DEBUG: Value 
"MakeSchemas/Read.out" materialized.
root: INFO: 2019-06-11T21:25:08.736Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
 materialized.
root: INFO: 2019-06-11T21:25:08.790Z: JOB_MESSAGE_DEBUG: Value 
"MakeTables/Read.out" materialized.
root: INFO: 2019-06-11T21:25:08.832Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
 materialized.
root: INFO: 2019-06-11T21:25:08.883Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" 
materialized.
root: INFO: 2019-06-11T21:25:08.933Z: JOB_MESSAGE_DEBUG: Value 
"GroupByKey/Session" materialized.
root: INFO: 2019-06-11T21:25:08.976Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
 materialized.
root: INFO: 2019-06-11T21:25:09.026Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
 materialized.
root: INFO: 2019-06-11T21:25:09.073Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-06-11T21:25:09.119Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-06-11T21:25:09.166Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
root: INFO: 2019-06-11T21:25:09.212Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Read+Map(<lambda at 
bigquery_file_loads_test.py:442>)+GroupByKey/Reify+GroupByKey/Write
root: INFO: 2019-06-11T21:25:09.257Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output"
 materialized.
root: INFO: 2019-06-11T21:25:09.299Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output"
 materialized.
root: INFO: 2019-06-11T21:25:09.337Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output"
 materialized.
root: INFO: 2019-06-11T21:26:17.146Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-06-11T21:27:02.327Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-06-11T21:27:02.380Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: Deleting dataset python_bq_file_loads_15602882724560 in project 
apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_49-14478105988589554615?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_31_27-7768916075736988951?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_39_51-13074942229589755802?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_49_24-414396806851580321?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_59_41-8119135558835814902?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_41-1684686555648870319?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_39_00-6121379184300885833?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_51_40-6386600650613694034?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_46-15279361734749562585?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_29_36-8463660812438577536?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_38_54-6519513546889538856?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_48_54-11937389649358719246?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_40-12894380162316921400?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_24_58-12742988343574231413?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-2:
Traceback (most recent call last):
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_34_32-13905330505358570286?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_44_36-9534817377618026255?project=apache-beam-testing.
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 661, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 689, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-11_14_24_58-12742988343574231413?alt=json>:
 response: <{'cache-control': 'private', 'server': 'ESF', 'transfer-encoding': 
'chunked', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, 
X-Origin, Referer', '-content-encoding': 'gzip', 'x-content-type-options': 
'nosniff', 'status': '404', 'x-xss-protection': '0', 'content-length': '280', 
'x-frame-options': 'SAMEORIGIN', 'date': 'Tue, 11 Jun 2019 21:28:44 GMT'}>, 
content <{
  "error": {
    "code": 404,
    "message": "(8cd08550da99f815): Information about job 
2019-06-11_14_24_58-12742988343574231413 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_40-6463629533206057602?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_35_33-10792244505161741670?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_43_26-11062300847743952853?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_51_26-5304004505513627529?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_39-12821998588556074995?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_22_28-5978236941452953125?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_32_21-12071516845440166211?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_41_53-5045389880910384619?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_42-465361070053109034?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_23_25-3579733370089590315?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_30_28-11077363674096399532?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_41_42-11218596066642861359?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_49_37-17702115365082300037?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_58_04-3794651887378523397?project=apache-beam-testing.
  kms_key=kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_14_42-1960111790568477224?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_24_17-14839282237727424843?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_31_44-4500189495841796663?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_39_13-18255671691284996397?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-11_14_49_22-9765084377171124662?project=apache-beam-testing.

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3295.347s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 4s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/htwat4hkew6zy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to