See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1085/display/redirect?page=changes>
Changes: [pabloem] BEAM-7475: Update programming guide with stateful and timer example ------------------------------------------ [...truncated 990.25 KB...] root: INFO: 2019-06-08T22:23:53.566Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) root: INFO: 2019-06-08T22:23:53.613Z: JOB_MESSAGE_DETAILED: Unzipping flatten s53-u71 for input s54-reify-value45-c69 root: INFO: 2019-06-08T22:23:53.661Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write, through flatten WriteWithMultipleDests/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify root: INFO: 2019-06-08T22:23:53.713Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow into FlatMap(<lambda at bigquery_file_loads_test.py:444>) root: INFO: 2019-06-08T22:23:53.757Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow into FlatMap(<lambda at bigquery_file_loads_test.py:444>) root: INFO: 2019-06-08T22:23:53.804Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile root: INFO: 2019-06-08T22:23:53.855Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile root: INFO: 2019-06-08T22:23:53.904Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify root: INFO: 2019-06-08T22:23:53.954Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination root: INFO: 2019-06-08T22:23:53.987Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/AppendDestination into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ApplyGlobalWindow root: INFO: 2019-06-08T22:23:54.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at bigquery_file_loads_test.py:442>) into Create/Read root: INFO: 2019-06-08T22:23:54.075Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into Map(<lambda at bigquery_file_loads_test.py:442>) root: INFO: 2019-06-08T22:23:54.124Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read root: INFO: 2019-06-08T22:23:54.162Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at bigquery_file_loads_test.py:444>) into GroupByKey/GroupByWindow root: INFO: 2019-06-08T22:23:54.190Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber root: INFO: 2019-06-08T22:23:54.240Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow root: INFO: 2019-06-08T22:23:54.278Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/DropShardNumber root: INFO: 2019-06-08T22:23:54.324Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow root: INFO: 2019-06-08T22:23:54.360Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read root: INFO: 2019-06-08T22:23:54.407Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) root: INFO: 2019-06-08T22:23:54.456Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Reify root: INFO: 2019-06-08T22:23:54.505Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Read root: INFO: 2019-06-08T22:23:54.544Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) root: INFO: 2019-06-08T22:23:54.587Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination into WriteWithMultipleDests/BigQueryBatchFileLoads/ApplyGlobalWindow root: INFO: 2019-06-08T22:23:54.629Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) into WriteWithMultipleDests/BigQueryBatchFileLoads/AppendDestination root: INFO: 2019-06-08T22:23:54.670Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile) root: INFO: 2019-06-08T22:23:54.720Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify root: INFO: 2019-06-08T22:23:54.764Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(_ShardDestinations) root: INFO: 2019-06-08T22:23:54.802Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Reify root: INFO: 2019-06-08T22:23:54.848Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read root: INFO: 2019-06-08T22:23:54.892Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read root: INFO: 2019-06-08T22:23:54.934Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames root: INFO: 2019-06-08T22:23:54.980Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify root: INFO: 2019-06-08T22:23:55.023Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>) into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read root: INFO: 2019-06-08T22:23:55.065Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs root: INFO: 2019-06-08T22:23:55.111Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue root: INFO: 2019-06-08T22:23:55.155Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs root: INFO: 2019-06-08T22:23:55.203Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read root: INFO: 2019-06-08T22:23:55.246Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs root: INFO: 2019-06-08T22:23:55.283Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue root: INFO: 2019-06-08T22:23:55.334Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables root: INFO: 2019-06-08T22:23:55.377Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify root: INFO: 2019-06-08T22:23:55.413Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames root: INFO: 2019-06-08T22:23:55.436Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow root: INFO: 2019-06-08T22:23:55.469Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read root: INFO: 2019-06-08T22:23:55.515Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read root: INFO: 2019-06-08T22:23:55.560Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables root: INFO: 2019-06-08T22:23:55.606Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read root: INFO: 2019-06-08T22:23:55.645Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs root: INFO: 2019-06-08T22:23:55.692Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read root: INFO: 2019-06-08T22:23:55.720Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read root: INFO: 2019-06-08T22:23:55.758Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read root: INFO: 2019-06-08T22:23:55.806Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow root: INFO: 2019-06-08T22:23:55.853Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-06-08T22:23:55.892Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-06-08T22:23:55.923Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-06-08T22:23:55.961Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-06-08T22:23:56.203Z: JOB_MESSAGE_DEBUG: Executing wait step start92 root: INFO: 2019-06-08T22:23:56.286Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>) root: INFO: 2019-06-08T22:23:56.333Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix root: INFO: 2019-06-08T22:23:56.345Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-06-08T22:23:56.378Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:550>) root: INFO: 2019-06-08T22:23:56.390Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... root: INFO: 2019-06-08T22:23:56.426Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix root: INFO: 2019-06-08T22:23:56.462Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-06-08T22:23:56.509Z: JOB_MESSAGE_BASIC: Executing operation MakeSchemas/Read root: INFO: 2019-06-08T22:23:56.557Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-06-08T22:23:56.597Z: JOB_MESSAGE_BASIC: Executing operation MakeTables/Read root: INFO: 2019-06-08T22:23:56.635Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-06-08T22:23:56.676Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-06-08T22:23:56.713Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create root: INFO: 2019-06-08T22:23:56.747Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-06-08T22:23:56.790Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-06-08T22:23:56.838Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. root: INFO: 2019-06-08T22:23:56.874Z: JOB_MESSAGE_DEBUG: Value "MakeSchemas/Read.out" materialized. root: INFO: 2019-06-08T22:23:56.920Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. root: INFO: 2019-06-08T22:23:56.958Z: JOB_MESSAGE_DEBUG: Value "MakeTables/Read.out" materialized. root: INFO: 2019-06-08T22:23:57.006Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. root: INFO: 2019-06-08T22:23:57.053Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. root: INFO: 2019-06-08T22:23:57.096Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. root: INFO: 2019-06-08T22:23:57.140Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. root: INFO: 2019-06-08T22:23:57.189Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. root: INFO: 2019-06-08T22:23:57.227Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-06-08T22:23:57.260Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-06-08T22:23:57.301Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0) root: INFO: 2019-06-08T22:23:57.342Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:442>)+GroupByKey/Reify+GroupByKey/Write root: INFO: 2019-06-08T22:23:57.380Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-06-08T22:23:57.417Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-06-08T22:23:57.461Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-06-08T22:26:07.768Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: Deleting dataset python_bq_file_loads_15600326007866 in project apache-beam-testing --------------------- >> end captured logging << --------------------- Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_48-4893369334577595215?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_29_39-1656130379838301507?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_36_45-4552518933584727651?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_46_52-1276033176226536223?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_45-59068063367927486?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_38_55-16556259040783398941?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_50-18344057455035966503?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_28_13-7813932463942124870?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_36_11-9431057798988259193?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_44_10-6677757323828724942?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_48-15209358125191180663?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_34_50-6738847629481091443?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_45_33-9813961695330582211?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_54_02-2493070751425121975?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_49-3819956772351531133?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Exception in thread Thread-2: Traceback (most recent call last): Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_23_47-14421877663087209173?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_31_08-15669137607112196406?project=apache-beam-testing. File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_38_16-16899948917516533293?project=apache-beam-testing. self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion response = runner.dataflow_client.get_job(job_id) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 668, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-08_15_23_47-14421877663087209173?alt=json>: response: <{'server': 'ESF', 'status': '404', 'content-length': '280', 'vary': 'Origin, X-Origin, Referer', 'x-xss-protection': '0', '-content-encoding': 'gzip', 'x-content-type-options': 'nosniff', 'x-frame-options': 'SAMEORIGIN', 'date': 'Sat, 08 Jun 2019 22:26:53 GMT', 'content-type': 'application/json; charset=UTF-8', 'cache-control': 'private', 'transfer-encoding': 'chunked'}>, content <{ "error": { "code": 404, "message": "(67982144318c3dca): Information about job 2019-06-08_15_23_47-14421877663087209173 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_46-2475170960122135298?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_22_53-7522363257222222709?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_31_06-7674213828338553685?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_38_51-3938394713822831148?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_46_40-11753871987487586137?project=apache-beam-testing. or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_53_51-12278758027520654165?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_47-14633594595139246403?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_23_21-13246377647066800057?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_32_25-16164900233140073758?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_39_42-13373638945899641344?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_49_04-98662495687384541?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_14_48-1660666179949974009?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_23_10-2896402844054927700?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_33_47-15675077060589781310?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_41_29-7472450836373482862?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_15_51_36-15192124721134313020?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-08_16_00_17-12253473558206869008?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 42 tests in 3319.930s FAILED (SKIP=5, failures=1) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 56m 32s 77 actionable tasks: 60 executed, 17 from cache Publishing build scan... https://gradle.com/s/pzap5zrirel3w Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
