See
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/3650/display/redirect>
Changes:
------------------------------------------
[...truncated 60.05 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.302Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.329Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.332Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.347Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.360Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.409Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/_UnpickledSideInput(SchemaModJobNamePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.435Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.480Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.515Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.572Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:32.931Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:33.004Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:33.091Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:33.231Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:33.310Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:33.417Z:
JOB_MESSAGE_BASIC: Executing operation
create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:50.498Z:
JOB_MESSAGE_BASIC: Finished operation
create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:50.597Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:50.675Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:50.766Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:53.754Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:53.853Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:53.932Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:21:54.004Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:05.828Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:05.902Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:05.942Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:05.980Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.010Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.042Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.075Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.078Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.109Z:
JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.124Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.124Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.142Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.169Z:
JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.232Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.261Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.294Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.328Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:06.353Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:15.060Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.218Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.277Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.312Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.375Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.435Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.497Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:19.609Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:26.943Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py",
line 181, in execute
op.finish()
File "dataflow_worker/native_operations.py", line 93, in
dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 94, in
dataflow_worker.native_operations.NativeWriteOperation.finish
File "dataflow_worker/native_operations.py", line 95, in
dataflow_worker.native_operations.NativeWriteOperation.finish
File
"/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line
309, in __exit__
self._data_file_writer.fo.close()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py",
line 215, in close
self._uploader.finish()
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py",
line 675, in finish
raise self._upload_thread.last_error # pylint: disable=raising-bad-type
File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py",
line 650, in _start_upload
self._client.objects.Insert(self._insert_request, upload=self._upload)
File
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",
line 1156, in Insert
upload=upload, upload_config=upload_config)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py",
line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpError: HttpError accessing
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=temp-it%2Fbeamapp-jenkins-0405071501-353763.1617606901.353940%2Fdax-tmp-2021-04-05_00_15_09-15072730239384990400-S20-0-7edff0a583e58898%2Ftmp-7edff0a583e580b5-shard--try-3733fac6fa561526-endshard.avro&uploadType=resumable&upload_id=ABg5-UycHDs8NkxBP4-RpLrYFeu03uKVBC1_Cc2TM3O5g6L2RC6JQ6dCb3vPIDrLi2NVEuzTYIJN3MDZYJepSgSqlEYXkZJHog>:
response: <{'content-type': 'text/plain; charset=utf-8',
'x-guploader-uploadid':
'ABg5-UycHDs8NkxBP4-RpLrYFeu03uKVBC1_Cc2TM3O5g6L2RC6JQ6dCb3vPIDrLi2NVEuzTYIJN3MDZYJepSgSqlEYXkZJHog',
'content-length': '0', 'date': 'Mon, 05 Apr 2021 07:22:24 GMT', 'server':
'UploadServer', 'status': '503'}>, content <>
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.068Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.148Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.240Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.289Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.352Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:31.451Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.009Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.079Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.151Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.202Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.269Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:33.337Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.644Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.710Z:
JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.773Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.830Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.925Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:34.977Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:35.162Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:35.248Z:
JOB_MESSAGE_DEBUG: Value
"write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:35.304Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:35.993Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:36.074Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:36.125Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:36.190Z:
JOB_MESSAGE_BASIC: Executing operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:37.438Z:
JOB_MESSAGE_BASIC: Finished operation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:37.517Z:
JOB_MESSAGE_DEBUG: Executing success step success48
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:37.637Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:37.691Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:37.724Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:58.906Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:58.993Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:59.140Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:22:59.242Z:
JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:08.324Z:
JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:08.374Z:
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:08.440Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:08.499Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:08.522Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:21.673Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:21.717Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:21.745Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2021-04-05_00_15_09-15072730239384990400 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query
SELECT bytes, date, time FROM
python_write_to_table_16176068959962.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth
process...
DEBUG:google.auth._default:No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1):
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
HTTP/1.1" 200 244
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1):
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200
None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/9fe117e3-47fc-47d2-b3de-944f16d92975?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false
HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/9fe117e3-47fc-47d2-b3de-944f16d92975?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false
HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is:
[(b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc',
datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'xyw', datetime.date(2011,
1, 1), datetime.time(23, 59, 59, 999999)), (b'\xe4\xbd\xa0\xe5\xa5\xbd',
datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset
python_write_to_table_16176068959962 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:59.052Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:59.100Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-04-05T07:23:59.135Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2021-04-05_00_16_22-13783445478536513180 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_streaming_wordcount_debugging_it
(apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT)
... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_run_example_with_setup_file
(apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT)
... ok
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_read_via_sql
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_read_via_table
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ...
ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_avro_file_load
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_spanner_error
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_spanner_update
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_write_batches
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_dicom_search_instances
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_analyzing_syntax
(apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_label_detection_with_video_context
(apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ...
ok
test_text_detection_with_language_hint
(apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ...
SKIP: The "TestDataflowRunner", does not support the TestStream transform.
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_avro
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_job_python_from_python_it
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
Test that schema update options are respected when appending to an existing ...
ok
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 69 tests in 4753.499s
OK (SKIP=6)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 198
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:postCommitPy37IT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 23m 44s
210 actionable tasks: 150 executed, 56 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/mnfjk7g7mcuem
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]