See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/3325/display/redirect?page=changes>

Changes:

[chamikaramj] Do not set empty specs in external transforms

[noreply] [BEAM-11614] Clarify Beam model for zero subtransform case (#13744)

[noreply] [BEAM-11531] Remove transform implementation (#13705)

[noreply] [BEAM-11531] Skip some DataFrame tests that are broken in pandas 1.2.0

[noreply] [BEAM-11357] Add annotations to PTransforms and enable them in Go SDK


------------------------------------------
[...truncated 52.28 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:21:06.514Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:54.947Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.316Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.365Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.398Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.436Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.467Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.490Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.494Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.506Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.518Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.544Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.547Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.571Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.574Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.586Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.597Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.624Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.652Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:25:55.687Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:26:37.506Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 5 to 14.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:26:53.259Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 14 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:27:07.340Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 14 to 36.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:27:22.904Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 36 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:32:03.324Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 36 to 40.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:32:18.939Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 40 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:33:07.955Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 40 to 48.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:33:23.516Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 48 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:34:39.670Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 48 to 53.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:34:45.898Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 50 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:34:45.928Z: 
JOB_MESSAGE_DETAILED: Resized worker pool to 50, though goal was 53.  This 
could be a quota issue.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:34:56.281Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 53 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:37:09.252Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 53 to 60.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:37:24.811Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 60 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:39:06.404Z: 
JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 60 to 74.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:39:11.805Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 68 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:39:11.853Z: 
JOB_MESSAGE_DETAILED: Resized worker pool to 68, though goal was 74.  This 
could be a quota issue.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:39:22.250Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 74 based on 
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:18.711Z: 
JOB_MESSAGE_BASIC: Finished operation 
ReadInputText/Read+HourlyTeamScore/ParseGameEventFn+HourlyTeamScore/FilterStartTime+HourlyTeamScore/FilterEndTime+HourlyTeamScore/AddEventTimestamps+HourlyTeamScore/FixedWindowsTeam+HourlyTeamScore/ExtractAndSumScore/Map(<lambda
 at 
hourly_team_score.py:146>)+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:18.960Z: 
JOB_MESSAGE_BASIC: Executing operation 
HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:19.039Z: 
JOB_MESSAGE_BASIC: Finished operation 
HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:19.138Z: 
JOB_MESSAGE_BASIC: Executing operation 
HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:27.846Z: 
JOB_MESSAGE_BASIC: Finished operation 
HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:27.966Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.059Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.180Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.534Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/DropShardNumber+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.645Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.727Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:28.845Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:40.629Z: 
JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", 
line 181, in execute
    op.finish()
  File "dataflow_worker/native_operations.py", line 93, in 
dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 94, in 
dataflow_worker.native_operations.NativeWriteOperation.finish
  File "dataflow_worker/native_operations.py", line 95, in 
dataflow_worker.native_operations.NativeWriteOperation.finish
  File 
"/usr/local/lib/python3.7/site-packages/dataflow_worker/nativeavroio.py", line 
309, in __exit__
    self._data_file_writer.fo.close()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/filesystemio.py", 
line 220, in close
    self._uploader.finish()
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", 
line 676, in finish
    raise self._upload_thread.last_error  # pylint: disable=raising-bad-type
  File "/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/gcsio.py", 
line 651, in _start_upload
    self._client.objects.Insert(self._insert_request, upload=self._upload)
  File 
"/usr/local/lib/python3.7/site-packages/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",
 line 1156, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python3.7/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpError: HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=temp-it%2Fbeamapp-jenkins-0114001941-863304.1610583581.863478%2Fdax-tmp-2021-01-13_16_19_50-4698428908229314687-S15-0-6a4c3f42ba9c9596%2Ftmp-6a4c3f42ba9c9b4b-shard--try-6ba21ba459ef7999-endshard.avro&uploadType=resumable&upload_id=ABg5-UyDJ-sJnloGh-e_9zK8Z3KGgyomkQGgBfRq6Wi6bUqEMJy_kI5y7Pm-JDDZcb8FOrbytZHh41qFhy9B_Kk946Q>:
 response: <{'content-type': 'text/plain; charset=utf-8', 
'x-guploader-uploadid': 
'ABg5-UyDJ-sJnloGh-e_9zK8Z3KGgyomkQGgBfRq6Wi6bUqEMJy_kI5y7Pm-JDDZcb8FOrbytZHh41qFhy9B_Kk946Q',
 'content-length': '0', 'date': 'Thu, 14 Jan 2021 00:40:40 GMT', 'server': 
'UploadServer', 'status': '503'}>, content <>

apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:53.894Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.042Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.079Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.131Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.160Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.205Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.206Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.220Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.277Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.280Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.335Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.344Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.370Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.412Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.455Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.492Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/Flatten.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:54.528Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorDestLoadJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:40:55.150Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorDestLoadJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.017Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.145Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.257Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.321Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.404Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:08.484Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.081Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.172Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.250Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.352Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.578Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.678Z: 
JOB_MESSAGE_DEBUG: Executing success step success64
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.860Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.930Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:41:09.962Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:42:19.427Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 74 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:42:19.503Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-01-14T00:42:19.536Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2021-01-13_16_19_50-4698428908229314687 is in state JOB_STATE_DONE
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query 
SELECT COUNT(*) FROM 
`apache-beam-testing.hourly_team_score_it_dataset16105835734700.leader_board` 
to BQ
google.auth._default: DEBUG: Checking None for explicit credentials as part of 
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth 
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using 
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth 
process...
google.auth._default: DEBUG: No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
 HTTP/1.1" 200 241
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 
None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/fb2c10ae-250b-4cd5-b617-7898d5c68cd5?maxResults=0&location=US&prettyPrint=false
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon83b96dda8b01700ba33308baaa924663805ef07e/data?prettyPrint=false
 HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Read from given query (SELECT 
COUNT(*) FROM 
`apache-beam-testing.hourly_team_score_it_dataset16105835734700.leader_board`), 
total rows 1
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Generate checksum: 
06e708586121302b42a6cbeb9cc4df518c2a174d
google.auth._default: DEBUG: Checking None for explicit credentials as part of 
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth 
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using 
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth 
process...
google.auth._default: DEBUG: No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform
 HTTP/1.1" 200 241
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE 
/bigquery/v2/projects/apache-beam-testing/datasets/hourly_team_score_it_dataset16105835734700?deleteContents=true&prettyPrint=false
 HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_52-16550214075591652407?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_33_43-5459053136850076313?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_41_29-16728554303889268841?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_49_12-14026881096312293306?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_56_58-4767441978081838992?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_04_51-6575590814558421451?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_11_29-8060798548156751385?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_18_04-8982706087083871383?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_50-4698428908229314687?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_42_45-9988054847118658969?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_52_03-9697640948049357745?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_00_36-10817832205496882714?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_09_20-5547065477462899023?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_17_24-11029295195784674063?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_50-10308558945693555356?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_31_23-10031565630218442940?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_38_52-6794904016674791902?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_47_43-15139895552264247293?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_56_29-2876165075372043883?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_04_20-14584093132905517192?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_12_17-14795998018936839426?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_20_08-11601222254711126078?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_48-3126150236771020521?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_38_12-4333992840473756857?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_50_25-16699883345535675816?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_59_09-2271788112331630969?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_07_55-15873132768361996256?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_16_42-17707978576573666583?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_24_53-14371570052070316396?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_22_20-17209306008481408110?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_30_56-10531448258080717868?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_39_08-2413539483389507974?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_47_20-11929792667016472768?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_55_46-6914842376151797836?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_03_51-1007825701718273053?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_11_23-17153563366649820037?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_18_59-17988841863297529102?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_26_19-16997557853954025773?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_50-17452852036243868960?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_28_07-12942525154825875634?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_38_04-10902368299419127117?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_48_25-13378399408975217122?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_56_27-6077063934966915009?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_06_51-639594988832436790?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_14_40-15114308009908824932?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_51-2577217608961728582?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_29_24-6002622995008103513?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_37_52-702965172024449454?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_46_27-11876394330303755136?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_55_21-18036563070084103700?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_04_06-16984720891805295571?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_13_02-9798985176343590455?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_21_26-9710400171615963332?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_19_49-7997864473110231472?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_28_40-5139770729850422214?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_36_53-12808485325614818571?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_45_00-4638693088022331454?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_16_52_36-7446899923262457372?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_00_09-15699558716879658549?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-13_17_16_39-984691234686407582?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4469.027s

FAILED (SKIP=6, errors=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 33m 33s
216 actionable tasks: 193 executed, 19 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/ruqg7trakzija

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to