See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3202/display/redirect>
Changes: ------------------------------------------ [...truncated 50.65 MB...] apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.321Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.338Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.353Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read.out" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.371Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.376Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.439Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.468Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.499Z: JOB_MESSAGE_DEBUG: Value "HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:05:46.565Z: JOB_MESSAGE_BASIC: Executing operation ReadInputText/Read+HourlyTeamScore/ParseGameEventFn+HourlyTeamScore/FilterStartTime+HourlyTeamScore/FilterEndTime+HourlyTeamScore/AddEventTimestamps+HourlyTeamScore/FixedWindowsTeam+HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:146>)+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:06.425Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:11.179Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 5. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:20.033Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:20.062Z: JOB_MESSAGE_DETAILED: Resized worker pool to 1, though goal was 5. This could be a quota issue. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:30.292Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:30.317Z: JOB_MESSAGE_DETAILED: Resized worker pool to 3, though goal was 5. This could be a quota issue. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:06:40.563Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:07:48.739Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:07:48.761Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.467Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.524Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.601Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.648Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.690Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.746Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.782Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.797Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.810Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.827Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.837Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.854Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.865Z: JOB_MESSAGE_BASIC: Executing operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.880Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.888Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.901Z: JOB_MESSAGE_BASIC: Finished operation WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.914Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.940Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:30.975Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:11:31.010Z: JOB_MESSAGE_DEBUG: Value "WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:21.436Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 5 to 67. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:26.734Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:26.772Z: JOB_MESSAGE_DETAILED: Resized worker pool to 10, though goal was 67. This could be a quota issue. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:36.984Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 66 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:37.013Z: JOB_MESSAGE_DETAILED: Resized worker pool to 66, though goal was 67. This could be a quota issue. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:12:57.496Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 67 based on the rate of progress in the currently running stage(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:56.359Z: JOB_MESSAGE_BASIC: Finished operation ReadInputText/Read+HourlyTeamScore/ParseGameEventFn+HourlyTeamScore/FilterStartTime+HourlyTeamScore/FilterEndTime+HourlyTeamScore/AddEventTimestamps+HourlyTeamScore/FixedWindowsTeam+HourlyTeamScore/ExtractAndSumScore/Map(<lambda at hourly_team_score.py:146>)+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Partial+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Reify+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:56.623Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:56.674Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:56.738Z: JOB_MESSAGE_BASIC: Executing operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.094Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.116Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.421Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.494Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.516Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.721Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:57.834Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.005Z: JOB_MESSAGE_ERROR: Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/dataflow_worker/batchworker.py", line 649, in do_work work_executor.execute() File "/usr/local/lib/python3.7/site-packages/dataflow_worker/executor.py", line 179, in execute op.start() File "dataflow_worker/shuffle_operations.py", line 63, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 64, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 79, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 80, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "dataflow_worker/shuffle_operations.py", line 84, in dataflow_worker.shuffle_operations.GroupedShuffleReadOperation.start File "apache_beam/runners/worker/operations.py", line 359, in apache_beam.runners.worker.operations.Operation.output File "apache_beam/runners/worker/operations.py", line 221, in apache_beam.runners.worker.operations.SingletonConsumerSet.receive File "dataflow_worker/shuffle_operations.py", line 261, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process File "dataflow_worker/shuffle_operations.py", line 272, in dataflow_worker.shuffle_operations.BatchGroupAlsoByWindowsOperation.process ValueError: too many values to unpack (expected 3) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.028Z: JOB_MESSAGE_BASIC: Finished operation HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.116Z: JOB_MESSAGE_DEBUG: Executing failure step failure64 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.152Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S11:HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/GroupByKey/Read+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine+HourlyTeamScore/ExtractAndSumScore/CombinePerKey(sum)/Combine/Extract+TeamScoresDict+WriteTeamScoreSums/ConvertToRow+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/AppendDestination+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/IdentityWorkaround+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteTeamScoreSums/WriteToBigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: beamapp-jenkins-121406053-12132205-upzb-harness-j9z0 Root cause: Work item failed., beamapp-jenkins-121406053-12132205-upzb-harness-gpgd Root cause: Work item failed., beamapp-jenkins-121406053-12132205-upzb-harness-p4kq Root cause: Work item failed., beamapp-jenkins-121406053-12132205-upzb-harness-lh1z Root cause: Work item failed. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.241Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.428Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:21:58.465Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:22:54.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 67 to 0. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:22:54.122Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-14T06:22:54.165Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-12-13_22_05_38-10822446963966584676 is in state JOB_STATE_FAILED google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process... google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process... google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process... google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials. google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): bigquery.googleapis.com:443 urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "DELETE /bigquery/v2/projects/apache-beam-testing/datasets/hourly_team_score_it_dataset16079259219217?deleteContents=true&prettyPrint=false HTTP/1.1" 200 None --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 68 tests in 4259.287s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 15m 44s 211 actionable tasks: 152 executed, 55 from cache, 4 up-to-date Gradle was unable to watch the file system for changes. The inotify watches limit is too low. Publishing build scan... https://gradle.com/s/ghonbp4hxxvoi Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
