See <https://ci-beam.apache.org/job/beam_PostCommit_Python2/2775/display/redirect?page=changes>
Changes: [Etienne Chauchot] [BEAM-10471] change the test condition for testEstimatedSizeBytes to [Colm O hEigeartaigh] BEAM-10668 - Replace toLowerCase().equals() with equalsIgnoreCase [noreply] [BEAM-10361] upgrade Kotlin version in example (#12497) ------------------------------------------ [...truncated 23.97 MB...] INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.022Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.048Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.056Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.082Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.085Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.110Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.115Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.154Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.188Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.224Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:15.299Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:32.693Z: JOB_MESSAGE_BASIC: Finished operation create/Read+write/BigQueryBatchFileLoads/RewindowIntoGlobal+write/BigQueryBatchFileLoads/AppendDestination+write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+write/BigQueryBatchFileLoads/GroupShardedRows/Reify+write/BigQueryBatchFileLoads/GroupShardedRows/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:32.760Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:32.827Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:32.888Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:33.415Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:33.453Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:35.901Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupShardedRows/Read+write/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+write/BigQueryBatchFileLoads/DropShardNumber+write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+write/BigQueryBatchFileLoads/IdentityWorkaround+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:35.970Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:36.041Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:36.143Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:48.866Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:48.938Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:48.969Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.029Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.064Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.082Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.096Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.108Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.124Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.146Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.157Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.193Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.220Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.227Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.274Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:49.311Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.479Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.533Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.606Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.666Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.730Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:53.831Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:56.543Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:56.617Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:56.669Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:56.735Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:59.694Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:59.767Z: JOB_MESSAGE_DEBUG: Executing success step success44 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:59.916Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:05:59.970Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:00.001Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:06.978Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:07.022Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:07.058Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-10_11_59_19-12533513157855706311 is in state JOB_STATE_DONE test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:53.887Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:53.934Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:06:53.970Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-10_11_59_41-5919289084164246147 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_15970859682515.python_no_schema_table to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/2b37983c-f96e-4849-a6cb-be70c284b5f4?location=US&maxResults=0 HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon342ba988df4023564a75a8e6d20e89b12495a176/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [('xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), ('abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), ('\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), ('\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))] INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_15970859682515 in project apache-beam-testing test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:35.756Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_15921273653642875301". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15921273653642875301". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:46.755Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_15921273653642875301" done. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:47.525Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:47.644Z: JOB_MESSAGE_DEBUG: Executing success step success1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:47.770Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:47.865Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:08:47.902Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:09:39.666Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:09:39.716Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-10T19:09:39.750Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-10_12_03_26-12553454531499720214 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15970861956856.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/0c705718-aff3-4ac8-b62f-4567649e3c66?location=US&maxResults=0 HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon2a97623c6f46d917bc188d64645db8b451d2ac00/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15970861956856.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 3b2cefe89863bf492d48f7d4da960f2999802a89 test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok ====================================================================== ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads_test.py",> line 717, in test_multiple_destinations_transform max_files_per_bundle=-1)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 555, in __exit__ self.result = self.run() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 521, in run allow_proto_holders=True).run(False) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 534, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline self).run_pipeline(pipeline, options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 586, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 656, in create_job self.create_job_description(job) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 712, in create_job_description resources = self._stage_resources(job.proto_pipeline, job.options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 609, in _stage_resources resources, staging_location=google_cloud_options.staging_location) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 305, in stage_job_resources file_path, FileSystems.join(staging_location, staged_path)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 975, in stage_artifact local_path_to_artifact, artifact_name) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 573, in _gcs_file_copy self.stage_file(to_folder, to_name, f, total_size=total_size) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 634, in stage_file response = self._storage_client.objects.Insert(request, upload=upload) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1156, in Insert upload=upload, upload_config=upload_config) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUnGTB0EtoRyDuYx3sd4WkfUsBImcoi7Kvxdi6D6NE_cIkhvjbO8hJkqjVnZkJRjlmitv6PzwaP1nJN0JXRONhrfMtVlpA&name=staging-it%2Fbeamapp-jenkins-0810181904-668832.1597083544.668959%2Fdataflow-worker.jar>: response: <{'status': '503', 'content-length': '0', 'server': 'UploadServer', 'x-guploader-uploadid': 'AAANsUnGTB0EtoRyDuYx3sd4WkfUsBImcoi7Kvxdi6D6NE_cIkhvjbO8hJkqjVnZkJRjlmitv6PzwaP1nJN0JXRONhrfMtVlpA', 'date': 'Mon, 10 Aug 2020 18:19:23 GMT', 'content-type': 'text/plain; charset=utf-8'}>, content <> -------------------- >> begin captured logging << -------------------- apache_beam.io.gcp.bigquery_file_loads_test: INFO: Created dataset python_bq_file_loads_1597083537964 in project apache-beam-testing apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter. root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.24.0.dev. If the image is not available at local, we will try to pull from hub.docker.com apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/pipeline.pb... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/requirements.txt in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/parameterized-0.7.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/parameterized-0.7.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/mock-2.0.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/mock-2.0.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/six-1.15.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/six-1.15.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/pbr-5.4.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/pbr-5.4.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/PyHamcrest-1.10.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/PyHamcrest-1.10.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/dataflow_python_sdk.tar... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/dataflow_python_sdk.tar in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0810181904-668832.1597083544.668959/dataflow-worker.jar... --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py27.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 66 tests in 4057.448s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 210 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:postCommitPy2IT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 116 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 9m 34s 158 actionable tasks: 121 executed, 35 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/hr7vfzjjr2xpa Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
