See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/312/display/redirect?page=changes>
Changes: [ryan.worley] [BEAM-10564] Support more Avro field name formats when mapping to Java [lourens] Support for Kafka deserialization API with headers (since Kafka API [lourens] Assert the deserializer method with a Headers argument exists and [ryan.worley] Test new mappable field names [lourens] Introduce a kafkaVersion210Test for testing KafkaIOTest against [noreply] Fix broken link [Ahmet Altay] Clarify Beam's use of semantic versioning. [Alan Myrvold] [BEAM-9136] Add python dependency license CSV for license URL and type [lourens] Let the kafkaVersion210 configuration use a resolution strategy to force [noreply] [BEAM-9154] Disable Chicago Taxi Example on Jenkins (#12886) [noreply] [BEAM-7372][BEAM-9372] Removes Python 2 and Python 3.5 Postcommit jobs. [Kyle Weaver] Clean up CHANGES.md in preparation for 2.25.0 release. [noreply] Update indexing skips for pandas 1.x (#12896) ------------------------------------------ [...truncated 22.06 MB...] INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:44.684Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:44.729Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:44.786Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:44.830Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:44.924Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:45.004Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:51.621Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.138Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.227Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.265Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.295Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.328Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.360Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.361Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.393Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.412Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.415Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.440Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.442Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.470Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.474Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.507Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.541Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:55.585Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:59.694Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:59.760Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:59.823Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:59.877Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:59.964Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:00.046Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:03:58.119Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:02.663Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:02.733Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:02.812Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:02.881Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:05.770Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:05.839Z: JOB_MESSAGE_DEBUG: Executing success step success44 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:05.929Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:05.979Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:06.005Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:08.389Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_7376902020001300392" observed total of 1 exported files thus far. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:08.463Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_7376902020001300392" INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:11.470Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:27.069Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:27.126Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:51.235Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:51.305Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:58.890Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:58.935Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:04:58.979Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-21_17_57_16-1884719828647388099 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_1600736223626.python_no_schema_table to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/580c1ac4-1b36-4c54-9816-13a68232336f?maxResults=0&location=US HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon3e2f65fe939cdfab1e668e3916dde1523ea80d27/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))] INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_1600736223626 in project apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:05:16.549Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:05:16.596Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:06:00.657Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:06:00.758Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:12.882Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:16.066Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:16.124Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:16.185Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:16.260Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:25.488Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:25.626Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:25.698Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:25.749Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:25.790Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:52.925Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_2283117836682263357". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2283117836682263357". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:08:56.769Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:03.331Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:03.405Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:03.445Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:03.514Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:03.546Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_2283117836682263357" done. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:04.333Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:04.419Z: JOB_MESSAGE_DEBUG: Executing success step success1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:04.508Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:08.858Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:08.903Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:08.937Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:04.694Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:04.734Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:12.718Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:12.778Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:12.859Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:12.904Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:12.928Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-21_18_02_06-8829513826135841473 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:47.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:47.182Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:47.234Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-21_18_02_21-8459099648878759198 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16007365303116.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/8d5609a7-bcb7-4cf6-9a0f-9c959af8bbcc?maxResults=0&location=US HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonc8aa9a2ed7468481005474c017a2c2b0c441eed9/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16007365303116.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:56.479Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:56.521Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:09:56.556Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:01.136Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:01.364Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:01.422Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:01.759Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-21_18_03_12-7131493819848479270 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:10.508Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:10.573Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:10.764Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:10.828Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:10.868Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:56.155Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:56.201Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-22T01:10:56.242Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-09-21_18_03_34-5890765473309010570 is in state JOB_STATE_DONE test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py38.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 66 tests in 4021.218s OK (SKIP=7) FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 55 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py38:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 10m 48s 172 actionable tasks: 122 executed, 46 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/54bmqpwnu5qhg Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
