See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3216/display/redirect?page=changes>
Changes: [noreply] [Beam-11002] Fixes BufferOverflowException in XMLReader (#13513) ------------------------------------------ [...truncated 52.45 MB...] INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:13.571Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:13.771Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:14.171Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:14.869Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:15.692Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:15.770Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:15.968Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:16.368Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:16.559Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:16.667Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:16.893Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:17.004Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:17.100Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:17.274Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:18.602Z: JOB_MESSAGE_DEBUG: Executing wait step start13 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:19.909Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:20.357Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:20.468Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:20.955Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:21.366Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:21.863Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:39.611Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:06:47.790Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:09.645Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:09.690Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:50.865Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:54.975Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:55.087Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:55.138Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:07:55.204Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:04.480Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:04.582Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:04.684Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:04.728Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:04.759Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:56.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:56.280Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:08:56.319Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-17_11_01_51-11797350799447380909 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:09:49.577Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:09:52.704Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:09:52.773Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:09:52.914Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:09:52.989Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:02.138Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:02.284Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:02.374Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:02.497Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:02.560Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:44.145Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:44.191Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:10:44.219Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-17_11_04_08-782595067158746677 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:08.478Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_5265141052057138439". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_5265141052057138439". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:12.321Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/FilesToRemoveImpulse/Read+read/ReadFromBigQuery/MapFilesToRemove INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:12.386Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/MapFilesToRemove.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:12.461Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:12.514Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:12.631Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(MapFilesToRemove.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:19.580Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_5265141052057138439" done. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.300Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/Write/NativeWrite INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.376Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.483Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.533Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.612Z: JOB_MESSAGE_DEBUG: Value "read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:20.670Z: JOB_MESSAGE_BASIC: Executing operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:26.572Z: JOB_MESSAGE_BASIC: Finished operation read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveExtractedFiles)/ParDo(RemoveExtractedFiles) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:26.660Z: JOB_MESSAGE_DEBUG: Executing success step success3 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:26.888Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:26.951Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:11:26.982Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:12.716Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:12.769Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:12.805Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:13.309Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:13.410Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:13.455Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:13.531Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-17_11_04_34-18017326569855053391 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16082318596819.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/40d09325-7c0c-4b6b-b72f-cfd77055d433?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon6bb6e20215a705579adeb155fa2c4ce9877eb2d1/data?prettyPrint=false HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16082318596819.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:23.195Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:23.426Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:23.531Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:23.583Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:12:23.659Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:13:06.074Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:13:07.077Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-17T19:13:07.182Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-17_11_06_00-401040921907192190 is in state JOB_STATE_DONE test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ERROR test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ====================================================================== ERROR: test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/experimental/spannerio_write_it_test.py",> line 146, in test_spanner_update with database.batch() as batch: File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/database.py",> line 611, in __enter__ session = self._session = self._database._pool.get() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/pool.py",> line 273, in get session.create() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/session.py",> line 121, in create self._database.name, metadata=metadata, **kw File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/cloud/spanner_v1/gapic/spanner_client.py",> line 307, in create_session request, retry=retry, timeout=timeout, metadata=metadata File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py",> line 145, in __call__ return wrapped_func(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/retry.py",> line 286, in retry_wrapped_func on_error=on_error, File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/retry.py",> line 184, in retry_target return target() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/timeout.py",> line 214, in func_with_timeout return func(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/api_core/grpc_helpers.py",> line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>", line 3, in raise_from google.api_core.exceptions.ResourceExhausted: 429 The instance has too many database splits to complete the operation. Please add more nodes and try again. -------------------- >> begin captured logging << -------------------- google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fspanner.data HTTP/1.1" 200 241 --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 68 tests in 4143.535s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 12m 54s 211 actionable tasks: 152 executed, 55 from cache, 4 up-to-date Gradle was unable to watch the file system for changes. The inotify watches limit is too low. Publishing build scan... https://gradle.com/s/gjpmbmnw7mim2 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
