See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4105/display/redirect>
Changes: ------------------------------------------ [...truncated 47.41 MB...] INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.001Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:11.600Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:11.629Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.017Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.036Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.046Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/Flatten INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.072Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.104Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.133Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.153Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/Flatten.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:15.200Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:16.874Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorDestinationLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-07-13_06_23_49-10628822499277826150 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.093Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read+write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/ParDo(UpdateDestinationSchema) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.176Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.208Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.257Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.298Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.366Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs/_UnpickledSideInput(ParDo(UpdateDestinationSchema).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:21.441Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:25.876Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorSchemaModJobs/Read+write/BigQueryBatchFileLoads/WaitForSchemaModJobs/WaitForSchemaModJobs INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:25.922Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForSchemaModJobs.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:25.999Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:26.034Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:26.100Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(WaitForSchemaModJobs.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:26.200Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:29.699Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:29.767Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:29.833Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:29.876Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:29.942Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:30.010Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.528Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/ImpulseMonitorCopyJobs/Read+write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.595Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/WaitForCopyJobs.out" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.646Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.683Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0) INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.743Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(WaitForCopyJobs.out.0).output" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.801Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:33.955Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:34.020Z: JOB_MESSAGE_DEBUG: Value "write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:34.100Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:37.637Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/Impulse/Read+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:37.698Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:37.742Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:37.825Z: JOB_MESSAGE_BASIC: Executing operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:38.823Z: JOB_MESSAGE_BASIC: Finished operation write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+write/BigQueryBatchFileLoads/RemoveTempTables/Delete INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:38.892Z: JOB_MESSAGE_DEBUG: Executing success step success48 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:38.981Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:39.025Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:32:39.055Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:33:21.948Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:33:21.987Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-07-13_06_24_28-2889318762639440042 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_16261826543584.python_no_schema_table to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 244 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/455d0220-c8d0-438c-a1da-4c17c248bef5?maxResults=0&timeoutMs=10000&location=US&prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/455d0220-c8d0-438c-a1da-4c17c248bef5?fields=jobReference%2CtotalRows%2CpageToken%2Crows&location=US&formatOptions.useInt64Timestamp=True&prettyPrint=false HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59)), (b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999))] INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_16261826543584 in project apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:36.404Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:36.463Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:36.529Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:36.613Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:45.952Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:46.026Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:46.100Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:46.155Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:36:46.186Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:37:34.335Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:37:34.367Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-07-13_06_29_17-11703072940004958633 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:15.854Z: JOB_MESSAGE_BASIC: Finished operation Create data/Read+Predict UserEvent/ParDo(_PredictUserEventFn)+ParDo(CallableWrapperDoFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/_CoGBKImpl/Tag[1]+assert_that/Group/_CoGBKImpl/GroupByKey/Reify+assert_that/Group/_CoGBKImpl/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:19.018Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/_CoGBKImpl/Tag[0]+assert_that/Group/_CoGBKImpl/GroupByKey/Reify+assert_that/Group/_CoGBKImpl/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:19.086Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/_CoGBKImpl/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:19.144Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/_CoGBKImpl/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:19.206Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/_CoGBKImpl/GroupByKey/Read+assert_that/Group/_CoGBKImpl/GroupByKey/GroupByWindow+assert_that/Group/_CoGBKImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:28.668Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/_CoGBKImpl/GroupByKey/Read+assert_that/Group/_CoGBKImpl/GroupByKey/GroupByWindow+assert_that/Group/_CoGBKImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:28.737Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:28.824Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:28.929Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:38:28.966Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:39:20.964Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2021-07-13T13:39:21.007Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2021-07-13_06_30_43-1572990108087919214 is in state JOB_STATE_DONE test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_flight_delays (apache_beam.examples.dataframe.flight_delays_it_test.FlightDelaysTest) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_aggregation (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok test_enrich (apache_beam.examples.dataframe.taxiride_it_test.TaxirideIT) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... SKIP: BEAM-12352: enable once maxBytesRewrittenPerCall works again test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection Test that schema update options are respected when appending to an existing ... ok test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_create_catalog_item (apache_beam.ml.gcp.recommendations_ai_test_it.RecommendationAIIT) ... ok test_create_user_event (apache_beam.ml.gcp.recommendations_ai_test_it.RecommendationAIIT) ... ok test_predict (apache_beam.ml.gcp.recommendations_ai_test_it.RecommendationAIIT) ... ok ====================================================================== ERROR: Failure: ModuleNotFoundError (No module named 'selenium') ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/failure.py",> line 39, in runTest raise self.exc_val.with_traceback(self.tb) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/loader.py",> line 418, in loadTestsFromName addr.filename, addr.module) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/importer.py",> line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/nose/importer.py",> line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/imp.py",> line 235, in load_module return load_source(name, filename, file) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/imp.py",> line 172, in load_source module = _load(spec) File "<frozen importlib._bootstrap>", line 684, in _load File "<frozen importlib._bootstrap>", line 665, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/interactive/testing/integration/tests/screen_diff_tests.py",> line 26, in <module> from selenium.webdriver.common.by import By ModuleNotFoundError: No module named 'selenium' -------------------- >> begin captured logging << -------------------- avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header' avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic' avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync' azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header' azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic' azure.storage.blob._shared.avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync' apache_beam.typehints.native_type_compatibility: INFO: Using Any for unsupported type: typing.Sequence[~T] root: WARNING: python-snappy is not installed; some tests will be skipped. root: WARNING: Tensorflow is not installed, so skipping some tests. apache_beam.runners.interactive.interactive_environment: WARNING: Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features. apache_beam.runners.interactive.interactive_environment: WARNING: You cannot use Interactive Beam features when you are not in an interactive environment such as a Jupyter notebook or ipython terminal. root: WARNING: Make sure that locally built Python SDK docker image has Python 3.6 interpreter. root: INFO: Default Python SDK image for environment is apache/beam_python3.6_sdk:2.32.0.dev --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py36.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 75 tests in 5704.964s FAILED (SKIP=8, errors=1) > Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 126 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 37m 50s 216 actionable tasks: 155 executed, 57 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/nwe4zsmqczupk Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
