See <https://ci-beam.apache.org/job/beam_PostCommit_Python38/534/display/redirect?page=changes>
Changes: [piotr.szuberski] [BEAM-8876] Run hadoop tests with different versions and enable [pawel.pasterz] Change hbase-shaded-client as provided dependency [pawel.pasterz] Update change.md file [piotr.szuberski] [BEAM-8615 BEAM-8569 BEAM-7937] Add hadoop 3 compatibility tests [noreply] Add trigger commands for Direct, Spark XVR Postcommits. (#13337) [noreply] [BEAM-9980] use constants of python versions in ------------------------------------------ [...truncated 24.60 MB...] id: '2020-11-16_11_08_11-17369218104329420760' location: 'us-central1' name: 'beamapp-jenkins-1116190802-053847' projectId: 'apache-beam-testing' stageStates: [] startTime: '2020-11-16T19:08:13.457345Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-11-16_11_08_11-17369218104329420760] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-11-16_11_08_11-17369218104329420760 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-11-16_11_08_11-17369218104329420760?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-16_11_08_11-17369218104329420760 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:11.404Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-11-16_11_08_11-17369218104329420760. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:11.404Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-11-16_11_08_11-17369218104329420760. The number of workers will be between 1 and 1000. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:18.527Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.167Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.204Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.232Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.272Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.340Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.382Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.416Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.446Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.480Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.516Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.554Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.587Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.612Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.645Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.680Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.717Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.897Z: JOB_MESSAGE_DEBUG: Executing wait step start13 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.957Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:22.993Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:23.026Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:23.079Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:23.152Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:23.221Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:35.374Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:35.410Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:52.143Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:52.172Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:08:51.218Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:09:37.280Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:11:07.820Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:11:07.853Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:39.576Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:46.221Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:46.292Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:46.342Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:46.454Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:55.613Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:55.679Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:55.767Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:55.807Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:12:55.857Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:02.570Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:05.708Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type matches/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:05.806Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:05.873Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:05.950Z: JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:15.131Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:15.205Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:15.295Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:15.354Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:15.391Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:45.793Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:45.833Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:13:45.885Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-16_11_06_21-15063805673081852776 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:14:11.119Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:14:11.167Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:14:11.215Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-16_11_06_38-14667721595831699082 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:31.469Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:32.104Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:32.156Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:32.216Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:40.760Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:40.831Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:40.892Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:40.946Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:15:40.962Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:16:30.954Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:16:30.989Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-11-16T19:16:31.018Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-11-16_11_08_11-17369218104329420760 is in state JOB_STATE_DONE test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ERROR test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ====================================================================== ERROR: test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py",> line 314, in test_big_query_new_types_native big_query_query_to_table_pipeline.run_bq_pipeline(options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py",> line 105, in run_bq_pipeline result = p.run() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 110, in run result = super(TestPipeline, self).run( File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 514, in run return Pipeline.from_runner_api( File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 56, in run_pipeline self.result = super(TestDataflowRunner, File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 526, in run_pipeline artifacts=environments.python_sdk_dependencies(options))) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/transforms/environments.py",> line 738, in python_sdk_dependencies staged_name in stager.Stager.create_job_resources( File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 176, in create_job_resources ( File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 569, in _populate_requirements_cache processes.check_output(cmd_args, stderr=processes.STDOUT) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/processes.py",> line 96, in check_output raise RuntimeError( \ RuntimeError: Full traceback: Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output out = subprocess.check_output(*args, **kwargs) File "/usr/lib/python3.8/subprocess.py", line 411, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File "/usr/lib/python3.8/subprocess.py", line 512, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1. Pip install failed for package: -r Output from execution of subprocess: b'Collecting pyhamcrest!=1.10.0,<2.0.0\n File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz\nCollecting mock<3.0.0\n File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz\n ERROR: Command errored out with exit status 1:\n command: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python> -c \'import sys, setuptools, tokenize; sys.argv[0] = \'"\'"\'/tmp/pip-download-057054c_/mock/setup.py\'"\'"\'; __file__=\'"\'"\'/tmp/pip-download-057054c_/mock/setup.py\'"\'"\';f=getattr(tokenize, \'"\'"\'open\'"\'"\', open)(__file__);code=f.read().replace(\'"\'"\'\\r\\n\'"\'"\', \'"\'"\'\\n\'"\'"\');f.close();exec(compile(code, __file__, \'"\'"\'exec\'"\'"\'))\' egg_info --egg-base /tmp/pip-pip-egg-info-hfukqq_t\n cwd: /tmp/pip-download-057054c_/mock/\n Complete output (5 lines):\n Traceback (most recent call last):\n File "<string>", line 1, in <module>\n File "<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/lib/python3.8/tokenize.py",> line 392, in open\n buffer = _builtin_open(filename, \'rb\')\n FileNotFoundError: [Errno 2] No such file or directory: \'/tmp/pip-download-057054c_/mock/setup.py\'\n ----------------------------------------\nERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.\n' -------------------- >> begin captured logging << -------------------- root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/build/gradleenv/-1734967051/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py38.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 67 tests in 4211.373s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py38:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python38/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py38:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 16m 22s 205 actionable tasks: 171 executed, 30 from cache, 4 up-to-date Gradle was unable to watch the file system for changes. The inotify watches limit is too low. Publishing build scan... https://gradle.com/s/mfovdw353eemy Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
