See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/3248/display/redirect>
Changes: ------------------------------------------ [...truncated 52.29 MB...] INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:38.918Z: JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type matches/Group/GroupByKey/GroupByWindow+Type matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type matches/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:38.986Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:39.063Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:39.106Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:39.171Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:51.413Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:07:51.942Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.105Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.476Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.527Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:22.559Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:22.585Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.145Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.183Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:20.568Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-25_11_01_47-14786617851269019436 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-25_11_01_07-15304431052855149656 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_16089228543194.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform HTTP/1.1" 200 241 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs?prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/a505ff4b-8764-4c47-9f40-c02efed50055?maxResults=0&location=US&prettyPrint=false HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7cb85f6c6dfb2050b951af8608e76c57f0a115b3/data?prettyPrint=false HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_16089228543194.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:47.138Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:47.208Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:47.254Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:47.328Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:56.390Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:56.436Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:56.511Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:56.542Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:12:56.568Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:13:45.463Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:13:45.500Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:13:45.526Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-12-25_11_07_21-3182975521816413800 is in state JOB_STATE_DONE test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_run_example_with_setup_file (apache_beam.examples.complete.juliaset.juliaset.juliaset_test_it.JuliaSetTestIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_read_queries (apache_beam.io.gcp.bigquery_read_it_test.ReadAllBQTests) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ERROR test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ====================================================================== ERROR: test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/transforms/external_it_test.py",> line 64, in test_job_python_from_python_it pipeline_from_proto.run().wait_until_finish() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 532, in run self._options).run(False) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 561, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline self).run_pipeline(pipeline, options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 621, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 655, in create_job self.create_job_description(job) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 711, in create_job_description resources = self._stage_resources(job.proto_pipeline, job.options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 608, in _stage_resources resources, staging_location=google_cloud_options.staging_location) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 316, in stage_job_resources file_path, FileSystems.join(staging_location, staged_path)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 976, in stage_artifact local_path_to_artifact, artifact_name) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 260, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 572, in _gcs_file_copy self.stage_file(to_folder, to_name, f, total_size=total_size) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 633, in stage_file response = self._storage_client.objects.Insert(request, upload=upload) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1156, in Insert upload=upload, upload_config=upload_config) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpError: HttpError accessing <https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-1225190306-991489.1608922986.991665%2Fdataflow-worker.jar&uploadType=resumable&upload_id=ABg5-UyEhQU4Nsc3W6Dmy5hKwixfVcuQzOf_WfC_GayIs83QBBOVsrZeKsDwNofOR7fYaxreVQInZxd1IP2S7-wZ3LtOLsbZoQ>: response: <{'content-type': 'text/plain; charset=utf-8', 'x-guploader-uploadid': 'ABg5-UyEhQU4Nsc3W6Dmy5hKwixfVcuQzOf_WfC_GayIs83QBBOVsrZeKsDwNofOR7fYaxreVQInZxd1IP2S7-wZ3LtOLsbZoQ', 'content-length': '0', 'date': 'Fri, 25 Dec 2020 19:03:32 GMT', 'server': 'UploadServer', 'status': '503'}>, content <> -------------------- >> begin captured logging << -------------------- root: INFO: Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. root: WARNING: Make sure that locally built Python SDK docker image has Python 3.7 interpreter. root: INFO: Default Python SDK image for environment is apache/beam_python3.7_sdk:2.28.0.dev root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. root: WARNING: Make sure that locally built Python SDK docker image has Python 3.7 interpreter. root: INFO: Default Python SDK image for environment is apache/beam_python3.7_sdk:2.28.0.dev root: INFO: Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20201214 root: INFO: Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37:beam-master-20201214" for Docker environment apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function eliminate_common_key_with_none at 0x7ff0c6028400> ==================== apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)_15\n ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_18\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19\n assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_23\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_27\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_30\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_31\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>'] apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function pack_combiners at 0x7ff0c6028488> ==================== apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)_15\n ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_18\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19\n assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_23\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_27\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_30\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_31\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>'] apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function sort_stages at 0x7ff0c6028bf8> ==================== apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2957>)_4\n Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_13\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)_15\n ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_18\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2957>)_19\n assert_that/Create/FlatMap(<lambda at core.py:2957>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_21\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_23\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_25\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_26\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_27\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_28\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_29\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_30\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_31\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>'] apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pipeline.pb... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/requirements.txt in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/parameterized-0.7.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/parameterized-0.7.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.5.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.5.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/mock-2.0.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/mock-2.0.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/six-1.15.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/six-1.15.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.4.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.4.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.5.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/pbr-5.5.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/PyHamcrest-1.10.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/PyHamcrest-1.10.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/dataflow_python_sdk.tar... root: DEBUG: Response returned status 500, retrying root: DEBUG: Retrying request to url https://www.googleapis.com/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-1225190306-991489.1608922986.991665%2Fdataflow_python_sdk.tar&uploadType=multipart after exception HttpError accessing <https://www.googleapis.com/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-1225190306-991489.1608922986.991665%2Fdataflow_python_sdk.tar&uploadType=multipart>: response: <{'x-guploader-uploadid': 'ABg5-UyL1lX4YLfVmUyQnRU9eUhfojHBUMmDB_V9NthHJRoIb6unluk9LtXpI1yMhvr9sYK-Wd-kFzybk_7mwrRBLDQ', 'date': 'Fri, 25 Dec 2020 19:03:15 GMT', 'vary': 'Origin, X-Origin', 'cache-control': 'no-cache, no-store, max-age=0, must-revalidate', 'expires': 'Mon, 01 Jan 1990 00:00:00 GMT', 'pragma': 'no-cache', 'content-length': '0', 'server': 'UploadServer', 'content-type': 'text/html; charset=UTF-8', 'status': '500'}>, content <> apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/dataflow_python_sdk.tar in 9 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1225190306-991489.1608922986.991665/dataflow-worker.jar... --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 68 tests in 4177.618s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 13m 37s 214 actionable tasks: 154 executed, 56 from cache, 4 up-to-date Gradle was unable to watch the file system for changes. The inotify watches limit is too low. Publishing build scan... https://gradle.com/s/dcyazq65eelvy Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
