See <https://ci-beam.apache.org/job/beam_PostCommit_Python37/2783/display/redirect>
Changes: ------------------------------------------ [...truncated 15.32 MB...] DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/fc735dc5-41f1-4fec-b08d-f52f3d468215?maxResults=0&location=US HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7a0898538de8ba626a82d3c4cbd5ba8bc23260fb/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 31), datetime.time(23, 59, 59))] INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset python_write_to_table_15988569021486 in project apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:02:48.992Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:02:49.010Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:03:59.104Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:03:59.146Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:15.376Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_15783375864840713933". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_15783375864840713933". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:25.868Z: JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_15783375864840713933" done. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:26.642Z: JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:26.738Z: JOB_MESSAGE_DEBUG: Executing success step success1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:26.807Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:27.070Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:05:27.100Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:07.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:07.916Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:07.984Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-30_23_59_15-13061921572624371339 is in state JOB_STATE_DONE INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query SELECT fruit from `python_query_to_table_15988571459561.output_table`; to BQ DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process... DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process... DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process... DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials. DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254 DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80 DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 221 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): bigquery.googleapis.com:443 DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/669e2e72-e657-4735-bad4-38e9ed025893?maxResults=0&location=US HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonea5f6b75ccfe496f8a76238e0d24174f9a000c3e/data HTTP/1.1" 200 None INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT fruit from `python_query_to_table_15988571459561.output_table`;), total rows 2 INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:26.267Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:32.760Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map(<lambda at external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:32.830Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:32.885Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:32.951Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:42.169Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:42.236Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:42.310Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:42.361Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:06:42.394Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:36.173Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:36.250Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:36.303Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:36.371Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:37.166Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:37.211Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:37.239Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-31_00_00_24-2218075691715471291 is in state JOB_STATE_DONE INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:45.504Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:45.564Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:45.654Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:45.704Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:07:45.738Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:08:40.847Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:08:40.893Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-08-31T07:08:40.936Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-08-31_00_01_43-9195403923424740542 is in state JOB_STATE_DONE test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_streaming_wordcount_debugging_it (apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT) ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_read_via_sql (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_read_via_table (apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bigquery_read_custom_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok test_avro_file_load (apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok test_spanner_error (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_spanner_update (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_write_batches (apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ERROR test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_dicom_search_instances (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_dicom_store_instance_from_gcs (apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok test_analyzing_syntax (apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok test_basic_execution (apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] Tests that the TestStream can independently control output watermarks. ... SKIP: The "TestDataflowRunner", does not support the TestStream transform. Supported runners: ['DirectRunner', 'SwitchingDirectRunner'] test_text_detection_with_language_hint (apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok test_label_detection_with_video_context (apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_avro (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ====================================================================== ERROR: test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/parquetio_it_test.py",> line 72, in test_parquetio_it self._verify_data(pcol, init_size, data_size) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 571, in __exit__ self.result = self.run() File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run False if self.not_use_test_runner_api else test_runner_api)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 521, in run allow_proto_holders=True).run(False) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 550, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline self).run_pipeline(pipeline, options) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 479, in run_pipeline artifacts=environments.python_sdk_dependencies(options))) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/transforms/environments.py",> line 614, in python_sdk_dependencies staged_name in stager.Stager.create_job_resources(options, tmp_dir)) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 172, in create_job_resources setup_options.requirements_file, requirements_cache_path) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 558, in _populate_requirements_cache processes.check_output(cmd_args, stderr=processes.STDOUT) File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",> line 99, in check_output .format(traceback.format_exc(), args[0][6], error.output)) RuntimeError: Full traceback: Traceback (most recent call last): File "<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output out = subprocess.check_output(*args, **kwargs) File "/usr/lib/python3.7/subprocess.py", line 395, in check_output **kwargs).stdout File "/usr/lib/python3.7/subprocess.py", line 487, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1. Pip install failed for package: -r Output from execution of subprocess: b'Collecting pyhamcrest!=1.10.0,<2.0.0\n File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz\nCollecting mock<3.0.0\n File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz\nCollecting parameterized<0.8.0,>=0.7.1\n File was already downloaded /tmp/dataflow-requirements-cache/parameterized-0.7.4.tar.gz\nCollecting six\n File was already downloaded /tmp/dataflow-requirements-cache/six-1.15.0.tar.gz\nCollecting pbr>=0.11\n File was already downloaded /tmp/dataflow-requirements-cache/pbr-5.4.5.tar.gz\n ERROR: Command errored out with exit status 1:\n command: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python> -c \'import sys, setuptools, tokenize; sys.argv[0] = \'"\'"\'/tmp/pip-download-a_155ggj/pbr/setup.py\'"\'"\'; __file__=\'"\'"\'/tmp/pip-download-a_155ggj/pbr/setup.py\'"\'"\';f=getattr(tokenize, \'"\'"\'open\'"\'"\', open)(__file__);code=f.read().replace(\'"\'"\'\\r\\n\'"\'"\', \'"\'"\'\\n\'"\'"\');f.close();exec(compile(code, __file__, \'"\'"\'exec\'"\'"\'))\' egg_info --egg-base /tmp/pip-pip-egg-info-earvn245\n cwd: /tmp/pip-download-a_155ggj/pbr/\n Complete output (11 lines):\n <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/nose/plugins/manager.py>:395: RuntimeWarning: Unable to load plugin beam_test_plugin = test_config:BeamTestPlugin: No module named \'test_config\'\n RuntimeWarning)\n Traceback (most recent call last):\n File "<string>", line 1, in <module>\n File "/tmp/pip-download-a_155ggj/pbr/setup.py", line 18, in <module>\n from pbr import util\n File "/tmp/pip-download-a_155ggj/pbr/pbr/util.py", line 85, in <module>\n import pbr.hooks\n File "/tmp/pip-download-a_155ggj/pbr/pbr/hooks/__init__.py", line 17, in <module>\n from pbr.hooks import commands\n ImportError: cannot import name \'commands\' from \'pbr.hooks\' (/tmp/pip-download-a_155ggj/pbr/pbr/hooks/__init__.py)\n ----------------------------------------\nERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.\n' -------------------- >> begin captured logging << -------------------- root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] apache_beam.runners.portability.stager: INFO: Executing command: ['<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 66 tests in 3898.917s FAILED (SKIP=7, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':release:go-licenses:java:dockerRun'. > Process 'command 'docker'' finished with non-zero exit value 125 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':release:go-licenses:py:dockerRun'. > Process 'command 'docker'' finished with non-zero exit value 125 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 8m 26s 159 actionable tasks: 120 executed, 35 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/euhqzre742z3k Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
