See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/822/display/redirect>
------------------------------------------ [...truncated 344.62 KB...] Ran 35 tests in 2216.699s OK (SKIP=11) > Task :beam-sdks-python-test-suites-dataflow-py35:postCommitIT Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_34-10120969471008498464?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_17_57-9398476425821839752?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_25_44-6228352441150263529?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_29-16380086563678021639?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_22_21-412742590031108098?project=apache-beam-testing. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_32-7795460752071704874?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_15_52-14095142147169085724?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_23_46-4459559417819123048?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_30_50-6499556042610975006?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_22-5227358905781442699?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_20_52-5256427966174394685?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_28_46-13193241068429924822?project=apache-beam-testing. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_22-5529019474291790552?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_10_57-15718391025986543826?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_18_13-7503440813324744639?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_25_53-11859613599231784378?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_34_13-7819854416147841399?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_41_02-12617465174806835617?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_26-14600519598530972494?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_09_56-15475120664656592773?project=apache-beam-testing. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_19_55-14978428519001764330?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_27_00-9803974653113639655?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_30-5239917858039049148?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_10_59-13235227876415639259?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_19_00-9300014213588524243?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_26_45-16595766754763335998?project=apache-beam-testing. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_02_28-11303620262501376415?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_11_34-15243432701266508766?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_21_43-7667201999081099313?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-6769 test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 35 tests in 2799.508s OK (SKIP=4) > Task :beam-sdks-python-test-suites-dataflow-py35:validatesRunnerBatchTests >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar> >>> >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: --nocapture --processes=8 --process-timeout=4500 >>> --attr=ValidatesRunner running nosetests running egg_info writing entry points to apache_beam.egg-info/entry_points.txt writing dependency_links to apache_beam.egg-info/dependency_links.txt writing apache_beam.egg-info/PKG-INFO writing requirements to apache_beam.egg-info/requires.txt writing top-level names to apache_beam.egg-info/top_level.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features. 'Python 3 support for the Apache Beam SDK is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1709362674/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features. 'Running the Apache Beam SDK on Python 3 is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543. warnings.warn('Datastore IO will support Python 3 after replacing ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628. warnings.warn("VCF IO will support Python 3 after migration to Nucleus, " Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_49-12635008090240496377?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_12-3864229389485238656?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_49-16175449588793719752?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_41-4728151896090137099?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_50-15532025906854774109?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_47-1636335687686326034?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_49-915124874354397064?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_57_07-9969373132215192312?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_48-6708891499316760667?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_17-955360388467318603?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_50-423071473965826494?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_52-18304335860849664761?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_35-12722883049073011888?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_48_49-9327346315942247532?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-10_23_56_47-8903050626768783299?project=apache-beam-testing. test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ERROR test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok ====================================================================== ERROR: test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 296, in test_as_dict_twice pipeline.run() File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in run else test_runner_api)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline pipeline, options) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 460, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 521, in create_job self.create_job_description(job) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 551, in create_job_description resources = self._stage_resources(job.options) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 481, in _stage_resources staging_location=google_cloud_options.staging_location) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 168, in stage_job_resources requirements_cache_path) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 487, in _populate_requirements_cache processes.check_output(cmd_args, stderr=processes.STDOUT) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/processes.py",> line 91, in check_output .format(traceback.format_exc(), args[0][6], error.output)) RuntimeError: Full traceback: Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/processes.py",> line 83, in check_output out = subprocess.check_output(*args, **kwargs) File "/usr/lib/python3.5/subprocess.py", line 626, in check_output **kwargs).stdout File "/usr/lib/python3.5/subprocess.py", line 708, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1709362674/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' returned non-zero exit status 1 Pip install failed for package: -r Output from execution of subprocess: b'Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1))\n File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz\nCollecting mock (from -r postcommit_requirements.txt (line 2))\n ERROR: Could not find a version that satisfies the requirement mock (from -r postcommit_requirements.txt (line 2)) (from versions: none)\nERROR: No matching distribution found for mock (from -r postcommit_requirements.txt (line 2))\n' -------------------- >> begin captured logging << -------------------- root: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled. root: DEBUG: Connecting using Google Application Default Credentials. root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0511064823-091164.1557557303.091388/pipeline.pb... oauth2client.transport: INFO: Attempting refresh to obtain initial access_token oauth2client.transport: INFO: Attempting refresh to obtain initial access_token root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0511064823-091164.1557557303.091388/pipeline.pb in 0 seconds. root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0511064823-091164.1557557303.091388/requirements.txt... root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0511064823-091164.1557557303.091388/requirements.txt in 0 seconds. root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1709362674/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 16 tests in 968.044s FAILED (errors=1) > Task :beam-sdks-python-test-suites-dataflow-py35:validatesRunnerBatchTests > FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 67 * What went wrong: Execution failed for task ':beam-sdks-python-test-suites-dataflow-py35:validatesRunnerBatchTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 4m 21s 71 actionable tasks: 54 executed, 17 from cache Publishing build scan... https://gradle.com/s/xuquz62mlelt2 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
