See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/989/display/redirect?page=changes>
Changes: [pabloem] [BEAM-6769][BEAM-7327][BEAM-7382] add it test for writing and reading ------------------------------------------ [...truncated 963.27 KB...] Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_21-1225450161827738394?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_46_00-8992727838186199707?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:680: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_55_16-14138213385751645172?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_20-8240574757660342207?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_38_20-18107377270274934585?project=apache-beam-testing. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_46_27-16172664164370988286?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_54_39-13683080168417198589?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Exception in thread Thread-9: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_02_04-7666070445230219124?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 152, in poll_for_job_completion response = runner.dataflow_client.get_job(job_id) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 661, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-05-29_13_54_39-13683080168417198589?alt=json>: response: <{'content-length': '280', 'x-content-type-options': 'nosniff', 'date': 'Wed, 29 May 2019 21:00:45 GMT', 'content-type': 'application/json; charset=UTF-8', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'x-xss-protection': '0', 'server': 'ESF', 'cache-control': 'private', 'x-frame-options': 'SAMEORIGIN', '-content-encoding': 'gzip', 'status': '404'}>, content <{ "error": { "code": 404, "message": "(844f52954e986839): Information about job 2019-05-29_13_54_39-13683080168417198589 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_18-6125407000896505937?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_44_32-7261000997311640672?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:680: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_53_57-7461165336017253683?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_02_11-14316310635361441363?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_19-17075949993416226115?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:680: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_32_56-2909044857007255082?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Exception in thread Thread-5: Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_40_36-9324524120401181080?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_48_04-7494489581513218968?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_56_33-7011114261448605830?project=apache-beam-testing. Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 152, in poll_for_job_completion response = runner.dataflow_client.get_job(job_id) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 661, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-05-29_13_56_33-7011114261448605830?alt=json>: response: <{'content-length': '279', 'x-content-type-options': 'nosniff', 'date': 'Wed, 29 May 2019 21:00:37 GMT', 'content-type': 'application/json; charset=UTF-8', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'x-xss-protection': '0', 'server': 'ESF', 'cache-control': 'private', 'x-frame-options': 'SAMEORIGIN', '-content-encoding': 'gzip', 'status': '404'}>, content <{ "error": { "code": 404, "message": "(fe32895ac975b1e4): Information about job 2019-05-29_13_56_33-7011114261448605830 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_17-3665661721341862062?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_31_12-6010490132550934566?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:680: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_40_45-5601553216259859423?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_48_41-9100491685633777195?project=apache-beam-testing. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_21-2622490669419985033?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_32_59-10781338769942981562?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_41_27-7283314527501269354?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_51_52-18240003431786232177?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_23_24-1988363773204348524?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_32_37-12455510920644468614?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_42_10-4151522935975783343?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_13_51_58-9304248964685476123?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_01_57-6964511550855121402?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_10_25-5618427574054198473?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 41 tests in 3342.856s FAILED (SKIP=4, failures=3) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED > Task :sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar> >>> >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: --nocapture --processes=8 --process-timeout=4500 >>> --attr=ValidatesRunner running nosetests running egg_info writing entry points to apache_beam.egg-info/entry_points.txt writing top-level names to apache_beam.egg-info/top_level.txt writing dependency_links to apache_beam.egg-info/dependency_links.txt writing requirements to apache_beam.egg-info/requires.txt writing apache_beam.egg-info/PKG-INFO reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' setup.py:177: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features. 'Python 3 support for the Apache Beam SDK is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features. 'Running the Apache Beam SDK on Python 3 is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543. warnings.warn('Datastore IO will support Python 3 after replacing ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628. warnings.warn("VCF IO will support Python 3 after migration to Nucleus, " Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_52-17558408330324924194?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_26_35-11461205284937066665?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_56-7129115509617842004?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_27_19-13216816646480284995?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_53-12124859306350816484?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_27_30-18414741619873053519?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_53-17440884290780531927?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_28_01-13939541306255083161?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_53-15026458664720468493?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_28_16-16905261627796937354?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_19_05-2532889078710896174?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_26_52-10243667949973537285?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_34_13-472077029399164816?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_54-12104141917921799404?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_28_22-12803502430055255131?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_18_54-7919404102742552594?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-29_14_26_32-10774284864505030748?project=apache-beam-testing. test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 17 tests in 1386.469s OK FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 20m 52s 78 actionable tasks: 62 executed, 16 from cache Publishing build scan... https://gradle.com/s/lwgrzzlqdcv2q Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
