See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1342/display/redirect>
------------------------------------------ [...truncated 1.05 MB...] projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-07-12T06:40:11.980457Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2019-07-11_23_40_11-1191261000212408852] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_40_11-1191261000212408852?project=apache-beam-testing root: INFO: Job 2019-07-11_23_40_11-1191261000212408852 is in state JOB_STATE_RUNNING root: INFO: 2019-07-12T06:40:11.152Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-07-11_23_40_11-1191261000212408852. The number of workers will be between 1 and 1000. root: INFO: 2019-07-12T06:40:11.253Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-07-11_23_40_11-1191261000212408852. root: INFO: 2019-07-12T06:40:13.890Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2019-07-12T06:40:14.428Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b. root: INFO: 2019-07-12T06:40:14.998Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2019-07-12T06:40:15.042Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. root: INFO: 2019-07-12T06:40:15.091Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2019-07-12T06:40:15.139Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2019-07-12T06:40:15.197Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2019-07-12T06:40:15.236Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into create/Read root: INFO: 2019-07-12T06:40:15.282Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-07-12T06:40:15.321Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-07-12T06:40:15.367Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-07-12T06:40:15.412Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-07-12T06:40:15.609Z: JOB_MESSAGE_DEBUG: Executing wait step start3 root: INFO: 2019-07-12T06:40:15.704Z: JOB_MESSAGE_BASIC: Executing operation create/Read+write/WriteToBigQuery/NativeWrite root: INFO: 2019-07-12T06:40:15.763Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-07-12T06:40:15.807Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... root: INFO: 2019-07-12T06:40:16.056Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b. root: INFO: 2019-07-12T06:41:09.534Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2019-07-12T06:42:16.453Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-12T06:42:16.483Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-12T06:43:16.025Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b. root: INFO: Deleting dataset python_write_to_table_15629135871645 in project apache-beam-testing --------------------- >> end captured logging << --------------------- Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_34-17294466275199212589?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_18_28-7757643293936997929?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_26_35-17314734967805679305?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_34_46-15836799519169863129?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_44_28-12496521232369522802?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_32-111143196285111110?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_25_32-2580726001151973497?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_33_16-690333367401968055?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_29-2683198069503212272?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_15_55-5859413234428202815?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_25_46-97867751126401411?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_34_51-12291013847153424416?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_31-5206298545638663205?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_23_37-14816156054317192006?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_32_08-3152812264112974102?project=apache-beam-testing. Exception in thread Thread-4: Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_40_11-1191261000212408852?project=apache-beam-testing. Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-11_23_40_11-1191261000212408852/messages?alt=json&startTime=2019-07-12T06%3A43%3A16.025Z>: response: <{'content-length': '279', 'transfer-encoding': 'chunked', '-content-encoding': 'gzip', 'date': 'Fri, 12 Jul 2019 06:45:41 GMT', 'cache-control': 'private', 'server': 'ESF', 'status': '404', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'x-frame-options': 'SAMEORIGIN'}>, content <{ "error": { "code": 404, "message": "(c9d51b7d38d70590): Information about job 2019-07-11_23_40_11-1191261000212408852 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_30-16707432393307897119?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_12_27-6339028658017969576?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Exception in thread Thread-2: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_21_17-10395555851683024986?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_29_26-7561410011597723347?project=apache-beam-testing. File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_37_09-15003514892607763709?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-11_23_12_27-6339028658017969576/messages?alt=json&startTime=2019-07-12T06%3A17%3A18.832Z>: response: <{'content-length': '279', 'transfer-encoding': 'chunked', '-content-encoding': 'gzip', 'date': 'Fri, 12 Jul 2019 06:17:55 GMT', 'cache-control': 'private', 'server': 'ESF', 'status': '404', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'x-frame-options': 'SAMEORIGIN'}>, content <{ "error": { "code": 404, "message": "(2f54182bb9144767): Information about job 2019-07-11_23_12_27-6339028658017969576 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_31-4576728565057439430?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_11_05-4464446222040779526?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_21_20-9020280438189138964?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_30_11-12555418353994972774?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_32-5912634838364680133?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_12_25-1528024823446447563?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_22_06-13803280702370484659?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_32_09-6689845661054057308?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_41_14-486534141905650435?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_49_34-207575568032859997?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Exception in thread Thread-2: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_02_31-728891665139189759?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_12_51-17012282243903216946?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_21_03-5747413560794396145?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-11_23_29_54-17902490143459160277?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion response = runner.dataflow_client.get_job(job_id) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 663, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-11_23_12_51-17012282243903216946?alt=json>: response: <{'content-length': '280', 'transfer-encoding': 'chunked', '-content-encoding': 'gzip', 'date': 'Fri, 12 Jul 2019 06:17:59 GMT', 'cache-control': 'private', 'server': 'ESF', 'status': '404', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'content-type': 'application/json; charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', 'x-frame-options': 'SAMEORIGIN'}>, content <{ "error": { "code": 404, "message": "(15bba895cdf5a7d0): Information about job 2019-07-11_23_12_51-17012282243903216946 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 42 tests in 3380.337s FAILED (SKIP=5, failures=3) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 57m 54s 77 actionable tasks: 63 executed, 14 from cache Publishing build scan... https://gradle.com/s/pqndflmfbgew4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
