See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/916/display/redirect?page=changes>
Changes: [migryz] Update stability critical jobs status ------------------------------------------ [...truncated 725.80 KB...] root: INFO: Create job: <Job createTime: '2019-05-21T17:49:54.239348Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2019-05-21_10_49_53-12270823172710169910' location: 'us-central1' name: 'beamapp-jenkins-0521174920-952482' projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-05-21T17:49:54.239348Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> root: INFO: Created job with id: [2019-05-21_10_49_53-12270823172710169910] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_53-12270823172710169910?project=apache-beam-testing root: INFO: Job 2019-05-21_10_49_53-12270823172710169910 is in state JOB_STATE_RUNNING root: INFO: 2019-05-21T17:49:56.853Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2019-05-21T17:49:57.666Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-a. root: INFO: 2019-05-21T17:49:58.241Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. root: INFO: 2019-05-21T17:49:58.243Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. root: INFO: 2019-05-21T17:49:58.250Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2019-05-21T17:49:58.258Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. root: INFO: 2019-05-21T17:49:58.260Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps root: INFO: 2019-05-21T17:49:58.266Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2019-05-21T17:49:58.279Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2019-05-21T17:49:58.281Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read root: INFO: 2019-05-21T17:49:58.283Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split root: INFO: 2019-05-21T17:49:58.286Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets root: INFO: 2019-05-21T17:49:58.288Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode root: INFO: 2019-05-21T17:49:58.290Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format root: INFO: 2019-05-21T17:49:58.292Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream root: INFO: 2019-05-21T17:49:58.294Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count root: INFO: 2019-05-21T17:49:58.296Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn) root: INFO: 2019-05-21T17:49:58.298Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode root: INFO: 2019-05-21T17:49:58.300Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one root: INFO: 2019-05-21T17:49:58.310Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-05-21T17:49:58.321Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-05-21T17:49:58.331Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-05-21T17:49:58.450Z: JOB_MESSAGE_DEBUG: Executing wait step start2 root: INFO: 2019-05-21T17:49:58.462Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-05-21T17:49:58.466Z: JOB_MESSAGE_BASIC: Starting 1 workers... root: INFO: 2019-05-21T17:50:00.393Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/Write/NativeWrite root: INFO: 2019-05-21T17:50:00.393Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream root: WARNING: Timing out on waiting for job 2019-05-21_10_49_53-12270823172710169910 after 182 seconds google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 176 root: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_outputacb11087-da13-4d22-ae49-be3f612308fc. --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_56-8503844973212011866?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_07_13-462040528737665311?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_17_11-5249809968938924834?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_56-18431711030549965610?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_21_09-16135401951155805266?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_34_15-4841056845788682306?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_53-15238357689816272534?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_03_32-14927890656439086803?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_13_52-7000070189330843902?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_24_25-15104273396829332193?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_56-10151028594135420232?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_18_59-12693769540300591367?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_54-3004406772633938281?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_03_26-13974453829156165774?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_16_41-8943066764131073934?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_26_26-12476486373119268566?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_35_49-8313705500718851958?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_46_10-12327889343653972053?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_53-5731850525535960067?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_03_30-14787414839076589343?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_17_28-11293434996232980448?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_26_39-18372940871586230542?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_49_53-12270823172710169910?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_03_08-17772265245881510257?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_13_09-9525863446959201757?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_24_44-945328118361004679?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_10_50_00-8000192807430971150?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_03_33-15247980979387349074?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported or p.options.view_as(GoogleCloudOptions).temp_location) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_17_23-9943052555553646955?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 35 tests in 4008.908s FAILED (SKIP=4, failures=1) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED > Task :sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests >>> RUNNING integration tests with pipeline options: >>> --runner=TestDataflowRunner --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz> >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar> >>> >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: --nocapture --processes=8 --process-timeout=4500 >>> --attr=ValidatesRunner running nosetests running egg_info writing entry points to apache_beam.egg-info/entry_points.txt writing top-level names to apache_beam.egg-info/top_level.txt writing apache_beam.egg-info/PKG-INFO writing dependency_links to apache_beam.egg-info/dependency_links.txt writing requirements to apache_beam.egg-info/requires.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features. 'Python 3 support for the Apache Beam SDK is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features. 'Running the Apache Beam SDK on Python 3 is not yet fully supported. ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59: UserWarning: Datastore IO will support Python 3 after replacing googledatastore by google-cloud-datastore, see: BEAM-4543. warnings.warn('Datastore IO will support Python 3 after replacing ' <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47: UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: BEAM-5628. warnings.warn("VCF IO will support Python 3 after migration to Nucleus, " Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_33-10622949927368938587?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_04_46-10133660931954814653?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_34-18027125333271382984?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_05_12-2786704001714899481?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_33-13136202770669889717?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_05_07-1549522488754799335?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_34-15419320219656131645?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_04_58-946173597246698254?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_34-11126461702678905384?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_05_12-16977534696084723684?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_32-809988853603964097?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_03_55-9651030851556497239?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_35-445619463040021628?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_06_38-10763096845471805647?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_11_56_33-6827941896967725977?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_03_28-4839664115314382376?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-21_12_11_56-1824331578713016579?project=apache-beam-testing. test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_dofn_lifecycle (apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_flatten_multiple_pcollections_having_multiple_consumers (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 17 tests in 1409.049s OK FAILURE: Build completed with 4 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 46 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 47 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 47 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 4: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 31m 54s 78 actionable tasks: 61 executed, 17 from cache Publishing build scan... https://gradle.com/s/ds3ggzzzwgrxy Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
