See <https://builds.apache.org/job/beam_PostCommit_Python37/478/display/redirect>
------------------------------------------ [...truncated 262.27 KB...] Stopping hdfs_it-jenkins-beam_postcommit_python37-478_namenode_1 ... done Aborting on container exit... real 1m33.847s user 0m1.135s sys 0m0.181s + finally + docker-compose -p hdfs_IT-jenkins-beam_PostCommit_Python37-478 --no-ansi down Removing hdfs_it-jenkins-beam_postcommit_python37-478_test_1 ... Removing hdfs_it-jenkins-beam_postcommit_python37-478_datanode_1 ... Removing hdfs_it-jenkins-beam_postcommit_python37-478_namenode_1 ... Removing hdfs_it-jenkins-beam_postcommit_python37-478_namenode_1 ... done Removing hdfs_it-jenkins-beam_postcommit_python37-478_test_1 ... done Removing hdfs_it-jenkins-beam_postcommit_python37-478_datanode_1 ... done Removing network hdfs_it-jenkins-beam_postcommit_python37-478_test_net real 0m0.796s user 0m0.605s sys 0m0.086s > Task :sdks:python:test-suites:direct:py37:postCommitIT [WARNING] Could not find SDK tarball in SDK_LOCATION: build/apache-beam.tar.gz. >>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner >>> --project=apache-beam-testing >>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it >>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it >>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output >>> --sdk_location=build/apache-beam.tar.gz >>> --requirements_file=postcommit_requirements.txt --num_workers=1 >>> --sleep_secs=20 >>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> >>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test >>> test options: >>> --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test >>> --nocapture --processes=8 --process-timeout=4500 running nosetests running egg_info writing apache_beam.egg-info/PKG-INFO writing dependency_links to apache_beam.egg-info/dependency_links.txt writing entry points to apache_beam.egg-info/entry_points.txt writing requirements to apache_beam.egg-info/requires.txt writing top-level names to apache_beam.egg-info/top_level.txt reading manifest file 'apache_beam.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/1398941891/lib/python3.7/site-packages/setuptools/dist.py>:474: UserWarning: Normalizing '2.17.0.dev' to '2.17.0.dev0' normalized_version, warning: no files found matching 'README.md' warning: no files found matching 'NOTICE' warning: no files found matching 'LICENSE' writing manifest file 'apache_beam.egg-info/SOURCES.txt' test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... FAIL test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner. ====================================================================== FAIL: test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_it_test.py",> line 134, in test_big_query_write write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY)) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 427, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 407, in run self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/pipeline.py",> line 420, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",> line 51, in run_pipeline hc_assert_that(self.result, pickler.loads(on_success_matcher)) AssertionError: Expected: (Expected data is [(1, 'abc'), (2, 'def'), (3, '你好'), (4, 'привет')]) but: Expected data is [(1, 'abc'), (2, 'def'), (3, '你好'), (4, 'привет')] Actual data is [] -------------------- >> begin captured logging << -------------------- root: INFO: Setting socket default timeout to 60 seconds. root: INFO: socket default timeout is 60.0econds. root: DEBUG: Connecting using Google Application Default Credentials. oauth2client.transport: INFO: Attempting refresh to obtain initial access_token root: INFO: Created dataset python_write_to_table_15686800602989 in project apache-beam-testing root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Any root: DEBUG: Unhandled type_constraint: Any root: INFO: ==================== <function annotate_downstream_side_inputs at 0x7fc25cbf9488> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function fix_side_input_pcoll_coders at 0x7fc25cbf9598> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function lift_combiners at 0x7fc25cbf9620> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function expand_sdf at 0x7fc25cbf96a8> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function expand_gbk at 0x7fc25cbf9730> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sink_flattens at 0x7fc25cbf9840> ==================== root: DEBUG: 3 [1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/AppendDestination_5\n write/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function greedily_fuse at 0x7fc25cbf98c8> ==================== root: DEBUG: 1 [3] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n create/Read:beam:transform:read:v1\nwrite/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function read_to_impulse at 0x7fc25cbf9950> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function impulse_to_input at 0x7fc25cbf99d8> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function inject_timer_pcollections at 0x7fc25cbf9b70> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sort_stages at 0x7fc25cbf9bf8> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function window_pcollection_coders at 0x7fc25cbf9c80> ==================== root: DEBUG: 1 [4] root: DEBUG: Stages: ['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: Running ((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7) root: DEBUG: start <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: start <DoOperation write/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: start <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: Creating or getting table <TableReference datasetId: 'python_write_to_table_15686800602989' projectId: 'apache-beam-testing' tableId: 'python_write_table'> with schema {'fields': [{'name': 'number', 'type': 'INTEGER'}, {'name': 'str', 'type': 'STRING'}]}. root: DEBUG: Created the table with id python_write_table root: INFO: Created table apache-beam-testing.python_write_to_table_15686800602989.python_write_table with schema <TableSchema fields: [<TableFieldSchema fields: [] mode: 'NULLABLE' name: 'number' type: 'INTEGER'>, <TableFieldSchema fields: [] mode: 'NULLABLE' name: 'str' type: 'STRING'>]>. Result: <Table creationTime: 1568680061660 etag: 'e8wceWFXQ1hIZAM+YXMIrA==' id: 'apache-beam-testing:python_write_to_table_15686800602989.python_write_table' kind: 'bigquery#table' lastModifiedTime: 1568680061735 location: 'US' numBytes: 0 numLongTermBytes: 0 numRows: 0 schema: <TableSchema fields: [<TableFieldSchema fields: [] mode: 'NULLABLE' name: 'number' type: 'INTEGER'>, <TableFieldSchema fields: [] mode: 'NULLABLE' name: 'str' type: 'STRING'>]> selfLink: 'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_write_to_table_15686800602989/tables/python_write_table' tableReference: <TableReference datasetId: 'python_write_to_table_15686800602989' projectId: 'apache-beam-testing' tableId: 'python_write_table'> type: 'TABLE'>. root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: finish <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out_FailedRows', 'out'], receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: Attempting to flush to all destinations. Total buffered: 4 root: DEBUG: Flushing data to apache-beam-testing:python_write_to_table_15686800602989.python_write_table. Total 4 rows. root: DEBUG: Passed: True. Errors are [] root: DEBUG: Wait for the bundle bundle_1 to finish. root: INFO: Attempting to perform query SELECT number, str FROM python_write_to_table_15686800602989.python_write_table to BQ google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token HTTP/1.1" 200 181 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/9f9331bc-44e0-4770-b43f-2169d21f0291?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/9f9331bc-44e0-4770-b43f-2169d21f0291?location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon20fbfc844d7bc91e19bd8b226a4191b2a8647c3c/data HTTP/1.1" 200 None root: INFO: Result of query is: [] root: INFO: Deleting dataset python_write_to_table_15686800602989 in project apache-beam-testing --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location ---------------------------------------------------------------------- XML: nosetests-postCommitIT-direct-py37.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 15 tests in 75.504s FAILED (SKIP=1, failures=1) > Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build.gradle'> line: 46 * What went wrong: Execution failed for task ':sdks:python:sdist'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 51 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 5m 40s 64 actionable tasks: 47 executed, 17 from cache Publishing build scan... https://gradle.com/s/o5zihvpiib6ie Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org