See 
<https://builds.apache.org/job/beam_PostCommit_Python36/724/display/redirect?page=changes>

Changes:

[github] Showing the Java environment used by Dataflow in the UI (#8680)


------------------------------------------
[...truncated 83.20 KB...]
======================================================================
FAIL: test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py";,>
 line 149, in test_big_query_legacy_sql
    big_query_query_to_table_pipeline.run_bq_pipeline(options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py";,>
 line 82, in run_bq_pipeline
    result = p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 112, in run
    else test_runner_api))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 407, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 420, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py";,>
 line 51, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Test pipeline expected terminated in state: DONE and Expected 
checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
     but: Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual 
checksum is da39a3ee5e6b4b0d3255bfef95601890afd80709

-------------------- >> begin captured logging << --------------------
root: INFO: Setting socket default timeout to 60 seconds.
root: INFO: socket default timeout is 60.0 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Any
root: INFO: Running pipeline with DirectRunner.
root: DEBUG: Query SELECT * FROM (SELECT "apple" as fruit), (SELECT "orange" as 
fruit), does not reference any tables.
root: WARNING: Dataset 
apache-beam-testing:temp_dataset_41ab4bc2c8ae4b8c8d4a14578baad188 does not 
exist so we will create it as temporary with location=None
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_query_to_table_15711665812468'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'> with schema {'fields': [{'name': 'fruit', 'type': 
'STRING', 'mode': 'NULLABLE'}]}.
root: DEBUG: Created the table with id output_table
root: INFO: Created table 
apache-beam-testing.python_query_to_table_15711665812468.output_table with 
schema <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>. Result: <Table
 creationTime: 1571166586391
 etag: 'A6ToBZ7KOkTw+CK4mh+DMA=='
 id: 'apache-beam-testing:python_query_to_table_15711665812468.output_table'
 kind: 'bigquery#table'
 lastModifiedTime: 1571166586443
 location: 'US'
 numBytes: 0
 numLongTermBytes: 0
 numRows: 0
 schema: <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15711665812468/tables/output_table'
 tableReference: <TableReference
 datasetId: 'python_query_to_table_15711665812468'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'>
 type: 'TABLE'>.
root: DEBUG: Attempting to flush to all destinations. Total buffered: 2
root: DEBUG: Flushing data to 
apache-beam-testing:python_query_to_table_15711665812468.output_table. Total 2 
rows.
root: DEBUG: Passed: True. Errors are []
root: INFO: Attempting to perform query SELECT fruit from 
`python_query_to_table_15711665812468.output_table`; to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/e2ae52dc-1032-4233-adba-d827e4381360?maxResults=0&location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anoned2c202f56a174152452af6daafa85c4ad66ad41/data
 HTTP/1.1" 200 None
root: INFO: Read from given query (SELECT fruit from 
`python_query_to_table_15711665812468.output_table`;), total rows 0
root: INFO: Generate checksum: da39a3ee5e6b4b0d3255bfef95601890afd80709
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 
31.260s<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location


FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py36:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_10_02-15865981577782974009?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_26_27-8732892758506628914?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_38_28-10842805536789019983?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_48_52-7607391507329408837?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_13_03_19-13825485086297214674?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_09_58-14530549290543377536?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_39_08-8902258855678340412?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_50_15-11975473773813151363?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_13_00_52-16857035251267514130?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_10_02-7372455630758543545?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_24_26-684630463837106073?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_37_00-13745019452050679134?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_55_10-14901957151886631232?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_09_57-10435407285462078138?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_36_32-12505807968042981771?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_48_23-9309706034655705522?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_13_01_07-5346672121831655173?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_09_57-12853011055235992514?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_21_25-731142299582317319?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_32_36-4682587060276747115?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_42_58-3460378638454248886?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_54_18-6983134927820009250?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_09_57-4117602056227056421?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_20_51-8753286597667120994?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_31_16-246307005813934936?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_42_14-842921366099346208?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_55_08-15771149210618595704?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_13_05_46-2861382029351070020?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_09_58-12729360916623171774?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_22_46-7922707020886318980?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_36_11-13193770202278821614?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_47_42-14147635578927444740?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_10_03-15090452517972695870?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_20_50-7361201120201396218?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_32_09-1695652304776636604?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_43_31-12382490655243818054?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-15_12_53_48-13912317318110048170?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due 
to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 
is addressed. 
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This 
test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3976.617s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'>
 line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 13s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/x733xoqtskqxe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to