See <https://builds.apache.org/job/beam_PostCommit_Python37/37/display/redirect>

------------------------------------------
[...truncated 252.17 KB...]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function lift_combiners at 0x7f44644667b8> 
====================
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_sdf at 0x7f4464466840> 
====================
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_gbk at 0x7f44644668c8> 
====================
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sink_flattens at 0x7f44644669d8> 
====================
root: DEBUG: 3 [1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/AppendDestination_5\n  
write/AppendDestination:beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7\n  
write/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function greedily_fuse at 0x7f4464466a60> 
====================
root: DEBUG: 1 [3]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
create/Read:beam:transform:read:v1\nwrite/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function read_to_impulse at 0x7f4464466ae8> 
====================
root: DEBUG: 1 [4]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function impulse_to_input at 0x7f4464466b70> 
====================
root: DEBUG: 1 [4]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function inject_timer_pcollections at 
0x7f4464466d08> ====================
root: DEBUG: 1 [4]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sort_stages at 0x7f4464466d90> 
====================
root: DEBUG: 1 [4]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function window_pcollection_coders at 
0x7f4464466e18> ====================
root: DEBUG: 1 [4]
root: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)\n
  
write/AppendDestination:beam:transform:pardo:v1\nwrite/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: Running 
((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/AppendDestination_5))+(ref_AppliedPTransform_write/StreamInsertRows/ParDo(BigQueryWriteFn)_7)
root: DEBUG: start <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: Connecting using Google Application Default Credentials.
root: DEBUG: start <DoOperation write/AppendDestination output_tags=['out'], 
receivers=[SingletonConsumerSet[write/AppendDestination.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: start <ImpulseReadOperation 
receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_write_to_table_15639482669274'
 projectId: 'apache-beam-testing'
 tableId: 'python_no_schema_table'> with schema None.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: finish <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <ImpulseReadOperation 
receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/AppendDestination output_tags=['out'], 
receivers=[SingletonConsumerSet[write/AppendDestination.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/StreamInsertRows/ParDo(BigQueryWriteFn).out1, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: Attempting to flush to all destinations. Total buffered: 4
root: DEBUG: Flushing data to 
apache-beam-testing:python_write_to_table_15639482669274.python_no_schema_table.
 Total 4 rows.
root: DEBUG: Passed: True. Errors are []
root: DEBUG: Wait for the bundle bundle_12 to finish.
root: INFO: Start verify Bigquery data.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
root: INFO: Attempting to perform query SELECT bytes, date, time FROM 
python_write_to_table_15639482669274.python_no_schema_table to BQ
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: INFO: Result of query is: <google.cloud.bigquery.job.QueryJob object at 
0x7f4463fe0400>
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/3f626c38-744f-42ba-96cf-b42157b9c0e6?maxResults=0&location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/3f626c38-744f-42ba-96cf-b42157b9c0e6?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonb0663bd07b4431781ff3b047457d394dc6701d16/data
 HTTP/1.1" 200 None
root: INFO: Read from given query (SELECT bytes, date, time FROM 
python_write_to_table_15639482669274.python_no_schema_table), total rows 0
root: INFO: Response from BigQuery is []
root: INFO: Start verify Bigquery data.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
root: INFO: Attempting to perform query SELECT bytes, date, time FROM 
python_write_to_table_15639482669274.python_no_schema_table to BQ
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: INFO: Result of query is: <google.cloud.bigquery.job.QueryJob object at 
0x7f446403fda0>
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/8ffdcdb4-2c5d-4d18-86bf-48f6a34d8afe?maxResults=0&location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/8ffdcdb4-2c5d-4d18-86bf-48f6a34d8afe?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anona40fe08a_aa0e_4aa2_b681_54a0d23b15c4/data
 HTTP/1.1" 200 None
root: INFO: Read from given query (SELECT bytes, date, time FROM 
python_write_to_table_15639482669274.python_no_schema_table), total rows 4
root: INFO: Response from BigQuery is [(b'xyw', datetime.date(2011, 1, 1), 
datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), 
datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), 
datetime.time(0, 0))]
root: INFO: Deleting dataset python_write_to_table_15639482669274 in project 
apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 24.287s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_30-15456306723897659520?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_16_55-12293341533480278250?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_25_53-2019556875072106271?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_34_08-10057533972047159200?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_42_24-13873349330850876512?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_28-3920042482530121516?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_28_06-11522121694619209749?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_36_21-10351603273795608500?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_29-10777004570555538217?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_15_02-14645513650829926714?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_23_43-17526927961116802239?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_31_42-16479953850614222242?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_28-12366006556782846504?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_22_29-4059008932044571142?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_31_41-1980574777269392879?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_39_33-11097616274119152063?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_28-1347041152238626277?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_11_45-16127409912491362968?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_23_01-2104279162255429705?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_32_15-15082774722133334634?project=apache-beam-testing.
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_27-13872846210726794130?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_09_54-5234221403114962776?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_19_41-9623535042335029592?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_29_15-4681281052783234521?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:565:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_30-509391355991476479?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_11_39-5393800817215018303?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_20_56-4737181843524750596?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_28_43-16229974820227353693?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_37_30-186057467301426499?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:686:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_01_28-3390936919464140808?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_11_25-16719876878713712109?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_20_06-11377342166328977852?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_29_36-7709893491348655426?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_38_52-11629780250451931411?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-23_23_46_53-3350797549874473583?project=apache-beam-testing.
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due 
to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 
is addressed. 
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This 
test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 41 tests in 3214.622s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'>
 line: 49

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 27s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/37ljoz2ecvstq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to