See 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1184/display/redirect>

------------------------------------------
[...truncated 983.56 KB...]
root: INFO: 2019-06-20T06:54:04.881Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-06-19_23_54_04-10787052062606720290. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-06-20T06:54:04.938Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-06-19_23_54_04-10787052062606720290.
root: INFO: 2019-06-20T06:54:07.770Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-06-20T06:54:08.338Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-06-20T06:54:09.111Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-20T06:54:09.160Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-06-20T06:54:09.205Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-20T06:54:09.253Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-06-20T06:54:09.449Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-20T06:54:09.492Z: JOB_MESSAGE_DETAILED: Fusing consumer 
write/WriteToBigQuery/NativeWrite into read
root: INFO: 2019-06-20T06:54:09.530Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-06-20T06:54:09.566Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-06-20T06:54:09.613Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-06-20T06:54:09.661Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-20T06:54:09.823Z: JOB_MESSAGE_DEBUG: Executing wait step 
start3
root: INFO: 2019-06-20T06:54:09.911Z: JOB_MESSAGE_BASIC: Executing operation 
read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-06-20T06:54:09.974Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-06-20T06:54:10.025Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-06-20T06:54:13.228Z: JOB_MESSAGE_BASIC: BigQuery query issued 
as job: "dataflow_job_15188431681004758894". You can check its status with the 
bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_15188431681004758894".
root: INFO: 2019-06-20T06:54:59.475Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
root: INFO: 2019-06-20T06:56:03.269Z: JOB_MESSAGE_BASIC: BigQuery query 
completed, job : "dataflow_job_15188431681004758894"
root: INFO: 2019-06-20T06:56:03.539Z: JOB_MESSAGE_BASIC: BigQuery export job 
"dataflow_job_17587624861661537223" started. You can check its status with the 
bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_17587624861661537223".
root: INFO: 2019-06-20T06:56:06.633Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-06-20T06:56:06.672Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2019-06-20T06:56:34.074Z: JOB_MESSAGE_DETAILED: BigQuery export job 
progress: "dataflow_job_17587624861661537223" observed total of 1 exported 
files thus far.
root: INFO: 2019-06-20T06:56:34.117Z: JOB_MESSAGE_BASIC: BigQuery export job 
finished: "dataflow_job_17587624861661537223"
root: INFO: 2019-06-20T06:58:12.166Z: JOB_MESSAGE_BASIC: Executing BigQuery 
import job "dataflow_job_15188431681004757036". You can check its status with 
the bq tool: "bq show -j --project_id=apache-beam-testing 
dataflow_job_15188431681004757036".
root: INFO: 2019-06-20T06:58:22.710Z: JOB_MESSAGE_BASIC: BigQuery import job 
"dataflow_job_15188431681004757036" done.
root: INFO: 2019-06-20T06:58:23.778Z: JOB_MESSAGE_BASIC: Finished operation 
read+write/WriteToBigQuery/NativeWrite
root: INFO: 2019-06-20T06:58:23.865Z: JOB_MESSAGE_DEBUG: Executing success step 
success1
root: INFO: 2019-06-20T06:58:24.006Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-20T06:58:24.169Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-06-20T06:58:24.216Z: JOB_MESSAGE_BASIC: Stopping worker pool...
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_14-7303297581115830079?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_43_01-9922612380191710699?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_52_18-4474883915936633028?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_00_09-6594133427237706818?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_12-6875557371329838319?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_51_13-10346774991177757331?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_59_28-17767692292970537471?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_09_37-15667622207208895704?project=apache-beam-testing.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_11-16268864785602547842?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_40_11-9047109937327198975?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_47_51-4275277587178258376?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_55_33-4476531729564212870?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_03_06-8288964777650546748?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_07-11910172833436648350?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_49_52-5721803678804115271?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_57_34-8073703081381956171?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_08-15455674215705002528?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_37_51-8428574649856870517?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_47_59-14638347052503264038?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-4:
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_56_17-17524304589558550283?project=apache-beam-testing.
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 663, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 689, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-19_23_56_17-17524304589558550283?alt=json>:
 response: <{'cache-control': 'private', 'x-xss-protection': '0', 
'content-length': '280', 'x-content-type-options': 'nosniff', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', '-content-encoding': 
'gzip', 'server': 'ESF', 'date': 'Thu, 20 Jun 2019 06:59:25 GMT', 
'transfer-encoding': 'chunked', 'status': '404'}>, content <{
  "error": {
    "code": 404,
    "message": "(dae872f14e92e487): Information about job 
2019-06-19_23_56_17-17524304589558550283 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_07-8146065701468305213?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_34_24-6521228903858971001?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_44_33-18433852630659048185?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_54_33-5451092228422624608?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Exception in thread Thread-4:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 663, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 689, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-19_23_54_33-5451092228422624608?alt=json>:
 response: <{'cache-control': 'private', 'x-xss-protection': '0', 
'content-length': '279', 'x-content-type-options': 'nosniff', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', '-content-encoding': 
'gzip', 'server': 'ESF', 'date': 'Thu, 20 Jun 2019 06:59:15 GMT', 
'transfer-encoding': 'chunked', 'status': '404'}>, content <{
  "error": {
    "code": 404,
    "message": "(6b7256532b503eab): Information about job 
2019-06-19_23_54_33-5451092228422624608 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_10-5459756898272265150?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Exception in thread Thread-39:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_37_01-6620308815437853371?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_46_28-10580454114033522407?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_54_04-10787052062606720290?project=apache-beam-testing.
    self.run()
  File "/usr/lib/python3.5/threading.py", line 862, in run
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_01_15-7020748263840625027?project=apache-beam-testing.
    self._target(*self._args, **self._kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 190, in poll_for_job_completion
    job_id, page_token=page_token, start_time=last_message_time)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 197, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 748, in list_messages
    response = self._client.projects_locations_jobs_messages.List(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 553, in List
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing 
<https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-19_23_54_04-10787052062606720290/messages?alt=json&startTime=2019-06-20T06%3A58%3A24.216Z>:
 response: <{'cache-control': 'private', 'x-xss-protection': '0', 
'content-length': '280', 'x-content-type-options': 'nosniff', 
'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; 
charset=UTF-8', 'vary': 'Origin, X-Origin, Referer', '-content-encoding': 
'gzip', 'server': 'ESF', 'date': 'Thu, 20 Jun 2019 06:59:24 GMT', 
'transfer-encoding': 'chunked', 'status': '404'}>, content <{
  "error": {
    "code": 404,
    "message": "(132d23c2fed3078a): Information about job 
2019-06-19_23_54_04-10787052062606720290 could not be found in our system. 
Please double check the id is correct. If it is please contact customer 
support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_27_10-5243926924781847545?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_35_36-10362711653401722247?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_43_17-9478301273968356740?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-19_23_52_13-5039619879107339607?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_00_13-11038403418056631953?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-20_00_08_04-7600460837257055828?project=apache-beam-testing.

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3047.992s

FAILED (SKIP=5, failures=3)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 78

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52m 2s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/tmg27poxhniji

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to