See 
<https://builds.apache.org/job/beam_PostCommit_Python35/380/display/redirect?page=changes>

Changes:

[lostluck] Makes subnetwork configurable

------------------------------------------
[...truncated 460.67 KB...]
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
RuntimeError: apitools.base.py.exceptions.HttpConflictError: HttpError 
accessing 
<https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
 response: <{'content-length': '502', 'date': 'Tue, 03 Sep 2019 23:09:12 GMT', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'cache-control': 'private', 'status': '409', 'transfer-encoding': 'chunked', 
'server': 'ESF', 'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', '-content-encoding': 'gzip', 
'x-xss-protection': '0'}>, content <{
  "error": {
    "code": 409,
    "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
    "errors": [
      {
        "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
        "domain": "global",
        "reason": "duplicate"
      }
    ],
    "status": "ALREADY_EXISTS"
  }
}
> [while running 
> 'WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)']

root: INFO: 2019-09-03T23:09:19.220Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "apache_beam/runners/common.py", line 803, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 610, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 682, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 903, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 918, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 331, in process
    write_disposition=self.write_disposition)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", 
line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 313, in _insert_copy_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 342, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpConflictError: HttpError accessing 
<https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
 response: <{'content-length': '502', 'date': 'Tue, 03 Sep 2019 23:09:17 GMT', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'cache-control': 'private', 'status': '409', 'transfer-encoding': 'chunked', 
'server': 'ESF', 'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', '-content-encoding': 'gzip', 
'x-xss-protection': '0'}>, content <{
  "error": {
    "code": 409,
    "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
    "errors": [
      {
        "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
        "domain": "global",
        "reason": "duplicate"
      }
    ],
    "status": "ALREADY_EXISTS"
  }
}
>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/batchworker.py", 
line 649, in do_work
    work_executor.execute()
  File "/usr/local/lib/python3.5/site-packages/dataflow_worker/executor.py", 
line 176, in execute
    op.start()
  File "dataflow_worker/native_operations.py", line 38, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 39, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 44, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "dataflow_worker/native_operations.py", line 54, in 
dataflow_worker.native_operations.NativeReadOperation.start
  File "apache_beam/runners/worker/operations.py", line 256, in 
apache_beam.runners.worker.operations.Operation.output
  File "apache_beam/runners/worker/operations.py", line 100, in 
apache_beam.runners.worker.operations.ConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 593, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 594, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 799, in 
apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 805, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 857, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "apache_beam/runners/common.py", line 803, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 610, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 682, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 903, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 942, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/worker/operations.py", line 143, in 
apache_beam.runners.worker.operations.SingletonConsumerSet.receive
  File "apache_beam/runners/worker/operations.py", line 593, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/worker/operations.py", line 594, in 
apache_beam.runners.worker.operations.DoOperation.process
  File "apache_beam/runners/common.py", line 799, in 
apache_beam.runners.common.DoFnRunner.receive
  File "apache_beam/runners/common.py", line 805, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 872, in 
apache_beam.runners.common.DoFnRunner._reraise_augmented
  File "/usr/local/lib/python3.5/site-packages/future/utils/__init__.py", line 
421, in raise_with_traceback
    raise exc.with_traceback(traceback)
  File "apache_beam/runners/common.py", line 803, in 
apache_beam.runners.common.DoFnRunner.process
  File "apache_beam/runners/common.py", line 610, in 
apache_beam.runners.common.PerWindowInvoker.invoke_process
  File "apache_beam/runners/common.py", line 682, in 
apache_beam.runners.common.PerWindowInvoker._invoke_process_per_window
  File "apache_beam/runners/common.py", line 903, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File "apache_beam/runners/common.py", line 918, in 
apache_beam.runners.common._OutputProcessor.process_outputs
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_file_loads.py",
 line 331, in process
    write_disposition=self.write_disposition)
  File "/usr/local/lib/python3.5/site-packages/apache_beam/utils/retry.py", 
line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/bigquery_tools.py", 
line 313, in _insert_copy_job
    response = self.client.jobs.Insert(request)
  File 
"/usr/local/lib/python3.5/site-packages/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py",
 line 342, in Insert
    upload=upload, upload_config=upload_config)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "/usr/local/lib/python3.5/site-packages/apitools/base/py/base_api.py", 
line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
RuntimeError: apitools.base.py.exceptions.HttpConflictError: HttpError 
accessing 
<https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
 response: <{'content-length': '502', 'date': 'Tue, 03 Sep 2019 23:09:17 GMT', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'cache-control': 'private', 'status': '409', 'transfer-encoding': 'chunked', 
'server': 'ESF', 'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', '-content-encoding': 'gzip', 
'x-xss-protection': '0'}>, content <{
  "error": {
    "code": 409,
    "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
    "errors": [
      {
        "message": "Already Exists: Job 
apache-beam-testing:US.beam_load_2019_09_03_230429_18_copy_5d3c413c1366433e4f3065051acbb89a_to_33e6600193655cac1752910b12516138",
        "domain": "global",
        "reason": "duplicate"
      }
    ],
    "status": "ALREADY_EXISTS"
  }
}
> [while running 
> 'WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)']

root: INFO: 2019-09-03T23:09:19.802Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-03T23:09:19.892Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure98
root: INFO: 2019-09-03T23:09:19.943Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: 
S29:WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
 failed., The job failed because a work item has failed 4 times. Look in 
previous log entries for the cause of each one of the 4 failures. For more 
information, see https://cloud.google.com/dataflow/docs/guides/common-errors. 
The work item was attempted on these workers: 
  beamapp-jenkins-090322582-09031558-xu9d-harness-rtn9,
  beamapp-jenkins-090322582-09031558-xu9d-harness-rtn9,
  beamapp-jenkins-090322582-09031558-xu9d-harness-rtn9,
  beamapp-jenkins-090322582-09031558-xu9d-harness-rtn9
root: INFO: 2019-09-03T23:09:20.101Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-03T23:09:20.388Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-09-03T23:09:20.423Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-03T23:13:01.883Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Reduced the number of workers to 0 based on the rate of progress in the 
currently running step(s).
root: INFO: 2019-09-03T23:13:01.941Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-03T23:13:01.992Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-09-03_15_58_49-379106670029973862 is in state 
JOB_STATE_FAILED
root: INFO: Deleting dataset python_bq_file_loads_15675515004185 in project 
apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_04-10231134830865035256?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_03_29-13385722211975597419?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_13_41-6187281644726589764?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_23_45-18165983246415396497?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_00-1041952298316311418?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_12_09-746018342410462564?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_30_45-5894403732173077815?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_02-14178607218583284440?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_01_37-14046724520412299099?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_12_26-6221681166542026309?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_21_01-12623952293363490580?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_30_15-14899099070641612324?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_00-2057091988156908874?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_07_45-6726613412076351961?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_18_49-13764669268338495629?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_28_07-17667592804653268398?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_46_59-707721722247794440?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_57_28-2696557588738424472?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_08_09-4068925341740239786?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_19_56-3148166816802617510?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_29_49-272803039573000383?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_46_58-15940935935815275280?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_57_38-11131674147710009065?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_07_47-7544542920618953928?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_18_10-12870935398116943603?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_28_40-11627666782793418709?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_37_35-5472719843898011261?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_06-4586268288689846037?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_58_13-12137507712374998561?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_07_14-8276680960594489473?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_16_39-10300460254395206649?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_25_59-7910512939710044292?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_47_01-8505536112410722221?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_15_58_49-379106670029973862?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_13_39-9677199781516471548?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_24_55-5327611756059658370?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-03_16_34_26-571265836330487952?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3612.881s

FAILED (SKIP=6, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 3s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/kerseempcyiq4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to