See 
<https://builds.apache.org/job/beam_PostCommit_Python37/1467/display/redirect?page=changes>

Changes:

[github] Fixing Lint

[github] [BEAM-9201] Release scripts fixes: run_rc_validation.sh,


------------------------------------------
[...truncated 2.78 MB...]
        },
        "serialized_fn": 
"eNq1lG2T1EQQx7O7HA8BDjh84ERRVDSrkoCIiCKoe/LgSjhzpxcfrqYmydxO3EkmPTPhuCq2SqVyxYeQ72pP9s5jVXjni+xmuqd/0/l39/zW81Ja0ZQzkjBa+EbRUm9IVWg/lYq5AyoETQRbU7SqmFqSN0sXnP7v0JlA14t7juOQjRJ6aZYL4RP765JUMWoY2ajL1OQSA/Z5M34haUbMVsVcmIsPImIgM7aKa9jfwIEIDnrDztDBpzs8OXAfOw+dR51RZ8WBQ2EDbj/uYMgDONzAkbjA14DLggW/snKcl3r3/7wW9D4LNqUaa/w+FtjPI8tSm4EsityQ5S3DZXnpSqBVGuhsrIOqtQRPyRHsyRFYOfxqC462GV8TtEgyeh3m7z6ZGzhwLO6iFZU43sCJvoGFCE7OfPOIGUKNUS680AKSOhcG84QX4wO4RLf1wkvb8HIEp2ZC86KSypBCZrVAyRbj0xjwnKLBKw2cjuDV9hyCkNQQAq9tw5kIXufz4X/VKmW4gDf4Po//rX5vuDJYeOhknUWsQNZddLLen12swlkvtP54DvGFLA2HN+Oj+G6kKmkmSSrr0sBb2/C2gXP9tkfG5D6808C78R//b73YA1pUgtlqyXGCT5DkI6iZ2iI72TFti+jx+eENfqyPlepH8B4/xRfjC/9QdZfl77L8f7Pg/QY+iOA8R3X9CAJUN5zAhfiwVd52OOF5aTRcnB0ydLR2P2NYMIo47d65ZyfgtjW78CFO2CUkfeS1qLysatPyNFwO4yNokrXZs30c1ttwJdGhgU8iuNrApxF81sC1CXzu8Yvc0q4j7YbHL4e83fxFMs2RqpGuWGpn+Et+tUbAVxEM2hNwfnH2p35YCts2rZRMmdbwNR/scG8i99Ye93ZSJ+twZwLfrMPwuTfLWl5mcjMvRy58i5y7Ewi9tmM3WwcmdO9Z8dMd7i0hEyqmHNRsGSnfxYdsI6p8NGIKEdGzEDtb3CW2QWthVneWsIKQ1fiEbYU0rYtaUHuD2dlj8D32/HHrEUJusoygj5VWjR9mTqlNLvCAvGDaYAe5S7Wi02twDeHxBH6M99vJyVMlNfw0dHQSL9isdyNwgIokL5mCn/HEtky5Jtk0U/jlcZ0YWPf/Al7c7IM=",
        "user_name": "format"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s6",
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED",
        "dataset": "BigQueryTornadoesIT",
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": 
"RowAsDictJsonCoder$eNprYE5OLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBK5xfp",
              "component_encodings": [],
              "pipeline_proto_coder_id": "ref_Coder_RowAsDictJsonCoder_7"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "bigquery",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s5"
        },
        "schema": "{\"fields\": [{\"name\": \"month\", \"type\": \"INTEGER\", 
\"mode\": \"NULLABLE\"}, {\"name\": \"tornado_count\", \"type\": \"INTEGER\", 
\"mode\": \"NULLABLE\"}]}",
        "table": "monthly_tornadoes_1580176184209",
        "user_name": "Write/WriteToBigQuery/NativeWrite",
        "write_disposition": "WRITE_TRUNCATE"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-01-28T01:50:04.555258Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-01-27_17_50_03-12425213445153295309'
 location: 'us-central1'
 name: 'beamapp-jenkins-0128014944-829551'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-01-28T01:50:04.555258Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2020-01-27_17_50_03-12425213445153295309]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_03-12425213445153295309?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-01-27_17_50_03-12425213445153295309 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:03.417Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-01-27_17_50_03-12425213445153295309.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:03.417Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-01-27_17_50_03-12425213445153295309. The number of workers will be between 
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:06.883Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:07.848Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.534Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.611Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.639Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.712Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.854Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.902Z: 
JOB_MESSAGE_DETAILED: Fusing consumer months with tornadoes into read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.937Z: 
JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey+monthly 
count/Combine/Partial into months with tornadoes
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:08.974Z: 
JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Reify into 
monthly count/GroupByKey+monthly count/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.011Z: 
JOB_MESSAGE_DETAILED: Fusing consumer monthly count/GroupByKey/Write into 
monthly count/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.043Z: 
JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine into monthly 
count/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.077Z: 
JOB_MESSAGE_DETAILED: Fusing consumer monthly count/Combine/Extract into 
monthly count/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.116Z: 
JOB_MESSAGE_DETAILED: Fusing consumer format into monthly count/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.151Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write/WriteToBigQuery/NativeWrite into 
format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.185Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.219Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.255Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.289Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.460Z: 
JOB_MESSAGE_DEBUG: Executing wait step start22
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.529Z: 
JOB_MESSAGE_BASIC: Executing operation monthly count/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.577Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.609Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.658Z: 
JOB_MESSAGE_BASIC: Finished operation monthly count/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.746Z: 
JOB_MESSAGE_DEBUG: Value "monthly count/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:09.826Z: 
JOB_MESSAGE_BASIC: Executing operation read+months with tornadoes+monthly 
count/GroupByKey+monthly count/Combine/Partial+monthly 
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:10.460Z: 
JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_737037546341136705" 
started. You can check its status with the bq tool: "bq show -j 
--project_id=clouddataflow-readonly dataflow_job_737037546341136705".
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:39.166Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:40.824Z: 
JOB_MESSAGE_DETAILED: BigQuery export job progress: 
"dataflow_job_737037546341136705" observed total of 1 exported files thus far.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:50:40.846Z: 
JOB_MESSAGE_BASIC: BigQuery export job finished: 
"dataflow_job_737037546341136705"
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.580Z: 
JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to 
bring up any of the desired 1 workers. RESOURCE_NOT_FOUND: The resource 
'projects/apache-beam-testing' was not found
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.600Z: 
JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.649Z: 
JOB_MESSAGE_BASIC: Finished operation read+months with tornadoes+monthly 
count/GroupByKey+monthly count/Combine/Partial+monthly 
count/GroupByKey/Reify+monthly count/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.772Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.934Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:23.960Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:39.804Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-28T01:51:39.852Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-01-27_17_50_03-12425213445153295309 is in state JOB_STATE_FAILED
apache_beam.io.gcp.tests.utils: INFO: Clean up a BigQuery table with project: 
apache-beam-testing, dataset: BigQueryTornadoesIT, table: 
monthly_tornadoes_1580176184209.
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "DELETE 
/bigquery/v2/projects/apache-beam-testing/datasets/BigQueryTornadoesIT/tables/monthly_tornadoes_1580176184209
 HTTP/1.1" 404 None
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_11-13540741752373080760?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_04_34-2241011755992706250?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_13_00-13184268837169143408?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_21_02-3751883346958888789?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_03-144041553244830852?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_13_13-10456799077957421227?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_22_09-18303113137170568248?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_30_55-1608742765698256715?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_08-7468664141227543463?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_02_09-12848057614506400657?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:259:
 FutureWarning: _ReadFromBigQuery is experimental.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_10_49-9487637410300913478?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_19_06-14681145689246988077?project=apache-beam-testing
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_27_13-4592602980943844337?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_35_10-14412834273696138754?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_03-2227624052600326496?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:757:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_09_13-8959815574423356073?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_16_59-2141240511328152012?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1418:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_04-4106557186712690672?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_59_09-12515492398983815116?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_07_05-9477313053636225985?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_14_46-699203422909700987?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:75:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_22_48-11344070093034580322?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_30_46-2016234636401285626?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_03-12425213445153295309?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_52_26-7027393272705815136?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:75:
 FutureWarning: _ReadFromBigQuery is experimental.
  known_args.input_table))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_02_06-5115641032308817984?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_11_03-465297589900546635?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:75:
 FutureWarning: _ReadFromBigQuery is experimental.
  known_args.input_table))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_20_05-1784675228898077506?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_29_11-18412636817069888338?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:298:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_07-18234784714528180107?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_58_41-18168954700510232934?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_06_56-8596872417542853439?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_15_37-5611060321681125240?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_25_16-17838064511213084073?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_33_03-13881766683590646879?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_50_06-175574107986896900?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_17_59_16-18423950559043216001?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_09_16-17534196917918043213?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_19_44-10014289563699081272?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-27_18_26_58-17974547656304455364?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 51 tests in 3189.264s

FAILED (SKIP=9, errors=2)

> Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
 line: 89

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 54m 51s
85 actionable tasks: 64 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/6hsro5wywbeow

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to