See 
<https://builds.apache.org/job/beam_PostCommit_Python35/1663/display/redirect?page=changes>

Changes:

[robinyqiu] Support all ZetaSQL TIMESTAMP functions


------------------------------------------
[...truncated 2.74 MB...]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_19_51-16378077519596805406?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-02-07_09_19_51-16378077519596805406 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:51.021Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-02-07_09_19_51-16378077519596805406. The number of workers will be between 
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:51.021Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-02-07_09_19_51-16378077519596805406.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:53.999Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:54.966Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.588Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.626Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Assert 
Checksums/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.659Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Matched 
Files/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.691Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.724Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.820Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.866Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.899Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s9 for input s7.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.941Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched 
Files/Group/GroupByKey/Reify, through flatten Matched Files/Group/Flatten, into 
producer Matched Files/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:56.976Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Reify into 
Matched Files/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.015Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched 
Files/Group/GroupByKey/GroupByWindow into Matched Files/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.048Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched 
Files/Group/Map(_merge_tagged_vals_under_key) into Matched 
Files/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.086Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Unkey into Matched 
Files/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.115Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Match into Matched 
Files/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.144Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s9-u22 for input s10-reify-value0-c20
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.176Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Matched 
Files/Group/GroupByKey/Write, through flatten Matched 
Files/Group/Flatten/Unzipped-1, into producer Matched 
Files/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.216Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/GroupByKey/Write into 
Matched Files/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.255Z: 
JOB_MESSAGE_DETAILED: Fusing consumer MatchAll/ParDo(_MatchAllFn) into 
Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.290Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GetPath into MatchAll/ParDo(_MatchAllFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.313Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/WindowInto(WindowIntoFn) 
into GetPath
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.347Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/ToVoidKey into Matched 
Files/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.371Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_1 into 
Matched Files/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.404Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s24 for input s22.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.438Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Assert 
Checksums/Group/GroupByKey/Reify, through flatten Assert 
Checksums/Group/Flatten, into producer Assert Checksums/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.464Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Reify 
into Assert Checksums/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.497Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert 
Checksums/Group/GroupByKey/GroupByWindow into Assert 
Checksums/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.537Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert 
Checksums/Group/Map(_merge_tagged_vals_under_key) into Assert 
Checksums/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.569Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Unkey into Assert 
Checksums/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.602Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Match into Assert 
Checksums/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.639Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s24-u29 for input s25-reify-value9-c27
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.677Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Assert 
Checksums/Group/GroupByKey/Write, through flatten Assert 
Checksums/Group/Flatten/Unzipped-1, into producer Assert 
Checksums/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.713Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/GroupByKey/Write 
into Assert Checksums/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.754Z: 
JOB_MESSAGE_DETAILED: Fusing consumer MatchOneAll/ParDo(_MatchAllFn) into 
SingleFile/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.789Z: 
JOB_MESSAGE_DETAILED: Fusing consumer ReadMatches/ParDo(_ReadMatchesFn) into 
MatchOneAll/ParDo(_MatchAllFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.822Z: 
JOB_MESSAGE_DETAILED: Fusing consumer ReadIn into 
ReadMatches/ParDo(_ReadMatchesFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.856Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Checksums into ReadIn
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.893Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/WindowInto(WindowIntoFn) 
into Checksums
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.933Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/ToVoidKey into Assert 
Checksums/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:57.966Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_1 into 
Assert Checksums/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Matched Files/Group/pair_with_0 into 
Matched Files/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.029Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assert Checksums/Group/pair_with_0 into 
Assert Checksums/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.068Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.101Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.128Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.161Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.345Z: 
JOB_MESSAGE_DEBUG: Executing wait step start40
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.409Z: 
JOB_MESSAGE_BASIC: Executing operation Matched Files/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.449Z: 
JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.462Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.505Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.549Z: 
JOB_MESSAGE_BASIC: Finished operation Matched Files/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.562Z: 
JOB_MESSAGE_BASIC: Finished operation Assert Checksums/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.615Z: 
JOB_MESSAGE_DEBUG: Value "Matched Files/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.652Z: 
JOB_MESSAGE_DEBUG: Value "Assert Checksums/Group/GroupByKey/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.682Z: 
JOB_MESSAGE_BASIC: Executing operation 
Create/Read+MatchAll/ParDo(_MatchAllFn)+GetPath+Matched 
Files/WindowInto(WindowIntoFn)+Matched Files/ToVoidKey+Matched 
Files/Group/pair_with_1+Matched Files/Group/GroupByKey/Reify+Matched 
Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.703Z: 
JOB_MESSAGE_BASIC: Executing operation Matched Files/Create/Read+Matched 
Files/Group/pair_with_0+Matched Files/Group/GroupByKey/Reify+Matched 
Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.736Z: 
JOB_MESSAGE_BASIC: Executing operation 
SingleFile/Read+MatchOneAll/ParDo(_MatchAllFn)+ReadMatches/ParDo(_ReadMatchesFn)+ReadIn+Checksums+Assert
 Checksums/WindowInto(WindowIntoFn)+Assert Checksums/ToVoidKey+Assert 
Checksums/Group/pair_with_1+Assert Checksums/Group/GroupByKey/Reify+Assert 
Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:19:58.767Z: 
JOB_MESSAGE_BASIC: Executing operation Assert Checksums/Create/Read+Assert 
Checksums/Group/pair_with_0+Assert Checksums/Group/GroupByKey/Reify+Assert 
Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:20:25.404Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:20:29.441Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running step(s).
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:25:58.315Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:31:58.316Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:37:58.315Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:43:58.315Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:49:58.315Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 2/2)
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T17:55:58.317Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:01:58.316Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:07:58.316Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:13:58.316Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.316Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.683Z: 
JOB_MESSAGE_ERROR: Workflow failed. Causes: The Dataflow job appears to be 
stuck because no worker activity has been seen in the last 1h. Please check the 
worker logs in Stackdriver Logging. You can also get help with Cloud Dataflow 
at https://cloud.google.com/dataflow/support.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.800Z: 
JOB_MESSAGE_BASIC: Finished operation Matched Files/Create/Read+Matched 
Files/Group/pair_with_0+Matched Files/Group/GroupByKey/Reify+Matched 
Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.800Z: 
JOB_MESSAGE_BASIC: Finished operation 
SingleFile/Read+MatchOneAll/ParDo(_MatchAllFn)+ReadMatches/ParDo(_ReadMatchesFn)+ReadIn+Checksums+Assert
 Checksums/WindowInto(WindowIntoFn)+Assert Checksums/ToVoidKey+Assert 
Checksums/Group/pair_with_1+Assert Checksums/Group/GroupByKey/Reify+Assert 
Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.800Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+MatchAll/ParDo(_MatchAllFn)+GetPath+Matched 
Files/WindowInto(WindowIntoFn)+Matched Files/ToVoidKey+Matched 
Files/Group/pair_with_1+Matched Files/Group/GroupByKey/Reify+Matched 
Files/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.800Z: 
JOB_MESSAGE_BASIC: Finished operation Assert Checksums/Create/Read+Assert 
Checksums/Group/pair_with_0+Assert Checksums/Group/GroupByKey/Reify+Assert 
Checksums/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:58.953Z: 
JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 
2020-02-07_09_19_51-16378077519596805406.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:59.033Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:59.121Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:19:59.144Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:23:25.570Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:23:25.609Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-07T18:23:25.641Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2020-02-07_09_19_51-16378077519596805406 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_15-13725854007423462094?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_09_28-1872646614444742023?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_19_51-16378077519596805406?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:303:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_14-6208560405594800238?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:823:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_08_53-4472021922518986149?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_17_45-4594306996154873115?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_09-10128515845839614538?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_34_10-8119403580741998079?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_16-4879894920616635703?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_01_36-17200920209437515305?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_10_38-7779242066727098613?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_19_30-12755053197539120729?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_58-15470410346143478912?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84:
 FutureWarning: _ReadFromBigQuery is experimental.
  (options.view_as(GoogleCloudOptions).project, known_args.input_table))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84:
 FutureWarning: _ReadFromBigQuery is experimental.
  (options.view_as(GoogleCloudOptions).project, known_args.input_table))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_14-17269314039435302872?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:162:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_08_17-3607227443109649838?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_16_37-14013837017550737621?project=apache-beam-testing
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_24_16-13890275007382824935?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_31_45-3147016241846520198?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_14-1421989793967264404?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_58_39-9564154724466079026?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_08_06-9232558537648497676?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_16_46-4420847516105651507?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_33_58-8212450790968431783?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1463:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_13-9030977208042973245?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_56_55-18013249984140787232?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:275:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_05_28-2273186467578633690?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_14_24-1324242307317662816?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_22_10-7181582658343638902?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_31_51-309424516277193108?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_15-15370975410966976742?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_58_12-12971590148497985047?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_06_49-6963421792405585282?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_15_20-6405970833360034169?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87:
 FutureWarning: _ReadFromBigQuery is experimental.
  kms_key=kms_key)
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_23_25-9634685747697184313?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_31_36-12982831341075839126?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87:
 FutureWarning: _ReadFromBigQuery is experimental.
  kms_key=kms_key)
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87:
 FutureWarning: _ReadFromBigQuery is experimental.
  kms_key=kms_key)
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_49_14-16560547256660427599?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_58_02-7004957761847039177?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_05_48-16667686843872145478?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_14_53-14619929221408219656?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_23_27-17094358371543681444?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_32_46-16124726978295281142?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 52 tests in 5710.246s

FAILED (SKIP=9, errors=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 37m 3s
84 actionable tasks: 63 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/wlvixghcbk2r4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to