See 
<https://builds.apache.org/job/beam_PostCommit_Python36/1273/display/redirect?page=changes>

Changes:

[lukecwik] [BEAM-9004] Migrate org.mockito.Matchers#anyString to


------------------------------------------
[...truncated 3.67 MB...]
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.830Z: 
JOB_MESSAGE_DEBUG: Value "MakeSchemas/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.850Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.868Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.904Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.931Z: 
JOB_MESSAGE_DEBUG: Value "MakeTables/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:38.967Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.001Z: 
JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.038Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.068Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.101Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.127Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.130Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.158Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.167Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.204Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.207Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.241Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.244Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.278Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.298Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.314Z: 
JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at 
bigquery_file_loads_test.py:674>)+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.343Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.380Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.412Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.449Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Read.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:39.476Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:39:49.611Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:40:09.141Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:41:34.019Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:41:34.051Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:38.313Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:42.870Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda
 at 
bigquery_file_loads.py:776>)+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:42.940Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:42.968Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:776>).out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.006Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix.out" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.050Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.084Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.102Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.116Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.135Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.148Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.166Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.178Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.197Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.212Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.234Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.256Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.291Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:776>).out.0).output" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.325Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.362Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:43.400Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.008Z: 
JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.079Z: 
JOB_MESSAGE_BASIC: Finished operation Create/Read+Map(<lambda at 
bigquery_file_loads_test.py:674>)+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.407Z: 
JOB_MESSAGE_WARNING: 
S27:WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
 failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.430Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseEmptyPC/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.735Z: 
JOB_MESSAGE_WARNING: 
S08:WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:776>) failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.764Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda
 at bigquery_file_loads.py:776>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:50.871Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:51.115Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:45:51.141Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:47:24.780Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:47:24.818Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-20T20:47:24.840Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-20_12_39_30-13169992645443920923 is in state JOB_STATE_FAILED
apache_beam.io.gcp.bigquery_file_loads_test: INFO: Deleting dataset 
python_bq_file_loads_1576874354500 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_41-13893935183974522278?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:261:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_40_08-11649948026892601498?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1593:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_48_22-18074488082656438913?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_56_39-6769770641884051792?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_39-15291204757023484212?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_50_44-4483094223205591389?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_58_35-8636826268808420168?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:769:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_42-2219442967732537500?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_39_34-885474676602038457?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_47_04-16936267749578211335?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_54_47-8796500102199469303?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_42-7520895836878420500?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_51_09-4461235147012748565?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1406:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_41-9798536976155396690?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_40_28-11138923680780154037?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155:
 FutureWarning: _ReadFromBigQuery is experimental.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_48_18-3359070756809992645?project=apache-beam-testing
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_55_33-10228401276631247573?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1593:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_13_03_10-374109255389539312?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_13_11_02-3023582819907601964?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_39-16451430479475070745?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_39_30-13169992645443920923?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_48_13-7628664812617588137?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_57_14-3559457536459849366?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_13_05_13-5590221373242815118?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:738:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:769:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:769:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_42-12776155992264469078?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_39_37-5548772597923757601?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_47_34-11035400513266674121?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_55_03-2966657742912633950?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_13_02_33-6069228264309756400?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_31_41-11441286448112737854?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_40_02-11621240841463356357?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:755:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_48_13-16675707296939348309?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1409:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-20_12_57_11-14342624867603301547?project=apache-beam-testing

======================================================================
ERROR: Runs streaming Dataflow job and verifies that user metrics are reported
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_exercise_streaming_metrics_pipeline_test.py";,>
 line 65, in setUp
    self.pub_client.topic_path(self.project, self.input_topic_name))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/cloud/pubsub_v1/_gapic.py";,>
 line 40, in <lambda>
    fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw)  # noqa
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/cloud/pubsub_v1/gapic/publisher_client.py";,>
 line 332, in create_topic
    request, retry=retry, timeout=timeout, metadata=metadata
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/api_core/gapic_v1/method.py";,>
 line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/api_core/retry.py";,>
 line 286, in retry_wrapped_func
    on_error=on_error,
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/api_core/retry.py";,>
 line 184, in retry_target
    return target()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/api_core/timeout.py";,>
 line 214, in func_with_timeout
    return func(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/google/api_core/grpc_helpers.py";,>
 line 59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.ResourceExhausted: 429 Your project has exceeded a 
limit: (type="topics-per-project", current=10000, maximum=10000).
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default 
to us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 47 tests in 2820.979s

FAILED (SKIP=6, errors=7)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'>
 line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/portable/py36/build.gradle'>
 line: 62

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py36:postCommitPy36IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 48m 6s
84 actionable tasks: 63 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/hyj2pykprk2tg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to