See 
<https://builds.apache.org/job/beam_PostCommit_Python2/377/display/redirect?page=changes>

Changes:

[markliu] [BEAM-7993] Run Portable PreCommit tests sequentially

[yifanzou] [BEAM-8117] add notes when generating the gpg key.

[yifanzou] [BEAM-8097] update the release doc

[lostluck] Makes subnetwork configurable

------------------------------------------
[...truncated 1.06 MB...]
root: INFO: 2019-09-03T23:02:53.949Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
root: INFO: 2019-09-03T23:02:53.990Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
root: INFO: 2019-09-03T23:02:54.067Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-03T23:03:12.316Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/DropShardNumber+WriteWithMultipleDests/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
root: INFO: 2019-09-03T23:03:12.376Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
root: INFO: 2019-09-03T23:03:12.441Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
root: INFO: 2019-09-03T23:03:12.524Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-03T23:03:26.911Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-03T23:03:26.982Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
root: INFO: 2019-09-03T23:03:27.018Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
root: INFO: 2019-09-03T23:03:27.053Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-03T23:03:27.080Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-03T23:03:27.103Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-03T23:03:27.113Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-03T23:03:27.148Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-03T23:03:27.157Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:27.162Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-03T23:03:27.232Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-03T23:03:27.260Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Flatten.out" materialized.
root: INFO: 2019-09-03T23:03:27.283Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:31.098Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+WriteWithMultipleDests/BigQueryBatchFileLoads/FlattenPartitions+WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
root: INFO: 2019-09-03T23:03:31.170Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
root: INFO: 2019-09-03T23:03:31.216Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
root: INFO: 2019-09-03T23:03:31.263Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-03T23:03:31.300Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-03T23:03:31.328Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
root: INFO: 2019-09-03T23:03:31.346Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-03T23:03:31.372Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten
root: INFO: 2019-09-03T23:03:31.413Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:31.415Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
root: INFO: 2019-09-03T23:03:31.459Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/Flatten.out" materialized.
root: INFO: 2019-09-03T23:03:31.498Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-03T23:03:31.543Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:36.099Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-03T23:03:36.177Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out"
 materialized.
root: INFO: 2019-09-03T23:03:36.263Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-09-03T23:03:36.338Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-09-03T23:03:36.427Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:36.518Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-03T23:03:40.946Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
root: INFO: 2019-09-03T23:03:41.034Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" 
materialized.
root: INFO: 2019-09-03T23:03:41.123Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-09-03T23:03:41.179Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
root: INFO: 2019-09-03T23:03:41.240Z: JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
root: INFO: 2019-09-03T23:03:41.326Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-03T23:03:44.308Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-03T23:03:44.397Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-09-03T23:03:44.498Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-09-03T23:03:44.613Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-09-03T23:03:51.550Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
root: INFO: 2019-09-03T23:03:51.638Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-09-03T23:03:51.700Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
root: INFO: 2019-09-03T23:03:51.789Z: JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-09-03T23:04:02.072Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-09-03T23:04:03.469Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-09-03T23:04:06.796Z: JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete
root: INFO: 2019-09-03T23:04:06.882Z: JOB_MESSAGE_DEBUG: Executing success step 
success97
root: INFO: 2019-09-03T23:04:07.048Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-09-03T23:04:07.110Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-09-03T23:04:07.144Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-09-03T23:07:07.943Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Resized worker pool from 1 to 0.
root: INFO: 2019-09-03T23:07:07.981Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-09-03T23:07:08.018Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-09-03_15_57_54-17015438919014593906 is in state 
JOB_STATE_DONE
root: INFO: Attempting to perform query SELECT name, language FROM 
python_bq_file_loads_15675514632281.output_table1 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/2f414d7b-1e91-4600-88ab-078f784f44c9?timeoutMs=10000&location=US&maxResults=0
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/2f414d7b-1e91-4600-88ab-078f784f44c9?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon71a1d4c0cb1bbf47a6b0b8b1c3ffd3f6897b114b/data
 HTTP/1.1" 200 None
root: INFO: Result of query is: [(u'beam', u'go'), (u'beam', u'py'), (u'spark', 
u'py'), (u'beam', u'java'), (u'flink', u'java'), (u'flink', u'scala'), 
(u'spark', u'scala'), (u'spark', u'scala')]
root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 3.89273153133 
seconds before retrying _query_with_retry because we caught exception: 
NotFound: 404 Not found: Table 
apache-beam-testing:python_bq_file_loads_15675514632281.output_table2 was not 
found in location US
 Traceback for above exception (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py";,>
 line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py";,>
 line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 8.97032038682 
seconds before retrying _query_with_retry because we caught exception: 
NotFound: 404 Not found: Table 
apache-beam-testing:python_bq_file_loads_15675514632281.output_table2 was not 
found in location US
 Traceback for above exception (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py";,>
 line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py";,>
 line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 18.8546179903 
seconds before retrying _query_with_retry because we caught exception: 
NotFound: 404 Not found: Table 
apache-beam-testing:python_bq_file_loads_15675514632281.output_table2 was not 
found in location US
 Traceback for above exception (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py";,>
 line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py";,>
 line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 22.5874567021 
seconds before retrying _query_with_retry because we caught exception: 
NotFound: 404 Not found: Table 
apache-beam-testing:python_bq_file_loads_15675514632281.output_table2 was not 
found in location US
 Traceback for above exception (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py";,>
 line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py";,>
 line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
root: WARNING: Retry with exponential backoff: waiting for 70.6235069476 
seconds before retrying _query_with_retry because we caught exception: 
NotFound: 404 Not found: Table 
apache-beam-testing:python_bq_file_loads_15675514632281.output_table2 was not 
found in location US
 Traceback for above exception (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 206, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/tests/bigquery_matcher.py";,>
 line 102, in _query_with_retry
    rows = query_job.result(timeout=60)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 2877, in result
    super(QueryJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/bigquery/job.py";,>
 line 733, in result
    return super(_AsyncJob, self).result(timeout=timeout)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/future/polling.py";,>
 line 127, in result
    raise self._exception

root: INFO: Attempting to perform query SELECT name, foundation FROM 
python_bq_file_loads_15675514632281.output_table2 to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 176
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 4193.388s

FAILED (SKIP=4, errors=1)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 48s
110 actionable tasks: 85 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/s2yqpdff6ygjq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to