See 
<https://builds.apache.org/job/beam_PostCommit_Python2/2105/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-9577] Rename the Artifact{Staging,Retrieval}Service.

[robertwb] [BEAM-9577] Define the new Artifact{Staging,Retrieval}Service.

[robertwb] [BEAM-9577] Regenerate protos.

[robertwb] [BEAM-9577] Implement the new Artifact{Staging,Retrieval}Services in


------------------------------------------
[...truncated 9.49 MB...]
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:23.031Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:23.126Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:46.208Z: 
JOB_MESSAGE_BASIC: BigQuery query completed, job : 
"dataflow_job_12469200986644758768"
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:46.933Z: 
JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_7288149929909672885" 
started. You can check its status with the bq tool: "bq show -j 
--project_id=apache-beam-testing dataflow_job_7288149929909672885".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:33:59.881Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:03.180Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:06.369Z: 
JOB_MESSAGE_BASIC: Finished operation Broken 
record/Read+WriteWithMultipleDests/_StreamToBigQuery/AppendDestination/AppendDestination+WriteWithMultipleDests/_StreamToBigQuery/AddInsertIds+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/AddRandomKeys+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Reify+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:09.601Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+WriteWithMultipleDests/_StreamToBigQuery/AppendDestination/AppendDestination+WriteWithMultipleDests/_StreamToBigQuery/AddInsertIds+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/AddRandomKeys+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Reify+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:09.672Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:09.728Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:09.810Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:17.235Z: 
JOB_MESSAGE_DETAILED: BigQuery export job progress: 
"dataflow_job_7288149929909672885" observed total of 1 exported files thus far.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:17.258Z: 
JOB_MESSAGE_BASIC: BigQuery export job finished: 
"dataflow_job_7288149929909672885"
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:18.927Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:18.959Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:22.601Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/GroupByWindow+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys+WriteWithMultipleDests/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)/ParDo(BigQueryWriteFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:22.688Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:22.744Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:22.810Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:32.412Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:32.494Z: 
JOB_MESSAGE_DEBUG: Executing success step success45
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:32.629Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:32.687Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:32.722Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:37.525Z: 
JOB_MESSAGE_BASIC: BigQuery query completed, job : 
"dataflow_job_4083025060748391689"
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:37.852Z: 
JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_6415118601923758829" 
started. You can check its status with the bq tool: "bq show -j 
--project_id=apache-beam-testing dataflow_job_6415118601923758829".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:41.032Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:41.065Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:48.472Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:48.518Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:48.562Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-03-31_16_26_15-8984662444257171101 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT name, value, timestamp FROM 
python_bq_file_loads_15856971583375.output_table WHERE value<0 to BQ
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/22d8dc0d-a84f-4438-9b69-3bae8fd06481?location=US&maxResults=0
 HTTP/1.1" 200 None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:55.241Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:34:55.275Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon56cc3cd72efc98e040f049a191d1785d62917258/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(u'Negative 
infinity', -inf, datetime.datetime(1970, 1, 1, 0, 0, tzinfo=<UTC>))]
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT name, timestamp FROM python_bq_file_loads_15856971583375.output_table to 
BQ
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/ef7862a4-8c0b-410c-9919-cf2871b9b14b?location=US&maxResults=0
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon39ab3ffb741f566d59685fb1974a7204ef91ac62/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(u'Negative 
infinity', datetime.datetime(1970, 1, 1, 0, 0, tzinfo=<UTC>)), (u'Not a 
number', datetime.datetime(2930, 12, 9, 0, 0, tzinfo=<UTC>))]
INFO:apache_beam.io.gcp.bigquery_test:Deleting dataset 
python_bq_file_loads_15856971583375 in project apache-beam-testing
test_avro_file_load 
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/datasets HTTP/1.1" 200 None
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "DELETE 
/bigquery/v2/projects/apache-beam-testing/datasets/python_pubsub_bq_15856977002852?deleteContents=true
 HTTP/1.1" 204 0
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/datasets HTTP/1.1" 200 None
DEBUG:google.cloud.pubsub_v1.publisher._batch.thread:Monitor is waking up
DEBUG:google.cloud.pubsub_v1.publisher._batch.thread:gRPC Publish took 
0.171725988388 seconds.
WARNING:apache_beam.options.pipeline_options:--region not set; will default to 
us-central1. Future releases of Beam will require the user to set --region 
explicitly, or else have a default set via the gcloud tool. 
https://cloud.google.com/compute/docs/regions-zones
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.portability.stager:Executing command: 
['/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/build/gradleenv/-194514014/bin/python',
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.042Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda
 at 
bigquery_file_loads.py:870>)+WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.114Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.150Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/Map(<lambda at 
bigquery_file_loads.py:870>).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.192Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/GenerateFilePrefix.out" 
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.226Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.266Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.271Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.299Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.319Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.327Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.359Z: 
JOB_MESSAGE_BASIC: Executing operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.363Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.372Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.383Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.409Z: 
JOB_MESSAGE_BASIC: Finished operation 
WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.415Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.451Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(Map(<lambda
 at bigquery_file_loads.py:870>).out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.486Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.522Z: 
JOB_MESSAGE_DEBUG: Value 
"WriteWithMultipleDests2/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:06.560Z: 
JOB_MESSAGE_BASIC: Executing operation 
Create/Read+WriteWithMultipleDests2/BigQueryBatchFileLoads/RewindowIntoGlobal+WriteWithMultipleDests/_StreamToBigQuery/AppendDestination+WriteWithMultipleDests2/BigQueryBatchFileLoads/AppendDestination+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+WriteWithMultipleDests2/BigQueryBatchFileLoads/IdentityWorkaround+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+WriteWithMultipleDests2/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Reify+WriteWithMultipleDests2/BigQueryBatchFileLoads/GroupShardedRows/Write+WriteWithMultipleDests/_StreamToBigQuery/AddInsertIds+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/AddRandomKeys+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Reify+WriteWithMultipleDests/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/sdks/python/build/apache-beam.tar.gz"
 to staging location.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/requirements.txt...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/requirements.txt
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/six-1.14.0.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/six-1.14.0.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/parameterized-0.7.1.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/parameterized-0.7.1.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/mock-2.0.0.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/mock-2.0.0.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/pbr-5.4.4.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/pbr-5.4.4.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/funcsigs-1.0.2.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/funcsigs-1.0.2.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/PyHamcrest-1.10.1.tar.gz...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0331233505-267188.1585697705.267332/dataflow-worker.jar...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:08.161Z: 
JOB_MESSAGE_DETAILED: BigQuery export job progress: 
"dataflow_job_6415118601923758829" observed total of 1 exported files thus far.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:08.190Z: 
JOB_MESSAGE_BASIC: BigQuery export job finished: 
"dataflow_job_6415118601923758829"
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:12.209Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:12.262Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-31T23:35:12.306Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
FATAL: command execution failed
java.io.IOException: Backing channel 'JNLP4-connect connection from 
250.128.224.35.bc.googleusercontent.com/35.224.128.250:48520' is disconnected.
        at 
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
        at com.sun.proxy.$Proxy135.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
        at hudson.Launcher$ProcStarter.join(Launcher.java:470)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
        at hudson.model.Build$BuildExecution.build(Build.java:206)
        at hudson.model.Build$BuildExecution.doRun(Build.java:163)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
        at hudson.model.Run.execute(Run.java:1815)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:97)
        at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
        at 
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
        at 
org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
        at 
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
        at 
org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
        at 
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
        at 
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
        at 
org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
        at 
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
        at hudson.remoting.Channel.close(Channel.java:1452)
        at hudson.remoting.Channel.close(Channel.java:1405)
        at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:847)
        at hudson.slaves.SlaveComputer.kill(SlaveComputer.java:814)
        at hudson.model.AbstractCIBase.killComputer(AbstractCIBase.java:89)
        at jenkins.model.Jenkins.access$2100(Jenkins.java:312)
        at jenkins.model.Jenkins$19.run(Jenkins.java:3464)
        at hudson.model.Queue._withLock(Queue.java:1379)
        at hudson.model.Queue.withLock(Queue.java:1256)
        at jenkins.model.Jenkins._cleanUpDisconnectComputers(Jenkins.java:3458)
        at jenkins.model.Jenkins.cleanUp(Jenkins.java:3336)
        at hudson.WebAppMain.contextDestroyed(WebAppMain.java:379)
        at 
org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
        at 
org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
        at 
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
        at 
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
        at 
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
        at 
org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
        at 
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
        at 
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
        at 
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
        at 
org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
        at 
org.apache.catalina.core.StandardService.stopInternal(StandardService.java:473)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
        at 
org.apache.catalina.core.StandardServer.stopInternal(StandardServer.java:994)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
        at org.apache.catalina.startup.Catalina.stop(Catalina.java:706)
        at org.apache.catalina.startup.Catalina.start(Catalina.java:668)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:344)
        at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:475)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-6 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to