See 
<https://builds.apache.org/job/beam_PostCommit_Python2/2414/display/redirect?page=changes>

Changes:

[github] Merge pull request #11567: [BEAM-8132] Report Python metrics to 
InfluxDB

[kamil.wasilewski] Fix InfluxDB measurement names to match those in Grafana 
dashboards

[github] [BEAM-9945] Report data channel progress via a designated counter.

[github] [BEAM-9577] Update Java Runners to handle dependency-based artifact


------------------------------------------
[...truncated 11.25 MB...]
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/cbc17d3d-88e5-439c-84cd-e18869713ec0?location=US&maxResults=0
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon8153299bb9e12cb8a16a4f38bf7846dd804cbe6c/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [('xyw', 
datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), ('\xab\xac\xad', 
datetime.date(2000, 1, 1), datetime.time(0, 0)), ('abc', datetime.date(2000, 1, 
1), datetime.time(0, 0)), ('\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset 
python_write_to_table_15893097365315 in project apache-beam-testing
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:03:45.583Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:03:56.698Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:05:33.207Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:05:33.243Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:18.564Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:21.331Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:24.586Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+ExternalTransform(simple)/Map(<lambda at 
external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:24.680Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:24.760Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:24.855Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:34.279Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:34.359Z: 
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:34.476Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:34.535Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:06:34.578Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:08:09.797Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:08:09.855Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:08:09.896Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-05-12_12_00_11-1156092441918660021 is in state JOB_STATE_DONE
test_job_python_from_python_it 
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:08.503Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:08.619Z: 
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:08.692Z: 
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:08.798Z: 
JOB_MESSAGE_BASIC: Executing operation 
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:09.492Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:17.861Z: 
JOB_MESSAGE_BASIC: Finished operation 
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:17.968Z: 
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:18.179Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:18.290Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:09:18.340Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:10:34.571Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:10:34.621Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-12T19:10:34.652Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-05-12_12_03_01-6894206549926008263 is in state JOB_STATE_DONE
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok

======================================================================
ERROR: test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py";,>
 line 211, in test_streaming_data_only
    self._test_streaming(with_attributes=False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_integration_test.py";,>
 line 207, in _test_streaming
    timestamp_attribute=self.TIMESTAMP_ATTRIBUTE)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/pubsub_it_pipeline.py";,>
 line 95, in run_pipeline
    result = p.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 529, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 581, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 635, in create_job
    self.create_job_description(job)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 691, in create_job_description
    resources = self._stage_resources(job.options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 588, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 351, in create_and_stage_job_resources
    staged_resources = self.stage_job_resources(resources, staging_location)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 305, in stage_job_resources
    file_path, FileSystems.join(staging_location, staged_path))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 954, in stage_artifact
    local_path_to_artifact, artifact_name)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 575, in _gcs_file_copy
    self.stage_file(to_folder, to_name, f, total_size=total_size)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 613, in stage_file
    response = self._storage_client.objects.Insert(request, upload=upload)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py";,>
 line 1156, in Insert
    upload=upload, upload_config=upload_config)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py";,>
 line 715, in _RunMethod
    http_request, client=self.client)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py";,>
 line 908, in InitializeUpload
    return self.StreamInChunks()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py";,>
 line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py";,>
 line 971, in __StreamMedia
    self.RefreshResumableUploadState()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py";,>
 line 875, in RefreshResumableUploadState
    raise exceptions.HttpError.FromResponse(refresh_response)
HttpError: HttpError accessing 
<https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUmPbMC6sCCwCath1M7st0ZWkG9K3iawRkI6XOea9mv453t9w4q1Q9NP5pvS2YjKadxMijtbB2XekMNPN_nXvumqsSD-7g&name=staging-it%2Fbeamapp-jenkins-0512184136-644910.1589308896.645043%2Fdataflow-worker.jar>:
 response: <{'status': '410', 'content-length': '205', 'expires': 'Mon, 01 Jan 
1990 00:00:00 GMT', 'vary': 'Origin, X-Origin', 'x-guploader-uploadid': 
'AAANsUmPbMC6sCCwCath1M7st0ZWkG9K3iawRkI6XOea9mv453t9w4q1Q9NP5pvS2YjKadxMijtbB2XekMNPN_nXvumqsSD-7g',
 'pragma': 'no-cache', 'cache-control': 'no-cache, no-store, max-age=0, 
must-revalidate', 'date': 'Tue, 12 May 2020 18:42:02 GMT', 'server': 
'UploadServer', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 503,
    "message": "Backend Error",
    "errors": [
      {
        "message": "Backend Error",
        "domain": "global",
        "reason": "backendError"
      }
    ]
  }
}
>
-------------------- >> begin captured logging << --------------------
apache_beam.io.gcp.bigquery_read_it_test: INFO: Deleting dataset 
python_read_table_15893079388915 in project apache-beam-testing
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: Monitor is waking up
root: WARNING: Make sure that locally built Python SDK docker image has Python 
2.7 interpreter.
root: INFO: Using Python SDK docker image: 
apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, 
we will try to pull from hub.docker.com
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pipeline.pb...
google.cloud.pubsub_v1.publisher._batch.thread: DEBUG: gRPC Publish took 
0.247656822205 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pipeline.pb
 in 0 seconds.
apache_beam.runners.portability.stager: INFO: Executing command: 
['<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
apache_beam.runners.portability.stager: INFO: Copying Beam SDK 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/requirements.txt...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/requirements.txt
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.4.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.4.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/six-1.14.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/six-1.14.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.3.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.3.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/parameterized-0.7.1.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/mock-2.0.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/mock-2.0.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pbr-5.4.4.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pbr-5.4.4.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/funcsigs-1.0.2.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/funcsigs-1.0.2.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pbr-5.4.5.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/pbr-5.4.5.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/PyHamcrest-1.10.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/dataflow_python_sdk.tar...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/dataflow_python_sdk.tar
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0512184136-644910.1589308896.645043/dataflow-worker.jar...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 58 tests in 3750.462s

FAILED (SKIP=7, errors=1)
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_43-6277423345840922781?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_18_34-6950195374039764650?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_28_41-10464765116453563629?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_37_10-3234791834495929618?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_54_47-8362780138316819153?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_12_03_01-6894206549926008263?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_47-3539011327683238036?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_23_14-4330343964817507251?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_30_39-12876910448948399733?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_38_12-828035381799665153?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_46_13-8853503123438171395?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_54_26-582980106973133306?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_40-4483298912418461279?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_27_33-789587486636684956?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_35_51-10732816533600274226?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_44_41-14588685871041382215?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_52_04-13390187199758674271?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_51-641783958430249436?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_20_55-4349124329384675033?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_28_58-5092102584534386168?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_37_10-14053503361410919983?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_45_36-1851724145234017517?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_52_53-10115400436847773315?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_41-14946556000892517894?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_25_57-17336190364112881719?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_34_21-5607948943024228002?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_42_21-8903097759856221328?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_50_01-15717015767917879319?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_58_15-6215391655283192025?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_41-1502503249569320866?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_16_59-8423201026376097246?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_25_29-2736048116877781643?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_33_38-14220176076896660316?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_42_27-9054866551844449000?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_51_01-14736863557387190631?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_12_00_11-1156092441918660021?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_40-3959933980284783233?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_16_43-12824138231661377211?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_25_31-12585785202544932239?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_33_02-18088378123496277753?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_40_36-12086018510382664810?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_48_12-14705983584918991344?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_56_00-956212657436290347?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_08_42-2324031417870749521?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_17_18-12355714597719460697?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_25_32-1611207643101354?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_34_56-6412610171336552246?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_43_33-6074946709757706646?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-12_11_52_48-10011896893801771609?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 116

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 5m 22s
123 actionable tasks: 106 executed, 15 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/fxpu4x6ro6f7u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to