See 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/6222/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-5422] Document DynamicDestinations.getTable uniqueness requirement


------------------------------------------
[...truncated 19.68 MB...]
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py";,>
 line 196, in test_iterable_side_input
    pipeline.run()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 501, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 514, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 561, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 635, in create_job
    self.create_job_description(job)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 691, in create_job_description
    resources = self._stage_resources(job.options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 588, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 351, in create_and_stage_job_resources
    staged_resources = self.stage_job_resources(resources, staging_location)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 305, in stage_job_resources
    file_path, FileSystems.join(staging_location, staged_path))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 953, in stage_artifact
    local_path_to_artifact, artifact_name)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 236, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 575, in _gcs_file_copy
    self.stage_file(to_folder, to_name, f, total_size=total_size)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 613, in stage_file
    response = self._storage_client.objects.Insert(request, upload=upload)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py";,>
 line 1156, in Insert
    upload=upload, upload_config=upload_config)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/base_api.py";,>
 line 715, in _RunMethod
    http_request, client=self.client)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 908, in InitializeUpload
    return self.StreamInChunks()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 1020, in StreamInChunks
    additional_headers=additional_headers)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 957, in __StreamMedia
    response = send_func(self.stream.tell())
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 943, in CallSendChunk
    start, additional_headers=additional_headers)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 1120, in __SendChunk
    return self.__SendMediaRequest(request, end)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/transfer.py";,>
 line 1033, in __SendMediaRequest
    retries=self.num_retries, check_response_func=CheckResponse)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/http_wrapper.py";,>
 line 346, in MakeRequest
    check_response_func=check_response_func)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/apitools/base/py/http_wrapper.py";,>
 line 396, in _MakeRequestNoRetry
    redirections=redirections, connection_type=connection_type)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/oauth2client/transport.py";,>
 line 169, in new_request
    redirections, connection_type)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/oauth2client/transport.py";,>
 line 169, in new_request
    redirections, connection_type)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py";,>
 line 1924, in request
    cachekey,
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py";,>
 line 1595, in _request
    conn, request_uri, method, body, headers
  File 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/site-packages/httplib2/__init__.py";,>
 line 1533, in _conn_request
    response = conn.getresponse()
  File "/usr/lib/python3.6/http/client.py", line 1331, in getresponse
    response.begin()
  File "/usr/lib/python3.6/http/client.py", line 297, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.6/http/client.py", line 258, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.6/socket.py", line 586, in readinto
    return self._sock.recv_into(b)
  File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into
    return self.read(nbytes, buffer)
  File "/usr/lib/python3.6/ssl.py", line 874, in read
    return self._sslobj.read(len, buffer)
  File "/usr/lib/python3.6/ssl.py", line 631, in read
    v = self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
-------------------- >> begin captured logging << --------------------
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pipeline.pb...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pipeline.pb
 in 0 seconds.
apache_beam.runners.portability.stager: INFO: Executing command: 
['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
apache_beam.runners.portability.stager: INFO: Copying Beam SDK 
"<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/requirements.txt...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/requirements.txt
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/six-1.14.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/six-1.14.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/parameterized-0.7.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/parameterized-0.7.1.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/mock-2.0.0.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/mock-2.0.0.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pbr-5.4.4.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pbr-5.4.4.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/funcsigs-1.0.2.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/funcsigs-1.0.2.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pbr-5.4.5.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/pbr-5.4.5.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/PyHamcrest-1.10.1.tar.gz...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/PyHamcrest-1.10.1.tar.gz
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/dataflow_python_sdk.tar...
apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/dataflow_python_sdk.tar
 in 0 seconds.
apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to 
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0408045758-713733.1586321878.713886/dataflow-worker.jar...
root: DEBUG: Caught socket error, retrying: [Errno 32] Broken pipe
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0408045758-713733.1586321878.713886%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UqwV6elbdVR4iT-UewUM25G8l3RkedQWX6jUBjjkyOpVWH0ONmRsbHUJH4Qltc8Qvc0lDEZfGWUUe7WQgfF7i5gDkR3iG3P6RozC0lT_8a96HhB7M4
 after exception [Errno 32] Broken pipe
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0408045758-713733.1586321878.713886%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UqwV6elbdVR4iT-UewUM25G8l3RkedQWX6jUBjjkyOpVWH0ONmRsbHUJH4Qltc8Qvc0lDEZfGWUUe7WQgfF7i5gDkR3iG3P6RozC0lT_8a96HhB7M4
 after exception The read operation timed out
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0408045758-713733.1586321878.713886%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UqwV6elbdVR4iT-UewUM25G8l3RkedQWX6jUBjjkyOpVWH0ONmRsbHUJH4Qltc8Qvc0lDEZfGWUUe7WQgfF7i5gDkR3iG3P6RozC0lT_8a96HhB7M4
 after exception The read operation timed out
root: DEBUG: Caught socket error, retrying: The read operation timed out
root: DEBUG: Retrying request to url 
https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0408045758-713733.1586321878.713886%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UqwV6elbdVR4iT-UewUM25G8l3RkedQWX6jUBjjkyOpVWH0ONmRsbHUJH4Qltc8Qvc0lDEZfGWUUe7WQgfF7i5gDkR3iG3P6RozC0lT_8a96HhB7M4
 after exception The read operation timed out
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_17-2901838650442796476?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_24-14670485604091519357?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_18-252874662158331045?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_18-17345248363438389755?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_21-6157538584871098473?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_08-16616690117009564930?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_19-1257006314004406768?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_03-12969752877752356726?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_52-14929832342606547178?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_18-17916212619402502706?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_52-17692926089881954004?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_16-7949341641023929840?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_20-6064494926618427644?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_18-6276701194172230241?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_22_06_04-8486075913056566902?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_18-2707708974129291717?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_20-9639764787453075392?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_19-9835773782164667781?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_13-6803248902292196892?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_06-17322561602562978130?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_18-4027338508478109625?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_42-986279059493091242?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_33-13190933171153597493?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 1901.583s

FAILED (errors=1)

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests 
> FAILED

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-08T05:14:03.227Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-04-07_22_06_26-1301946633442483571 is in state JOB_STATE_DONE
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_00-14513817836858111908?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_02-12689498862433944943?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_52-3963933564676365148?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_01-10814509492729174308?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_19-15742961004084454171?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_15-17368655596327520104?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_01-18308783491259241826?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_28-2716960681929097826?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_01-15401296360179457946?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_40-12477789034366336247?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_32-4279580716339259860?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_00-14071644359237756565?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_58-2157194163511908559?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_01-13488904919378835822?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_41_59-17601442307091247046?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_46-11677245960313050944?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_29-9119107869290994276?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_01-12221068069471905866?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_49_54-18294770901941596659?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_57_41-16584046607014175684?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_42_00-16357556772740974389?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_50_01-8550311404401774740?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_21_58_19-1269114812156250084?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-07_22_06_26-1301946633442483571?project=apache-beam-testing
test_gbk_many_values 
(apache_beam.runners.portability.fn_api_runner.fn_runner_test.FnApiBasedStateBackedCoderTest)
 ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_a_flattened_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_one_single_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_user_counter_using_pardo (apache_beam.metrics.metric_test.MetricsTest) ... 
ok
test_flatten_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_reshuffle_preserves_timestamps 
(apache_beam.transforms.util_test.ReshuffleTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 1958.947s

OK

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 113

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 58s
76 actionable tasks: 58 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/wtvpqakjhxsps

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to