See 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/7365/display/redirect>

Changes:


------------------------------------------
[...truncated 17.56 MB...]
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/environments.py";,>
 line 741, in python_sdk_dependencies
    skip_prestaged_dependencies=skip_prestaged_dependencies))
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 179, in create_job_resources
    setup_options.requirements_file, requirements_cache_path)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 260, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/runners/portability/stager.py";,>
 line 569, in _populate_requirements_cache
    processes.check_output(cmd_args, stderr=processes.STDOUT)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py";,>
 line 99, in check_output
    .format(traceback.format_exc(), args[0][6], error.output))
RuntimeError: Full traceback: Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py";,>
 line 91, in check_output
    out = subprocess.check_output(*args, **kwargs)
  File "/usr/lib/python3.6/subprocess.py", line 356, in check_output
    **kwargs).stdout
  File "/usr/lib/python3.6/subprocess.py", line 438, in run
    output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command 
'['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']' 
returned non-zero exit status 1.
 
 Pip install failed for package: -r           
 Output from execution of subprocess: b'Collecting mock<3.0.0\n  File was 
already downloaded 
/tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz\nCollecting 
parameterized<0.8.0,>=0.7.1\n  File was already downloaded 
/tmp/dataflow-requirements-cache/parameterized-0.7.4.tar.gz\nCollecting 
pyhamcrest!=1.10.0,<2.0.0\n  File was already downloaded 
/tmp/dataflow-requirements-cache/PyHamcrest-1.10.1.tar.gz\nCollecting 
pbr>=0.11\n  File was already downloaded 
/tmp/dataflow-requirements-cache/pbr-5.5.1.tar.gz\nCollecting six>=1.9\n  File 
was already downloaded /tmp/dataflow-requirements-cache/six-1.15.0.tar.gz\n    
ERROR: Command errored out with exit status 1:\n     command: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python>
 -c \'import sys, setuptools, tokenize; sys.argv[0] = 
\'"\'"\'/tmp/pip-download-nh108tc5/six_35f487e32ee6469894980e27001621e0/setup.py\'"\'"\';
 
__file__=\'"\'"\'/tmp/pip-download-nh108tc5/six_35f487e32ee6469894980e27001621e0/setup.py\'"\'"\';f=getattr(tokenize,
 \'"\'"\'open\'"\'"\', 
open)(__file__);code=f.read().replace(\'"\'"\'\\r\\n\'"\'"\', 
\'"\'"\'\\n\'"\'"\');f.close();exec(compile(code, __file__, 
\'"\'"\'exec\'"\'"\'))\' egg_info --egg-base /tmp/pip-pip-egg-info-c_ea61xz\n   
      cwd: /tmp/pip-download-nh108tc5/six_35f487e32ee6469894980e27001621e0/\n   
 Complete output (5 lines):\n    Traceback (most recent call last):\n      File 
"<string>", line 1, in <module>\n      File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/lib/python3.6/tokenize.py";,>
 line 452, in open\n        buffer = _builtin_open(filename, \'rb\')\n    
FileNotFoundError: [Errno 2] No such file or directory: 
\'/tmp/pip-download-nh108tc5/six_35f487e32ee6469894980e27001621e0/setup.py\'\n  
  ----------------------------------------\nERROR: Command errored out with 
exit status 1: python setup.py egg_info Check the logs for full command 
output.\n'
-------------------- >> begin captured logging << --------------------
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
root: DEBUG: Unhandled type_constraint: Union[]
apache_beam.runners.portability.stager: INFO: Executing command: 
['<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967053/bin/python',>
 '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 
'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:']
--------------------- >> end captured logging << ---------------------
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_12-13488081208286825811?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_18-6852839736708339341?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_17-9134896670883639007?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_14-16355298447783801778?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_45_57-10903250759465724655?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_07-15949683228310022382?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_14-13300596500179011869?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_14-14218362441032708379?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_24-8620256233757522287?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_11_00_42-7400701993957263196?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_14-15583485888772444684?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_18-10168800288932508348?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_13-14657763211227015455?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_28-3620849362591884048?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_19-17768735257581113010?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_11-13290781488431023613?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_02-11792774142905433845?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_16-17875936786789221050?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_14-7654844825444074689?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_23-3631542905714916668?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_13-2758363075243283405?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_09-6121376987403411262?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_52_58-14618103686977046337?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 30 tests in 1751.681s

FAILED (SKIP=3, errors=1)

> Task :sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests 
> FAILED

> Task :sdks:python:test-suites:dataflow:py37:validatesRunnerStreamingTests
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.024Z: 
JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at 
core.py:2957>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.024Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+Key
 
param+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.024Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:2957>)+assert_that/Create/Map(decode)+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.024Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.156Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.184Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.189Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.191Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:08:51.196Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:09:35.283Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-12-25_11_03_00-15667665242995794330 is in state JOB_STATE_DONE
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_23-6645729685859220930?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_57-7823026288036242783?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_11-16547485244462165111?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_24-9184646880744806094?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_05-16840636563572477319?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_52_54-10183995237875398465?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_24-607762297043867215?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_33-8088560554688790812?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_32-14653462185704780498?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_23-18288625769466209475?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_45_54-2983866492150676287?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_39-11233563969908920595?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_24-9779367813093363323?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_19-6917243110560672970?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_03-1965703272242122482?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_25-6232447123493309022?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_34-11832198193640110994?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_43-11689442630262429847?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_11_03_00-15667665242995794330?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_24-10483845677918896476?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_25-15974092194182051505?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_39_24-7005971735176864975?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_23-9109850962010692531?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_28-4728033115753675322?project=apache-beam-testing
test_combine 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_combining_value_state 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_non_liftable_combine 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_gbk_many_values 
(apache_beam.runners.portability.fn_api_runner.fn_runner_test.FnApiBasedStateBackedCoderTest)
 ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_flatten_a_flattened_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_one_single_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_user_counter_using_pardo (apache_beam.metrics.metric_test.MetricsTest) ... 
ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_reshuffle_preserves_timestamps 
(apache_beam.transforms.util_test.ReshuffleTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 30 tests in 1843.346s

OK (SKIP=3)

> Task :sdks:python:test-suites:dataflow:py38:validatesRunnerStreamingTests
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.352Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at 
core.py:2957>)+assert_that/Create/Map(decode)+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.352Z: 
JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at 
core.py:2957>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.352Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/ReadStream+assert_that/Group/GroupByKey/MergeBuckets+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.352Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+Key
 
param+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/WriteStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.471Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.524Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.531Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.534Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:10:20.540Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-12-25T19:11:13.658Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-12-25_11_04_04-11437339836569342849 is in state JOB_STATE_DONE
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_04-1800151589161815767?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_38-13139716292461397479?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_05-13086786469803582675?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_46_30-6288819408972981647?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_53_39-16010409725327739242?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_05-2409365232833695130?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_44-2918661906310158224?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_55_03-14412880244652471366?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_08-12963059655686764581?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_24-10993715389059978615?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_37-3280389824646297296?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_09-18112568953376010806?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_49_02-17403609803977053536?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_56_36-14119027708258170034?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_11_04_04-11437339836569342849?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_05-172317078998984354?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_15-2179257173304687332?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_14-18206377985218192793?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_06-18008515791512316040?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_12-16394647885830085946?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_31-9574675756893698698?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_40_05-3559527193508380994?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_47_24-16159636572317797519?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-25_10_54_48-11131099778083093767?project=apache-beam-testing
test_combine 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_combining_value_state 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_non_liftable_combine 
(apache_beam.transforms.combinefn_lifecycle_test.CombineFnLifecycleTest) ... 
SKIP: CombineFn.setup and CombineFn.teardown are not supported. Please use 
Dataflow Runner V2.
test_gbk_many_values 
(apache_beam.runners.portability.fn_api_runner.fn_runner_test.FnApiBasedStateBackedCoderTest)
 ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_a_flattened_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_one_single_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_user_counter_using_pardo (apache_beam.metrics.metric_test.MetricsTest) ... 
ok
test_flatten_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_reshuffle_preserves_timestamps 
(apache_beam.transforms.util_test.ReshuffleTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df-py38.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 30 tests in 1898.091s

OK (SKIP=3)

FAILURE: Build failed with an exception.

* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
 line: 175

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 3s
89 actionable tasks: 60 executed, 29 from cache

Publishing build scan...
https://gradle.com/s/g462duwos6mrw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to