See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/583/display/redirect>
------------------------------------------ [...truncated 74.53 KB...] Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[str, NoneType], Tuple[str, int]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) test_par_do_with_multiple_outputs_and_using_yield (apache_beam.transforms.ptransform_test.PTransformTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, List[Any]]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: List[Any]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, List[Any]], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, List[Any]]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: List[Any]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, List[Any]], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr test_multiple_empty_outputs (apache_beam.transforms.ptransform_test.PTransformTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, List[Any]]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: List[Any]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, List[Any]], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, Any]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) test_as_list_and_as_dict_side_inputs (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, Any], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, NoneType]], Tuple[NoneType, Tuple[Any, Tuple[int, str]]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, NoneType], Tuple[Any, Tuple[int, str]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, Any]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, Any], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[NoneType, Tuple[Any, Any]], Tuple[NoneType, Tuple[Any, NoneType]]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/coders/typecoders.py>:133: UserWarning: Using fallback coder for typehint: Union[Tuple[Any, Any], Tuple[Any, NoneType]]. warnings.warn('Using fallback coder for typehint: %r.' % typehint) Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok Collecting pyhamcrest (from -r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/PyHamcrest-1.9.0.tar.gz Collecting mock (from -r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/mock-2.0.0.tar.gz Collecting setuptools (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/setuptools-38.3.0.zip Collecting six (from pyhamcrest->-r postcommit_requirements.txt (line 1)) File was already downloaded /tmp/dataflow-requirements-cache/six-1.11.0.tar.gz Collecting funcsigs>=1 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/funcsigs-1.0.2.tar.gz Collecting pbr>=0.11 (from mock->-r postcommit_requirements.txt (line 2)) File was already downloaded /tmp/dataflow-requirements-cache/pbr-3.1.1.tar.gz Successfully downloaded pyhamcrest mock setuptools six funcsigs pbr test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok test_multi_valued_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok ====================================================================== ERROR: test_par_do_with_multiple_outputs_and_using_return (apache_beam.transforms.ptransform_test.PTransformTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/transforms/ptransform_test.py",> line 240, in test_par_do_with_multiple_outputs_and_using_return pipeline.run() File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 102, in run result = super(TestPipeline, self).run() File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 330, in run self.to_runner_api(), self.runner, self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/pipeline.py",> line 339, in run return self.runner.run_pipeline(self) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 315, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/utils/retry.py",> line 175, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 461, in create_job self.create_job_description(job) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 491, in create_job_description job.options, file_copy=self._gcs_file_copy) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 328, in stage_job_resources setup_options.requirements_file, requirements_cache_path) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/runners/dataflow/internal/dependency.py",> line 262, in _populate_requirements_cache processes.check_call(cmd_args) File "<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/apache_beam/utils/processes.py",> line 44, in check_call return subprocess.check_call(*args, **kwargs) File "/usr/lib/python2.7/subprocess.py", line 540, in check_call raise CalledProcessError(retcode, cmd) CalledProcessError: Command '['<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:']' returned non-zero exit status 2 -------------------- >> begin captured logging << -------------------- root: DEBUG: PValue computed by Some Numbers/Read (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(some_fn) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Create/Read (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/ToVoidKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Group/pair_with_0 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Group/pair_with_1 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Group/Flatten (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Group/GroupByKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert_that/Unkey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(some_fn) (tag odd): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Create/Read (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/ToVoidKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Group/pair_with_0 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Group/pair_with_1 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Group/Flatten (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Group/GroupByKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:odd/Unkey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by ClassifyNumbers/FlatMap(some_fn) (tag even): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/WindowInto(WindowIntoFn) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Create/Read (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/ToVoidKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Group/pair_with_0 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Group/pair_with_1 (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Group/Flatten (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Group/GroupByKey (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Group/Map(_merge_tagged_vals_under_key) (tag None): refcount: 1 => 0 root: DEBUG: PValue computed by assert:even/Unkey (tag None): refcount: 1 => 0 root: DEBUG: Connecting using Google Application Default Credentials. root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0105030248-271365.1515121368.271720/pipeline.pb... oauth2client.transport: INFO: Attempting refresh to obtain initial access_token root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0105030248-271365.1515121368.271720/pipeline.pb root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0105030248-271365.1515121368.271720/requirements.txt... root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-validatesrunner-test/beamapp-jenkins-0105030248-271365.1515121368.271720/requirements.txt root: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/ws/src/sdks/python/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--no-binary', ':all:'] --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 15 tests in 1528.188s FAILED (errors=1) Build step 'Execute shell' marked build as failure
