Mark Liu created BEAM-5785:
------------------------------

             Summary: A ValidatesRunner test failed on Python 3
                 Key: BEAM-5785
                 URL: https://issues.apache.org/jira/browse/BEAM-5785
             Project: Beam
          Issue Type: Bug
          Components: test-failures
            Reporter: Mark Liu
            Assignee: Valentyn Tymofieiev


I run a random ValidatesRunner test on Python 3 with TestsDataflowRunner. The 
test failed before submitting job to the service.

More details about my env and test:
Python version: 3.5.3
Test: 
apache_beam.transforms.ptransform_test:PTransformTest.test_multiple_empty_outputs
command:
{code}
python setup.py nosetests \
  --tests 
apache_beam.transforms.ptransform_test:PTransformTest.test_multiple_empty_outputs
  \
  --nocapture \
  --test-pipeline-options=" \                             
        --runner=TestDataflowRunner \
        --project=<my_project> \
        --staging_location=<my_staging> \
        --temp_location=<my_temp> \
        --output=<my_output> \                                                  
                    
        --sdk_location=.../beam/sdks/python/dist/apache-beam-2.9.0.dev0.tar.gz \
        --num_workers=1"
{code}

Here are the stacktrace from my console:
{code}
======================================================================
ERROR: test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/transforms/ptransform_test.py",
 line 284, in test_multiple_empty_outputs
    pipeline.run()
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/testing/test_pipeline.py",
 line 107, in run
    else test_runner_api))
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
403, in run
    self.to_runner_api(), self.runner, self._options).run(False)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
416, in run
    return self.runner.run_pipeline(self)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",
 line 50, in run_pipeline
    self.result = super(TestDataflowRunner, self).run_pipeline(pipeline)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",
 line 374, in run_pipeline
    super(DataflowRunner, self).run_pipeline(pipeline)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/runner.py",
 line 176, in run_pipeline
    pipeline.visit(RunVisitor(self))
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
444, in visit
    self._root_transform().visit(visitor, self, visited)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
780, in visit
    part.visit(visitor, pipeline, visited)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
780, in visit
    part.visit(visitor, pipeline, visited)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/pipeline.py", line 
783, in visit
    visitor.visit_transform(self)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/runner.py",
 line 171, in visit_transform
    self.runner.run_transform(transform_node)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/runner.py",
 line 214, in run_transform
    return m(transform_node)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",
 line 846, in run_Read
    source_dict)
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",
 line 85, in add_property
    key=name, value=to_json_value(value, with_type=with_type)))
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/internal/gcp/json_value.py",
 line 104, in to_json_value
    key=k, value=to_json_value(v, with_type=with_type)))
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/internal/gcp/json_value.py",
 line 104, in to_json_value
    key=k, value=to_json_value(v, with_type=with_type)))
  File 
"/usr/local/google/home/markliu/beam/sdks/python/apache_beam/internal/gcp/json_value.py",
 line 124, in to_json_value
    raise TypeError('Cannot convert %s to a JSON value.' % repr(obj))
TypeError: Cannot convert 
b'eNpVjrEKwjAURRNjq6ZOfkVd8hOZxFGQLhJe0ycW2saXpA5CQf/ctro4Xc7hwr0vYeEO9oamRGhV9NCFq/NtUNYjRDTB9d6iNHrG05eI7d/EB1rkRcYYM9FFaEyon0jiKIrd5AL6GppRVeYBTY+BlhdKcs05pZoLWmme0BqLdCpbV6Gnzd+X2YVfyDP4Qxf1BJLkOJ8NtC37Un0AfYQ+kQ=='
 to a JSON value.
-------------------- >> begin captured logging << --------------------
avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync'
avro.schema: Level 5: Register new name for 'example.avro.User'
root: WARNING: snappy is not installed; some tests will be skipped.
avro.schema: Level 5: Register new name for 'example.avro.User'
root: WARNING: Tensorflow is not installed, so skipping some tests.
root: ERROR: Error while visiting Some Numbers/Read
--------------------- >> end captured logging << ---------------------
Ran 1 test in 1.319s
{code}

How to reproduce:
1. create virtualenv with python 3
2. install gcp,test dependencies
3. build python tar file
4. run test



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to