Lisandro Dalcin wrote:
> No time right now to look at this... All doctests fail on Py3.1rc1,
> with errors like the traceback below. Are we needing a (trivial?) fix
> or is something broken on Py3.1?
>
> ======================================================================
> ERROR: compiling (cpp) and running withstat
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "runtests.py", line 345, in run
>     doctest.DocTestSuite(self.module).run(result)
>   File "/usr/local/python/3.1/lib/python3.1/doctest.py", line 2226, in
> DocTestSuite
>     tests = test_finder.find(module, globs=globs, extraglobs=extraglobs)
>   File "/usr/local/python/3.1/lib/python3.1/doctest.py", line 820, in find
>     source_lines = linecache.getlines(file, module.__dict__)
>   File "/usr/local/python/3.1/lib/python3.1/linecache.py", line 41, in
> getlines
>     return updatecache(filename, module_globals)
>   File "/usr/local/python/3.1/lib/python3.1/linecache.py", line 130,
> in updatecache
>     lines = fp.readlines()
>   File "/usr/local/python/3.1/lib/python3.1/codecs.py", line 300, in
> decode
>     (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf8' codec can't decode byte 0x94 in position
> 100: unexpected code byte

If all doctests fail like this, I doubt that there's much we can do about
this. I wonder why the UTF-8 codec is involved at all here, we are using
Unicode strings for "__doc__" everywhere in Py3 compatible tests. And why
does it look like linecache is reading from a file (even a filename!)
here? I can't look at the sources right now, but this feels awfully wrong
to me. I hope it's not trying to read the binary modules as text files...

I think we should file a bug report against Py3.1 and even bring this to
the attention of python-dev before it's too late.

Stefan

_______________________________________________
Cython-dev mailing list
[email protected]
http://codespeak.net/mailman/listinfo/cython-dev

Reply via email to