Dan Stromberg, 01.09.2011 19:56:
On Tue, Aug 30, 2011 at 10:05 AM, Guido van Rossum wrote:
  The problem lies with the PyPy backend -- there it generates ctypes
code, which means that the signature you declare to Cython/Pyrex must
match the *linker* level API, not the C compiler level API. Thus, if
in a system header a certain function is really a macro that invokes
another function with a permuted or augmented argument list, you'd
have to know what that macro does. I also don't see how this would
work for #defined constants: where does Cython/Pyrex get their value?
ctypes doesn't have their values.

So, for PyPy, a solution based on Cython/Pyrex has many of the same
downsides as one based on ctypes where it comes to complying with an
API defined by a .h file.

It's certainly a harder problem.

For most simple constants, Cython/Pyrex might be able to generate a series
of tiny C programs with which to find CPP symbol values:

#include "file1.h"
...
#include "filen.h"

main()
{
printf("%d", POSSIBLE_CPP_SYMBOL1);
}

...and again with %f, %s, etc.The typing is quite a mess

The user will commonly declare #defined values as typed external variables and callable macros as functions in .pxd files. These manually typed "macro" functions allow users to tell Cython what it should know about how the macros will be used. And that would allow it to generate C/C++ glue code for them that uses the declared types as a real function signature and calls the macro underneath.


and code fragments would probably be impractical.

Not necessarily at the C level but certainly for a ctypes backend, yes.


But hopefully clang has something that'd make this easier.

For figuring these things out, maybe. Not so much for solving the problems they introduce.

Stefan

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to