I've recently been working on generating C functions on-the-fly which inline the C code necessary to implement the bytecode in a given Python function. For example, this bytecode:
>>> dis.dis(f) 2 0 LOAD_FAST 0 (a) 3 LOAD_CONST 1 (1) 6 BINARY_ADD 7 RETURN_VALUE is transformed into this rather boring bit of C code: #include "Python.h" #include "code.h" #include "frameobject.h" #include "eval.h" #include "opcode.h" #include "structmember.h" #include "opcode_mini.h" PyObject * _PyEval_EvalMiniFrameEx(PyFrameObject *f, int throwflag) { static int jitting = 1; PyEval_EvalFrameEx_PROLOG1(); co = f->f_code; PyEval_EvalFrameEx_PROLOG2(); oparg = 0; LOAD_FAST_IMPL(oparg); oparg = 1; LOAD_CONST_IMPL(oparg); BINARY_ADD_IMPL(); RETURN_VALUE_IMPL(); PyEval_EvalFrameEx_EPILOG(); } The PROLOG1, PROLOG2 and EPILOG macros are just chunks of code from PyEval_EvalFrameEx. I have the code compiling and linking, and dlopen and dlsym seem to work, returning apparently valid pointers, but when I try to call the function I get Program received signal EXC_BAD_ACCESS, Could not access memory. Reason: KERN_PROTECTION_FAILURE at address: 0x0000000c 0x0058066d in _PyEval_EvalMiniFrameEx (f=0x230d30, throwflag=0) at MwDLSf.c:17 Line 17 is the PROLOG1 macro. I presume it's probably barfed on the very first instruction. (This is all on an Intel Mac running Leopard BTW.) Here are the commands generated to compile and link the C code: gcc -fno-strict-aliasing -DNDEBUG -g -fwrapv -O3 -Wall \ -Wstrict-prototypes -g -DPy_BUILD_CORE -DNDEBUG \ -I/Users/skip/src/python/py3k-t/Include \ -I/Users/skip/src/python/py3k-t -c dTd5cl.c \ -o /tmp/MwDLSf.o gcc -L/opt/local/lib -bundle -undefined dynamic_lookup -g \ /tmp/dTd5cl.o -L/Users/skip/src/python/py3k-t -lpython3.1 \ -o /tmp/MwDLSf.so (It just uses the distutils compiler module to build .so files.) The .so file looks more-or-less ok: % otool -L /tmp/MwDLSf.so /tmp/MwDLSf.so: /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 111.1.3) though nm doesn't show that any undefined _Py* symbols so I suspect I'm not linking it correctly. The Python executable was built without --enable-shared. I've tried building with that config flag, but that just gives me fits during debugging because it always wants to find libpython in the installation directory even if I'm running python.exe from the build directory. Installing is a little tedious because it relies on a properly functioning interpreter. dlopen is called very simply: handle = dlopen(shared, RTLD_NOW); I used RTLD_NOW because that's what sys.getdlopenflags() returns. I'm not calling dlclose for the time being. I'm not exactly sure where I should go from here. I'd be more than happy to open an item in the issue tracker. I was hoping to get something a bit closer to working before doing that though. The failure to properly load the compiled function makes it pretty much impossble to debug the generated code beyond what the compiler can tell me. Any suggestions? Skip _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com