Forgot reply-all... didin't we have this discussion before about making that the default for this list as it is by-far the most common desired behavior?
On Thu, Feb 17, 2011 at 3:53 PM, Robert Bradshaw <rober...@math.washington.edu> wrote: > On Thu, Feb 17, 2011 at 3:12 PM, W. Trevor King <wk...@drexel.edu> wrote: >> On Thu, Feb 17, 2011 at 01:25:10PM -0800, Robert Bradshaw wrote: >>> On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King <wk...@drexel.edu> wrote: >>> > On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: >>> >> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King <wk...@drexel.edu> wrote: >>> >> > What I'm missing is a way to bind the ModuleScope namespace to a name >>> >> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, >>> >> > name)` will work in expose.pyx. >>> >> >>> >> You have also hit into the thorny issue that .pxd files are used for >>> >> many things. They may be pure C library declarations with no Python >>> >> module backing, they may be declarations of (externally implemented) >>> >> Python modules (such as numpy.pxd), or they may be declarations for >>> >> Cython-implemented modules. >>> >> >>> >> Here's another idea, what if extern blocks could contain cpdef >>> >> declarations, which would automatically generate a Python-level >>> >> wrappers for the declared members (if possible, otherwise an error)? >>> > >>> > Ah, this sounds good! Of the three .pxd roles you list above, >>> > external Python modules (e.g. numpy) and Cython-implemented modules >>> > (e.g. matched .pxd/.pyx) both already have a presence in Python-space. >>> > What's missing is a way to give (where possible) declarations of >>> > external C libraries a Python presence. cpdef fills this hole nicely, >>> > since its whole purpose is to expose Python interfaces to >>> > C-based elements. >>> >>> In the case of external Python modules, I'm not so sure we want to >>> monkey-patch our stuff in >> >> I don't think any of the changes we are suggesting would require >> changes to existing code, so .pxd-s with external implementations >> wouldn't be affected unless they brough the changes upon themselves. > > Say, in numpy.pxd, I have > > cdef extern from "...": > cpdef struct obscure_internal_struct: > ... > > Do we add an "obscure_internal_struct" onto the (global) numpy module? > What if it conflicts with a (runtime) name? This is the issue I'm > bringing up. > >>> (and where would we do it--on the first import of a cimporting >>> module?) >> >> Compilation is an issue. I think that .pxd files should be able to be >> cythoned directly, since then they Cython can build any wrappers they >> request. If the file has a matching .pyx file, cythoning either one >> should compile both together, since they'll produce a single Python >> .so module. > > In this case, I think it may make more sense to consider cimporting > stuff from .pyx files if no .pxd file is present. In any case, this is > a big change. I don't like the idea of always creating such a module > (it may be empty, or have name conflicts) nor the idea of > conditionally compiling it (does one have to look at the contents and > really understand Cython to see if a Python shadow will be created?) > >>> > A side effect of this cpdef change would be that now even bare .pxd >>> > files (no matching .pyx) would have a Python presence, >>> >>> Where would it live? Would we just create this module (in essence, >>> acting as if there was an empty .pyx file sitting there as well)? On >>> this note, it may be worth pursuing the idea of a "cython helper" >>> module where common code and objects could live. >> >> I'm not sure exactly what you mean by "cython helper", but this sounds >> like my 'bare .pyx can create a Python .so module idea above. > > I'm thinking of a place to put, e.g. the generator and bind-able > function classes, which are now re-implemented in every module that > uses them. I think there will be more cases like this in the future > rather than less. C-level code could be #included and linked from > "global" stores as well. However, that's somewhat tangential. > >>> > so You could do >>> > something like >>> > >>> > cimport mylib as mylib_c >>> > import mylib as mylib_py >>> > import sys >>> > >>> > # Access through Python >>> > for name in dir(mylib_py): >>> > setattr(sys.modules[__name__], name, getattr(mylib_py, name)) >>> >>> I think this smells worse than "import *" >> >> Aha, thanks ;). I was stuck in my old >> .pxd-files-don't-create-modules-by-themselves mindset. Obviously, >> once they do, any Python code can access the contents directly and I >> can throw out all this indirection. >> >>> > However, from Parsing.py:2369: >>> > >>> > error(pos, "C struct/union/enum cannot be declared cpdef") >>> > >>> > From pyrex_differences.rst: >>> > >>> > If a function is declared :keyword:`cpdef` it can be called from >>> > and overridden by both extension and normal python subclasses. >>> > >>> > I believe the reason that cpdef-ed enums and similar are currently >>> > illegal is confusion between "can be called from Python" and "can be >>> > overridden from Python". >>> >>> The reason that error statement is there is because it had no meaning, >>> so an error was better than just ignoring it. >> >> Why does it have no meaning? I understand that it's not implemented >> yet, but a cpdef-ed enum or struct seems just as valid an idea as a >> cpdef-ed method. > > We're only deciding what the meaning is now. I agree it's a valid > idea, its just that no one had even considered it yet. (I'll also > concede that it's a much more natural idea for someone who came to the > language once cpdef was already implemented.) > >>> > Unions don't really have a Python parallel, >>> >>> They can be a cdef class wrapping the union type. >> >> But I would think coercion would be difficult. Unions are usually (in >> my limited experience) for "don't worry about the type, just make sure >> it fits in X bytes". How would union->Python conversion work? > > There would be a wrapping type, e.g. > > cdef class MyUnion: > cdef union_type value > > with a bunch of setters/getters for the values, just like there are > for structs. (In fact the same code would handle structs and unions). > > This is getting into the wrapper-generator territory, but I'm starting > to think for simple things that might be worth it. > >>> > cpdef struct Foo: >>> > cpdef public int intA >>> > cpdef readonly int intB >>> > cdef void *ptr >>> > >>> > We would both declare the important members of the C struct (as we can >>> > already do in Cython) and also have Cython automatically generate a >>> > Python class wrapping the struct (because of `cpdef struct`). The >>> > Python class would have: >>> > >>> > * Cython-generated getter/setter for intA (because of `cpdef public`) >>> > using the standard Python<->int coercion. >>> > * Similar Cython-generated getter for int B (because of `cpdef >>> > readonly`). >>> > * No Python access to ptr (standard C-access still possible through >>> > Cython). >>> > >>> > Doing something crazy like `cdef public void *ptr` would raise a >>> > compile-time error. >>> >>> Yes, all of the above was exactly what I was proposing. >>> >>> > I'm definately willing to help out with this (if someone will point me >>> > in the right direction), >>> >>> That would be great. >> >> Ok, I think we're pretty much agreed ;). I think that the next step >> is to start working on implementations of: >> >> * Stand alone .pxd -> Python module > > I'm not sure we're agreed on this one. > >> * Extending class cdef/cdpef/public/readonly handling to cover enums, >> stucts, and possibly unions. > > This seems like the best first step. > >> Problems with me getting started now: >> >> * I don't know where I should start mucking about in the source. > > I would start by (1) creating/declaring cdef classes for cdef > structs/unions/enums (not textually, within, e.g. > CStructOrUnionDefNode.analyse_declarations(...) in Nodes.py) and then > making sure they get added to the module's namespace (look at > ModuleNode.py where the classes get added). The other files to look at > would be Parsing.py, PyrexTypes.py and Symtab.py. Could either try to > patch up parsing first, but I think it'd be more effective to make > this the default during development, as then all the tests would > exercise your code, and it's probably both easier to dive in there and > would give you a better idea of how best to handle the parsing. > > Also, the creating of cpdef functions is a bit hackish (for historical > reasons) and something we want to clean up, so I wouldn't go to great > lengths to try and emulate that. > >> * I don't know how to handle things like dummy enums (perhaps by >> requiring all cdef-ed enums to be named). > > All enums in C are named. > >> * I'm going to go watch a movie ;). > > :) > >> It's good to be moving forward! > > Yep. You might want to consider creating a github fork. > > - Robert > _______________________________________________ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel