Responding to my own post in case the outcome is useful to anyone. I have been perplexed for a while with the python pyext (C++) implementation of protobuf, and i think I finally figured out how it was intended to work:
python_protobuf.h declares the GetCProtoInsidePyProto and MutableCProtoInsidePyProto methods python_protobuf.cc defines these methods as 'wrappers' around a couple global function pointers (GetCProtoInsidePyProtoPtr and MutableCProtoInsidePyProtoPtr). These function pointers are initialized to point to dummy 'stub' methods which simply return NULL (GetCProtoInsidePyProtoStub and MutableCProtoInsidePyProtoStub) python-proto2.cc contains the 'real' implementations of these methods. When the python init function for this module is run, it sets the global function pointers to point to its concrete implementations. So far, this all made sense ... except for one problem, both python_protobuf.cc and python-proto2.cc get compiled into the python protobuf library (_net_proto2___python.so). Because of this, all the fancy pointer magic doesn't really make sense. If you're linking against the concrete implementation directly, then you shouldn't need to go through the function pointers, you could just call the implementations directly. You can't get access to the function pointer without linking directly against the _net_proto2___python.so library (or doing some dlopen voodoo). This is what led me to my previous post. I have since noticed that python_protobuf.cc looks like it was written with the idea that it would live in a separate library from the actual pyext code (Probably the main protobuf library itself?) This would then provide access to the GetCProtoInsidePyProto and MutableCProtoInsidePyProto by simply linking against the main protobuf library. For the moment, I have patched my local protobuf implementation to stop compiling python_protobuf.cc into the pyext extension (_net_proto2___python.so), and I am instead compiling it into a separate library (libpython_protobuf.so). The pyext library is then linked against libpython_protobuf.so. By linking my own python extension module against libpython_protobuf.so, I can now make use of these methods. Hopefully this is all going to be made irrelevant by having this code migrate to another google-sanctioned library at a later time. Moral of the story, when code is marked 'EXPERMENTAL', it might not be 'done' yet =) Alex On Jul 6, 11:32 am, A Richardson <[email protected]> wrote: > I am writing a python extension (in C++) which needs to work with > protobufs. I would also like to make use of the experimental C++/ > Python extension in my application. (i.e. I would like my extension to > handle python protobuf messages which are actually backed by c++ > messages.) > > Is there a reasonable way to do this right now? It appears that the > secret to doing this would lie gaining access to the > MutableCProtoInsidePyProto method which exists down inside the > protobuf pyext code. However, this method is only implemented in the > internal _net_proto2___python.so library, which is a python extension > itself, and not suitable for 'general' linking to another library > (e.g. it has no soname defined). > > It seems like maybe this could be solved by creating a separate > 'implementation' library which would contain the implementation of > MutableCProtoInsidePyProto (and its sibling method). This would also > require moving the CMessage struct out of the python-proto2.cc file > and into a header file so it could be used for both the python > extension, and the implementation library. > > Thoughts? > > Alex -- You received this message because you are subscribed to the Google Groups "Protocol Buffers" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/protobuf?hl=en.
