Well, In the case of wrapping large, legacy C++ libraries, then the
choice is not easy, it depend on many factors. I'll try to elaborate

For something like MPI and PETSc in case for mpi4py and petsc4py,
which are both being converted to Cython, the option is really clear
for me: just use Cython, do not bother with SWIG.

Why use Cython? Their native API's are just C (well, MPI has a
standard C++ API, but not much more featured than the C one). Wrapping
a C API and providing just C-ish functions at the Python level is just
unpythonic and painfull to use that plain C. You really need a trully
OO Python API, if not, you Python level API is just much crap than the
C one, hard to use, perhaps the only you alleviate is error checking.
Building a OO API targeting Python through using Cython and calling a
your legacy library C API is just easy, fast, convenient than using
SWIG and next use the SWIG generated code to implement a
better-looking, high-level, pythonic API for Python-level consumption.

In the case of a large C++ API its depend on how much of that API it
makes sense to expose to Python. Let suppose we have an API with a lot
of classes and a lot of methods (let say 10,20,30 classes, with
10,10,30 methods each). Then wrapping the full API in the current
status of Cython, that is going to be a real pain, to much manual.
BUT, if you are going to use only a few of your classes, and only
expose a few of your methods, or even want to expose new methods, more
convenient to use in the Python side, then, again I will definitelly
choose Cython.

My personal experience in going from SWIG to Cython is worth to take
into account:

Implementing a really good, robust, full featured, pythonic access to
full PETSc functionalities for petsc4py took me about two years of
carefull writting of typemaps, implementing Python extension types BY
HAND, and writting a log or C code inside custom typemaps, with Python
and Numpy  C API calls plus PETSc calls.

Then, after the success with Cython for the mpi4py case, I started the
process of rewritting petsc4py from scratch, working at night, let say
from 9:00 PM to until 1:00 AM or 2:00 AM. In about 10 days, yes, JUST
10 DAYS of night work, I'm near to have implemented all the features
of the previous implementation, and the new implementation is actually
bether and more featured....


In short, if you really, really, really want to wrapp the FULL C++
library, then the Cython way will be harder than the SWIG way. But I
really doubt that you really need to access the FULL C++ API. But one
thing that you sould take for granted: the Cython way will let you
write better-looking, more easily maintenable source code.

That's all, folks!

Regards,



On 5/30/08, Fernando Perez <[EMAIL PROTECTED]> wrote:
> [ Lisandro, I'm cc'ing you here because we've just had some
>  discussions with folks from Los Alamos NL, Berkeley and UC Santa Cruz
>  on the issue of SWIG vs. other options for wrapping a large C++
>  library (a high performance computer vision library, in this case).  I
>  suggested SWIG due to its maturity for automatic wrapping (despite the
>  pain of typemaps) but your recent experience in transitioning away
>  from it is in my view a critically useful piece of information at this
>  point]
>
>  Hi folks,
>
>
>  just to give you a counterpoint from what I said yesterday.  While
>  SWIG has the advantage of allowing automatic wrapping of large
>  libraries, it is also true that it is unwieldy in some ways, and the
>  two-layer overhead is impossible to escape.  Cython continues to
>  improve for this type of tasks, and I think the message below is worth
>  reading carefully before you embark in any long SWIG exercise.
>  Lisandro (CC'd here)  is an extremely knowledgeable developer and both
>  Petsc and MPI are large, complex projects (Petsc comes from Argonne
>  NL). The fact that his recent experiences with Cython wrapping have
>  been so good is in my view an important data point.
>
>  If you begin looking in this direction, the Cython mailing list is a
>  very helpful resource.  You can also download the mpi4py sources to
>  have a look at what Lisandro has done.
>
>  Best,
>
>  f
>
>  ---------- Forwarded message ----------
>  From: Lisandro Dalcin <[EMAIL PROTECTED]>
>  Date: Fri, May 30, 2008 at 11:58 AM
>  Subject: [Cython] about SWIG versus Cython/Pyrex
>  To: Daniele Pianu <[EMAIL PROTECTED]>
>  Cc: Cython-dev <[email protected]>
>
>
>  I saw a mail from you, forwarded by W. Stein to Cython-Devel list,
>  about you are trying to measure performance for SWIG vs. Cython/Pyrex.
>
>  Well, I'm a rather power SWIG user, and now, after about a month I
>  believe I'm a rather good Cython user.
>
>  My main insterest is parallel distributed computing with Python. From
>  some time, I've been developing two very important (at least for me!)
>  packages, mpi4py and petsc4py (just google for the link).
>
>  For your objectives, I believe petsc4py could be interesting. The C
>  API of PETSc is wrapped with SWIG, but then this stuff is used inside
>  a Python code to implement a fully OO, pythonic API. I've heavily
>  hacked in the way SWIG implements its infamous 'this'. I do not rely
>  on this mechanism for passing object between Python and C layers,
>  instead, I've implemented a full base type object in C; and all this
>  crap is because the normal SWIG way was not enough for me.
>
>  But now, after fall in love with Cython, I've started to port (in
>  fact, write from scratch) all my work to use Cython. The petsc4py
>  project is not yet ready for the public, but it is in good shape for
>  testing.
>
>  And now, the interesting part. I've wrote some testing for my thesis
>  presentation (it will be in about a month), where I've crafted some
>  numerical tests in order to stress the overhead of passing back a
>  forth objects from the C layer to the Python layer and viceversa. All
>  the actual numericall computing is done in C or Fortran 90. And all
>  runs are sequential, not parallel.
>
>  This testing is done, code and results available, but only for the
>  former SWIG based version of petsc4py. The whole problem is related to
>  use Krylov iterative methods for solving a linear system of equations
>  arising from a finite-diferences discretization of the 3D Poisson
>  problem. BUT note that I never build a sparse matrix, instead I use a
>  'matrix-free' version with implements the matrix-vector product A*x ->
>  y as a function F(x) -> y, thus being crude computing with Fortran 90
>  arrays.
>
>  If you are interested in take a look at this, I can send you some PDF
>  pages describing all this testing. I can also send you the code
>  implementing this testing. And if you want to go further, I perhaps
>  can find some spare time to help you implement all this testing with
>  the new Cython-based version of petsc4py that I'm writting.
>
>  I expect that my SWIG implementation will run faster that a normal
>  SWIG wrapper (as I've hacked on many ways to ge it faster). And I
>  expect that the new Cython based version is still faster.
>
>  I really believe that this is a very good test case for comparing SWIG
>  vs. Cython, as all the numerical computing is actully done in C, then
>  you finally measure the overhead of passing object around.
>
>
>  --
>  Lisandro Dalcín
>  ---------------
>  Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
>  Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
>  Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
>  PTLC - Güemes 3450, (3000) Santa Fe, Argentina
>  Tel/Fax: +54-(0)342-451.1594
>  _______________________________________________
>  Cython-dev mailing list
>  [email protected]
>  http://codespeak.net/mailman/listinfo/cython-dev
>


-- 
Lisandro Dalcín
---------------
Centro Internacional de Métodos Computacionales en Ingeniería (CIMEC)
Instituto de Desarrollo Tecnológico para la Industria Química (INTEC)
Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET)
PTLC - Güemes 3450, (3000) Santa Fe, Argentina
Tel/Fax: +54-(0)342-451.1594
_______________________________________________
Cython-dev mailing list
[email protected]
http://codespeak.net/mailman/listinfo/cython-dev

Reply via email to