On Mon, Dec 3, 2012 at 11:59 AM, Michael Droettboom <md...@stsci.edu> wrote:

>>> but some of that complexity could be reduced by using Numpy arrays in place 
>>> of the
>>> image buffer types that each of them contain
>> OR Cython arrays and/or memoryviews -- this is indeed a real strength of 
>> Cython.
>
> Sure, but when we return to Python, they should be Numpy arrays which
> have more methods etc. -- or am I missing something?

Cython makes it really easy to switch between ndarrays and
memoryviews, etc -- it's a question of what you want to work with in
your code, so you have write a function that takes numpy arrays and
returns numpy arrays, but uses a memoryview internally (and passes to
C code that way). But I'm not an expert on this, I'mve found that I'm
either doing simplestuff where using numpy arrays directly works fine,
or passing the pointer to the data array off to C:

def a_function_to_call_C( cnp.ndarray[double, ndim=2, mode="c" ] in_array ):
    """
    calls the_c_function, altering the array in-place
    """
     cdef int m, n
     m = in_array.size[0]
     m = in_array.size[1]
     the_c_function( &in_array[0], m, n )

>> It does support the C99 fixed-width integer types:
>> from libc.stdint cimport int16_t, int32_t,
>>
> The problem is that Cython can't actually read the C header,

yeah, this is a pity. There has been some work on auto-generating
Cython from C headers, though nothing mature.  For my work, I've been
considering writing some simple pyd-generating code, just to make sure
my data types are inline with the C++ as it may change.

> so there
> are types in libpng, for example, that we don't actually know the size
> of.  They are different on different platforms.  In C, you just include
> the header.  In Cython, I'd have to determine the size of the types in a
> pre-compilation step, or manually determine their sizes and hard code
> them for the platforms we care about.

yeah -- this is a tricky problem, however, I think you can follow what
you'd do in C -- i.e. presumable the header define their own data
types: png_short or whatever. The actually definition is filled in by
the pre-processor. So I wonder if you can declare those types  in
Cython, then have it write C code that uses those types, and it all
gets cleared up at compile time -- maybe. The key is that when you
declare stuff in Cython, that declaration is used to determine how to
write the C code, I don't think the declarations themselves are
translated.

> It would at least make this a more fair comparison to have the Cython
> code as Cythonic as possible.  However, I couldn't find any ways around
> using these particular APIs -- other than the Numpy stuff which probably
> does have a more elegant solution in the form of Cython arrays and
> memory views.

yup -- that's what I noticed right away -- I"m note sure it there is
easier handling of file handles.

> True.  We do have two categories of stuff using PyCXX in matplotlib:
> things that (primarily) wrap third-party C/C++ libraries, and things
> that are actually doing algorithmic heavy lifting.  It's quite possible
> we don't want the same solution for all.

And I'm not sure the wrappers all need to be written the same way, either.

-Chris
-- 

Christopher Barker, Ph.D.
Oceanographer

Emergency Response Division
NOAA/NOS/OR&R            (206) 526-6959   voice
7600 Sand Point Way NE   (206) 526-6329   fax
Seattle, WA  98115       (206) 526-6317   main reception

chris.bar...@noaa.gov

------------------------------------------------------------------------------
Keep yourself connected to Go Parallel: 
BUILD Helping you discover the best ways to construct your parallel projects.
http://goparallel.sourceforge.net
_______________________________________________
Matplotlib-devel mailing list
Matplotlib-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-devel

Reply via email to