Currently, wherever a C type is used in Cython, it is enough to specify 
sort-of what kind of class of type it is... the following will work all 
right:

cdef extern ...:
     ctypedef int int8
     ctypedef int int16
     ctypedef int int32
     ctypedef int int64


However, when types are used in buffers:

cdef object[int16, 3] x

one must translate the type (dtype) into a character representation (as 
listed in http://docs.python.org/lib/module-struct.html). For instance, 
signed short is "h" and signed int is "i"; and one can imagine int16 to 
be any of these (which one being decided by the included header-file; 
NumPy comes with a long list defining intXX for a lot of cases; you can 
choose whether to use a C-dependant type like "npy_longlong" or a fixed 
type like "npy_int64").

I see two options:
a) Start requiring exact ctypedefs in Cython; at least to get correct 
buffer behaviour. However, for NumPy this would require a lot of 
"#ifdefs" Cython-side. (NumPy comes with two options for each type,

b) "Somehow" get the C compiler to map the C-compiler-resolved type to 
the char. With C++ this would be easy:

template <typename T> struct TypeChar {};
template <> struct TypeChar<short> {
   static const char value = 'h';
};
template <> struct TypeChar<int> {
   static const char value = 'i';
};
int main() {
   int t;
   cout << TypeChar<typeof(t)>::value;
}

however I'm not sure if any tricks are possible in C to get something 
like this?

-- 
Dag Sverre
_______________________________________________
Cython-dev mailing list
[email protected]
http://codespeak.net/mailman/listinfo/cython-dev

Reply via email to