[Numpy-discussion] Numpy Generalized Ufuncs: Pointer Arithmetic and Segmentation Faults (Debugging?)

2015-10-25 Thread eleanore.young
Dear Numpy maintainers and developers,

Thanks for providing such a great numerical library!

I’m currently trying to implement the Dynamic Time Warping metric as a set of 
generalised numpy ufuncs, but unfortunately, I have lasting issues with pointer 
arithmetic and segmentation faults. Is there any way that I can
use GDB or some such to debug a python/numpy extension? Furthermore: is it 
necessary to use pointer arithmetic to access the function arguments (as seen 
on http://docs.scipy.org/doc/numpy/user/c-info.ufunc-tutorial.html)
or is element access (operator[]) also permissible?

To break it down quickly, I need to have a fast DTW distance function 
dist_dtw() with two vector inputs (broadcasting should be possible), two scalar 
parameters and one scalar output (signature: (i), (j), (), () -> ()) usable in 
python for a 1-Nearest Neighbor classification algorithm. The extension also 
implements two functions compute_envelope() and piecewise_mean_reduction() 
which are used for lower-bounding based on Keogh and Ratanamahatana, 2005. The 
source code is available at http://pastebin.com/MunNaP7V and the prominent 
segmentation fault happens somewhere in the chain dist_dtw() —> meta_dtw_dist() 
—> slow_dtw_dist(), but I fail to pin it down.

Aside from my primary questions, I wonder how to approach errors/exceptions and 
unit testing when developing numpy ufuncs. Are there any examples apart from 
the numpy manual that I could use as reference implementations of generalised 
numpy ufuncs?

I would greatly appreciate some insight into properly developing generalised 
ufuncs.

Best,
Eleanore

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] ANN: Scipy 0.16.1 release

2015-10-25 Thread Ralf Gommers
Hi all,

I'm happy to announce the availability of the Scipy 0.16.1 release. This is
a bugfix only release; it contains no new features compared to 0.16.0.

The sources and binary installers can be found at:

- Source tarballs: at https://github.com/scipy/scipy/releases and on
PyPi.
- OS X: there are wheels on PyPi, so simply install with pip.
- Windows: .exe installers can be found on
https://github.com/scipy/scipy/releases

Cheers,
Ralf



==
SciPy 0.16.1 Release Notes
==

SciPy 0.16.1 is a bug-fix release with no new features compared to 0.16.0.


Issues closed for 0.16.1


- `#5077 `__: cKDTree not
indexing properly for arrays with too many elements
- `#5127 `__: Regression in
0.16.0: solve_banded errors out in patsy test suite
- `#5149 `__: linalg tests
apparently cause python to crash with numpy 1.10.0b1
- `#5154 `__: 0.16.0 fails to
build on OS X; can't find Python.h
- `#5173 `__: failing
stats.histogram test with numpy 1.10
- `#5191 `__: Scipy 0.16.x -
TypeError: _asarray_validated() got an unexpected...
- `#5195 `__: tarballs missing
documentation source
- `#5363 `__: FAIL:
test_orthogonal.test_j_roots, test_orthogonal.test_js_roots


Pull requests for 0.16.1


- `#5088 `__: BUG: fix logic
error in cKDTree.sparse_distance_matrix
- `#5089 `__: BUG: Don't
overwrite b in lfilter's FIR path
- `#5128 `__: BUG: solve_banded
failed when solving 1x1 systems
- `#5155 `__: BLD: fix missing
Python include for Homebrew builds.
- `#5192 `__: BUG: backport
as_inexact kwarg to _asarray_validated
- `#5203 `__: BUG: fix
uninitialized use in lartg 0.16 backport
- `#5204 `__: BUG: properly
return error to fortran from ode_jacobian_function...
- `#5207 `__: TST: Fix
TestCtypesQuad failure on Python 3.5 for Windows
- `#5352 `__: TST: sparse:
silence warnings about boolean indexing
- `#5355 `__: MAINT: backports
for 0.16.1 release
- `#5356 `__: REL: update Paver
file to ensure sdist contents are OK for releases.
- `#5382 `__: 0.16.x backport:
MAINT: work around a possible numpy ufunc loop...
- `#5393 `__: TST:special: bump
tolerance levels for test_j_roots and test_js_roots
- `#5417 

Re: [Numpy-discussion] Numpy Generalized Ufuncs: Pointer Arithmetic and Segmentation Faults (Debugging?)

2015-10-25 Thread Travis Oliphant
Two things that might help you create generalized ufuncs:

1) Look at Numba --- it makes it very easy to write generalized ufuncs in
simple Python code.  Numba will compile to machine code so it can be as
fast as writing in C.   Here is the documentation for that specific
feature:
http://numba.pydata.org/numba-doc/0.21.0/user/vectorize.html#the-guvectorize-decorator.
One wart of the interface is that scalars need to be treated as
1-element 1-d arrays (but still use '()' in the signature).

2) Look at the linear algebra module in NumPy which now wraps a bunch of
linear-algebra based generalized ufuncs (all written in C):
https://github.com/numpy/numpy/blob/master/numpy/linalg/umath_linalg.c.src

-Travis



On Sun, Oct 25, 2015 at 7:06 AM,  wrote:

> Dear Numpy maintainers and developers,
>
> Thanks for providing such a great numerical library!
>
> I’m currently trying to implement the Dynamic Time Warping metric as a set
> of generalised numpy ufuncs, but unfortunately, I have lasting issues with
> pointer arithmetic and segmentation faults. Is there any way that I can
> use GDB or some such to debug a python/numpy extension? Furthermore: is it
> necessary to use pointer arithmetic to access the function arguments (as
> seen on http://docs.scipy.org/doc/numpy/user/c-info.ufunc-tutorial.html)
> or is element access (operator[]) also permissible?
>
> To break it down quickly, I need to have a fast DTW distance function
> dist_dtw() with two vector inputs (broadcasting should be possible), two
> scalar parameters and one scalar output (signature: (i), (j), (), () -> ())
> usable in python for a 1-Nearest Neighbor classification algorithm. The
> extension also implements two functions compute_envelope() and
> piecewise_mean_reduction() which are used for lower-bounding based on Keogh
> and Ratanamahatana, 2005. The source code is available at
> http://pastebin.com/MunNaP7V and the prominent segmentation fault happens
> somewhere in the chain dist_dtw() —> meta_dtw_dist() —> slow_dtw_dist(),
> but I fail to pin it down.
>
> Aside from my primary questions, I wonder how to approach
> errors/exceptions and unit testing when developing numpy ufuncs. Are there
> any examples apart from the numpy manual that I could use as reference
> implementations of generalised numpy ufuncs?
>
> I would greatly appreciate some insight into properly developing
> generalised ufuncs.
>
> Best,
> Eleanore
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>


-- 

*Travis Oliphant*
*Co-founder and CEO*


@teoliphant
512-222-5440
http://www.continuum.io
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Numpy Generalized Ufuncs: Pointer Arithmetic and Segmentation Faults (Debugging?)

2015-10-25 Thread Jaime Fernández del Río
HI Eleanore,

Thanks for the kind words, you are very welcome!

As for your issues, I think they are coming from the handling of the
strides you are doing in the slow_dtw_dist function.  The strides are the
number of bytes you have to advance your pointer to get to the next item.
In your code, you end up doing something akin to:

dtype *v_i = v0;
...
for (...) {
...
v_i += stride_v;
}

This, rather than increase the v_i pointer by stride_v bytes, increases it
by stride_v * sizeof(dtype), and with the npy_double you seem to be using
as dtype, sends you out of your allocated memory at a rate 8x too fast.

What you increase by stride_v has to be of char* type, so one simple
solution would be to do something like:

char *v_ptr = (char *)v0;
...
for (...) {
dtype v_val = *(dtype *)v_ptr;
...
v_ptr += stride_v;
}

And use v_val directly wherever you were dereferencing v_i before.

Jaime


On Sun, Oct 25, 2015 at 5:06 AM,  wrote:

> Dear Numpy maintainers and developers,
>
> Thanks for providing such a great numerical library!
>
> I’m currently trying to implement the Dynamic Time Warping metric as a set
> of generalised numpy ufuncs, but unfortunately, I have lasting issues with
> pointer arithmetic and segmentation faults. Is there any way that I can
> use GDB or some such to debug a python/numpy extension? Furthermore: is it
> necessary to use pointer arithmetic to access the function arguments (as
> seen on http://docs.scipy.org/doc/numpy/user/c-info.ufunc-tutorial.html)
> or is element access (operator[]) also permissible?
>
> To break it down quickly, I need to have a fast DTW distance function
> dist_dtw() with two vector inputs (broadcasting should be possible), two
> scalar parameters and one scalar output (signature: (i), (j), (), () -> ())
> usable in python for a 1-Nearest Neighbor classification algorithm. The
> extension also implements two functions compute_envelope() and
> piecewise_mean_reduction() which are used for lower-bounding based on Keogh
> and Ratanamahatana, 2005. The source code is available at
> http://pastebin.com/MunNaP7V and the prominent segmentation fault happens
> somewhere in the chain dist_dtw() —> meta_dtw_dist() —> slow_dtw_dist(),
> but I fail to pin it down.
>
> Aside from my primary questions, I wonder how to approach
> errors/exceptions and unit testing when developing numpy ufuncs. Are there
> any examples apart from the numpy manual that I could use as reference
> implementations of generalised numpy ufuncs?
>
> I would greatly appreciate some insight into properly developing
> generalised ufuncs.
>
> Best,
> Eleanore
>
>
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>


-- 
(\__/)
( O.o)
( > <) Este es Conejo. Copia a Conejo en tu firma y ayúdale en sus planes
de dominación mundial.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion