Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-02-09 Thread Olivier Grisel
Hi Carl,

Could you please provide some details on how you used your
mingw-static toolchain to build OpenBLAS, numpy  scipy? I would like
to replicate but apparently the default Makefile in the openblas
projects expects unix commands such as `uname` and `perl` that are not
part of your archive. Did you compile those utilities from source or
did you use another distribution of mingw with additional tools such
as MSYS?

For numpy and scipy, besides applying your patches, did you configure
anything in site.cfg? I understand that you put the libopenblas.dll in
the numpy/core folder but where do you put the BLAS / LAPACK header
files?

I would like to help automating that build in some CI environment
(with either Windows or Linux + wine) but I am affraid that I am not
familiar enough with the windows build of numpy  scipy to get it
working all by myself.

-- 
Olivier
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-02-09 Thread Sturla Molden
Two quick comments:
- You need MSYS or Cygwin to build OpenBLAS. MSYS has uname and perl. Carl
probably used MSYS.
- BLAS and LAPACK are Fortran libs, hence there are no header files. NumPy
and SciPy include their own cblas headers.

Sturla


Olivier Grisel olivier.gri...@ensta.org wrote:
 Hi Carl,
 
 Could you please provide some details on how you used your
 mingw-static toolchain to build OpenBLAS, numpy  scipy? I would like
 to replicate but apparently the default Makefile in the openblas
 projects expects unix commands such as `uname` and `perl` that are not
 part of your archive. Did you compile those utilities from source or
 did you use another distribution of mingw with additional tools such
 as MSYS?
 
 For numpy and scipy, besides applying your patches, did you configure
 anything in site.cfg? I understand that you put the libopenblas.dll in
 the numpy/core folder but where do you put the BLAS / LAPACK header
 files?
 
 I would like to help automating that build in some CI environment
 (with either Windows or Linux + wine) but I am affraid that I am not
 familiar enough with the windows build of numpy  scipy to get it
 working all by myself.

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-02-09 Thread Carl Kleffner
Basically you need:

(1) site.cfg or %HOME%\.numpy-site.cfg with the following content: (change
the paths according to your installation)

[openblas]
libraries = openblas
library_dirs = D:/devel/packages/openblas/amd64/lib
include_dirs = D:/devel/packages/openblas/amd64/include

OpenBLAS was build with the help of msys2, the successor of msys.

(2) I created an import library for python##.dll in python\libs\

I copied python##.dll in a temporary folder and executed: (example for
python-2.7)

 gendef python27.dll
 dlltool --dllname python27.dll --def python27.def --output-lib
libpython27.dll.a
 copy libpython27.dll.a python_root\libs\libpython27.dll.a

(3) before starting the numpy build I copied libopenblas.dll to numpy\core\

Actually I rework the overall procedure to allow the installation of the
toolchain as a wheel with some postprocessing to handle all this
intermediate steps.

Cheers,

Carl


2015-02-09 22:22 GMT+01:00 Sturla Molden sturla.mol...@gmail.com:

 Two quick comments:
 - You need MSYS or Cygwin to build OpenBLAS. MSYS has uname and perl. Carl
 probably used MSYS.
 - BLAS and LAPACK are Fortran libs, hence there are no header files. NumPy
 and SciPy include their own cblas headers.

 Sturla


 Olivier Grisel olivier.gri...@ensta.org wrote:
  Hi Carl,
 
  Could you please provide some details on how you used your
  mingw-static toolchain to build OpenBLAS, numpy  scipy? I would like
  to replicate but apparently the default Makefile in the openblas
  projects expects unix commands such as `uname` and `perl` that are not
  part of your archive. Did you compile those utilities from source or
  did you use another distribution of mingw with additional tools such
  as MSYS?
 
  For numpy and scipy, besides applying your patches, did you configure
  anything in site.cfg? I understand that you put the libopenblas.dll in
  the numpy/core folder but where do you put the BLAS / LAPACK header
  files?
 
  I would like to help automating that build in some CI environment
  (with either Windows or Linux + wine) but I am affraid that I am not
  familiar enough with the windows build of numpy  scipy to get it
  working all by myself.

 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Nathaniel Smith
On Tue, Jan 27, 2015 at 8:53 PM, Ralf Gommers ralf.gomm...@gmail.com wrote:

 On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner cmkleff...@gmail.com wrote:

 Thanks for all your ideas. The next version will contain an augumented
 libopenblas.dll  in both numpy and scipy. On the long term I would prefer an
 external openblas wheel package, if there is an agreement about this among
 numpy-dev.


 Sounds fine in principle, but reliable dependency handling will be hard to
 support in setup.py. You'd want the dependency on Openblas when installing a
 complete set of wheels, but not make it impossible to use:

   - building against ATLAS/MKL/... from source with pip or distutils
   - allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels
   - pip install numpy --no-use-wheel
   - etc.

 Static bundling is a lot easier to get right.

In principle I think this should be easy: when installing a .whl, pip
or whatever looks at the dependencies declared in the distribution
metadata file inside the wheel. When installing via setup.py, pip or
whatever uses the dependencies declared by setup.py. We just have to
make sure that the wheels we distribute have the right metadata inside
them and everything should work.

Accomplishing this may be somewhat awkward with existing tools, but as
a worst-case/proof-of-concept approach we could just have a step in
the wheel build that opens up the .whl and edits it to add the
dependency. Ugly, but it'd work.

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Ralf Gommers
On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner cmkleff...@gmail.com wrote:

 Thanks for all your ideas. The next version will contain an augumented
 libopenblas.dll  in both numpy and scipy. On the long term I would prefer
 an external openblas wheel package, if there is an agreement about this
 among numpy-dev.


Sounds fine in principle, but reliable dependency handling will be hard to
support in setup.py. You'd want the dependency on Openblas when installing
a complete set of wheels, but not make it impossible to use:

  - building against ATLAS/MKL/... from source with pip or distutils
  - allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels
  - pip install numpy --no-use-wheel
  - etc.

Static bundling is a lot easier to get right.


 Another idea for the future is to conditionally load a debug version of
 libopenblas instead. Together with the backtrace.dll (part of mingwstatic,
 but undocumentated right now) a meaningfull stacktrace in case of segfaults
 inside the code comiled with mingwstatic will be given.


 2015-01-26 2:16 GMT+01:00 Sturla Molden sturla.mol...@gmail.com:

 On 25/01/15 22:15, Matthew Brett wrote:

  I agree, that shipping openblas with both numpy and scipy seems
  perfectly reasonable to me - I don't think anyone will much care about
  the 30M, and I think our job is to make something that works with the
  least complexity and likelihood of error.

 Yes. Make something that works first, optimize for space later.


+1

Ralf


   It would be good to rename the dll according to the package and
  version though, to avoid a scipy binary using a pre-loaded but
  incompatible 'libopenblas.dll'.   Say something like
  openblas-scipy-0.15.1.dll - on the basis that there can only be one
  copy of scipy loaded at a time.

 That is a good idea and we should do this for NumPy too I think.



 Sturla


 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion



 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion


___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Ralf Gommers
On Tue, Jan 27, 2015 at 10:13 PM, Nathaniel Smith n...@pobox.com wrote:

 On Tue, Jan 27, 2015 at 8:53 PM, Ralf Gommers ralf.gomm...@gmail.com
 wrote:
 
  On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:
 
  Thanks for all your ideas. The next version will contain an augumented
  libopenblas.dll  in both numpy and scipy. On the long term I would
 prefer an
  external openblas wheel package, if there is an agreement about this
 among
  numpy-dev.
 
 
  Sounds fine in principle, but reliable dependency handling will be hard
 to
  support in setup.py. You'd want the dependency on Openblas when
 installing a
  complete set of wheels, but not make it impossible to use:
 
- building against ATLAS/MKL/... from source with pip or distutils
- allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels
- pip install numpy --no-use-wheel
- etc.
 
  Static bundling is a lot easier to get right.

 In principle I think this should be easy: when installing a .whl, pip
 or whatever looks at the dependencies declared in the distribution
 metadata file inside the wheel. When installing via setup.py, pip or
 whatever uses the dependencies declared by setup.py. We just have to
 make sure that the wheels we distribute have the right metadata inside
 them and everything should work.

 Accomplishing this may be somewhat awkward with existing tools, but as
 a worst-case/proof-of-concept approach we could just have a step in
 the wheel build that opens up the .whl and edits it to add the
 dependency. Ugly, but it'd work.


Good point, that should work. Not all that much uglier than some of the
other stuff we do in release scripts for Windows binaries.

Ralf
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Matthew Brett
Hi,

On Tue, Jan 27, 2015 at 1:37 PM, Carl Kleffner cmkleff...@gmail.com wrote:


 2015-01-27 22:13 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Tue, Jan 27, 2015 at 8:53 PM, Ralf Gommers ralf.gomm...@gmail.com
 wrote:
 
  On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner cmkleff...@gmail.com
  wrote:
 
  Thanks for all your ideas. The next version will contain an augumented
  libopenblas.dll  in both numpy and scipy. On the long term I would
  prefer an
  external openblas wheel package, if there is an agreement about this
  among
  numpy-dev.
 
 
  Sounds fine in principle, but reliable dependency handling will be hard
  to
  support in setup.py. You'd want the dependency on Openblas when
  installing a
  complete set of wheels, but not make it impossible to use:
 
- building against ATLAS/MKL/... from source with pip or distutils
- allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels
- pip install numpy --no-use-wheel
- etc.
 
  Static bundling is a lot easier to get right.

 In principle I think this should be easy: when installing a .whl, pip
 or whatever looks at the dependencies declared in the distribution
 metadata file inside the wheel. When installing via setup.py, pip or
 whatever uses the dependencies declared by setup.py. We just have to
 make sure that the wheels we distribute have the right metadata inside
 them and everything should work.

 Accomplishing this may be somewhat awkward with existing tools, but as
 a worst-case/proof-of-concept approach we could just have a step in
 the wheel build that opens up the .whl and edits it to add the
 dependency. Ugly, but it'd work.

My 'delocate' utility has a routine for patching wheels :

pip install delocate
delocate-patch --help

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Sturla Molden
On 27/01/15 11:32, Carl Kleffner wrote:

 OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all
 assembler based kernels are included and are choosen at runtime.

Ok, I wasn't aware of that option. Last time I built OpenBLAS I think I 
had to specify the target CPU.

  Non
 optimized parts of Lapack have been build with -march=sse2.

Since LAPACK delegates almost all of its heavy lifting to BLAS, there is 
probably not a lot to gain from SSE3, SSE4 or AVX here.


Sturla

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-27 Thread Carl Kleffner
2015-01-27 0:16 GMT+01:00 Sturla Molden sturla.mol...@gmail.com:

 On 26/01/15 16:30, Carl Kleffner wrote:

  Thanks for all your ideas. The next version will contain an augumented
  libopenblas.dll  in both numpy and scipy. On the long term I would
  prefer an external openblas wheel package, if there is an agreement
  about this among numpy-dev.


 Thanks for all your great work on this.

 An OpenBLAS wheel might be a good idea. Probably we should have some
 sort of instruction on the website how to install the binary wheel. And
 then we could include the OpenBLAS wheel in the instruction. Or we could
 have the OpenBLAS wheel as a part of the scipy stack.

 But make the bloated SciPy and NumPy wheels work first, then we can
 worry about a dedicated OpenBLAS wheel later :-)


  Another idea for the future is to conditionally load a debug version of
  libopenblas instead. Together with the backtrace.dll (part of
  mingwstatic, but undocumentated right now) a meaningfull stacktrace in
  case of segfaults inside the code comiled with mingwstatic will be given.

 An OpenBLAS wheel could also include multiple architectures. We can
 compile OpenBLAS for any kind of CPUs and and install the one that fits
 best with the computer.


OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all
assembler based kernels are included and are choosen at runtime. Non
optimized parts of Lapack have been build with -march=sse2.


 Also note that an OpenBLAS wheel could be useful on Linux. It is clearly
 superior to the ATLAS libraries that most distros ship. If we make a
 binary wheel that works for Windows, we are almost there for Linux too :-)


I have in mind, that binary wheels are not supported for Linux. Maybe this
could be done as conda package for Anaconda/Miniconda as an OSS alternative
to MKL.


 For Apple we don't need OpenBLAS anymore. On OSX 10.9 and 10.10
 Accelerate Framework is actually faster than MKL under many
 circumstances. DGEMM is about the same, but e.g. DAXPY and DDOT are
 faster in Accelerate.


 Sturla


























 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-26 Thread Carl Kleffner
Thanks for all your ideas. The next version will contain an augumented
libopenblas.dll  in both numpy and scipy. On the long term I would prefer
an external openblas wheel package, if there is an agreement about this
among numpy-dev.

Another idea for the future is to conditionally load a debug version of
libopenblas instead. Together with the backtrace.dll (part of mingwstatic,
but undocumentated right now) a meaningfull stacktrace in case of segfaults
inside the code comiled with mingwstatic will be given.

2015-01-26 2:16 GMT+01:00 Sturla Molden sturla.mol...@gmail.com:

 On 25/01/15 22:15, Matthew Brett wrote:

  I agree, that shipping openblas with both numpy and scipy seems
  perfectly reasonable to me - I don't think anyone will much care about
  the 30M, and I think our job is to make something that works with the
  least complexity and likelihood of error.

 Yes. Make something that works first, optimize for space later.


  It would be good to rename the dll according to the package and
  version though, to avoid a scipy binary using a pre-loaded but
  incompatible 'libopenblas.dll'.   Say something like
  openblas-scipy-0.15.1.dll - on the basis that there can only be one
  copy of scipy loaded at a time.

 That is a good idea and we should do this for NumPy too I think.



 Sturla


 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread cjw

On 24-Jan-15 12:14 PM, Carl Kleffner wrote:
 Just a wild guess:

 (1) update your pip and try again
Thanks.  My pip version was 1,5,6, it is now 6.0.6

 (2) use the bitbucket wheels with:
 pip install --no-index -f
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads numpy
Successfully installed numpy-1.9.1
 pip install --no-index -f
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads scipy
Successfully installed scipy-0.15.1

 (3) check if there i something left in site-packages\numpy in the case you
 have uninstalled another numpy distribution before.
Could you be more specific please?

C:\Python27\Lib\site-packagespython
Python 2.7.9 (default, Dec 10 2014, 12:28:03) [MSC v.1500 64 bit 
(AMD64)] on win32

C:\Python27\Libcd site-packages

C:\Python27\Lib\site-packages dir
  Volume in drive C has no label.
  Volume Serial Number is 9691-2C8F

  Directory of C:\Python27\Lib\site-packages

25-Jan-15  06:11 AMDIR  .
25-Jan-15  06:11 AMDIR  ..
26-Nov-14  06:30 PM   909 apsw-3.8.7-py2.7.egg-info
26-Nov-14  06:30 PM   990,208 apsw.pyd
11-Dec-14  06:35 PMDIR  astroid
11-Dec-14  06:35 PMDIR  astroid-1.3.2.dist-info
11-Dec-14  06:35 PMDIR  colorama
11-Dec-14  06:35 PMDIR colorama-0.3.2-py2.7.egg-info
09-Sep-14  09:38 AMDIR  dateutil
29-Dec-14  10:16 PM   126 easy_install.py
29-Dec-14  10:16 PM   343 easy_install.pyc
27-Dec-14  09:18 AMDIR  epydoc
21-Jan-13  03:19 PM   297 epydoc-3.0.1-py2.7.egg-info
11-Dec-14  06:35 PMDIR  logilab
11-Dec-14  06:35 PM   309 logilab_common-0.63.2-py2.7-nspkg.pth
11-Dec-14  06:35 PMDIR logilab_common-0.63.2-py2.7.egg-info
16-Nov-14  04:02 PMDIR  matplotlib
22-Oct-14  03:11 PM   324 matplotlib-1.4.2-py2.7-nspkg.pth
16-Nov-14  04:02 PMDIR matplotlib-1.4.2-py2.7.egg-info
16-Nov-14  04:02 PMDIR  mpl_toolkits
25-Jan-15  06:07 AMDIR  numpy
25-Jan-15  06:07 AMDIR  numpy-1.9.1.dist-info
25-Jan-15  06:01 AMDIR  pip
25-Jan-15  06:01 AMDIR  pip-6.0.6.dist-info
29-Dec-14  10:16 PM   101,530 pkg_resources.py
29-Dec-14  10:16 PM   115,360 pkg_resources.pyc
10-Sep-14  12:30 PMDIR  pycparser
10-Sep-14  12:30 PMDIR pycparser-2.10-py2.7.egg-info
16-Dec-14  08:21 AMDIR  pygame
25-Mar-14  11:03 AM   543 pygame-1.9.2a0-py2.7.egg-info
24-Nov-14  06:55 AMDIR  pygit2
24-Nov-14  06:55 AMDIR pygit2-0.21.3-py2.7.egg-info
26-Mar-14  01:23 PM90 pylab.py
16-Nov-14  04:02 PM   237 pylab.pyc
16-Nov-14  04:02 PM   237 pylab.pyo
11-Dec-14  06:35 PMDIR  pylint
11-Dec-14  06:35 PMDIR  pylint-1.4.0.dist-info
11-Sep-14  08:26 PMDIR pyparsing-2.0.2-py2.7.egg-info
24-Nov-14  06:27 AM   157,300 pyparsing.py
30-Nov-14  08:51 AM   154,996 pyparsing.pyc
09-Sep-14  09:38 AMDIR python_dateutil-2.2-py2.7.egg-info
09-Sep-14  10:03 AMDIR  pytz
09-Sep-14  10:03 AMDIR pytz-2014.7-py2.7.egg-info
30-Apr-14  08:54 AM   119 README.txt
25-Jan-15  06:11 AMDIR  scipy
25-Jan-15  06:11 AMDIR  scipy-0.15.1.dist-info
29-Dec-14  10:16 PMDIR  setuptools
29-Dec-14  10:16 PMDIR  setuptools-7.0.dist-info
09-Sep-14  09:50 AMDIR  six-1.7.3.dist-info
09-Sep-14  09:50 AM26,518 six.py
09-Sep-14  09:50 AM28,288 six.pyc
21-Dec-14  07:56 PMDIR  System
21-Dec-14  07:55 PMDIR  User
21-Sep-14  06:00 PM   878,592 _cffi__xf1819144xd61e91d9.pyd
29-Dec-14  10:16 PMDIR  _markerlib
21-Sep-14  06:00 PM   890,368 _pygit2.pyd
   20 File(s)  3,346,694 bytes
   36 Dir(s)   9,810,276,352 bytes free

C:\Python27\Lib\site-packages

 I tried and failed, even after adding --pre.

 My log file is here:

 
 C:\Python27\Scripts\pip run on 01/24/15 07:51:10
 Downloading/unpacking https://pypi.binstar.org/carlkl/simple
Downloading simple
Downloading from URL https://pypi.binstar.org/carlkl/simple
 Cleaning up...
 Exception:
 Traceback (most recent call last):
File C:\Python27\lib\site-packages\pip\basecommand.py, line 122, in
 main
  status = self.run(options, args)
File C:\Python27\lib\site-packages\pip\commands\install.py, line 278,
 in run
  requirement_set.prepare_files(finder, force_root_egg_info=self.bundle,
 bundle=self.bundle)
File C:\Python27\lib\site-packages\pip\req.py, line 1197, in
 prepare_files
  do_download,
File C:\Python27\lib\site-packages\pip\req.py, line 1375, in
 unpack_url
  self.session,
File C:\Python27\lib\site-packages\pip\download.py, line 582, in
 unpack_http_url
  unpack_file(temp_location, location, content_type, link)

Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Nathaniel Smith
On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner cmkleff...@gmail.com wrote:

 2015-01-23 0:23 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:
  OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
  wheels
  mentioned above are dependant on the installation of the OpenBLAS based
  numpy and won't work i.e. with an installed  numpy-MKL.

 This sounds like it probably needs to be fixed before we can recommend
 the scipy wheels for anyone? OTOH it might be fine to start
 distributing numpy wheels first.


 I very much prefer dynamic linking to numpy\core\libopenblas.dll instead of
 static linking to avoid bloat. This matters, because libopenblas.dll is a
 heavy library (around 30Mb for amd64). As a consequence all packages with
 dynamic linkage to OpenBLAS depend on numpy-openblas. This is not different
 to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.

The difference is that if we upload this as the standard scipy wheel,
and then someone goes hey, look, a new scipy release just got
announced, 'pip upgrade scipy', then the result will often be that
they just get random unexplained crashes. I think we should try to
avoid that kind of outcome, even if it means making some technical
compromises. The whole idea of having the wheels is to make fetching
particular versions seamless and robust, and the other kinds of builds
will still be available for those willing to invest more effort.

One solution would be for the scipy wheel to explicitly depend on a
numpy+openblas wheel, so that someone doing 'pip install scipy' also
forced a numpy upgrade. But I think we should forget about trying this
given the current state of python packaging tools: pip/setuptools/etc.
are not really sophisticated enough to let us do this without a lot of
kluges and compromises, and anyway it is nicer to allow scipy and
numpy to be upgraded separately.

Another solution would be to just include openblas in both. This
bloats downloads, but I'd rather waste 30 MiB then waste users' time
fighting with random library incompatibility nonsense that they don't
care about.

Another solution would be to split the openblas library off into its
own python package, that just dropped the binary somewhere where it
could be found later, and then have both the numpy and scipy wheels
depend on this package.

We could start with the brute force solution (just including openblas
in both) for the first release, and then upgrade to the fancier
solution (both depend on a separate package) later.

  For the numpy 32bit builds there are 3 failures for special FP value
  tests,
  due to a bug in mingw-w64 that is still present. All scipy versions show
  up
  7 failures with some numerical noise, that could be ignored (or
  corrected
  with relaxed asserts in the test code).
 
  PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
  for
  building can be found at
  https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.

 Correct me if I'm wrong, but it looks like there isn't any details on
 how exactly the compiler was set up? Which is fine, I know you've been
 doing a ton of work on this and it's much appreciated :-). But
 eventually I do think a prerequisite for us adopting these as official
 builds is that we'll need a text document (or an executable script!)
 that walks through all the steps in setting up the toolchain etc., so
 that someone starting from scratch could get it all up and running.
 Otherwise we run the risk of eventually ending up back where we are
 today, with a creaky old mingw binary snapshot that no-one knows how
 it works or how to reproduce...


 This has to be done and is in preperation, but not ready for consumption
 right now. Some preliminary information is given here:
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014-11-readme.md

Right, I read that :-). There's no way that I could sit down with that
document and a clean windows install and replicate your mingw-w64
toolchain, though :-). Which, like I said, is totally fine at this
stage in the process, I just wanted to make sure that this step is on
the radar, b/c it will eventually become crucial.

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Carl Kleffner
2015-01-25 16:46 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:
 
  2015-01-23 0:23 GMT+01:00 Nathaniel Smith n...@pobox.com:
 
  On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
  wrote:
   OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
   wheels
   mentioned above are dependant on the installation of the OpenBLAS
 based
   numpy and won't work i.e. with an installed  numpy-MKL.
 
  This sounds like it probably needs to be fixed before we can recommend
  the scipy wheels for anyone? OTOH it might be fine to start
  distributing numpy wheels first.
 
 
  I very much prefer dynamic linking to numpy\core\libopenblas.dll instead
 of
  static linking to avoid bloat. This matters, because libopenblas.dll is a
  heavy library (around 30Mb for amd64). As a consequence all packages with
  dynamic linkage to OpenBLAS depend on numpy-openblas. This is not
 different
  to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.

 The difference is that if we upload this as the standard scipy wheel,
 and then someone goes hey, look, a new scipy release just got
 announced, 'pip upgrade scipy', then the result will often be that
 they just get random unexplained crashes. I think we should try to
 avoid that kind of outcome, even if it means making some technical
 compromises. The whole idea of having the wheels is to make fetching
 particular versions seamless and robust, and the other kinds of builds
 will still be available for those willing to invest more effort.

 One solution would be for the scipy wheel to explicitly depend on a
 numpy+openblas wheel, so that someone doing 'pip install scipy' also
 forced a numpy upgrade. But I think we should forget about trying this
 given the current state of python packaging tools: pip/setuptools/etc.
 are not really sophisticated enough to let us do this without a lot of
 kluges and compromises, and anyway it is nicer to allow scipy and
 numpy to be upgraded separately.


I've learned, that mark numpy with something like numpy+openblas is called
local version identifier:
https://www.python.org/dev/peps/pep-0440/#local-version-identifiers
These identifieres are not allowed for Pypi however.


 Another solution would be to just include openblas in both. This
 bloats downloads, but I'd rather waste 30 MiB then waste users' time
 fighting with random library incompatibility nonsense that they don't
 care about.

 Another solution would be to split the openblas library off into its
 own python package, that just dropped the binary somewhere where it
 could be found later, and then have both the numpy and scipy wheels
 depend on this package.


Creating a dedicated OpenBLAS package and adding this package as an
dependancy to numpy/scipy would also allow independant upgrade paths to
OpenBLAS, numpy and scipy. The API of OpenBLAS seems to be stable enough to
allow for that. Having an additional package dependancy is a minor problem,
as pip can handle this automatically for the user.


 We could start with the brute force solution (just including openblas
 in both) for the first release, and then upgrade to the fancier
 solution (both depend on a separate package) later.

   For the numpy 32bit builds there are 3 failures for special FP value
   tests,
   due to a bug in mingw-w64 that is still present. All scipy versions
 show
   up
   7 failures with some numerical noise, that could be ignored (or
   corrected
   with relaxed asserts in the test code).
  
   PR's for numpy and scipy are in preparation. The mingw-w64 compiler
 used
   for
   building can be found at
   https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
 
  Correct me if I'm wrong, but it looks like there isn't any details on
  how exactly the compiler was set up? Which is fine, I know you've been
  doing a ton of work on this and it's much appreciated :-). But
  eventually I do think a prerequisite for us adopting these as official
  builds is that we'll need a text document (or an executable script!)
  that walks through all the steps in setting up the toolchain etc., so
  that someone starting from scratch could get it all up and running.
  Otherwise we run the risk of eventually ending up back where we are
  today, with a creaky old mingw binary snapshot that no-one knows how
  it works or how to reproduce...
 
 
  This has to be done and is in preperation, but not ready for consumption
  right now. Some preliminary information is given here:
 
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014-11-readme.md

 Right, I read that :-). There's no way that I could sit down with that
 document and a clean windows install and replicate your mingw-w64
 toolchain, though :-). Which, like I said, is totally fine at this
 stage in the process, I just wanted to make sure that this step is on
 the radar, b/c it will eventually become crucial.

 -n

 --
 Nathaniel 

Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Olivier Grisel
+1 for bundling OpenBLAS both in scipy and numpy in the short term.
Introducing a new dependency project for OpenBLAS sounds like a good
idea but this is probably more work.

-- 
Olivier
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Sturla Molden
On 25/01/15 22:15, Matthew Brett wrote:

 I agree, that shipping openblas with both numpy and scipy seems
 perfectly reasonable to me - I don't think anyone will much care about
 the 30M, and I think our job is to make something that works with the
 least complexity and likelihood of error.

Yes. Make something that works first, optimize for space later.


 It would be good to rename the dll according to the package and
 version though, to avoid a scipy binary using a pre-loaded but
 incompatible 'libopenblas.dll'.   Say something like
 openblas-scipy-0.15.1.dll - on the basis that there can only be one
 copy of scipy loaded at a time.

That is a good idea and we should do this for NumPy too I think.



Sturla


___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Nathaniel Smith
On 25 Jan 2015 18:46, Carl Kleffner cmkleff...@gmail.com wrote:

 2015-01-25 16:46 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner cmkleff...@gmail.com
wrote:
 
  2015-01-23 0:23 GMT+01:00 Nathaniel Smith n...@pobox.com:
 
  On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
  wrote:
   OpenBLAS is deployed as part of the numpy wheel. That said, the
scipy
   wheels
   mentioned above are dependant on the installation of the OpenBLAS
based
   numpy and won't work i.e. with an installed  numpy-MKL.
 
  This sounds like it probably needs to be fixed before we can recommend
  the scipy wheels for anyone? OTOH it might be fine to start
  distributing numpy wheels first.
 
 
  I very much prefer dynamic linking to numpy\core\libopenblas.dll
instead of
  static linking to avoid bloat. This matters, because libopenblas.dll
is a
  heavy library (around 30Mb for amd64). As a consequence all packages
with
  dynamic linkage to OpenBLAS depend on numpy-openblas. This is not
different
  to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.

 The difference is that if we upload this as the standard scipy wheel,
 and then someone goes hey, look, a new scipy release just got
 announced, 'pip upgrade scipy', then the result will often be that
 they just get random unexplained crashes. I think we should try to
 avoid that kind of outcome, even if it means making some technical
 compromises. The whole idea of having the wheels is to make fetching
 particular versions seamless and robust, and the other kinds of builds
 will still be available for those willing to invest more effort.

 One solution would be for the scipy wheel to explicitly depend on a
 numpy+openblas wheel, so that someone doing 'pip install scipy' also
 forced a numpy upgrade. But I think we should forget about trying this
 given the current state of python packaging tools: pip/setuptools/etc.
 are not really sophisticated enough to let us do this without a lot of
 kluges and compromises, and anyway it is nicer to allow scipy and
 numpy to be upgraded separately.


 I've learned, that mark numpy with something like numpy+openblas is
called local version identifier:
https://www.python.org/dev/peps/pep-0440/#local-version-identifiers
 These identifieres are not allowed for Pypi however.

Right, it's fine for the testing wheels, but even if it were allowed on
pypi then it still wouldn't let us specify the correct dependency -- we'd
have to say that scipy build X depends on exactly numpy 1.9.1+openblas, not
numpy anything+openblas. So then when a new version of numpy was uploaded
it'd be impossible to upgrade without also rebuilding numpy.

Alternatively pip would be within its rights to simply ignore the local
version part, because Local version identifiers are used to denote fully
API (and, if applicable, ABI) compatible patched versions of upstream
projects. Here the +openblas is exactly designed to communicate ABI
incompatibility.

Soo yeah this is ugly all around. Pip and friends are getting better
but they're just not up to this kind of thing.

 Another solution would be to just include openblas in both. This
 bloats downloads, but I'd rather waste 30 MiB then waste users' time
 fighting with random library incompatibility nonsense that they don't
 care about.

 Another solution would be to split the openblas library off into its
 own python package, that just dropped the binary somewhere where it
 could be found later, and then have both the numpy and scipy wheels
 depend on this package.


 Creating a dedicated OpenBLAS package and adding this package as an
dependancy to numpy/scipy would also allow independant upgrade paths to
OpenBLAS, numpy and scipy. The API of OpenBLAS seems to be stable enough to
allow for that. Having an additional package dependancy is a minor problem,
as pip can handle this automatically for the user.

Exactly.

We might even want to give it a tiny python wrapper, e.g. you do
  import openblas
  openblas.add_to_library_path()
and that would be a little function that modifies LD_LIBRARY_PATH or calls
AddDllDirectory etc. as appropriate, so that code linking to openblas can
ignore all these details.

-n
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-25 Thread Matthew Brett
Hi,

On Sun, Jan 25, 2015 at 12:23 PM, Nathaniel Smith n...@pobox.com wrote:
 On 25 Jan 2015 18:46, Carl Kleffner cmkleff...@gmail.com wrote:

 2015-01-25 16:46 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:
 
  2015-01-23 0:23 GMT+01:00 Nathaniel Smith n...@pobox.com:
 
  On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
  wrote:
   OpenBLAS is deployed as part of the numpy wheel. That said, the
   scipy
   wheels
   mentioned above are dependant on the installation of the OpenBLAS
   based
   numpy and won't work i.e. with an installed  numpy-MKL.
 
  This sounds like it probably needs to be fixed before we can recommend
  the scipy wheels for anyone? OTOH it might be fine to start
  distributing numpy wheels first.
 
 
  I very much prefer dynamic linking to numpy\core\libopenblas.dll
  instead of
  static linking to avoid bloat. This matters, because libopenblas.dll is
  a
  heavy library (around 30Mb for amd64). As a consequence all packages
  with
  dynamic linkage to OpenBLAS depend on numpy-openblas. This is not
  different
  to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.

 The difference is that if we upload this as the standard scipy wheel,
 and then someone goes hey, look, a new scipy release just got
 announced, 'pip upgrade scipy', then the result will often be that
 they just get random unexplained crashes. I think we should try to
 avoid that kind of outcome, even if it means making some technical
 compromises. The whole idea of having the wheels is to make fetching
 particular versions seamless and robust, and the other kinds of builds
 will still be available for those willing to invest more effort.

 One solution would be for the scipy wheel to explicitly depend on a
 numpy+openblas wheel, so that someone doing 'pip install scipy' also
 forced a numpy upgrade. But I think we should forget about trying this
 given the current state of python packaging tools: pip/setuptools/etc.
 are not really sophisticated enough to let us do this without a lot of
 kluges and compromises, and anyway it is nicer to allow scipy and
 numpy to be upgraded separately.


 I've learned, that mark numpy with something like numpy+openblas is called
 local version identifier:
 https://www.python.org/dev/peps/pep-0440/#local-version-identifiers
 These identifieres are not allowed for Pypi however.

 Right, it's fine for the testing wheels, but even if it were allowed on pypi
 then it still wouldn't let us specify the correct dependency -- we'd have to
 say that scipy build X depends on exactly numpy 1.9.1+openblas, not numpy
 anything+openblas. So then when a new version of numpy was uploaded it'd
 be impossible to upgrade without also rebuilding numpy.

 Alternatively pip would be within its rights to simply ignore the local
 version part, because Local version identifiers are used to denote fully
 API (and, if applicable, ABI) compatible patched versions of upstream
 projects. Here the +openblas is exactly designed to communicate ABI
 incompatibility.

 Soo yeah this is ugly all around. Pip and friends are getting better but
 they're just not up to this kind of thing.

I agree, that shipping openblas with both numpy and scipy seems
perfectly reasonable to me - I don't think anyone will much care about
the 30M, and I think our job is to make something that works with the
least complexity and likelihood of error.

It would be good to rename the dll according to the package and
version though, to avoid a scipy binary using a pre-loaded but
incompatible 'libopenblas.dll'.   Say something like
openblas-scipy-0.15.1.dll - on the basis that there can only be one
copy of scipy loaded at a time.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-24 Thread Carl Kleffner
Just a wild guess:

(1) update your pip and try again

(2) use the bitbucket wheels with:
pip install --no-index -f
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads numpy
pip install --no-index -f
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads scipy

(3) check if there i something left in site-packages\numpy in the case you
have uninstalled another numpy distribution before.

Carl

2015-01-24 15:48 GMT+01:00 cjw c...@ncf.ca:

 On 22-Jan-15 6:23 PM, Nathaniel Smith wrote:

 On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:

 I took time to create mingw-w64 based wheels of numpy-1.9.1 and
 scipy-0.15.1
 source distributions and put them on
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as
 on
 binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and
 64bit.

 Feedback is welcome.

 The wheels can be pip installed with:

 pip install -i https://pypi.binstar.org/carlkl/simple numpy
 pip install -i https://pypi.binstar.org/carlkl/simple scipy

 Some technical details: the binaries are build upon OpenBLAS as
 accelerated
 BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to
 MKL)
 and automatic runtime selection depending on the CPU. The minimal
 requested
 feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not
 supported
 with this builds. This is the default for 64bit binaries anyway.

 According to the steam hardware survey, 99.98% of windows computers
 have SSE2. (http://store.steampowered.com/hwsurvey , click on other
 settings at the bottom). So this is probably OK :-).

  OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
 wheels
 mentioned above are dependant on the installation of the OpenBLAS based
 numpy and won't work i.e. with an installed  numpy-MKL.

 This sounds like it probably needs to be fixed before we can recommend
 the scipy wheels for anyone? OTOH it might be fine to start
 distributing numpy wheels first.

  For the numpy 32bit builds there are 3 failures for special FP value
 tests,
 due to a bug in mingw-w64 that is still present. All scipy versions show
 up
 7 failures with some numerical noise, that could be ignored (or corrected
 with relaxed asserts in the test code).

 PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
 for
 building can be found at
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.

 Correct me if I'm wrong, but it looks like there isn't any details on
 how exactly the compiler was set up? Which is fine, I know you've been
 doing a ton of work on this and it's much appreciated :-). But
 eventually I do think a prerequisite for us adopting these as official
 builds is that we'll need a text document (or an executable script!)
 that walks through all the steps in setting up the toolchain etc., so
 that someone starting from scratch could get it all up and running.
 Otherwise we run the risk of eventually ending up back where we are
 today, with a creaky old mingw binary snapshot that no-one knows how
 it works or how to reproduce...

 -n

  Karl,

 I tried and failed, even after adding --pre.

 My log file is here:

 
 C:\Python27\Scripts\pip run on 01/24/15 07:51:10
 Downloading/unpacking https://pypi.binstar.org/carlkl/simple
   Downloading simple
   Downloading from URL https://pypi.binstar.org/carlkl/simple
 Cleaning up...
 Exception:
 Traceback (most recent call last):
   File C:\Python27\lib\site-packages\pip\basecommand.py, line 122, in
 main
 status = self.run(options, args)
   File C:\Python27\lib\site-packages\pip\commands\install.py, line 278,
 in run
 requirement_set.prepare_files(finder, force_root_egg_info=self.bundle,
 bundle=self.bundle)
   File C:\Python27\lib\site-packages\pip\req.py, line 1197, in
 prepare_files
 do_download,
   File C:\Python27\lib\site-packages\pip\req.py, line 1375, in
 unpack_url
 self.session,
   File C:\Python27\lib\site-packages\pip\download.py, line 582, in
 unpack_http_url
 unpack_file(temp_location, location, content_type, link)
   File C:\Python27\lib\site-packages\pip\util.py, line 627, in
 unpack_file
 and is_svn_page(file_contents(filename))):
   File C:\Python27\lib\site-packages\pip\util.py, line 210, in
 file_contents
 return fp.read().decode('utf-8')
   File C:\Python27\lib\encodings\utf_8.py, line 16, in decode
 return codecs.utf_8_decode(input, errors, True)
 UnicodeDecodeError: 'utf8' codec can't decode byte 0x8b in position 1:
 invalid start byte

 Do you have any suggestions?

 Colin W.

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-24 Thread cjw
On 22-Jan-15 6:23 PM, Nathaniel Smith wrote:
 On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com wrote:
 I took time to create mingw-w64 based wheels of numpy-1.9.1 and scipy-0.15.1
 source distributions and put them on
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as on
 binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and 64bit.

 Feedback is welcome.

 The wheels can be pip installed with:

 pip install -i https://pypi.binstar.org/carlkl/simple numpy
 pip install -i https://pypi.binstar.org/carlkl/simple scipy

 Some technical details: the binaries are build upon OpenBLAS as accelerated
 BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL)
 and automatic runtime selection depending on the CPU. The minimal requested
 feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not supported
 with this builds. This is the default for 64bit binaries anyway.
 According to the steam hardware survey, 99.98% of windows computers
 have SSE2. (http://store.steampowered.com/hwsurvey , click on other
 settings at the bottom). So this is probably OK :-).

 OpenBLAS is deployed as part of the numpy wheel. That said, the scipy wheels
 mentioned above are dependant on the installation of the OpenBLAS based
 numpy and won't work i.e. with an installed  numpy-MKL.
 This sounds like it probably needs to be fixed before we can recommend
 the scipy wheels for anyone? OTOH it might be fine to start
 distributing numpy wheels first.

 For the numpy 32bit builds there are 3 failures for special FP value tests,
 due to a bug in mingw-w64 that is still present. All scipy versions show up
 7 failures with some numerical noise, that could be ignored (or corrected
 with relaxed asserts in the test code).

 PR's for numpy and scipy are in preparation. The mingw-w64 compiler used for
 building can be found at
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
 Correct me if I'm wrong, but it looks like there isn't any details on
 how exactly the compiler was set up? Which is fine, I know you've been
 doing a ton of work on this and it's much appreciated :-). But
 eventually I do think a prerequisite for us adopting these as official
 builds is that we'll need a text document (or an executable script!)
 that walks through all the steps in setting up the toolchain etc., so
 that someone starting from scratch could get it all up and running.
 Otherwise we run the risk of eventually ending up back where we are
 today, with a creaky old mingw binary snapshot that no-one knows how
 it works or how to reproduce...

 -n

Karl,

I tried and failed, even after adding --pre.

My log file is here:


C:\Python27\Scripts\pip run on 01/24/15 07:51:10
Downloading/unpacking https://pypi.binstar.org/carlkl/simple
   Downloading simple
   Downloading from URL https://pypi.binstar.org/carlkl/simple
Cleaning up...
Exception:
Traceback (most recent call last):
   File C:\Python27\lib\site-packages\pip\basecommand.py, line 122, in 
main
 status = self.run(options, args)
   File C:\Python27\lib\site-packages\pip\commands\install.py, line 
278, in run
 requirement_set.prepare_files(finder, 
force_root_egg_info=self.bundle, bundle=self.bundle)
   File C:\Python27\lib\site-packages\pip\req.py, line 1197, in 
prepare_files
 do_download,
   File C:\Python27\lib\site-packages\pip\req.py, line 1375, in unpack_url
 self.session,
   File C:\Python27\lib\site-packages\pip\download.py, line 582, in 
unpack_http_url
 unpack_file(temp_location, location, content_type, link)
   File C:\Python27\lib\site-packages\pip\util.py, line 627, in 
unpack_file
 and is_svn_page(file_contents(filename))):
   File C:\Python27\lib\site-packages\pip\util.py, line 210, in 
file_contents
 return fp.read().decode('utf-8')
   File C:\Python27\lib\encodings\utf_8.py, line 16, in decode
 return codecs.utf_8_decode(input, errors, True)
UnicodeDecodeError: 'utf8' codec can't decode byte 0x8b in position 1: 
invalid start byte

Do you have any suggestions?

Colin W.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-24 Thread Carl Kleffner
2015-01-23 0:23 GMT+01:00 Nathaniel Smith n...@pobox.com:

 On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com
 wrote:
  I took time to create mingw-w64 based wheels of numpy-1.9.1 and
 scipy-0.15.1
  source distributions and put them on
  https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as
 on
  binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and
 64bit.
 
  Feedback is welcome.
 
  The wheels can be pip installed with:
 
  pip install -i https://pypi.binstar.org/carlkl/simple numpy
  pip install -i https://pypi.binstar.org/carlkl/simple scipy
 
  Some technical details: the binaries are build upon OpenBLAS as
 accelerated
  BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to
 MKL)
  and automatic runtime selection depending on the CPU. The minimal
 requested
  feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not
 supported
  with this builds. This is the default for 64bit binaries anyway.

 According to the steam hardware survey, 99.98% of windows computers
 have SSE2. (http://store.steampowered.com/hwsurvey , click on other
 settings at the bottom). So this is probably OK :-).

  OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
 wheels
  mentioned above are dependant on the installation of the OpenBLAS based
  numpy and won't work i.e. with an installed  numpy-MKL.

 This sounds like it probably needs to be fixed before we can recommend
 the scipy wheels for anyone? OTOH it might be fine to start
 distributing numpy wheels first.


I very much prefer dynamic linking to numpy\core\libopenblas.dll instead of
static linking to avoid bloat. This matters, because libopenblas.dll is a
heavy library (around 30Mb for amd64). As a consequence all packages with
dynamic linkage to OpenBLAS depend on numpy-openblas. This is not different
to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.

 For the numpy 32bit builds there are 3 failures for special FP value
 tests,
  due to a bug in mingw-w64 that is still present. All scipy versions show
 up
  7 failures with some numerical noise, that could be ignored (or corrected
  with relaxed asserts in the test code).
 
  PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
 for
  building can be found at
  https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.

 Correct me if I'm wrong, but it looks like there isn't any details on
 how exactly the compiler was set up? Which is fine, I know you've been
 doing a ton of work on this and it's much appreciated :-). But
 eventually I do think a prerequisite for us adopting these as official
 builds is that we'll need a text document (or an executable script!)
 that walks through all the steps in setting up the toolchain etc., so
 that someone starting from scratch could get it all up and running.
 Otherwise we run the risk of eventually ending up back where we are
 today, with a creaky old mingw binary snapshot that no-one knows how
 it works or how to reproduce...


This has to be done and is in preperation, but not ready for consumption
right now. Some preliminary information is given here:
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014-11-readme.md


 -n

 --
 Nathaniel J. Smith
 Postdoctoral researcher - Informatics - University of Edinburgh
 http://vorpus.org
 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-23 Thread Carl Kleffner
All tests for the 64bit builds passed.

Carl

2015-01-23 0:11 GMT+01:00 Sturla Molden sturla.mol...@gmail.com:

 Were there any failures with the 64 bit build, or did all tests pass?

 Sturla


 On 22/01/15 22:29, Carl Kleffner wrote:
  I took time to create mingw-w64 based wheels of numpy-1.9.1 and
  scipy-0.15.1 source distributions and put them on
  https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as
  on binstar.org http://binstar.org. The test matrix is python-2.7 and
  3.4 for both 32bit and 64bit.
 
  Feedback is welcome.
 
  The wheels can be pip installed with:
 
  pip install -i https://pypi.binstar.org/carlkl/simple numpy
  pip install -i https://pypi.binstar.org/carlkl/simple scipy
 
  Some technical details: the binaries are build upon OpenBLAS as
  accelerated BLAS/Lapack. OpenBLAS itself is build with dynamic kernels
  (similar to MKL) and automatic runtime selection depending on the CPU.
  The minimal requested feature supplied by the CPU is SSE2. SSE1 and
  non-SSE CPUs are not supported with this builds. This is the default for
  64bit binaries anyway.
 
  OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
  wheels mentioned above are dependant on the installation of the OpenBLAS
  based numpy and won't work i.e. with an installed  numpy-MKL.
 
  For the numpy 32bit builds there are 3 failures for special FP value
  tests, due to a bug in mingw-w64 that is still present. All scipy
  versions show up 7 failures with some numerical noise, that could be
  ignored (or corrected with relaxed asserts in the test code).
 
  PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
  for building can be found at
  https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
 
 
 
  ___
  NumPy-Discussion mailing list
  NumPy-Discussion@scipy.org
  http://mail.scipy.org/mailman/listinfo/numpy-discussion
 


 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-23 Thread Olivier Grisel
2015-01-23 9:25 GMT+01:00 Carl Kleffner cmkleff...@gmail.com:
 All tests for the 64bit builds passed.

Thanks very much Carl. Did you have to patch the numpy / distutils
source to build those wheels are is this using the source code from
the official releases?

-- 
Olivier
http://twitter.com/ogrisel - http://github.com/ogrisel
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-22 Thread Carl Kleffner
I took time to create mingw-w64 based wheels of numpy-1.9.1 and
scipy-0.15.1 source distributions and put them on
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as on
binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and
64bit.

Feedback is welcome.

The wheels can be pip installed with:

pip install -i https://pypi.binstar.org/carlkl/simple numpy
pip install -i https://pypi.binstar.org/carlkl/simple scipy

Some technical details: the binaries are build upon OpenBLAS as accelerated
BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL)
and automatic runtime selection depending on the CPU. The minimal requested
feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not
supported with this builds. This is the default for 64bit binaries anyway.

OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
wheels mentioned above are dependant on the installation of the OpenBLAS
based numpy and won't work i.e. with an installed  numpy-MKL.

For the numpy 32bit builds there are 3 failures for special FP value tests,
due to a bug in mingw-w64 that is still present. All scipy versions show up
7 failures with some numerical noise, that could be ignored (or corrected
with relaxed asserts in the test code).

PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
for building can be found at
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-22 Thread cjw

  
  
Thanks Carl,

This is good to hear.  I presume that the AMD64 is covered.

Colin W.
On 22-Jan-15 4:29 PM, Carl Kleffner
  wrote:


  I took time to create mingw-w64 based wheels of numpy-1.9.1 and
scipy-0.15.1 source distributions and put them on
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as on
binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and
64bit.

Feedback is welcome.

The wheels can be pip installed with:

pip install -i https://pypi.binstar.org/carlkl/simple numpy
pip install -i https://pypi.binstar.org/carlkl/simple scipy

Some technical details: the binaries are build upon OpenBLAS as accelerated
BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL)
and automatic runtime selection depending on the CPU. The minimal requested
feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not
supported with this builds. This is the default for 64bit binaries anyway.

OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
wheels mentioned above are dependant on the installation of the OpenBLAS
based numpy and won't work i.e. with an installed  numpy-MKL.

For the numpy 32bit builds there are 3 failures for special FP value tests,
due to a bug in mingw-w64 that is still present. All scipy versions show up
7 failures with some numerical noise, that could be ignored (or corrected
with relaxed asserts in the test code).

PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
for building can be found at
https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.


  
  
  
  ___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion



  

___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-22 Thread Sturla Molden
Were there any failures with the 64 bit build, or did all tests pass?

Sturla


On 22/01/15 22:29, Carl Kleffner wrote:
 I took time to create mingw-w64 based wheels of numpy-1.9.1 and
 scipy-0.15.1 source distributions and put them on
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as
 on binstar.org http://binstar.org. The test matrix is python-2.7 and
 3.4 for both 32bit and 64bit.

 Feedback is welcome.

 The wheels can be pip installed with:

 pip install -i https://pypi.binstar.org/carlkl/simple numpy
 pip install -i https://pypi.binstar.org/carlkl/simple scipy

 Some technical details: the binaries are build upon OpenBLAS as
 accelerated BLAS/Lapack. OpenBLAS itself is build with dynamic kernels
 (similar to MKL) and automatic runtime selection depending on the CPU.
 The minimal requested feature supplied by the CPU is SSE2. SSE1 and
 non-SSE CPUs are not supported with this builds. This is the default for
 64bit binaries anyway.

 OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
 wheels mentioned above are dependant on the installation of the OpenBLAS
 based numpy and won't work i.e. with an installed  numpy-MKL.

 For the numpy 32bit builds there are 3 failures for special FP value
 tests, due to a bug in mingw-w64 that is still present. All scipy
 versions show up 7 failures with some numerical noise, that could be
 ignored (or corrected with relaxed asserts in the test code).

 PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
 for building can be found at
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.



 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion



___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-22 Thread Nathaniel Smith
On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner cmkleff...@gmail.com wrote:
 I took time to create mingw-w64 based wheels of numpy-1.9.1 and scipy-0.15.1
 source distributions and put them on
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as on
 binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and 64bit.

 Feedback is welcome.

 The wheels can be pip installed with:

 pip install -i https://pypi.binstar.org/carlkl/simple numpy
 pip install -i https://pypi.binstar.org/carlkl/simple scipy

 Some technical details: the binaries are build upon OpenBLAS as accelerated
 BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL)
 and automatic runtime selection depending on the CPU. The minimal requested
 feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not supported
 with this builds. This is the default for 64bit binaries anyway.

According to the steam hardware survey, 99.98% of windows computers
have SSE2. (http://store.steampowered.com/hwsurvey , click on other
settings at the bottom). So this is probably OK :-).

 OpenBLAS is deployed as part of the numpy wheel. That said, the scipy wheels
 mentioned above are dependant on the installation of the OpenBLAS based
 numpy and won't work i.e. with an installed  numpy-MKL.

This sounds like it probably needs to be fixed before we can recommend
the scipy wheels for anyone? OTOH it might be fine to start
distributing numpy wheels first.

 For the numpy 32bit builds there are 3 failures for special FP value tests,
 due to a bug in mingw-w64 that is still present. All scipy versions show up
 7 failures with some numerical noise, that could be ignored (or corrected
 with relaxed asserts in the test code).

 PR's for numpy and scipy are in preparation. The mingw-w64 compiler used for
 building can be found at
 https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.

Correct me if I'm wrong, but it looks like there isn't any details on
how exactly the compiler was set up? Which is fine, I know you've been
doing a ton of work on this and it's much appreciated :-). But
eventually I do think a prerequisite for us adopting these as official
builds is that we'll need a text document (or an executable script!)
that walks through all the steps in setting up the toolchain etc., so
that someone starting from scratch could get it all up and running.
Otherwise we run the risk of eventually ending up back where we are
today, with a creaky old mingw binary snapshot that no-one knows how
it works or how to reproduce...

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] new mingw-w64 based numpy and scipy wheel (still experimental)

2015-01-22 Thread Carl Kleffner
Yes,

I build win32 as well as amd64 binaries.

Carlkl

2015-01-22 23:06 GMT+01:00 cjw c...@ncf.ca:

  Thanks Carl,

 This is good to hear.  I presume that the AMD64 is covered.

 Colin W.

 On 22-Jan-15 4:29 PM, Carl Kleffner wrote:

 I took time to create mingw-w64 based wheels of numpy-1.9.1 and
 scipy-0.15.1 source distributions and put them 
 onhttps://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as 
 onbinstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and
 64bit.

 Feedback is welcome.

 The wheels can be pip installed with:

 pip install -i https://pypi.binstar.org/carlkl/simple numpy
 pip install -i https://pypi.binstar.org/carlkl/simple scipy

 Some technical details: the binaries are build upon OpenBLAS as accelerated
 BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL)
 and automatic runtime selection depending on the CPU. The minimal requested
 feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not
 supported with this builds. This is the default for 64bit binaries anyway.

 OpenBLAS is deployed as part of the numpy wheel. That said, the scipy
 wheels mentioned above are dependant on the installation of the OpenBLAS
 based numpy and won't work i.e. with an installed  numpy-MKL.

 For the numpy 32bit builds there are 3 failures for special FP value tests,
 due to a bug in mingw-w64 that is still present. All scipy versions show up
 7 failures with some numerical noise, that could be ignored (or corrected
 with relaxed asserts in the test code).

 PR's for numpy and scipy are in preparation. The mingw-w64 compiler used
 for building can be found 
 athttps://bitbucket.org/carlkl/mingw-w64-for-python/downloads.




 ___
 NumPy-Discussion mailing 
 listNumPy-Discussion@scipy.orghttp://mail.scipy.org/mailman/listinfo/numpy-discussion



 ___
 NumPy-Discussion mailing list
 NumPy-Discussion@scipy.org
 http://mail.scipy.org/mailman/listinfo/numpy-discussion


___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion