Re: [Numpy-discussion] 2D phase unwrapping
Le mercredi 26 novembre 2008 à 09:17 +0200, Nadav Horesh a écrit : Is there a 2D phase unwrapping for python? I read a presentation by GERI (http://www.ljmu.ac.uk/GERI) that their code is implemented in scipy, but I could not find it. I had the same problem a couple of days ago! Playing with the unwrap function and the axis argument, I still did not managed to get rid of these *** lines! the kind of results I had are available at : http://fsilva.perso.ec-marseille.fr/visible/tmp/ - tmp00.png : no unwrapping at all - tmp10.png : unwrapping along the vertical axis - tmp11.png : unwrapping along the vertical axis and then unwrapping the first line and applying the 2pi gaps to all lines... - tmp20.png : unwrapping along the horizontal axis -- Fabrice Silva ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] More loadtxt() changes
Ryan May wrote: Hi, I have a couple more changes to loadtxt() that I'd like to code up in time for 1.3, but I thought I should run them by the list before doing too much work. These are already implemented in some fashion in matplotlib.mlab.csv2rec(), but the code bases are different enough, that pretty much only the idea can be lifted. All of these changes would be done in a manner that is backwards compatible with the current API. 1) Support for setting the names of fields in the returned structured array without using dtype. This can be a passed in list of names or reading the names of fields from the first line of the file. Many files have a header line that gives a name for each column. Adding this would obviously make loadtxt much more general and allow for more generic code, IMO. My current thinking is to add a *name* keyword parameter that defaults to None, for no support for reading names. Setting it to True would tell loadtxt() to read the names from the first line (after skiprows). The other option would be to set names to a list of strings. 2) Support for automatic dtype inference. Instead of assuming all values are floats, this would try a list of options until one worked. For strings, this would keep track of the longest string within a given field before setting the dtype. This would allow reading of files containing a mixture of types much more easily, without having to go to the trouble of constructing a full dtype by hand. This would work alongside any custom converters one passes in. My current thinking of API would just be to add the option of passing the string 'auto' as the dtype parameter. 3) Better support for missing values. The docstring mentions a way of handling missing values by passing in a converter. The problem with this is that you have to pass in a converter for *every column* that will contain missing values. If you have a text file with 50 columns, writing this dictionary of converters seems like ugly and needless boilerplate. I'm unsure of how best to pass in both what values indicate missing values and what values to fill in their place. I'd love suggestions Hi Ryan, this would be a great feature to have !!! One question: I have a datafile in ASCII format that uses a fixed width for each column. If no data if present, the space is left empty (see second row). What is the default behavior of the StringConverter class in this case? Does it ignore the empty entry by default? If so, what is the value in the array in this case? Is it nan? Example file: 1| 123.4| -123.4| 00.0 2| | 234.7| 12.2 Manuel Here's an example of my use case (without 50 columns): ID,First Name,Last Name,Homework1,Homework2,Quiz1,Homework3,Final 1234,Joe,Smith,85,90,,76, 5678,Jane,Doe,65,99,,78, 9123,Joe,Plumber,45,90,,92, Currently reading in this code requires a bit of boilerplace (declaring dtypes, converters). While it's nothing I can't write, it still would be easier to write it once within loadtxt and have it for everyone. Any support for *any* of these ideas? Any suggestions on how the user should pass in the information? Thanks, Ryan ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] More loadtxt() changes
On Tue, Nov 25, 2008 at 11:23 PM, Ryan May [EMAIL PROTECTED] wrote: Updated patch attached. This includes: * Updated docstring * New tests * Fixes for previous issues * Fixes to make new tests actually work I appreciate any and all feedback. I'm having trouble applying your patch, so I haven't tested yet, but do you (and do you want to) handle a case like this:: from StringIO import StringIO import matplotlib.mlab as mlab f1 = StringIO(\ name age weight John 23 145. Harry 43 180.) for line in f1: print line.split(' ') Ie, space delimited but using an irregular number of spaces? One place this comes up a lot is when the output files are actually fixed-width using spaces to line up the columns. One could count the columns to figure out the fixed widths and work with that, but it is much easier to simply assume space delimiting and handle the irregular number of spaces assuming one or more spaces is the delimiter. In csv2rec, we write a custom file object to handle this case. Apologies if you are already handling this and I missed it... JDH ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Problems building numpy on solaris 10 x86
On Tue, Nov 25, 2008 at 11:28 PM, David Cournapeau [EMAIL PROTECTED] wrote: Charles R Harris wrote: What happens if you go the usual python setup.py {build,install} route? Won't go far since it does not handle sunperf. David Even though the regular build process appears to complete, it seems to be doing the wrong thing. It seems, for instance, that lapack_lite.so is being built as an executable: [EMAIL PROTECTED] 11:14 ~ $ gnu file /usr/local/python-2.5.1/lib/python2.5/site-packages/numpy/linalg/lapack_lite.so /usr/local/python-2.5.1/lib/python2.5/site-packages/numpy/linalg/lapack_lite.so: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), not stripped ??? -Peter ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] ANNOUNCE: EPD with Py2.5 version 4.0.30002 RC2 available for testing
Hello, We've recently posted the beta1 build of EPD (the Enthought Python Distribution) with Python 2.5 version 4.1.30001 to the EPD website. You may download the beta from here: http://www.enthought.com/products/epdearlyaccess.php You can check out the release notes here: https://svn.enthought.com/epd/wiki/Python2.5.2/4.1.300/Beta1 Please help us test it out and provide feedback on the EPD Trac instance: https://svn.enthought.com/epd or via e-mail to [EMAIL PROTECTED] If everything goes well, we are planning a final release for December. About EPD - The Enthought Python Distribution (EPD) is a kitchen-sink-included distribution of the Python™ Programming Language, including over 60 additional tools and libraries. The EPD bundle includes NumPy, SciPy, IPython, 2D and 3D visualization, database adapters, GUI building libraries, and a lot of other tools right out of the box. http://www.enthought.com/products/epd.php It is currently available as a single-click installer for Windows XP (x86), Mac OS X (a universal binary for OS X 10.4 and above), and RedHat 3 and 4 (x86 and amd64). EPD is free for academic use. An annual subscription and installation support are available for individual commercial use. Enterprise subscriptions with support for particular deployment environments are also available for commercial purchase. Enthought Build Team ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] ANNOUNCE: EPD with Py2.5 version 4.0.30002 RC2 available for testing
On Wed, Nov 26, 2008 at 12:05:07PM -0600, Travis E. Oliphant wrote: We've recently posted the beta1 build of EPD (the Enthought Python Distribution) with Python 2.5 version 4.1.30001 to the EPD website. You may download the beta from here: http://www.enthought.com/products/epdearlyaccess.php Congatulations for a quicker pace of releases. Gaël ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] ANNOUNCE: EPD with Py2.5 version 4.0.30002 RC2 available for testing
Travis E. Oliphant wrote: Hello, SNIP Hi Travis, It is currently available as a single-click installer for Windows XP (x86), Mac OS X (a universal binary for OS X 10.4 and above), and RedHat 3 and 4 (x86 and amd64). I am sure you mean RHEL 3 and 4? This Redhat 3 and 4 always strikes me as vague :) SNIP Enthought Build Team Cheers, Michael ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] 2D phase unwrapping
On Wed, Nov 26, 2008 at 04:19, Fabrice Silva [EMAIL PROTECTED] wrote: Le mercredi 26 novembre 2008 à 09:17 +0200, Nadav Horesh a écrit : Is there a 2D phase unwrapping for python? I read a presentation by GERI (http://www.ljmu.ac.uk/GERI) that their code is implemented in scipy, but I could not find it. I had the same problem a couple of days ago! Playing with the unwrap function and the axis argument, I still did not managed to get rid of these *** lines! 2D phase unwrapping is a very tricky problem, particularly if you have noise. I don't expect that you will have much success just applying the 1D unwrap in various ways. The algorithms are fairly sophisticated. Links to the GERI C++ code appear to be here: http://www.ljmu.ac.uk/GERI/90207.htm You do have to click through a restrictive license, though. -- Robert Kern I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth. -- Umberto Eco ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] More loadtxt() changes
John Hunter wrote: On Tue, Nov 25, 2008 at 11:23 PM, Ryan May [EMAIL PROTECTED] wrote: Updated patch attached. This includes: * Updated docstring * New tests * Fixes for previous issues * Fixes to make new tests actually work I appreciate any and all feedback. I'm having trouble applying your patch, so I haven't tested yet, but do you (and do you want to) handle a case like this:: from StringIO import StringIO import matplotlib.mlab as mlab f1 = StringIO(\ name age weight John 23 145. Harry 43 180.) for line in f1: print line.split(' ') Ie, space delimited but using an irregular number of spaces? One place this comes up a lot is when the output files are actually fixed-width using spaces to line up the columns. One could count the columns to figure out the fixed widths and work with that, but it is much easier to simply assume space delimiting and handle the irregular number of spaces assuming one or more spaces is the delimiter. In csv2rec, we write a custom file object to handle this case. Apologies if you are already handling this and I missed it... I think line.split(None) handles this case, so *in theory* passing delimiter=None would do it. I *am* interested in this case, so I'll have to give it a try when I get a chance. (I sense this is the same case as Manuel just asked about.) Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] More loadtxt() changes
Manuel Metz wrote: Ryan May wrote: 3) Better support for missing values. The docstring mentions a way of handling missing values by passing in a converter. The problem with this is that you have to pass in a converter for *every column* that will contain missing values. If you have a text file with 50 columns, writing this dictionary of converters seems like ugly and needless boilerplate. I'm unsure of how best to pass in both what values indicate missing values and what values to fill in their place. I'd love suggestions Hi Ryan, this would be a great feature to have !!! Thanks for the support! One question: I have a datafile in ASCII format that uses a fixed width for each column. If no data if present, the space is left empty (see second row). What is the default behavior of the StringConverter class in this case? Does it ignore the empty entry by default? If so, what is the value in the array in this case? Is it nan? Example file: 1| 123.4| -123.4| 00.0 2| | 234.7| 12.2 I don't think this is so much anything to do with StringConverter, but more to do with how to split lines. Maybe we should add an option that, instead of simply specifying characters that delimit the fields, allows one to pass a custom function to split lines? That could either be done by overriding `delimiter` or by adding a new option like `splitter` I'll have to give that some thought. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] More loadtxt() changes
On Nov 26, 2008, at 5:55 PM, Ryan May wrote: Manuel Metz wrote: Ryan May wrote: 3) Better support for missing values. The docstring mentions a way of handling missing values by passing in a converter. The problem with this is that you have to pass in a converter for *every column* that will contain missing values. If you have a text file with 50 columns, writing this dictionary of converters seems like ugly and needless boilerplate. I'm unsure of how best to pass in both what values indicate missing values and what values to fill in their place. I'd love suggestions Hi Ryan, this would be a great feature to have !!! About missing values: * I don't think missing values should be supported in np.loadtxt. That should go into a specific np.ma.io.loadtxt function, a preview of which I posted earlier. I'll modify it taking Ryan's new function into account, and Chrisopher's suggestion (defining a dictionary {column name : missing values}. * StringConverter already defines some default filling values for each dtype. In np.ma.io.loadtxt, these values can be overwritten. Note that you should also be able to define a filling value by specifying a converter (think float(x or 0) for example) * Missing values on space-separated fields are very tricky to handle: take a line like a,,,d. With a comma as separator, it's clear that the 2nd and 3rd fields are missing. Now, imagine that commas are actually spaces ( a d): 'd' is now seen as the 2nd field of a 2-field record, not as the 4th field of a 4- field record with 2 missing values. I thought about it, and kicked in touch * That said, there should be a way to deal with fixed-length fields, probably by taking consecutive slices of the initial string. That way, we should be able to keep track of missing data... ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] numpy errors when importing in Picalo
Hi, I get numpy errors after I install Picalo (www.picalo.org) on Mac OS X 10.4.11 Tiger. I have tried to import numpy in Picalo using the instructions in PicaloCookBook, p.101. I get this error message which I don't understand. Per Picalo author (see below for his reply to my email to Picalo discussion forum), I try it here. I use numpy v. 1.0.4. distributed with Scipy superpack ( http://macinscience.org/?page_id=6) Could anyone please help? Thanks, and cheers Igor sys.path.append('/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/') import numpy Traceback (most recent call last): File input, line 1, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/__init__.py, line 93, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/add_newdocs.py, line 9, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/lib/__init__.py, line 4, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/lib/type_check.py, line 8, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/core/__init__.py, line 5, in module ImportError: dlopen(/Applications/sage/local/lib/python2.5/site-packages/numpy/core/multiarray.so, 2): Symbol not found: _PyUnicodeUCS4_FromUnicode Referenced from: /Applications/sage/local/lib/python2.5/site-packages/numpy/core/multiarray.so Expected in: dynamic lookup Reply Forward Conan C. Albrecht to me, users show details Nov 23 (3 days ago) Reply You're doing everything right from my perspective. It looks like a problem with NumPy. The stack trace goes to multiarray.so in their core toolkit. I think you should hit their forums and see if they can help. One idea is that Picalo uses unicode for all data values. Perhaps numpy can't handle unicode? ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] numpy errors when importing in Picalo
igor Halperin wrote: Hi, Hi I get numpy errors after I install Picalo (www.picalo.org http://www.picalo.org) on Mac OS X 10.4.11 Tiger. I have tried to import numpy in Picalo using the instructions in PicaloCookBook, p.101. I get this error message which I don't understand. Per Picalo author (see below for his reply to my email to Picalo discussion forum), I try it here. I use numpy v. 1.0.4. http://1.0.4. distributed with Scipy superpack (http://macinscience.org/?page_id=6) Could anyone please help? The problem is that numpy was build using a python that was build with ucs4 (it is a unicode thing) while the python you run (I assume the Apple one) is ucs2. To fix this either build your own numpy or get a binary one that is ucs2, but I have no clue where one would get such a thing. Thanks, and cheers Igor Cheers, Michael sys.path.append('/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/site-packages/') import numpy Traceback (most recent call last): File input, line 1, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/__init__.py, line 93, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/add_newdocs.py, line 9, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/lib/__init__.py, line 4, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/lib/type_check.py, line 8, in module File /Applications/sage/local/lib/python2.5/site-packages/numpy/core/__init__.py, line 5, in module ImportError: dlopen(/Applications/sage/local/lib/python2.5/site-packages/numpy/core/multiarray.so, 2): Symbol not found: _PyUnicodeUCS4_FromUnicode Referenced from: /Applications/sage/local/lib/python2.5/site-packages/numpy/core/multiarray.so Expected in: dynamic lookup Reply Forward Conan C. Albrecht to me, users show details Nov 23 (3 days ago) [smime.p7s] Reply You're doing everything right from my perspective. It looks like a problem with NumPy. The stack trace goes to multiarray.so in their core toolkit. I think you should hit their forums and see if they can help. One idea is that Picalo uses unicode for all data values. Perhaps numpy can't handle unicode? ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] 2D phase unwrapping
On Tue, Nov 25, 2008 at 11:17 PM, Nadav Horesh [EMAIL PROTECTED] wrote: I read a presentation by GERI (http://www.ljmu.ac.uk/GERI) that their code is implemented in scipy, but I could not find it. One of my colleagues has been using 2D and 3D phase unwrapping code from Munther Gdeisat from GERI: https://cirl.berkeley.edu/trac/browser/bic/trunk/recon-tools/src https://cirl.berkeley.edu/trac/browser/bic/trunk/recon-tools/root/recon/punwrap This code is very high quality and replicating it from scratch would be a fairly daunting task. I was hoping to get this code integrated into SciPy, but no one in my group has had time to do this. Munther Gdeisat and I spoke on the phone and had an email exchange about relicensing his code and integrating it into SciPy. Munther was very interested in having this happen and had some discussions with the Institute Director to get permission for relicencing the code. I have appended our email exchange below. If anyone is interested in picking this up and going through the effort of incorporating this code in scipy I would be happy to help resolve any remaining licensing issues. I also may be able to devote some programming resources to helping out, if someone else volunteers to do the majority of the work. Thanks, -- Forwarded message -- From: Gdeisat, Munther [EMAIL PROTECTED] Date: Fri, Sep 28, 2007 at 1:07 PM Subject: RE: 3D phase unwrap To: Jarrod Millman [EMAIL PROTECTED] Cc: Daniel Sheltraw [EMAIL PROTECTED], Travis E. Oliphant [EMAIL PROTECTED] Dear Jarrod, On behalf of the General Engineering Research Institute (GERI), Liverpool John Moores University, UK, I am very happy to license our 2D and 3D phase unwrappers to use in your NumPy and SciPy libraries. I spoke with this matter with the director of our institute (GERI), prof. Burton, and he is also happy to license the code for both libraries mentioned above. But myself and Prof. Burton would like to stress on the following issues 1- We disclaims all responsibility for the use which is made of the Software. We further disclaim any liability for the outcomes arising from using the Software. 2-We are not obliged to update the software or give any support to the users of the software. We generally help researchers around the world but we are not obliged to do that. Following our phone call, you mentioned to me that you already have these two points mentioned in the license of both libraries. So, I can confirm you that you can include our software in your library. Yours Truly, Dr. Munther Gdeisat The General Engineering Research Institute (GERI) Liverpool John Moores University, UK From: [EMAIL PROTECTED] on behalf of Jarrod Millman Sent: Fri 9/28/2007 9:54 PM To: Gdeisat, Munther Cc: Daniel Sheltraw; Travis E. Oliphant Subject: Re: 3D phase unwrap Hello Munther, It was good to speak to you on the phone. I am happy that you will be able to relicense your code for us. Here is the license we use: http://projects.scipy.org/scipy/scipy/browser/trunk/LICENSE.txt It should address all your concerns. Feel free to let me know if you have any questions about it. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From: [EMAIL PROTECTED] on behalf of Jarrod Millman Sent: Fri 9/28/2007 3:02 AM To: Gdeisat, Munther Cc: Daniel Sheltraw; Travis E. Oliphant Subject: Re: 3D phase unwrap On 9/26/07, Gdeisat, Munther [EMAIL PROTECTED] wrote: Firstly, I would like to than Daniel to bring us together. I am happy to include the 2D and 3D phase unwrappers in the NumPy/SciPy project. If you need any help regarding this matter such as documentation, I am happy to do so. Kind regards. Hello Munther, I am very excited about the possibility of getting your 2D and 3D phase unwrappers incorporated into SciPy (http://www.scipy.org/ http://www.scipy.org/ ). Travis Oliphant (the main author of NumPy and a major contributor to SciPy) spoke about where your phase unwrapping coding would best fit, and we both agreed that they belong in SciPy. NumPy and SciPy are both part of the same technology stack. We try to keep NumPy as lean as possible leaving SciPy to provide a more comprehensive set of tools. Here is an article about NumPy/SciPy written by Travis from a recent special issue of IEEE's Computing in Science and Engineering, which was devoted to Python for scientific programming: http://www.computer.org/portal/cms_docs_cise/cise/2007/n3/10-20.pdf http://www.computer.org/portal/cms_docs_cise/cise/2007/n3/10-20.pdf Anyway, I am the current release manager of SciPy and am eager to get your phase unwrappers incorporated ASAP. Phase unwrapping is currently missing from SciPy and Daniel has spoken very highly of your algorithms and code. The only potential issue I see involves the licensing. Both SciPy and NumPy are
Re: [Numpy-discussion] Problems building numpy on solaris 10 x86
David Cournapeau wrote: On Thu, Nov 27, 2008 at 1:16 AM, Peter Norton [EMAIL PROTECTED] wrote: On Tue, Nov 25, 2008 at 11:28 PM, David Cournapeau [EMAIL PROTECTED] wrote: Charles R Harris wrote: What happens if you go the usual python setup.py {build,install} route? Won't go far since it does not handle sunperf. David Even though the regular build process appears to complete, it seems to be doing the wrong thing. It seems, for instance, that lapack_lite.so is being built as an executable: [EMAIL PROTECTED] 11:14 ~ $ gnu file /usr/local/python-2.5.1/lib/python2.5/site-packages/numpy/linalg/lapack_lite.so /usr/local/python-2.5.1/lib/python2.5/site-packages/numpy/linalg/lapack_lite.so: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), not stripped ??? Hi, I think this is expected if python was built with one compiler and numpy with another (python with Forte and numpy with gcc). Distutils knows the options from python itself, wether it is optional in numscons (in theory, you can set it up to use python options or known configurations). Hmm, I have recently build numpy 1.2.1 on FreeBSD 7 and had trouble with lapacK_lite.so. The fix was to add a -shared flag. I needed the same fix for Cygwin. I don't think you will have much hope with distutils, unless you are ready to add code by yourself (sunperf will be very difficult to support, though). Why? What do you think makes sunperf problematic? [Not that I want to do the work, just curious :)] The numscons error has nothing to do with solaris, the scons scripts should be there. Could you give me the full output of python setupscons.py scons ? David Cheers, Michael ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] unsubscirpt
___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] Problems building numpy on solaris 10 x86
On Thu, Nov 27, 2008 at 1:38 PM, Michael Abshoff [EMAIL PROTECTED] wrote: Why? What do you think makes sunperf problematic? [Not that I want to do the work, just curious :)] I *know* it will be difficult :) The problem of sunperf is that you cannot just link a few libraries to make it work, you need to use compiler specific options like -xlic_lib=sunperf + some compiler options like align and co. Worse, at least of the versions I tried, the option does not work for shared libraries (when using the -G option). So using it with gcc is complicated. The only reason why it works in numscons is because there is a workaround ala autoconf which links sunperf to a dummy main, and I added a small linker parser which parse the output of verbose link step to get the options dynamically: http://bazaar.launchpad.net/%7Edavid-ar/numpy.scons.support/0.9/annotate/314?file_id=misc.py-20080116113453-hssst2gc3fs30vre-1 In theory, it could be added with distutils. Not that I will do it myself either, though. David ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
[Numpy-discussion] What happened to numpy-docs ?
All, I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? Thx in advance. P. ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] What happened to numpy-docs ?
On Wed, Nov 26, 2008 at 23:27, Pierre GM [EMAIL PROTECTED] wrote: All, I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? It got moved into the numpy trunk under docs/. -- Robert Kern I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth. -- Umberto Eco ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] What happened to numpy-docs ?
On Thu, Nov 27, 2008 at 2:32 PM, Robert Kern [EMAIL PROTECTED] wrote: On Wed, Nov 26, 2008 at 23:27, Pierre GM [EMAIL PROTECTED] wrote: All, I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? It got moved into the numpy trunk under docs/. While we are speaking about the moved docs: is it decided how we will distribute it ? For now, it is not included in the generated tarball, but I was wondering how we should distribute it (before, it went into .../site-packages/numpy/doc). Distutils does not have the notion of an installed doc outside the package itself, right ? cheers, David ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] What happened to numpy-docs ?
On Nov 27, 2008, at 12:32 AM, Robert Kern wrote: On Wed, Nov 26, 2008 at 23:27, Pierre GM [EMAIL PROTECTED] wrote: All, I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? It got moved into the numpy trunk under docs/. Duh... Guess I fell right at the time of the change. Robert, thx a lot! Pauli, do you think you could put your numpyext in the doc/ directory as well ? Cheers, P. ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] 2D phase unwrapping
My problem is how to calculate the power flow (free-space Poynting vector) given an image of a complex scalar electric-field image. This requires to calculate the *derivative* of the phase, and I think I found a way to do it directly bypassing phase unwrapping. 1. I may return to unwrapping if I'll have to do so. I downloaded the code from GERI, it looks like a pure C code, so it might be an easy task to bind it to python. 2. Fabrice's problem involves a smooth image. It may be not to hard to make it. I'll try to code it in the next week, and port it here if I'll succeed. 3. Does anyone know an existing python code to solve the problem presented above? I'll post my code here (when it'll be ready) if some of you are interested. Nadav. -הודעה מקורית- מאת: [EMAIL PROTECTED] בשם Jarrod Millman נשלח: ה 27-נובמבר-08 06:15 אל: Discussion of Numerical Python נושא: Re: [Numpy-discussion] 2D phase unwrapping On Tue, Nov 25, 2008 at 11:17 PM, Nadav Horesh [EMAIL PROTECTED] wrote: I read a presentation by GERI (http://www.ljmu.ac.uk/GERI) that their code is implemented in scipy, but I could not find it. One of my colleagues has been using 2D and 3D phase unwrapping code from Munther Gdeisat from GERI: https://cirl.berkeley.edu/trac/browser/bic/trunk/recon-tools/src https://cirl.berkeley.edu/trac/browser/bic/trunk/recon-tools/root/recon/punwrap This code is very high quality and replicating it from scratch would be a fairly daunting task. I was hoping to get this code integrated into SciPy, but no one in my group has had time to do this. Munther Gdeisat and I spoke on the phone and had an email exchange about relicensing his code and integrating it into SciPy. Munther was very interested in having this happen and had some discussions with the Institute Director to get permission for relicencing the code. I have appended our email exchange below. If anyone is interested in picking this up and going through the effort of incorporating this code in scipy I would be happy to help resolve any remaining licensing issues. I also may be able to devote some programming resources to helping out, if someone else volunteers to do the majority of the work. Thanks, -- Forwarded message -- From: Gdeisat, Munther [EMAIL PROTECTED] Date: Fri, Sep 28, 2007 at 1:07 PM Subject: RE: 3D phase unwrap To: Jarrod Millman [EMAIL PROTECTED] Cc: Daniel Sheltraw [EMAIL PROTECTED], Travis E. Oliphant [EMAIL PROTECTED] Dear Jarrod, On behalf of the General Engineering Research Institute (GERI), Liverpool John Moores University, UK, I am very happy to license our 2D and 3D phase unwrappers to use in your NumPy and SciPy libraries. I spoke with this matter with the director of our institute (GERI), prof. Burton, and he is also happy to license the code for both libraries mentioned above. But myself and Prof. Burton would like to stress on the following issues 1- We disclaims all responsibility for the use which is made of the Software. We further disclaim any liability for the outcomes arising from using the Software. 2-We are not obliged to update the software or give any support to the users of the software. We generally help researchers around the world but we are not obliged to do that. Following our phone call, you mentioned to me that you already have these two points mentioned in the license of both libraries. So, I can confirm you that you can include our software in your library. Yours Truly, Dr. Munther Gdeisat The General Engineering Research Institute (GERI) Liverpool John Moores University, UK From: [EMAIL PROTECTED] on behalf of Jarrod Millman Sent: Fri 9/28/2007 9:54 PM To: Gdeisat, Munther Cc: Daniel Sheltraw; Travis E. Oliphant Subject: Re: 3D phase unwrap Hello Munther, It was good to speak to you on the phone. I am happy that you will be able to relicense your code for us. Here is the license we use: http://projects.scipy.org/scipy/scipy/browser/trunk/LICENSE.txt It should address all your concerns. Feel free to let me know if you have any questions about it. Thanks, -- Jarrod Millman Computational Infrastructure for Research Labs 10 Giannini Hall, UC Berkeley phone: 510.643.4014 http://cirl.berkeley.edu/ From: [EMAIL PROTECTED] on behalf of Jarrod Millman Sent: Fri 9/28/2007 3:02 AM To: Gdeisat, Munther Cc: Daniel Sheltraw; Travis E. Oliphant Subject: Re: 3D phase unwrap On 9/26/07, Gdeisat, Munther [EMAIL PROTECTED] wrote: Firstly, I would like to than Daniel to bring us together. I am happy to include the 2D and 3D phase unwrappers in the NumPy/SciPy project. If you need any help regarding this matter such as documentation, I am happy to do so. Kind regards. Hello Munther, I am very excited about the possibility of getting your 2D and 3D phase unwrappers incorporated into SciPy (http://www.scipy.org/ http://www.scipy.org/ ). Travis Oliphant (the main author of NumPy and a
Re: [Numpy-discussion] What happened to numpy-docs ?
2008/11/27 Pierre GM [EMAIL PROTECTED]: I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? Hi Pierre, I've done a little bit of that at http://docs.scipy.org/numpy/docs/numpy-docs/reference/routines.ma.rst Which brings up the question of duplicating effort.. I have been under the impression that the documentation on the doc wiki http://docs.scipy.org/numpy/Front%20Page/ immediately (or at least very quickly) reflected changes in SVN and that changes to the docs in the wiki need to be manually checked in to SVN. Admittedly I have no good reason to make this assumption. Looking at some recent changes made to docstrings in SVN by Pierre (r6110 r6111), these are not yet reflected in the doc wiki. I guess my question is aimed at Pauli - How frequently does the doc wiki's version of SVN get updated and is this automatic or does it require manual intervention? Thanks, Scott ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] What happened to numpy-docs ?
On Wed, Nov 26, 2008 at 23:51, David Cournapeau [EMAIL PROTECTED] wrote: On Thu, Nov 27, 2008 at 2:32 PM, Robert Kern [EMAIL PROTECTED] wrote: On Wed, Nov 26, 2008 at 23:27, Pierre GM [EMAIL PROTECTED] wrote: All, I'd like to update routines.ma.rst on the numpy/numpy-docs/trunk SVN, but the whole trunk seems to be MIA... Where has it gone ? How can I (where should I) commit changes ? It got moved into the numpy trunk under docs/. While we are speaking about the moved docs: is it decided how we will distribute it ? For now, it is not included in the generated tarball, but I was wondering how we should distribute it (before, it went into .../site-packages/numpy/doc). I recommend a numpy-doc-1.x.zip file on the download site. Distutils does not have the notion of an installed doc outside the package itself, right ? Nope. -- Robert Kern I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth. -- Umberto Eco ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Re: [Numpy-discussion] What happened to numpy-docs ?
On Nov 27, 2008, at 1:39 AM, Scott Sinclair wrote: Looking at some recent changes made to docstrings in SVN by Pierre (r6110 r6111), these are not yet reflected in the doc wiki. Well, I haven't committed my version yet. I'm polishing a couple of issues with functions that are not recognized as such by inspect (because they're actually instances of a factory class). ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion