Hi Steven,
this sounds like the library I was looking for.
Would you mind reading my post
[SciPy-User] Global Curve Fitting of 2 functions to 2 sets of data-curves
http://mail.scipy.org/pipermail/scipy-user/2010-June/025674.html
?
I got many interesting answers, where apparently the agreement
Hi list,
I am new to this list, so forgive me if this is a trivial problem,
however i would appreciate any help.
I am using numpy to work with large amounts of data - sometimes too much
to fit into memory. Therefore I want to be able to store data in binary
files and use numpy to read
On Thu, Jun 17, 2010 at 4:21 AM, Simon Lyngby Kokkendorff
sil...@gmail.com wrote:
memory errors. Is there a way to get numpy to do what I want, using an
internal platform independent numpy-format like .npy, or do I have to wrap a
custom file reader with something like ctypes?
You might give
You may have a look to the nice python-h5py module, which gives an OO
interface to the underlying hdf5 file format. I'm using it for storing
large amounts (~10Gb) of experimental data. Very fast, very convenient.
Ciao
Davide
On Thu, 2010-06-17 at 08:33 -0400, greg whittier wrote:
On Thu, Jun
On Fri, Jun 11, 2010 at 8:00 AM, Geoffrey Irving irv...@naml.us wrote:
Hello,
If I create an mmap'ed array, and then generate another array
referencing only its base, destruction of the original mmap'ed array
closes the mmap. The second array is then prone to segfaults.
I think the best
On Tue, Jun 15, 2010 at 11:06 AM, Chris LeBlanc crlebl...@gmail.com wrote:
Hi,
Firstly thanks to everyone that has helped bring NumPy to the point it
is today. Its an excellent bit of software.
I've recently managed to get NumPy to compile on a 64 bit Solaris 10
(sparc) machine. We've
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays. What
I'm doing now is
import numpy as np
import struct
fd = open('file.raw', 'rb')
for n in range(1)
count = 1024*1024
a = np.array([struct.unpack('H',
On Thu, Jun 17, 2010 at 09:29, greg whittier gre...@gmail.com wrote:
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays.
np.fromfile(filename, dtype='i2')
--
Robert Kern
I have come to believe that the whole world is
A Thursday 17 June 2010 16:29:54 greg whittier escrigué:
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays. What
I'm doing now is
import numpy as np
import struct
fd = open('file.raw', 'rb')
for n in
On Thu, Jun 17, 2010 at 09:46, Francesc Alted fal...@pytables.org wrote:
A Thursday 17 June 2010 16:29:54 greg whittier escrigué:
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays. What
I'm doing now is
import numpy
Friedrich Romstedt wrote:
2010/6/13 Alan Bromborsky abro...@verizon.net:
Friedrich Romstedt wrote:
I am writing symbolic tensor package for general relativity. In making
symbolic tensors concrete
I generate numpy arrays stuffed with sympy functions and symbols.
That
On Thu, Jun 17, 2010 at 10:41 AM, Robert Kern robert.k...@gmail.com wrote:
On Thu, Jun 17, 2010 at 09:29, greg whittier gre...@gmail.com wrote:
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays.
np.fromfile(filename,
On Thu, Jun 17, 2010 at 3:29 PM, greg whittier gre...@gmail.com wrote:
I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays. What
I'm doing now is
import numpy as np
import struct
fd = open('file.raw', 'rb')
for n
On Thu, Jun 17, 2010 at 12:11 PM, Peter
numpy-discuss...@maubp.freeserve.co.uk wrote:
On Thu, Jun 17, 2010 at 3:29 PM, greg whittier gre...@gmail.com wrote:
I'm unclear if you want a numpy array or a standard library array,
but can you exploit the fact that struct.unpack returns a tuple? e.g.
Hello,
I'm trying to find a way to compute the normals of a mesh (vertices + indices)
using only vectorized computations and I wonder if anyone already did that.
Here my code so far:
# Mesh generation + indices for triangles
n = 24
vertices = numpy.zeros(((n*n),3), dtype=numpy.float32)
Hi,
I noticed that when using a Framework based python under Mac.
numpy nicely gets installed into the Python.Framework directory by default.
But then if I use:
from numpy.distutils.core import setup, Extension
It installed under the regular:
${prefix}/lib/python-2.6/site-packages
which
Thanks for the references to these libraries - they seem to fix my problem!
Cheers,
Simon
On Thu, Jun 17, 2010 at 2:58 PM, davide lasagnadav...@gmail.com wrote:
You may have a look to the nice python-h5py module, which gives an OO
interface to the underlying hdf5 file format. I'm using it
I have a 1D array with 100k samples that I would like to reduce by
computing the min/max of each chunk of n samples. Right now, my
code is as follows:
n = 100
offset = array.size % downsample
array_min = array[offset:].reshape((-1, n)).min(-1)
array_max = array[offset:].reshape((-1, n)).max(-1)
2010/6/17 Charles سمير Doutriaux doutria...@llnl.gov:
Hi,
I noticed that when using a Framework based python under Mac.
numpy nicely gets installed into the Python.Framework directory by default.
But then if I use:
from numpy.distutils.core import setup, Extension
It installed under the
19 matches
Mail list logo