Hi all,
How can I convert a list of arrays into one array ?
Nils
data
[array([ 40. , 285.6, 45. , 285.3, 50. , 285.1,
55. , 284.8]), array([ 60. , 284.5, 65. , 282.8,
70. , 281.1, 75. , 280. ]), array([ 80. , 278.8,
85. , 278.1, 90. , 277.4, 95. , 276.9]),
On 5/11/2009 6:28 AM Nils Wagner apparently wrote:
How can I convert a list of arrays into one array ?
Do you mean one long array, so that ``concatenate``
is appropriate, or a 2d array, in which case you
can just use ``array``.
But your example looks like you should preallocate the
larger array
On Mon, 11 May 2009 06:54:45 -0400
Alan G Isaac ais...@american.edu wrote:
On 5/11/2009 6:28 AM Nils Wagner apparently wrote:
How can I convert a list of arrays into one array ?
Do you mean one long array, so that ``concatenate``
is appropriate, or a 2d array, in which case you
can just
A Monday 11 May 2009, Nils Wagner escrigué:
Hi all,
How can I convert a list of arrays into one array ?
Nils
data
[array([ 40. , 285.6, 45. , 285.3, 50. , 285.1,
55. , 284.8]), array([ 60. , 284.5, 65. , 282.8,
70. , 281.1, 75. , 280. ]), array([ 80. , 278.8,
Hi all,
Please consider two strings
line_a
'12345678abcdefgh12345678'
line_b
'12345678 abcdefgh 12345678'
line_b.split()
['12345678', 'abcdefgh', '12345678']
Is it possible to split line_a such that the output
is
['12345678', 'abcdefgh', '12345678']
Nils
Hi all,
Can someone reproduce the following failure ?
I am using
numpy.__version__
'1.4.0.dev6983'
==
FAIL: Test bug in reduceat with structured arrays copied
for speed.
A Monday 11 May 2009, Nils Wagner escrigué:
Hi all,
Please consider two strings
line_a
'12345678abcdefgh12345678'
line_b
'12345678 abcdefgh 12345678'
line_b.split()
['12345678', 'abcdefgh', '12345678']
Is it possible to split line_a such that the output
is
['12345678',
A Monday 11 May 2009, Francesc Alted escrigué:
Although regular expressions seems a bit thought to learn, they will
^^^ -- tough :-\
--
Francesc Alted
One would expect people to feel threatened by the 'giant
brains or machines that think'. In
On Mon, 11 May 2009 14:25:46 +0200
Francesc Alted fal...@pytables.org wrote:
A Monday 11 May 2009, Nils Wagner escrigué:
Hi all,
Please consider two strings
line_a
'12345678abcdefgh12345678'
line_b
'12345678 abcdefgh 12345678'
line_b.split()
['12345678', 'abcdefgh',
On Monday 11 May 2009 14:36:17 Nils Wagner wrote:
On Mon, 11 May 2009 14:25:46 +0200
Francesc Alted fal...@pytables.org wrote:
A Monday 11 May 2009, Nils Wagner escrigué:
Hi all,
Please consider two strings
line_a
'12345678abcdefgh12345678'
line_b
'12345678 abcdefgh
Mon, 11 May 2009 14:06:07 +0200, Nils Wagner kirjoitti:
Can someone reproduce the following failure ? I am using
numpy.__version__
'1.4.0.dev6983'
==
FAIL: Test bug in reduceat with structured arrays copied for speed.
hi,
here is my workaround.
from numpy import arange
line_a = '11.122.233.3' # without
separator
line_b = '11.1 22.2 33.3' # including space
as a delimiter
div, mod = divmod(len(line_a),8)
liste = []
for j in arange(0,div):
On Mon, 11 May 2009 14:05:13 + (UTC)
Pauli Virtanen p...@iki.fi wrote:
Mon, 11 May 2009 14:06:07 +0200, Nils Wagner kirjoitti:
Can someone reproduce the following failure ? I am using
numpy.__version__
'1.4.0.dev6983'
On 5/11/2009 8:03 AM Nils Wagner apparently wrote:
line_a
'12345678abcdefgh12345678'
Is it possible to split line_a such that the output
is
['12345678', 'abcdefgh', '12345678']
More of a comp.lang.python question, I think:
out = list()
for k, g in groupby('123abc456',lambda x:
Mon, 11 May 2009 16:22:37 +0200, Nils Wagner kirjoitti:
On Mon, 11 May 2009 14:05:13 + (UTC)
Pauli Virtanen p...@iki.fi wrote:
Mon, 11 May 2009 14:06:07 +0200, Nils Wagner kirjoitti:
Can someone reproduce the following failure ? I am using
numpy.__version__
'1.4.0.dev6983'
On 5/11/2009 8:36 AM Nils Wagner apparently wrote:
I would like to split strings made of digits after eight
characters each.
[l[i*8:(i+1)*8] for i in range(len(l)/8)]
Alan Isaac
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
at least I think this is strange behavior.
When convolving an image with a large kernel, its know that its faster to
perform the operation as multiplication in the frequency domain. The below
code example shows that the results of my 2d filtering are shifted from the
expected value a distance 1/2
David Cournapeau wrote:
On Wed, May 6, 2009 at 3:03 PM, Christopher Barker
The binary for OS-X on sourceforge is called:
numpy-1.3.0-py2.5-macosx10.5.dmg
However, as far as I can tell, it works just fine on OS-X 10.4, and
maybe even 10.3.9.
I have to confess I don't understand mac os x
On Mon, May 11, 2009 at 9:40 AM, Chris Colbert sccolb...@gmail.com wrote:
at least I think this is strange behavior.
When convolving an image with a large kernel, its know that its faster to
perform the operation as multiplication in the frequency domain. The below
code example shows that
Hi Chris
2009/5/11 Chris Colbert sccolb...@gmail.com:
When convolving an image with a large kernel, its know that its faster to
perform the operation as multiplication in the frequency domain. The below
code example shows that the results of my 2d filtering are shifted from the
expected value
Stefan,
Did I pad my example incorrectly? Both images were upped to the larger
nearest power of 2 (256)...
Does the scipy implementation do this differently? I thought that since FFTW
support has been dropped, that scipy and numpy use the same routines...
Thanks!
Chris
2009/5/11 Stéfan van
Hi Chris,
If you have MxN and PxQ signals, you must pad them to shape M+P-1 x
N+Q-1, in order to prevent circular convolution (i.e. values on the
one end sliding back in at the other).
Regards
Stéfan
2009/5/11 Chris Colbert sccolb...@gmail.com:
Stefan,
Did I pad my example incorrectly? Both
2009/5/11 Chris Colbert sccolb...@gmail.com:
Does the scipy implementation do this differently? I thought that since FFTW
support has been dropped, that scipy and numpy use the same routines...
Just to be clear, I was referring to scipy.signal.fftconvolve, not
scipy's FFT (which is the same as
Thanks Stefan.
2009/5/11 Stéfan van der Walt ste...@sun.ac.za
2009/5/11 Chris Colbert sccolb...@gmail.com:
Does the scipy implementation do this differently? I thought that since
FFTW
support has been dropped, that scipy and numpy use the same routines...
Just to be clear, I was
Hi, Francesc:
The codes do not work. Guess you forgot something there.
Thanks.
Wei Su
--- On Mon, 5/11/09, Francesc Alted fal...@pytables.org wrote:
From: Francesc Alted fal...@pytables.org
Subject: Re: [Numpy-discussion] List of arrays
To: Discussion of Numerical Python
Hi, All,
Coming from SAS and R, this is probably the first thing I want to do now that I
can convert my data into record arrays. But I could not find any clues after
googling for a while. Any hint or suggestions will be great!
Thanks a lot.
Wei Su
Wei Su wrote:
The codes do not work. Guess you forgot something there.
l wasn't defined:
In [16]: a = np.arange(10)
In [17]: b = np.arange(5)
In [20]: l = [a,b]
In [21]: l
Out[21]: [array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), array([0, 1, 2, 3, 4])]
In [22]: np.concatenate(l)
Out[22]: array([0,
On May 11, 2009, at 5:44 PM, Wei Su wrote:
Coming from SAS and R, this is probably the first thing I want to do
now that I can convert my data into record arrays. But I could not
find any clues after googling for a while. Any hint or suggestions
will be great!
That depends what you
Hi, Pierre:
Thanks for the reply. I can now actually turn a big list into a record array.
My question is actually how to join related record arrays in Python. This is
done in SAS by MERGE and PROC SQL and by merge() in R. But I have no idea how
to do it in Python.
Thanks.
Wei Su
--- On
for use in binary distribution where I need only basics and fast
startup/low memory footprint, I try to isolate the minimal ndarray
type and what I need..
with import numpy or import numpy.core.multiarray almost the
whole numpy package tree is imported, _dotblas etc.
cxFreeze produces some
On May 11, 2009, at 6:18 PM, Wei Su wrote:
Thanks for the reply. I can now actually turn a big list into a
record array. My question is actually how to join related record
arrays in Python.. This is done in SAS by MERGE and PROC SQL and by
merge() in R. But I have no idea how to do it
On Mon, May 11, 2009 at 6:18 PM, Wei Su taste_o...@yahoo.com wrote:
Hi, Pierre:
Thanks for the reply. I can now actually turn a big list into a record
array. My question is actually how to join related record arrays in Python..
This is done in SAS by MERGE and PROC SQL and by merge() in R.
Hey guys,
I've got a small C extension that uses isnan() and (in numpy 1.1) had
been importing it from ufuncobject.h. I see that it has now moved
into npy_math.h in 1.3.
What is the best way to ensure that I can reliably include this
function across versions 1.1, 1.2, and 1.3? (Checking
On May 11, 2009, at 6:36 PM, Skipper Seabold wrote:
On Mon, May 11, 2009 at 6:18 PM, Wei Su taste_o...@yahoo.com wrote:
Hi, Pierre:
Thanks for the reply. I can now actually turn a big list into a
record
array. My question is actually how to join related record arrays in
Python..
On Mon, May 11, 2009 at 4:49 PM, Peter Wang pw...@enthought.com wrote:
Hey guys,
I've got a small C extension that uses isnan() and (in numpy 1.1) had
been importing it from ufuncobject.h. I see that it has now moved
into npy_math.h in 1.3.
What is the best way to ensure that I can
Charles R Harris wrote:
On Mon, May 11, 2009 at 4:49 PM, Peter Wang pw...@enthought.com
mailto:pw...@enthought.com wrote:
Hey guys,
I've got a small C extension that uses isnan() and (in numpy 1.1) had
been importing it from ufuncobject.h. I see that it has now moved
Hi Robert,
Robert wrote:
for use in binary distribution where I need only basics and fast
startup/low memory footprint, I try to isolate the minimal ndarray
type and what I need..
with import numpy or import numpy.core.multiarray almost the
whole numpy package tree is imported, _dotblas
37 matches
Mail list logo