I was trying to see if I could reproduce this problem, but your code fails
with numpy 1.6.1 with:
AttributeError: 'numpy.ndarray' object has no attribute 'H'
Is X supposed to be a regular ndarray with dtype = 'complex128', or
something else?
-=- Olivier
2011/12/5 kneil magnetotellur...@gmail.com
Hi Nathaniel,
The results of running memtest was a pass with no errors.
-Karl
Nathaniel Smith wrote:
(You should still run memtest. It's very easy - just install it with your
package manager, then reboot. Hold down the shift key while booting, and
you'll get a boot menu. Choose memtest,
Hi Nathaniel,
Thanks for the suggestion. I more or less implemented it:
np.save('X',X);
X2=np.load('X.npy')
X2=np.asmatrix(X2)
diffy = (X != X2)
if diffy.any():
print X[diffy]
print X2[diffy]
print
OK - here is the status update on this problem:
1. To test for bad RAM, I saved the code plus data onto a usb drive and
manually transferred it to a colleague's desktop machine with same specs as
mine. The NaN values continued to appear at random, so it is unlikely to be
bad RAM - unless its
If save/load actually makes a reliable difference, then it would be useful
to do something like this, and see what you see:
save(X, X)
X2 = load(X.npy)
diff = (X == X2)
# did save/load change anything?
any(diff)
# if so, then what changed?
X[diff]
X2[diff]
# any subtle differences in floating
Le 01/12/2011 02:44, Karl Kappler a écrit :
Also note that I have had a similar problem with much smaller arrays,
say 24 x 3076
Hi Karl,
Could you post a self-contained code with such a small array (or even
smaller. the smaller, the better...) so that we can run it and play with
it ?
--
Hi Oliver, indeed that was a typo, I should have used cut and paste. I was
using .transpose()
Olivier Delalleau-2 wrote:
I guess it's just a typo on your part, but just to make sure, you are
using
.transpose(), not .transpose, correct?
-=- Olivier
2011/11/30 Karl Kappler
Hi Pierre,
I was thinking about uploading some examples but strangely, when I store the
array using for example: np.save('Y',Y)
and then reload it in a new workspace, I find that the problem does not
reproduce. It would seem somehow to be
associated with the 'overhead' of the workspace I am
On Thu, Dec 1, 2011 at 2:47 PM, kneil magnetotellur...@gmail.com wrote:
Hi Pierre,
I was thinking about uploading some examples but strangely, when I store
the
array using for example: np.save('Y',Y)
and then reload it in a new workspace, I find that the problem does not
reproduce. It
Hi Pierre,
I confirmed with the guy who put together the machine that it is non-ECC
RAM. You know, now that i think about it, this machine seems to crash a
fair amount more often than its identical twin which sits on a desk near me.
I researched memtest a bit... downloaded and compiled it, but
Hello,
I am somewhat new to scipy/numpy so please point me in the right direction
if I am posting to an incorrect forum.
The experience which has prompted my post is the following:
I have a numpy array Y where the elements of Y are
type(Y[0,0])
Out[709]: type 'numpy.complex128'
The absolute
I guess it's just a typo on your part, but just to make sure, you are using
.transpose(), not .transpose, correct?
-=- Olivier
2011/11/30 Karl Kappler magnetotellur...@gmail.com
Hello,
I am somewhat new to scipy/numpy so please point me in the right direction
if I am posting to an incorrect
12 matches
Mail list logo