Hi,
I'm using Sage scripts to run tasks in batch. They look like:
def dostuff(X):
result = [X] #plus irrelevant calculations
return result
print dostuff([1,2,3,4,n])
This worked fine for all recent data sets with an input array size of 1.0 -
2.4M records. However Sage crashes if larger data sets (3.5M - 10.0M)are used
with segmentation faults. Is the way I'm using Sage terribly wrong or did I
reached some hard boundaries and does it make sense that it crashes? Or does it
seem to be a bug? Thanks for your help.
Cheers,
Jeroen
Ubuntu 12.04 LTS x64 on Intel Xeon E3 with 24GB RAM
Sage Version 6.1, Release Date: 2014-01-30 (GIT)
--
You received this message because you are subscribed to the Google Groups
"sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/groups/opt_out.