Hi Jenny!
So if i understand your datafile corect you have 960 case for a year. Any
you have 43 years.. Yes?
I'm not sure you should use correlation in this situation because of the
autocorrelation of the data. There are big autocorrelation on spatial data's
like what you use, and there are also
:
UweL Zoltan Kmetty wrote:
Hi!
I had some memory problem with R - hope somebody could
tell me a solution.
I work with very large datasets, but R cannot allocate
enough memoty to handle these datasets.
I want work a matrix with row= 100 000 000 and column
Hi Milton!
I don't know why, but this thing happend with me too, quite a lot times.
It's a useful way that convert the value type like that:
*as.numeric(as.character(min(...)))*
Zoltan
2007/1/9, Milton Cezar Ribeiro [EMAIL PROTECTED]:
Hi R-friends
I donĀ“t know why the min() function below
Hi Benjamin!
##TRY THIS: THIS MAYBE MUCH FASTER, BECAUSE ONLY WORK WITH THE IMPORTANAT
ROWS - HOPE NO ERROR IN IT:D
puffer1 - as.matrix(sdata$value)
puffer2 - rbind(as.matrix(puffer1[2:nrow(puffer1),1]),0)
speedy - puffer1 puffer2
speedy - (as.matrix(which(puffer1==TRUE)))+1
sdata$ddtd[]=0
#Try:
Write.table(...)
#Zoltan
2007/1/8, Benjamin Dickgiesser [EMAIL PROTECTED]:
Hi all,
Is there a function to export a dataframe to a text file?
I want to store a large set of data which I have saved in a dataframe
in my workspace and copy and past doesn't cut it.
Thank you,
Benjamin
Hi!
I had some memory problem with R - hope somebody could tell me a solution.
I work with very large datasets, but R cannot allocate enough memoty to
handle these datasets.
I want work a matrix with row= 100 000 000 and column=10
A know this is 1 milliard cases, but i thought R could handle