Hello list, I have 120,000 geocoded observations, for which I'm trying to create a distance-based spatial weighting matrix so that I can perform a Moran test.
Each observation has Lat and Lon. Unfortunately, when I run dists <- as.matrix(dist(cbind(Lon, Lat))) I get the message: Error in vector("double", length) : vector size specified is too large Now I realize that 120,000^2 / 2 is on the order of 6 GB. However, I seem to be running into software limitations on the vector size before I hit RAM limitations. Also, in principle, it should be possible (though slow) to use hard disk space to store this matrix. Does anyone have any ideas on how to do this in R? Thanks, ------------------------ Aleksandr Andreev Graduate Student - Department of Economics University of North Carolina at Chapel Hill _______________________________________________ R-sig-Geo mailing list R-sig-Geo@stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-geo