Well I won't recommend R with anything more than a few GBs, but it
depends on what you are trying to do. Only reading this amount data
into the memory takes a lot of time, but then again, depends on how
much time you have.

As a general suggestion, you might want to move away from the regular
read.csv() approach if you have huge delimited data; try using
data.table instead.

On 8 January 2013 11:27, Gaurav Paliwal <[email protected]> wrote:
> Hi all,
>
> Is there anyone who has used R for statistical programming ? If yes, then
> what was your experience about R for handling dataset > 1 TB.
>
> SAS seems to be good in handling dataset of couple of terabytes but its
> closed source.
>
> --
> With regards,
> Gaurav Paliwal
>
> --
> Mailing list guidelines and other related articles:
> http://lug-iitd.org/Footer



-- 
Chirag Anand
http://atvariance.in

-- 
Mailing list guidelines and other related articles: http://lug-iitd.org/Footer

Reply via email to