Hello to the list, I'm afraid this is a non-specific question.
I've been using data.table for only a couple of weeks on various spatial, economic and agro-ecological data with many individual sets reaching over 3 millions records of numeric data -- but I'm working on very tight deadlines and right now don't have the luxury to test any R library extensively. I see data.table was first released in March 2010, not terribly cited when searching through Google, and I'm very worried about its overall robustness. I'm pretty much using R and data.table as a more flexible and elegant alternative to SAS and SQL to join, slice and dice datasets according to sometimes complex aggregation formulas. I've been very impressed at the run times, but some initial code produced very puzzling results and I'm at the point of comparing output with a purely SQL-based approach (much more convoluted, but I thought a good way to test nonetheless). My question is: is data.table ready for production? Would you rely on it for sensitive publications? Thanks for sharing, --Mel. -- View this message in context: http://r.789695.n4.nabble.com/Is-data-table-ready-for-prime-time-and-sensitive-work-tp3075753p3075753.html Sent from the datatable-help mailing list archive at Nabble.com. _______________________________________________ datatable-help mailing list [email protected] https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/datatable-help
