I believe in patches and working code. You're proposing to compete with the likes of sqlite and berkeley db -- not small competition, with excellent performance characteristics when used properly.
You also used the 'b'illions word in reference to data sets - really? best, --e On Tue, Dec 23, 2014 at 11:31 AM, joanv <joan.igles...@live.com> wrote: > Dear all, > > I'm developing a new database with the ability to perform very fast seek, > insert, and delete operations. Also is able to perform very fast comparison > of datasets. It has been designed to work embedded into other programs > programmed in R, Fortran, C++, etc. > > It can manage efficiently billions of numeric datasets in a single machine. > > Right now I do not know in what fields of the R community could be helpful > such a database, or if there could be a need of such a capability in the R > community. > > Could someone help me in this topic? Partners for the project are also > wanted, specially R experts, or experts on other kinds of calculation > programs (vasp, gaussian, etc... ) > > Regards and thank you. > > > > > -- > View this message in context: > http://r.789695.n4.nabble.com/how-useful-could-be-a-fast-and-embedded-database-for-the-R-community-tp4701051.html > Sent from the R devel mailing list archive at Nabble.com. > > ______________________________________________ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel