This doesn't sound like a question we can answer for you, as it would likely depend on your RAM, virtual memory, OS, and other processes. I would be willing to bet you can load one record at a time. You already know that ten million won't work. Why don't you read http://en.m.wikipedia.org/wiki/Binary_search_algorithm? --------------------------------------------------------------------------- Jeff Newmiller The ..... ..... Go Live... DCN:<jdnew...@dcn.davis.ca.us> Basics: ##.#. ##.#. Live Go... Live: OO#.. Dead: OO#.. Playing Research Engineer (Solar/Batteries O.O#. #.O#. with /Software/Embedded Controllers) .OO#. .OO#. rocks...1k --------------------------------------------------------------------------- Sent from my phone. Please excuse my brevity.
arunkumar1111 <akpbond...@gmail.com> wrote: >Hi > >I'm using the dbwrite to insert a large dataset. There are about 10 >million >rows. But i'm not able to load the records. >Please can anyone tell me the way to load or maximum number of records >that >dbwrite does. so that i can load it in batches > >----- >Thanks in Advance > Arun >-- >View this message in context: >http://r.789695.n4.nabble.com/what-is-the-maximum-number-of-records-that-we-can-load-using-dbwrite-tp4572849p4572849.html >Sent from the R help mailing list archive at Nabble.com. > >______________________________________________ >R-help@r-project.org mailing list >https://stat.ethz.ch/mailman/listinfo/r-help >PLEASE do read the posting guide >http://www.R-project.org/posting-guide.html >and provide commented, minimal, self-contained, reproducible code. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.