Olaf Schmidt pretended :
> "Alok Singh" <[email protected]> schrieb
> im Newsbeitrag
> news:[email protected]...
>
>> yeah that's correct Simon, its in 0.6 sec to insert
>> for 10.5K rows with 20 columns (2 files both
>> having 10.5k rows)
>
> That's the timing I would expect, if you'd have used
> Garrys recommendation (to read in the whole file
> into a String first, and then split the string into an
> Array "InMemory", finally followed by an Insert-
> Transaction, which makes use of this 2D-Array).
>
> That's memory-intensive - but "Ok" (and fast) for Testfiles
> with that RowCount (filesize of  your 10.5K-Rows
> testfiles around 4-6MB I'd guess).

<FWIW>
I just checked a 21,000 line x 30 column delimited file and it is 
817KB. I draw the line (for performance and/or convenience working with 
the data) at about 50K lines before I'd use ADO to load the entire file 
into a recordset, OR criteria-specific recordsets depending on how I 
want to work with it.

What I was hoping to learn here is whether we can dump an entire 
recordset into a SQLite table. Is that doable.

>
> Are you sure, that your replies address the person
> you have in mind ... your previous reply was going to
> Garry - and your last reply here was going to me,
> and "both of us" are not Simon (who is a very helpful
> person on this list, no doubt about that... :-).
>
> Olaf
>
>
>
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

-- 
Garry

Free usenet access at http://www.eternal-september.org
ClassicVB Users Regroup! comp.lang.basic.visual.misc



_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to