Hi Guys ,
I again stuck with this. kindly give your suggestion.
Its taking time for to insert 65k of docs around 20 min.
kindly have a look and help me on my basic code.hope this will help you all
over my problem.
i think something going to have huge memory or becoming lengthy here, so
after going
10000 in few sec
next 10001 to 20000 in 1min
next 20001 to 30000 in more than 1.5 min
next 30001 to 40000 in more than 3 min
next 40001 to 50000 in more than 6 min
next 50001 to 60000 in more than 8 min
600001 to 65042 in 3 min
------------------------------------------------------------------------
so 1 to 65042 rows from 2files in 20 mins (SUCKS)
As earleir when i was simply inserting it was taking same 20 min....so i try
to make it insertion with 10k as it was faster but.....................
KINDLY have a look on my Basic Code are here , something getting big memory
and getting lengthy , plzz advice me soon.:
insert:
Dim tx = cons.BeginTransaction()
For y = j To x
Dim strval
strval = Replace(yarray(y) + vbTab + yarray1(y), "'", "''")
strval = Replace(strval, vbTab, "','")
myparam.Value = strval
cmd.CommandText = "INSERT into " & ticket & " VALUES('" & strval
& "')"
cmd.Parameters.Add(myparam)
cmd.ExecuteNonQuery()
Next
tx.Commit()
j = x + 1
x = x + 10000
If x < Frow1 Then
GoTo insert
Else
Dim m As Integer
m = x - 10000 + 1
For y = m To Frow1
Dim strval
strval = Replace(yarray(y) + vbTab + yarray1(y), "'", "''")
strval = Replace(strval, vbTab, "','")
myparam.Value = strval
cmd.CommandText = "INSERT into " & ticket & " VALUES('" &
strval & "')"
cmd.Parameters.Add(myparam)
cmd.ExecuteNonQuery()
Next
tx.Commit()
End If
On 6 January 2011 21:21, Olaf Schmidt <[email protected]> wrote:
>
> "Alok Singh" <[email protected]> schrieb
> im Newsbeitrag
> news:[email protected]...
>
> > yeah that's correct Simon, its in 0.6 sec to insert
> > for 10.5K rows with 20 columns (2 files both
> > having 10.5k rows)
>
> That's the timing I would expect, if you'd have used
> Garrys recommendation (to read in the whole file
> into a String first, and then split the string into an
> Array "InMemory", finally followed by an Insert-
> Transaction, which makes use of this 2D-Array).
>
> That's memory-intensive - but "Ok" (and fast) for Testfiles
> with that RowCount (filesize of your 10.5K-Rows
> testfiles around 4-6MB I'd guess).
>
> Are you sure, that your replies address the person
> you have in mind ... your previous reply was going to
> Garry - and your last reply here was going to me,
> and "both of us" are not Simon (who is a very helpful
> person on this list, no doubt about that... :-).
>
> Olaf
>
>
>
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
--
Thanks
Alok Kumar Singh
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users