Hi Simon,
Thanks alot for your kind response.
i am doing the same inserting line by line using Transaction but its like
dead code as its very slow where as my system config is quite heavy then
also.
look into my code please : acc. to me i am doing same as you mean :
Public Function insert_sqlite(ByVal c1 As ComboBox, ByVal T1 As TextBox)
Dim fileStream As IO.FileStream
Dim streamReader As IO.StreamReader
cons.ConnectionString = "Data
Source=C:\sqlite\LFBDB\SQLITE_LFBDB.sqlite; Version=3;Extended
Properties='text;HDR=Yes;FMT=Delimited'"
cons.Open()
Dim tx = cons.BeginTransaction()
'OPEN THE PHYSICAL STREAM
fileStream = New IO.FileStream(T1.Text, IO.FileMode.Open)
'OPEN THE LOGICAL STREAM
streamReader = New IO.StreamReader(fileStream)
Dim filetext As String = streamReader.ReadToEnd
Dim yarray() As String
Dim xarray() As String
yarray = filetext.Split(Environment.NewLine) 'slipt the file at line
breaks: each line into each array segment'
Dim Frow1 As Integer
Dim Frow As Integer
Frow1 = yarray.GetUpperBound(Frow)
Dim fcol As Integer
fcol = (yarray(0).Split(vbTab)).Length - 1
Dim FileArray(fcol, Frow1) As String
Dim Filecol(fcol, Frow1) As String
Dim x, y As Integer
'Dim i As Integer
For y = 0 To Frow1
For x = 0 To fcol
xarray = yarray(y).Split(vbTab) 'split into array segments from
,'s
FileArray(x, y) = xarray(x)
Filecol(x, y) = FileArray(x, y)
Dim myparam As New SQLite.SQLiteParameter()
myparam.Value = Filecol(x, 0)
Dim myparam2 As New SQLite.SQLiteParameter()
myparam2.Value = FileArray(x, y)
Dim myparam3 As New SQLite.SQLiteParameter()
myparam3.Value = Filecol(0, y)
If FileArray(x, y) = FileArray(x, 0) Then
ElseIf FileArray(x, y) = FileArray(0, y) Then
cmd.CommandText = "INSERT into " & ticket & " (" &
Filecol(x, 0) & ") VALUES(" & FileArray(x, y) & ")"
cmd.ExecuteNonQuery()
cmd.Parameters.Add(myparam)
cmd.Parameters.Add(myparam2)
'How to set all parameters? myparam.Value
' tx.Commit()
Else
Dim strval
strval = Replace(FileArray(x, y), "'", "''")
cmd.CommandText = "UPDATE " & ticket & " SET " &
Filecol(x, 0) & " ='" & strval & "' where DOCID =" & Filecol(0, y) & ""
cmd.ExecuteNonQuery()
cmd.Parameters.Add(myparam)
cmd.Parameters.Add(myparam2)
cmd.Parameters.Add(myparam3)
End If
Next x
Next y
fileStream.Close()
tx.Commit()
cons.Close()
End Function
please advice me good also if there is any better solution for this.
as my tab dilimited file norally will having 1 million of record. its deadly
if we go through this. as for 6k for test its taking 30 min to insert.
while if i am direct importing that into sqlite using user interface on
firefox its completing that task in some sec.
please help me on this. i am in need.
Regards,
Alok
On 3 January 2011 03:51, Simon Slavin <[email protected]> wrote:
>
> On 2 Jan 2011, at 10:11pm, Simon Slavin wrote:
>
> > It will take less memory and be faster if you do it line by line: read
> one line of the .csv file, then write one row to your SQLite table, then
> read the next line.
>
> Sorry, I forgot: SQLite will make changes faster if you bundle them all
> together in one TRANSACTION:
>
> CREATE TABLE ...
> BEGIN TRANSACTION
> INSERT ...
> INSERT ...
> INSERT ...
> END TRANSACTION
>
> Simon.
> _______________________________________________
> sqlite-users mailing list
> [email protected]
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>
--
Thanks
Alok Kumar Singh
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users