Alok Singh wrote on 1/7/2011 :
> Hi Guys ,
>
> I again stuck with this. kindly give your suggestion.
> Its taking time for to insert 65k of docs around 20 min.
> kindly have a look and help me on my basic code.hope this will help you all
> over my problem.
>  i think something going to have huge memory or becoming lengthy here, so
> after going
> 10000 in few sec
> next 10001 to 20000 in 1min
> next 20001 to 30000 in more than 1.5 min
> next 30001 to 40000 in more than 3 min
> next 40001 to 50000 in more than 6 min
> next 50001 to 60000 in more than 8 min
> 600001 to 65042 in 3 min
> ------------------------------------------------------------------------
> so 1 to 65042 rows from 2files in 20 mins (SUCKS)
> As earleir when i was simply inserting it was taking same 20 min....so i try
> to make it insertion with 10k as it was faster but.....................
> KINDLY have a look on my Basic Code are here , something getting big memory
> and getting lengthy , plzz advice me soon.:
>
> insert:
>         Dim tx = cons.BeginTransaction()
>         For y = j To x
>             Dim strval
>             strval = Replace(yarray(y) + vbTab + yarray1(y), "'", "''")
>             strval = Replace(strval, vbTab, "','")
>             myparam.Value = strval
>             cmd.CommandText = "INSERT into " & ticket & " VALUES('" & strval
> & "')"
>             cmd.Parameters.Add(myparam)
>             cmd.ExecuteNonQuery()
>
>         Next
>         tx.Commit()
>         j = x + 1
>         x = x + 10000
>         If x < Frow1 Then
>             GoTo insert
>         Else
>
>             Dim m As Integer
>             m = x - 10000 + 1
>             For y = m To Frow1
>
>                 Dim strval
>                 strval = Replace(yarray(y) + vbTab + yarray1(y), "'", "''")
>                 strval = Replace(strval, vbTab, "','")
>                 myparam.Value = strval
>                 cmd.CommandText = "INSERT into " & ticket & " VALUES('" &
> strval & "')"
>                 cmd.Parameters.Add(myparam)
>                 cmd.ExecuteNonQuery()
>             Next
>             tx.Commit()
>         End If
>
>
>
>
> On 6 January 2011 21:21, Olaf Schmidt 
> <[email protected]> wrote:
>
>> 
>> "Alok Singh" <[email protected]> schrieb
>> im Newsbeitrag
>> news:[email protected]...
>> 
>>> yeah that's correct Simon, its in 0.6 sec to insert
>>> for 10.5K rows with 20 columns (2 files both
>>> having 10.5k rows)
>> 
>> That's the timing I would expect, if you'd have used
>> Garrys recommendation (to read in the whole file
>> into a String first, and then split the string into an
>> Array "InMemory", finally followed by an Insert-
>> Transaction, which makes use of this 2D-Array).
>> 
>> That's memory-intensive - but "Ok" (and fast) for Testfiles
>> with that RowCount (filesize of  your 10.5K-Rows
>> testfiles around 4-6MB I'd guess).
>> 
>> Are you sure, that your replies address the person
>> you have in mind ... your previous reply was going to
>> Garry - and your last reply here was going to me,
>> and "both of us" are not Simon (who is a very helpful
>> person on this list, no doubt about that... :-).
>> 
>> Olaf
>> 
>> 
>> 
>> _______________________________________________
>> sqlite-users mailing list
>> [email protected]
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

I agree with Simon here about the coding (looks like VB.Net) being not 
very efficient, and about using a 2nd array. Put your file contents 
into one array. Loop through that array one record at a time and use a 
temp array to process each record. This should look similar to the 
example code I posted earlier (..just revised for use as VB.Net).

What I don't understand is why you put strvl into myparam to pass into 
cmd.Parameters.Add as well as put the same value into cmd.CommandText.

Also, it appears to me that you're substituting vbTab with a comma 
wrapped in single quotes. What's the purpose for that? (I'm guessing it 
requires string literals) My understanding (from reading elsewhere) is 
that SQLite recognizes the tab. Pardon my lack of knowledge about using 
SQLite but that seems like a place to save some time.

Try revising your code something like:

  For y = 1 To UBound(FileArray)
    RecordArray = Split(FileArray(y), vbTab)
    'Process this record from here and move on
  Next 'y

**It also appears that you're passing a delimited string containing the 
entire record rather than each field of a record. If this is doable 
then efficient coding gets even easier...

  For y = 1 To UBound(asFileArray)

    'what you're doing now
    sRecord = Replace(asFileArray(y), vbTab, "','")
    cmd.CommandText = "INSERT into " & ticket _
                    & " VALUES('" & sRecord & "')"
    cmd.Parameters.Add(sRecord) '//what purpose?

    =========================================================
    'OR...If SQLite accepts the tab

    cmd.CommandText = "INSERT into " & ticket & " VALUES('" _
                    & asFileArray(y) & "')"
    cmd.Parameters.Add(asFileArray(y)) '//what purpose?
    =========================================================

    cmd.ExecuteNonQuery()
  Next 'y
  tx.Commit()

Since you appear to be using VB.Net then you might want to switch to 
using ADO for these larger files. See Olaf's comments and 
recommendation for committing an ADOrs to SQLite. It's quite fast, but 
basically does the same thing as reading from an array in that it 
inserts each record from the ADOrs using MoveNext. Same pricipal - 
different approach.

Personally, I find using arrays much faster than ADO, but ADO is more 
efficient in that the data can be filter into easier to manage sizes. 
Searching is also better if your text file[s] aren't set up to simulate 
tables in a single db. (Each file is a table in ADO)

-- 
Garry

Free usenet access at http://www.eternal-september.org
ClassicVB Users Regroup! comp.lang.basic.visual.misc



_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to