Jian Lin wrote:
> ok, the following code with 6000 records insert will take 13.3 seconds 
> to finish (just for the db portion).  if i change 6000 to 30000 then it 
> will take 67 seconds.
> 

OK, I created the table and the index and ran your code for 30000 
records, but I wrapped the database part in a Benchmark.measure{}

Using MySQL:

For 30000 inserts with no indexes:

"Starting inserting records"
  31.996000   0.639000  32.635000 ( 35.356000)
30000

For 30000 inserts with all indexes:

"Starting inserting records"
  32.795000   0.982000  33.777000 ( 37.103000)
30000

That's 33 seconds of CPU time with 37 seconds elapsed on a quad core 2.4 
GHz with a Seagate Barracuda SATA drive with 32MB cache.

As you can see, it's pretty much all in CPU time!!!!


The result was essentially the same in Postgres and MS SQL server!

So you can forget about the database itself. None of the database 
engines were unduly taxed by the test.

Just for fun I changed the program to output the data as SQL INSERT 
statements and then run that (with 30000 inserts wrapped in a 
transaction) against MySQL.

Imported in 1.2 seconds!!

I Don't know if it is the hash lookup code or ActiveRecord that is 
gobbling up the time, but it certainly isn't the database.

You'll need to tinker with, or better profile your code to find out what 
is  sucking up the time.

Cheers,
Gary.



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to