Arfon - you might check this out:

http://archives.postgresql.org/pgsql-sql/2002-04/msg00318.php

It's somewhat dated and deals with postgresql but

"  We currently have about 100GB of data and will soon grow to a multi-
terabyte system.  We have tables of up to 1 billion rows and have been
able to get ~1 million row queries to run in about 5 min.  "

Rick

On Nov 27, 5:15 am, Arfon Smith <[EMAIL PROTECTED]>
wrote:
> I'm looking at building a Rails application which will have some pretty
> large tables with upwards of 500 million rows.  To keep things snappy
> I'm currently looking into how a large table can be split to more
> manageable chunks.   I see that as of MySQL 5.1 there is a partitioning
> option and that's a possible option but I don't like the way the column
> that determines the partitioning has to be part of the primary key on
> the table.
>
> What I'd really like to do is split the table that a AR model writes to
> based upon the values written but as far as I am aware there is no way
> to do this - does anyone have any suggestions as to how I might
> implement this or any alternative strategies?
>
> Thanks
>
> Arfon
> --
> Posted viahttp://www.ruby-forum.com/.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/rubyonrails-talk?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to