Pete,

I would suggest you start here:

http://dev.mysql.com/doc/refman/5.0/en/insert-speed.html

When working with so much data, I usually try to get as low to the
database as possible.  Cut out all the ORM stuff and use the DB as
natively as possible.

Just my $.02.

On Jun 11, 3:16 pm, Pete Wright <[EMAIL PROTECTED]> wrote:
> Hey all,
>
> I have an application that basically collects stats from a bunch of
> websites. Actually, that's an understatement. It collects stats from a
> few hundred thousand websites.
>
> Each time controller actions get hit they post information into tables
> in a very large database. Is there any mechanism by which I can batch
> up these inserts so that instead of doing a single insert for every
> action call I'm basically doing one bulk insert for every 1000 or even
> 10,000 action calls? The backend database here is MySql.
>
> Any ideas or pointers on this would be greatly appreciated.
>
> Thanks
>
> Pete
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"pylons-discuss" group.
To post to this group, send email to pylons-discuss@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/pylons-discuss?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to