On Wednesday, 8 October 2014 09:04:29 UTC-4, LZ Olem wrote:
>
> I'm developing a polling application that will deal with an average of 
> 1000-2000 votes per second coming from different users. In other words, 
> it'll receive 1k to 2k requests per second with each request making a DB 
> insert into the table that stores the voting data.
>
> I'm using RoR 4 with MySQL and planning to push it to Heroku or AWS.
>
> What performance issues related to database and the application itself 
> should I be aware of?
>
> How can I address this amount of inserts per second into the database?
>
EDIT
>
> I was thinking in not inserting into the DB for each request, but instead 
> writing to a memory stream the insert data. So I would have a scheduled job 
> running every second that would read from this memory stream and generate a 
> bulk insert, avoiding each insert to be made atomically. But i cannot think 
> in a nice way to implement this.
>

Don't implement that, certainly not as a first thing. Build something 
straightforward that does what you intend (collecting poll results), then 
load-test it.  Building a hyper-scalable DB layer isn't going to do much 
good until you're sure the layers in front of it (app servers, etc) can 
handle the load.

You may also want to consider what part of these results needs to be 
durable. For instance, if an exact "user -> vote" mapping isn't needed you 
could hack something together with Redis and its INCR command.

--Matt Jones

-- 
You received this message because you are subscribed to the Google Groups "Ruby 
on Rails: Talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/rubyonrails-talk/4189e84e-afd0-4de8-bc55-9d102dcbb2d1%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to