On Mon, Oct 24, 2011 at 6:50 PM, Jason Pruim <li...@pruimphotography.com>wrote:
> Now that I've managed to list 3 separate programming languages and somewhat
> tie it back into php here's the question...
> I have about 89 million records in mysql... the initial load of the page
> takes 2 to 3 minutes, I am using pagination, so I have LIMIT's on the SQL
> query's... But they just aren't going fast enough...
> What I would like to do, is pull the data out of MySQL and store it in the
> HTML files, and then update the HTML files once a day/week/month... I can
> figure most of it out... BUT... How do I automatically link to the
> individual pages?
> I have the site working when you pull it from MySQL... Just the load time
> sucks... Any suggestions on where I can pull some more info from? :)
> Thanks in advance!
dial in the db schema (think keys) and queries; then investigate a reverse
proxy like varnish to cache the generated html.
you'll be able to handle a couple thousand requests per second against the
proxy in no time.
might be worth pre-generating some of the pages if they are still really
slow after db optimization.