Re: [PHP-DB] mySQL max connections

2002-07-31 Thread John Lim

Hello Rasmus,

Sorry to bother you, but lerdorf.com is currently down.

Regards, John

PS: I have been reading your new Programming PHP book. I love it!

"Rasmus Lerdorf" <[EMAIL PROTECTED]> wrote in message
[EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
> I would suggest looking into MySQL's replication support.  Split reads and
> writes so they go to separate servers.  That is, create a master server
> where you send all database writes.  And do all reads on the replicated
> slave servers.
>
> Have a look at this presentation I gave last week on this stuff:
>
> http://pres.lerdorf.com/show/osconmysql
>
> The last couple of slides should be interesting to you.
>
> (works best with Mozilla, click on the yellow text at the top to change
> slides, or use cursor-right/left)
>
> -Rasmus
>
> On Tue, 30 Jul 2002, Shane Wright wrote:
>
> > Hi
> >
> > I have a database thats taking a bit of a hammering - enough so that the
> > number of connections spirals up and out of control.
> >
> > max_connections was originally at the default of 100 - but rising above
50 or
> > so meant actual throughput dropped so the db never got a chance to keep
up
> > (meaning manually restarting the db).  I've lowered max_connections to
40
> > which at least keeps the db alive.
> >
> > But, the number of connections keeps rising to and bouncing off this
limit -
> > and for the users that hit it a 'Too many connections' error is given.
> >
> > Now, I've optimised everything as much as is humanely possible - and the
only
> > way out I can see so far is to have some kind of connection queue to
keep
> > people waiting for the 1/2 second or so until the load spike drops off
(I'd
> > rather have a few slow pages than errors any day).
> >
> > Is there any way of doing this - I've looked at back_log (the listen()
> > backlog), but that doesnt really apply.
> >
> > Short of writing a 'hide-warning-wait-a-bit-and-try-again' chunk in PHP
> > (sucky!) I'm stuck!
> >
> > Using persistent connections doesn't stunningly help either - it only
gives a
> > small performance increase
> >
> > Any help appreciated, thanks.
> >
> > --
> > Shane
> > http://www.shanewright.co.uk/
> > Public key: http://www.shanewright.co.uk/files/public_key.asc
> >
> > --
> > gpg: Warning: using insecure memory!
> > gpg: Signature made Tue 30 Jul 2002 04:25:04 AM PDT using DSA key ID
D08C06B4
> > gpg: Can't check signature: public key not found
> > --
> >
>



-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP-DB] mySQL max connections

2002-07-30 Thread Shane Wright

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hi 

Thanks for the info, looks pretty interesting - unfortunately we dont have the 
resources to add more servers, guess I'm just reaching capacity :(

Ah well, time to start looking at alternatives, hmmm shm caches sound nice.

Thanks

Shane

On Tuesday 30 July 2002 5:33 pm, Rasmus Lerdorf wrote:
> I would suggest looking into MySQL's replication support.  Split reads and
> writes so they go to separate servers.  That is, create a master server
> where you send all database writes.  And do all reads on the replicated
> slave servers.
>
> Have a look at this presentation I gave last week on this stuff:
>
> http://pres.lerdorf.com/show/osconmysql
>
> The last couple of slides should be interesting to you.
>
> (works best with Mozilla, click on the yellow text at the top to change
> slides, or use cursor-right/left)
>
> -Rasmus
>
> On Tue, 30 Jul 2002, Shane Wright wrote:
> > Hi
> >
> > I have a database thats taking a bit of a hammering - enough so that the
> > number of connections spirals up and out of control.
> >
> > max_connections was originally at the default of 100 - but rising above
> > 50 or so meant actual throughput dropped so the db never got a chance to
> > keep up (meaning manually restarting the db).  I've lowered
> > max_connections to 40 which at least keeps the db alive.
> >
> > But, the number of connections keeps rising to and bouncing off this
> > limit - and for the users that hit it a 'Too many connections' error is
> > given.
> >
> > Now, I've optimised everything as much as is humanely possible - and the
> > only way out I can see so far is to have some kind of connection queue to
> > keep people waiting for the 1/2 second or so until the load spike drops
> > off (I'd rather have a few slow pages than errors any day).
> >
> > Is there any way of doing this - I've looked at back_log (the listen()
> > backlog), but that doesnt really apply.
> >
> > Short of writing a 'hide-warning-wait-a-bit-and-try-again' chunk in PHP
> > (sucky!) I'm stuck!
> >
> > Using persistent connections doesn't stunningly help either - it only
> > gives a small performance increase
> >
> > Any help appreciated, thanks.
> >
> > --
> > Shane
> > http://www.shanewright.co.uk/
> > Public key: http://www.shanewright.co.uk/files/public_key.asc
> >
> > --
> > gpg: Warning: using insecure memory!
> > gpg: Signature made Tue 30 Jul 2002 04:25:04 AM PDT using DSA key ID
> > D08C06B4 gpg: Can't check signature: public key not found
> > --

- -- 
Shane
http://www.shanewright.co.uk/
Public key: http://www.shanewright.co.uk/files/public_key.asc
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.7 (GNU/Linux)

iD8DBQE9RxKK5DXg6dCMBrQRAkSSAKCIwrEaKLg0KceZIWKdy4F5BTa3WACg1WFL
8HziL4IvBo/y+oe/zsGi7Fo=
=dhdx
-END PGP SIGNATURE-


--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP-DB] mySQL max connections

2002-07-30 Thread Rasmus Lerdorf

I would suggest looking into MySQL's replication support.  Split reads and
writes so they go to separate servers.  That is, create a master server
where you send all database writes.  And do all reads on the replicated
slave servers.

Have a look at this presentation I gave last week on this stuff:

http://pres.lerdorf.com/show/osconmysql

The last couple of slides should be interesting to you.

(works best with Mozilla, click on the yellow text at the top to change
slides, or use cursor-right/left)

-Rasmus

On Tue, 30 Jul 2002, Shane Wright wrote:

> Hi
>
> I have a database thats taking a bit of a hammering - enough so that the
> number of connections spirals up and out of control.
>
> max_connections was originally at the default of 100 - but rising above 50 or
> so meant actual throughput dropped so the db never got a chance to keep up
> (meaning manually restarting the db).  I've lowered max_connections to 40
> which at least keeps the db alive.
>
> But, the number of connections keeps rising to and bouncing off this limit -
> and for the users that hit it a 'Too many connections' error is given.
>
> Now, I've optimised everything as much as is humanely possible - and the only
> way out I can see so far is to have some kind of connection queue to keep
> people waiting for the 1/2 second or so until the load spike drops off (I'd
> rather have a few slow pages than errors any day).
>
> Is there any way of doing this - I've looked at back_log (the listen()
> backlog), but that doesnt really apply.
>
> Short of writing a 'hide-warning-wait-a-bit-and-try-again' chunk in PHP
> (sucky!) I'm stuck!
>
> Using persistent connections doesn't stunningly help either - it only gives a
> small performance increase
>
> Any help appreciated, thanks.
>
> --
> Shane
> http://www.shanewright.co.uk/
> Public key: http://www.shanewright.co.uk/files/public_key.asc
>
> --
> gpg: Warning: using insecure memory!
> gpg: Signature made Tue 30 Jul 2002 04:25:04 AM PDT using DSA key ID D08C06B4
> gpg: Can't check signature: public key not found
> --
>


-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php