Thanks, I'm looking into Linode. I've been recommended that once before on
this list. Does it handle traffic spikes well? Or would that trigger a CPU
alert? I've gotten those alerts before from my present hosting company and
it stresses me out. That's how I was kicked out - too many CPU alerts. My
setup at that time had no speed/cache optimization. Maybe the new Linode
setup would be so fast that traffic spikes would not be a problem.
I just don't want any kind of resource usage alerts and yes, I know once I
get more traffic like you do, I'll have to upgrade. I can even get the 1GB
Linode later.
Once a link to the wiki was posted on the front page of Reddit and we had
lots of traffic and I had to take that page down for a short while
(redirected). Another time, a link was posted to a Facebook page which had
20 or 60K likes and we were getting clicks like crazy and I had to take
that page down too.

About this:
>> My recommendation is NGINX, PHP-FPM with APC and the built-in mediawiki file
cache. If you're not getting the performance you want you could also run
Varnish or set up a separate Linode for memcached. You could also place the
wiki behind Cloudflare if you're serving a lot of media files on page, if
not I don't think it would be beneficial.

I'm not serving media files. Most stuff is regular wiki pages (most of them
to logged out users, as is often the case) and even graphics are rarely
used. I've heard of the NGINX recommendation before but I have no
experience with it. The price for the 512 package is definitely attractive
and I can get it and play with the setup.


thanks




On Wed, Nov 28, 2012 at 1:25 AM, Serrano <[email protected]> wrote:

> I run a wiki an order of magnitude larger than yours on a 2GB Linode. You
> should have no issue on a $20/mo 512MB Linode provided you're running a
> modern PHP stack.
>
> My recommendation is NGINX, PHP-FPM with APC and the built-in mediawiki
> file cache. If you're not getting the performance you want you could also
> run Varnish or set up a separate Linode for memcached. You could also place
> the wiki behind Cloudflare if you're serving a lot of media files on page,
> if not I don't think it would be beneficial.
>
> Best,
> Chris
>
>
> On Tue, Nov 27, 2012 at 9:40 PM, Dan Fisher <[email protected]>
> wrote:
>
> > I was running mediawiki on a Shared host and traffic was around 10K
> views a
> > day (small to moderate size wiki). I was forced to leave that setup
> because
> > of high CPU usage. I was not able to install Squid there or do anything
> to
> > speed things up. I had talked about that before on this list and I'm
> > thankful for the recommendations.
> > Now I'm on a VPS where Squid is running and currently I don't have CPU
> > issues except when there's a traffic spike. So I've decided to look for a
> > dedicated server. I've seen on web hosting forums that (low-end?)
> dedicated
> > servers are available for pretty cheap ($100). Currently I'm paying $70
> for
> > the VPS.
> > My key issue is that the webhost has to willing to let me remain
> anonymous
> > and because of this my options are limited. For example they have to
> accept
> > Paypal. I have not looked around yet at what options are available but I
> > will look into that next after this discussion.
> > To be prepared for the future, I want the server to be able to support
> 30K
> > views a day (3 times the current traffic) and display pages with no
> > noticeable/serious delays. I hope a $100 server with Squid can do this
> for
> > me.
> > Are there any server specs that I should look for? The first one would be
> > RAM. What's the minimum RAM I should have? Other desirable specs?
> >
> > My second issue is the hit ratio for Squid: According to Squid's cache
> > manager, the cache hit rate is about 40% and the byte hit ratio is 20%.
> > Average time taken to serve a "missed" request is 0.7 seconds, while for
> a
> > hit its only 0.02 seconds (35 times faster). So a higher hit ratio would
> be
> > really nice.
> > Looking at Squid's access logs, I also noticed that calls to Load.php are
> > always "misses". Can anything be done to fix that?
> > What can be done to optimize Squid for mediawiki and increase the hit
> > ratio? The RAM I have available is 1.3GB and I told Squid it can use
> 130MB
> > and it goes over and the total RAM used usually stays around 40%. I know
> > 1.3GB may be small. I've heard we need to leave some ram free, to ensure
> > system stability. I may have more RAM in the dedicated server when I get
> > it.
> > If anyone has a high hit ratio, I would really be thankful if you could
> > email me your Squid.conf (remove any sensitive information) and I can
> > compare it with my setup. Or you could tell me the settings I should
> change
> > or add.
> >
> > thanks!
> > Dan
> > _______________________________________________
> > MediaWiki-l mailing list
> > [email protected]
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
> >
> _______________________________________________
> MediaWiki-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to