On Fri, 2009-08-14 at 16:49 +1200, Daniel Hill wrote:
> Kent Fredric wrote:
> > For minimal pain, don't  unmask the ~ ( testing ) versions of things
> > during stage 1. You'll find if you do you'll find a fun gcc cyclic
> > dependency :)  ( that is, don't set  ACCEPT_KEYWORDS=  to "~amd64" or
> > "~x86" leave them at "amd64" or "x86" )
> >
> > Once you get to stage 3 of the build /then/ you /might/ want to switch
> > on that, but don't do it earlier.
> 
> I have a friend advise me to do this
> * start with stage 3, updating all the settings then going "emerge
> world" gets you the same result as starting
> * from stage 1
> 
Daniel,

What are you wanting to learn from all of this? Whilst it is possible
that you may conceivably need to roll your own kernel, maintaining that
system from then on is going to be needlessly difficult.

I agree that there's far more of a case for being able to build your own
applications from scratch ( especially internet facing ones ), it does
take an extreme use to require a specific kernel - ultra-high
performance databases and embedded hardware are the only ones that
readily spring to mind.

Managing a server really isn't about being able to build it from
scratch. That's great if you want to learn about linux, internals and
philosophy, but not really relevant for day-to-day server use, where
it's all about minimising risk, ensuring availability and understanding
system load. Oh, and monitoring it.

With risk in mind, it's best to use software "certified"* for a specific
os, and to do that most simply, it's best to stay in the mainline, which
really is RH/CentOS 5.3 or debian lenny. You'll be shocked at the
versions used... for example, kernel 2.6.18, PHP 5.1.6, MySQL 5.0.45,
etc ( lenny is newer, but only because it's just been released, but
don't go for RH 4 unless you want real heartache! ). Understanding why
this is the case and working with it, then developing your own stuff in
the same manner ( and testing to death before releasing! ) is what it's
all about. Deciding whether it's better to keep a stable platform or to
patch it to the hilt, whighing the risks again.

That, and use of the command line, and scripting ( because you've got to
have the record of what went wrong to put it right ) becomes far more
important than at the desktop.

Personally I stand by my recommendations. You'll learn plenty of linux
just by throwing X away (:

Oh, and most importantly, you've got to become a BOFH like me (:

Cheers,


Steve

*This is a very loose definition of the word, where package releases are
considered "certified". Often the source release from the author is
better, but then you've got the extra headache of completely testing it
yourself. At least if debian/RH release a package, it's been pretty
thoroughly tested. I know it's the wrong word, but I couldn't think of
the right one (:


-- 
Steve Holdoway <[email protected]>
http://www.greengecko.co.nz
MSN: [email protected]
GPG Fingerprint = B337 828D 03E1 4F11 CB90  853C C8AB AF04 EF68 52E0

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to