> I do read what you write, I just feel there should be
> some way to FTP install on a low ram computer. I also
> understand that certain sacrifices occur during the
> install (This is what I would like to learn more
> about).

I'm not really sure how the install works, but from
pieces I've picked up here and there, and a little
consideration of my own, I think I have a little bit of
an idea as to why the install takes so much RAM.  I'll
try and explain what I've come up with so far, and if I'm
way off, I'm sure someone will correct me.


First off, you have the boot disk.  It's 1.4MB, and that
has the kernel, common hardware and other modules, and
gets the whole thing started.  Can't remember if that's
compressed or not, we'll assume so.  We'll double it.

+1.4MB X 2
=2.8MB

Then, you have the second stage installer, which is
downloaded once the boot disk gets up and running.  If
you take a look at your CD-ROM/FTP site, you'll notice
that "mdkinst_stage2.bz2" is 9.6MB, compressed.
Uncompress that and you're looking at a big chunk of
change.  I haven't tried it, but I'll go with 2x
compression for everything here on.

+9.6MB X 2
=22MB

Once the second stage starts up, you have the hardware
detection, partitioning, etc.  Then it gets to the
package selection.  Well, first off, you have a couple
files that it needs.  I'm guessing one is the horrible
hdlist.cz file that tops out at an astounding 8.6MB.
Not sure what kind of file this is, compressed or not,
but it's big.

+8.6MB
=30.6MB


Then there's the pkglist.cooker.bz2 file.  Is this
needed?  No clue.  I'll assume it is, otherwise, why
include a 1.4MB file?  It's obviously compressed, so
we'll double it like usual.

+1.4MB X 2
=33.4MB


Then you throw in the fun things, like depslist (130k,
has dependencies I'm guessing?), provides (140k, similar
to deps?), and well, you're sitting close to a pretty
hefty 34MB.


Now, that's just for the files that it uses to perform
the actuall installation.  That doesn't included the
memory footprint of the installation process itself,
which I'm sure isn't huge, but can't be too small.

If my compression is off some, those sizes may go up and
down.  I know compressing a plain text file, you can get
4X and 5X compression, so some of those files might be
much bigger, therefore taking up even more space.


Now, down to the real kicker, and why FTP and HTTP are
worse than NFS, CD-ROM, and local HD installs.


With NFS, CD-ROM, and local HD, you're reading those
files from a mounted media.  You don't have to copy them
or download them or anything like that.

With FTP and HTTP, you have to download those files and
load them into memory.  FTP and HTTP don't provide any
type of local filesystem to store them on, so you have
to create a ramdisk and dump them there, which of course
chews up memory.


Make sense?

Am I far off?  Please, let me know.

This kind of stuff intrigues me.

(Mandrake, if you'd like to use any of this explanation
in a document, feel free, just let me know and stick my
name/e-mail address in there somewhere.  Edit it to your
heart's content, I know I can't be right on about the
whole thing, but if it's somewhat acurate, I'm sure you
can touch up what I didn't get right.  If this is already
explained somewhere, please let me know as well!)


Don Head
SAIR LCA, CIW-P, i-Net+, Network+, A+

Systems Administrator      [ [EMAIL PROTECTED] ]
Web Designer                            [ 1 314 650-4056 ]
[ AIM - Don Wave ] [ ICQ - 18804935 ] [ Yahoo - Don_Wave ]

Reply via email to