On Sat, Dec 29, 2012 at 1:36 PM, David Kerber
> On 12/29/2012 1:31 PM, dmccunney wrote:
>>> We sell an industrial data collection machine based on XP that runs in
>>> about 80MB of allocated memory. We turn off the server service, themes
>>> and a couple others, along with unneeded devices, and have only tcpip v4
>>> networking enabled. Doing a warm reboot takes about 20 sec IIRC from
>>> the time I click shutdown to the time it's back up taking data again.
>> Sweet. I've done a fair bit of optimizing memory usage in 2K and XP
>> by pruning stuff run on startup and closing down unneeded services,
>> but I've never gotten RAM usage that low because I was configuring a
>> general purpose machine, not a dedicated one. (The XP box I'm posting
>> from at the moment takes about 270MB for XP itself from a standing
>> start. I could prune that more if I had to, but it would mean
>> compromises I'd rather not make, and since the box has 1.5GB RAM, I
>> don't have to.)
> Yep, that's about as low as I've gotten a general purpose XP desktop as
> well, ~250MB or so, including an antivirus.
I don't run A/V. I thought about it and realized I didn't need it.
The only thing A/V had actually caught in years back when I *did* run
it were "false positives".
Viruses are infections, and infections have vectors by which they
enter the host. Ward the vector and block the infection.
The main vector for viruses is email. I use GMail as my primary email
account, and read mail via the web interface. I have no need for a
local copy, so I don't download via POP. My mail, including
attachments, lives on Google's servers and never reaches my machine.
Google has viewers for the majority of file types used as attachments,
so I don't need to download them to see them.
Other downloads are all from known-good sources that scan on their end
(and most are open source as well.)
I stopped running A/V on Windows a while back and have had no cause to
reconsider the decision. I don't run A/V because I don't do things
that are likely to infect me with a virus.
>> Along those lines, a chap on the Puppy Linux forums got a working
>> Puppy installation in 16MB RAM. To do so, he had to take out
>> everything that *could* be removed and still have a working bootable
>> Linux image, and he had to actually build the image on a more powerful
>> machine, then transfer the drive to the ancient target system, The
>> end result was a dedicated media server that performed the intended
>> function on a box with 16MB RAM that he had lying around and wanted to
> Now that's cool. I've never tried puppy linux, but I have heard it's
> good for that kind of application.
I found Puppy because I was given an old Fujitsu Lifebook p2110 by a
friend who had upgraded but loved the old box and wanted it to go to a
good home, not a dumpster.
It has an 867mhz Transmeta Crusoe CPU and a whopping 256MB of RAM, of
which 16MB is grabbed off the top for code morphing by the CPU. It
came to me with XP SP2 installed and was frozen snail slow, requiring
8 minutes to simply boot. Once up, it did a good imitation of
mainframe "death by thrashing", spending more time swapping than
I went looking for a Linux distro suitable for slow, low-RAM hardware,
and Puppy was one of the candidates.
What I wound up doing was swapping in a larger 40GB HD from my SO's
dead laptop, re-partitioning, and setting up a multi-boot system, with
Win2K Pro SP4, Puppy Linux, Ubuntu Linux, and FreeDOS.
Win2K is on a 20GB NTFS slice. (I got its memory requirements down to
about 180MB.) Puppy and Ubuntu are on 8GB ext4 slices, with a shared
512MB swap partition. FreeDOS is in a 2GB FAT32 volume. Puppy and
Ubuntu are configured to mount each others slices, and I did some
fiddling to have one copy of large apps living on one side or the
other and shared between them. I found an open source driver that
lets me read and write the ext4 slices from Win2K, and the Linux
systems both grok NTFS and can see the Windows slice, All can read
and writer the FAT32 partition. FreeDOS can't see anything else, but
I don't care.
Puppy was a straight install from an ISO. Ubuntu was trickier. I
started using Xubuntu, a version intended for lower end gear, but it
was snail slow. Posters on the Ubuntu forums indicated too much Gnome
had crept into Xubuntu, and that Ubuntu had a steadily increasing idea
of what "low end" was. They recommended what I did, which was
install from the Minimal ISO that would produce a working CLI
inlstallation, then use apt-get to install only the bits I needed.
That produced a Ubuntu installation that was usable and almost as
quick as Puppy.
Puppy is fun but quirky. Among other things, it is explicitly single
user, and you *always* run as root. (The code that lets you have
other users was ripped out - you *must* be root.) I started using
Unix before Linux was even a gleam in Linus Torvald's eye, and the
notion of always being root gave me hives. It gets away with it
because Puppy installations essentially *are* single-user machines,
and the fundamental situation isn't much different from the old
MS-DOS/Windows setups where the user at the keyboard was assumed to be
administrator with all powers. (There are some Linux apps that flatly
refuse to run as root which Puppy users must pass on.)
The biggest limitation I found in Puppy was package management.
Handling of dependencies essentially isn't done, and you may have fun
finding out why an app won't run and what libraries you need to get it
A feature of Puppy is that it's easy to create a custom Puppy version
(called a puplet) customized for specific purposes, and a number
I actually spend more time on the Ubuntu install than the Puppy one.
It's much closer to standard Linux, and Ubuntu package management is
the best I've seen.
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
with LearnDevNow - 3,200 step-by-step video tutorials by Microsoft
MVPs and experts. SALE $99.99 this month only -- learn more at:
Freedos-user mailing list