Linux-Advocacy Digest #596, Volume #31 Sat, 20 Jan 01 00:13:05 EST
Contents:
Re: Linux is crude and inconsistant ("Kyle Jacobs")
Re: Linux is crude and inconsistant ("Kyle Jacobs")
Re: New Microsoft Ad :-) (T. Max Devlin)
Re: Windows curses fast computers ("Erik Funkenbusch")
Re: Linux is crude and inconsistant ("Kyle Jacobs")
Re: Linux is crude and inconsistant ("Kyle Jacobs")
Re: Linux is crude and inconsistant. (Cliff Wagner)
Re: NT is Most Vulnerable Server Software (T. Max Devlin)
Re: Dumping Novell for Linux (almost).. ("Les Mikesell")
Re: New Microsoft Ad :-) ("Erik Funkenbusch")
Re: NTFS Limitations (Was: RE: Red hat becoming illegal?) (T. Max Devlin)
Re: "The Linux Desktop", by T. Max Devlin (T. Max Devlin)
Re: New Microsoft Ad :-) ("JS PL")
Re: "The Linux Desktop", by T. Max Devlin (Charlie Ebert)
Re: New Microsoft Ad :-) ("Aaron R. Kulkis")
----------------------------------------------------------------------------
From: "Kyle Jacobs" <[EMAIL PROTECTED]>
Crossposted-To: alt.linux.sux
Subject: Re: Linux is crude and inconsistant
Date: Sat, 20 Jan 2001 04:01:10 GMT
<[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> >"Kyle Jacobs" <[EMAIL PROTECTED]> wrote in message
> >> If you have to reinstall Windows NT or 2000 more than a few times when
> >> "something" goes wrong, you are an incompetent administrator.
> >
> >If he had to reinstall 2000 he is an incompetent administrator.
> >
> >Yes, if you have a problem, and don't want to learn how to fix it,
> >reinstalling (over what you already have) is the most hassle-free option
> >you've. There are other, better ways to do this.
>
> This only overlooks the fact that Windows, even a a server product,
> is marketed towards the people that don't want to put any effort
> into debugging the system.
Windows is a horrible server product, except for its clone of ISO/SMB (err,
Windows file sharing) which Windows has an exlcusive hold on. I wouldn't
use Windows (proably not even 2000) for a mission critical, pure data
service when UNIX systems have been providing years of stability in a
situation where UNIX has proven itself a dedicated, entrenched dinasour,
albeit a functional one.
> If Win2k requires the end user (or even admin) to become some sort
> of guru, then it has failed as a product and you might as well just
> run Unix or VMS.
Windows NT may require more knoledge about the computer, but that's really
only because it ignored technological revoulitions that made computing
simpler. 2000 is excelent for a desktop, but Me supports more "home
centric" hardware, and functions with less user intervention.
And if I had to compare the quite complex Windows NT with Linux, and had to
have someone ELSE choose between them, I they would choose NT in a
heartbeat. Why? Because as a desktop platform, Windows has Linux beat
hands, feet, and anything else down.
------------------------------
From: "Kyle Jacobs" <[EMAIL PROTECTED]>
Crossposted-To: alt.linux.sux
Subject: Re: Linux is crude and inconsistant
Date: Sat, 20 Jan 2001 04:02:49 GMT
Your one of these people who thought that Windows NT was a capible server
platform, weren't you?
You know better now.
I knew better then.
I looked, I saw, I thought "would work better on a workstation. I was
right.
"Kevin Ford" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> I think you mean incompetent system architect for choosing NT.
>
> Kyle Jacobs once wrote:
> >If you have to reinstall Windows NT or 2000 more than a few times when
> >"something" goes wrong, you are an incompetent administrator.
> >
> >
> >
> >"Charlie Ebert" <[EMAIL PROTECTED]> wrote in message
> >news:[EMAIL PROTECTED]...
> >> In article <9481dp$8c0$[EMAIL PROTECTED]>,
> >> Lewis Miller wrote:
> >> >>
> >> >>We're talking about workstations here. I wouldn't trust IIS on
ANYTHING
> >> >>that even closely classified as "enterprise". Or do you have another
> >> >>definition of "workstation"?
> >> >
> >> >Right, but we're talking about Linux. Linux is a server OS. Through
and
> >> >through. So that's why we keep coming back to this point. Workstations
be
> >> >damned. you have a problem with a workstation, you don't even try to
fix
> >> >it. You grab the image file off the server and reimage the machine.
Bam
> >> >it's just like new. That's how to fix a Windows workstation.
> >> >
> >> >
> >>
> >> The point made here for the CLUELESS is that Windows spent the
> >> time to make their REINSTALL effortless. WHY you ASK?
> >>
> >> Because you HAVE to re-install Windows MANY times over the life
> >> of your MACHINE. That's because Windows is a peice of shit!
> >>
> >> Linux is always, install it once, use it, upgrade it when
> >> upgrades are available, but you never have to re-install it
> >> unless you've just lost your hardware.
> >>
> >> Or your mind.
> >>
> >> Charlie
> >>
> >
> >
>
>
> --
>
> ---
>
------------------------------
From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft,comp.os.ms-windows.nt.advocacy
Subject: Re: New Microsoft Ad :-)
Reply-To: [EMAIL PROTECTED]
Date: Sat, 20 Jan 2001 04:06:30 GMT
Said T. Max Devlin in alt.destroy.microsoft on Sat, 20 Jan 2001 00:08:09
GMT;
>Said Edward Rosten in alt.destroy.microsoft on Fri, 19 Jan 2001 22:53:08
>>J Sloan wrote:
>>>
>>> JS PL wrote:
>>>
>>> > Easily. I just built a system last week. And it played an mp3 perfectly
>>> > while simultaneously copying 600mb worth of other mp3's from the cd drive to
>>> > a folder AND installing office 2000 from the other cd drive. Didn't skip a
>>> > beat. It was probably "accessing" the internet too, I forget.
>>>
>>> Sure, and I'll bet it cured your cancer too...
>>>
>>> Meanwhile, back in the real world, my friend just mentioned
>>> that he clicked on the icq button the other day and windows
>>> 2000 spontaneously rebooted.
>>
>>Tsk. That's obviously the fault of the mouse drivers. Its not Win2Ks
>>fault that it can't supply decent drivers.
>
>Why not? Get your head out of your ass! ('Tsk', indeed.) Whether it
>was the mouse driver that *failed* or not, you haven't any way
>whatsoever of knowing that it was the mouse drivers "fault". And
>considering how much more often, and much less explicably, this "bad
>driver" crap happens on Windows, I'd say it makes perfect sense to blame
>Windows, indeed, Microsoft's lousy monopoly crapware design, for the
>fact that a mouse driver glitch, regardless of who's "fault" it is, can
>cause a spontaneous reboot.
>
>>Besides if it was Linux you
>>would have spent 8 months just getting your keyboard to work, never mind
>>the mouse.
>
>Well, that's a fabrication, if not an outright lie, and the fact that
>you can't see why W2K spontaneously rebooting *NO MATTER WHAT* is W2K's
>fault, and nobody else's, indicates you don't really know what you're
>talking about, at all. Frankly, its a ludicrous suggestion, and
>provides evidence you don't know what an "operating system" is, or what
>it does.
>
> [...]
Oops. A more complete reading of the thread indicates I went a little
overboard and missed some rather important points in the matter. My
apologies.
--
T. Max Devlin
*** The best way to convince another is
to state your case moderately and
accurately. - Benjamin Franklin ***
------------------------------
From: "Erik Funkenbusch" <[EMAIL PROTECTED]>
Subject: Re: Windows curses fast computers
Date: Fri, 19 Jan 2001 22:14:00 -0600
"mlw" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Erik Funkenbusch wrote:
> >
> > <[EMAIL PROTECTED]> wrote in message
news:94agkc$amc$[EMAIL PROTECTED]...
> > > > Let me ask you a question. How long is WIndows supposed to wait?
> > > Suppose
> > > > IBM introduces a new drive with a 10GB buffer in it. It takes 10
> > > minutes to
> > > > flush the buffer to disk. How long is Windows supposed to wait
before
> > > > shutting down? The drive provides no way for the OS to know when
the
> > > buffer
> > > > is fully flushed, so what is the OS supposed to do?
> > >
> > > Windows is supposed to wait long enough for the buffer to be safely
> > > written to the disk. And yes, the drive can tell you if that has
> > > happened.
> >
> > Fine, the please provide the ATA spec reference that shows how the drive
> > does this. I can't find it in the spec. Since the spec isn't available
> > publicly, you can just give me a reference and i'll look it up in mine.
The
> > spec is called NCITS 340-2000 and is available from ANSI for $18.
>
> If they fixed it once on NT4, why didn't they make sure that procedure
> was in ME?
>
> Why? because MS does not know how to make decent software. Hece a 38 day
> MTTF for NT 4.0.
They didn't "fix" it. They just delayed the shutdown some extra time. This
too will fail eventually when some other disk comes out with a larger cache.
------------------------------
From: "Kyle Jacobs" <[EMAIL PROTECTED]>
Crossposted-To: alt.linux.sux
Subject: Re: Linux is crude and inconsistant
Date: Sat, 20 Jan 2001 04:03:55 GMT
"Aaron R. Kulkis" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> > I think you mean incompetent system architect for choosing NT.
>
>
> Incompetent NT LoseDOS admin and incompetetant system architects who
> choose LoseDOS products....go hand in hand.
Of course, then there are idiots like you who would choose a Linux
workstation platform as their choice when you knew what would suffer.
UNIX on the desktop isn't pretty. If it were, Microsoft wouldn't be in
business.
------------------------------
From: "Kyle Jacobs" <[EMAIL PROTECTED]>
Crossposted-To: alt.linux.sux
Subject: Re: Linux is crude and inconsistant
Date: Sat, 20 Jan 2001 04:08:13 GMT
"Aaron R. Kulkis" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> > Right, but we're talking about Linux. Linux is a server OS. Through and
> > through. So that's why we keep coming back to this point. Workstations
be
> > damned. you have a problem with a workstation, you don't even try to fix
> > it. You grab the image file off the server and reimage the machine. Bam
> > it's just like new. That's how to fix a Windows workstation.
>
> That's only because...with LoseDOS, you have no alternative.
>
> Conversely, with UNIX desktop machines, the machine is almost
> always kept up and running ( > 98% of typical problems).
Oh please. Your specialty environment is perfect for UNIX because your
environment requires engineering software. UNIX workstations don't work in
80% of the REST OF THE WORLD. I don't even want to think about UNIX in
workstation capacity at a corporate situation. I can just see NOTHING
getting done. No words processed, no print jobs printed because no one can
figure out how ass backward UNIX GUI's are.
The rest of the world depends on Windows NT & 2000 for their workstation
environments. The GOOD productivity is for Windows. The GOOD data services
are for UNIX. Some of us understand how that works, then there is you.
------------------------------
From: [EMAIL PROTECTED] (Cliff Wagner)
Crossposted-To: alt.linux.sux
Subject: Re: Linux is crude and inconsistant.
Date: 20 Jan 2001 04:11:28 GMT
Reply-To: [EMAIL PROTECTED]
On Sat, 20 Jan 2001 00:26:02 -0000, [EMAIL PROTECTED] typed something like:
>On Thu, 18 Jan 2001 07:53:50 GMT, T. Max Devlin <[EMAIL PROTECTED]> wrote:
>>Said Charlie Ebert in comp.os.linux.advocacy on Wed, 17 Jan 2001
>> [...]
>>>I'm not a RedHat fan. I don't think RedHat is worse than Windows.
>>>But RedHat and Debian are at extreme opposite ends of the spectrum sir.
>>
>>How would you characterize the difference, Charlie?
>
> Conservative versus bleeding edge. Debian seems to be much
> more conservative about what it includes. This includes
> licencing philosphy. They also tend to package the older,
> more stable version of a component. They also seem to
> concentrate first on getting particular core functionality
> (like packaging) right before going after flash and market
> appeal.
>
> They're kinda of like Slackware in that they are relatively
> not market driven but with more of a usability focus (like
> package management).
>
>--
>
> Ease of use should be associated with things like "human engineering"
> and "use the right tool for the right job". And of course,
> "reliability", since stopping to fix a problem or starting over due
> to lost work are the very antithesis of "ease of use".
>
> Bobby Bryant - COLA
> |||
> / | \
But at the same token are also ALWAYS working with the latest
versions and software. Apt-get is such a godsend. I configured
my debian box to only look in the UNSTABLE heirarchy tree.
In a year of doing this, it bit me in the ass exactly once by
getting an unstable configuration (an upgrade to libc that
caused sendmail to have a coniption fit for some reason).
That's what I love about debian. For a server, it's one of
the most rock solid stable distributions (especially if you
crontab an apt-get update to make sure you always have
the up-to-date STABLE patches) but also giving the ability
to run more bleeding edge (which is nearly as stable
from my experiences).
Hats off to debian (and in my opinion Stomix for a
good debian based distro).
--
Cliff Wagner ([EMAIL PROTECTED])
Visit The Edge Zone: http://www.edge-zone.net
"Man will Occasionally stumble over the truth, but most
of the time he will pick himself up and continue on."
-- Winston Churchill
------------------------------
From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft
Subject: Re: NT is Most Vulnerable Server Software
Reply-To: [EMAIL PROTECTED]
Date: Sat, 20 Jan 2001 04:14:17 GMT
Said [EMAIL PROTECTED] () in alt.destroy.microsoft on Sat, 20 Jan
[...]
>>To play the Microsoft apologist for a moment, the fact is that a
>>'typical user' is not going to know right away when they've been hacked.
>>This would be a *major* nightmare for RedHat, and people *would*
>>potentially turn away from not just RedHat, but Linux as a whole, if
>>they get burned this way.
>
> This is why Redhat needs to be rightfully flamed whenever
> one has the opportunity. Linux is quite capable of running
> on different subnets concurrently,even on one physical
> network. Certain services simply should not be exposed to
> routable subnets.
I'm going to jump out with one of my "you don't quite understand how
this network thing works" rants, I'm afraid. A service is either
exposed or it is not; there is a port listener or daemon or there is
not. There is no such thing as an 'unroutable' subnet, therefore
there's no way (save potentially removing all advantages of modern
networking and henceforth requiring programmers to again twiddle bits on
the wire) to differentiate between whether a port is exposed on a
routable subnet or not.
Having a firewall configured and running by default might very well be
far more problematic, believe it or not, certainly for Redhat, if not
their users. Still, this seems like a reason to bitch at Redhat users,
not Redhat. But, like you, I wonder if Mandrake was also vulnerable.
--
T. Max Devlin
*** The best way to convince another is
to state your case moderately and
accurately. - Benjamin Franklin ***
------------------------------
From: "Les Mikesell" <[EMAIL PROTECTED]>
Subject: Re: Dumping Novell for Linux (almost)..
Date: Sat, 20 Jan 2001 04:15:18 GMT
"Joel Barnett" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Thanks indeed for the post. Your smb.conf file looks like you are using
> Samba as a domain controller. I was under the impression that Samba can't
be
> a domain controller for my Windows 2000 clients.
>
Yet another reason to avoid upgrading to win2000.
Les Mikesell
[EMAIL PROTECTED]
------------------------------
From: "Erik Funkenbusch" <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft,comp.os.ms-windows.nt.advocacy
Subject: Re: New Microsoft Ad :-)
Date: Fri, 19 Jan 2001 22:27:51 -0600
"T. Max Devlin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Said Erik Funkenbusch in alt.destroy.microsoft on Fri, 19 Jan 2001
> >"T. Max Devlin" <[EMAIL PROTECTED]> wrote in message
> [...]
> >> Actually, and not surprisingly, you're mistaken, Erik. Although your
> >> confusion is understandable, a "mean time to failure" metric is not a
> >> simple "average" of times systems were up. It is the projected average
> >> time before *any* system, statistically, *will* fail. It is *possible*
> >> a system can be up longer. It is *probable* it will fail earlier,
given
> >> anything but idealistic circumstances.
> >
> >In a true MTBF statistic, yes. That's not how the study worked though.
It
> >simply took the number of hours monitored and divided by the number of
> >failures.
>
> I don't know where you got that idea.
>From the study, which states specifically how they calculated MTTF.
> That's not what a 'mean time to failure' is.
That's how they calculated it.
> And it is indeed MTTF that they "computed".
No, it's not.
> I was
> questioning myself the validity of the metric for software, but that
> doesn't mean it isn't actually MTTF. You don't think they actually had
> W2K systems up for longer than 72 forty hour weeks without a crash, did
> you? Again, you illustrate your lack of awareness of how such
> statistics work. No, this wasn't an average; you just imagined that.
>From the report at:
http://www.nstl.com/downloads/Win2000Reliability.pdf
"MTTF is calculated as the average of session times between unplanned
reboots. In other words, the mean time to failure Tf is given by (graphic
depicting Tf = T over f) where T is the duration that the operating system
was running and f is the number of failures or unplanned reboots "
------------------------------
From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To:
alt.destroy.microsoft,comp.os.ms-windows.advocacy,comp.os.ms-windows.nt.advocacy
Subject: Re: NTFS Limitations (Was: RE: Red hat becoming illegal?)
Reply-To: [EMAIL PROTECTED]
Date: Sat, 20 Jan 2001 04:25:52 GMT
Said [EMAIL PROTECTED] () in alt.destroy.microsoft on Sat, 20 Jan
>On Fri, 19 Jan 2001 15:25:22 GMT, T. Max Devlin <[EMAIL PROTECTED]> wrote:
>>Said Ayende Rahien in alt.destroy.microsoft on Fri, 19 Jan 2001 06:58:01
[...]
>>>I'm not sure exactly *what* you can put into a file to get into that size.
>>
>>Precisely what they said about the 2 Gigabyte limit. ;-)
>
> Databases.
A 'database' is not, by definition or even by convention, a single file.
> Then again, databases grew to that size long before there
> were file systems to handle such file sizes. Good software
> adapts to some degree to route around other 'faults' in the
> system.
Databases started out larger than a single file. The contrary idea
didn't even occur to anyone, I would wager, until the advent of PC
desktop applications.
>>And they were really sure *they* were right, too. ;-)
>[deletia]
>
> The real question is how much trouble is it to "route around"
> such limitations. Considering the successes of databases in
> this regard as well as mp3 players and DVD consoles, I don't
> think this issue is such a tragedy.
>
> Compared to some of Microsoft's past mistakes, a 2G limitation
> in an ext2 file is downright trivial.
The real issue is how trivially correctable it is. There are already
three alternative file systems, including both a near-term and long-term
development with "official" support and a good deal of success. In
contrast, the half dozen times that I've run into such "storage
out-limits OS" situations with DOS and Windows (all the way back to 'no
drive bigger than 40Meg', IIRC, and most recently itself needing
specific versions of something to break 2Gig *for a single drive* <and
here we're talking for a single file>) have always been simply waiting
around for the serial upgrade path of the monopoly crapware.
--
T. Max Devlin
*** The best way to convince another is
to state your case moderately and
accurately. - Benjamin Franklin ***
------------------------------
From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft
Subject: Re: "The Linux Desktop", by T. Max Devlin
Reply-To: [EMAIL PROTECTED]
Date: Sat, 20 Jan 2001 04:29:04 GMT
Said Gary Hallock in alt.destroy.microsoft on Fri, 19 Jan 2001 19:40:46
>In article <FP4a6.61645$[EMAIL PROTECTED]>, "J J
>Sloan" <[EMAIL PROTECTED]> wrote:
>>
>> gcc in Red Hat 7 works just fine, thank you.
>>
>> Yes, there was some silly uproar from the Red Hat bashers, and I'm still
>> not sure I understand what it was all supposed to be about - gcc 2.96,
>> while not yet 3.0, is a solid compiler, especially the c++ stuff - and
>> was needed for some enterprise customers. I have been compiling the 2.4
>> kernel with gcc-2.96 on several boxes, and it's been completely solid.
>
>The problem as I understand it is that the name mangling for C++ in 2.96
>is different than previous versions and also different than what will be
>in 3.0. So everything should work fine until you upgrade. Then a
>recompile may be needed. But I have heard many people say that the whole
>thing was blown way out of proportion. I'll probably keep 6.2 up on my
>produciton systems until 7.1 comes out anyway I have tried 7.0 and had
>no problems with it.
Thanks, Gary. I still hope to avoid having to use gcc or any other
compiler, but I appreciate the conversation.
--
T. Max Devlin
*** The best way to convince another is
to state your case moderately and
accurately. - Benjamin Franklin ***
------------------------------
From: "JS PL" <jim@wauseon_com>
Crossposted-To: alt.destroy.microsoft,comp.os.ms-windows.nt.advocacy
Subject: Re: New Microsoft Ad :-)
Date: Fri, 19 Jan 2001 23:38:06 -0500
"Chris Ahlstrom" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> JS PL wrote:
> >
> > That story kind of reminds me about how my mp3 player in Linux plays
exactly
> > 1 mp3 per system boot. I try to make it a good choice since I get to
only
> > play one until I reboot though.
>
> You sure fucked up your configuration then.
> Or you're absolutely lying. What a wienie.
What do you want me to do, film it happening?
I didn't do anything to the configuration. It's the default install. It
plays an MP3 ONCE. Among other things. Sometimes it won't run ANY programs.
Sometimes it runs some of them. It just does whatever it wants to do I
guess. I click an icon and say to myself " I wonder if this will run today?"
Funny thing is, Windows 2000 on the exact same hardware runs perfectly all
the time. Go figure...
------------------------------
From: [EMAIL PROTECTED] (Charlie Ebert)
Crossposted-To: alt.destroy.microsoft
Subject: Re: "The Linux Desktop", by T. Max Devlin
Reply-To: Charlie Ebert:<[EMAIL PROTECTED]>
Date: Sat, 20 Jan 2001 04:55:04 GMT
In article <FP4a6.61645$[EMAIL PROTECTED]>,
J J Sloan wrote:
>In comp.os.linux.advocacy Ayende Rahien <Please@don't.spam> wrote:
>
>> "T. Max Devlin" <[EMAIL PROTECTED]> wrote in message
>> news:[EMAIL PROTECTED]...
>>> Well, here we go.
>>>
>>> I've got the "Linux Desktop" on order, from a company listed on
>>> linux.org. Its an 850MHz Athlon with 128 Meg of ram and a 40G ATA 100
>>> drive. CD-writer, printer, Logitech wheel mouse, PCI modem and a cheap
>>> Ethernet card; 19 inch monitor. RedHat 7.0, and I paid the extra bucks
>>> for the Deluxe box.
>
>Excellent -
>
>> RH 7.0 ?
>
>What's this, Linux advice from a wintroll?
>
>> On general, you should stay away from RH, and especially from .0 releases.
>
>Red Hat is by far the most popular distro, for many reasons.
>
>> RH tend to put all sorts of bleeding edge stuff in those things, stuff that
>> will make you bleed.
>
>I find RH 7.0 to be quite solid, and other reviewers agree.
>
>> Most notable example is gcc in RH 7, I remember that there was some problem
>> with 5.0, can't recall if there was something of the like in 6.0
>
>gcc in Red Hat 7 works just fine, thank you.
>
>Yes, there was some silly uproar from the Red Hat bashers,
>and I'm still not sure I understand what it was all supposed
>to be about - gcc 2.96, while not yet 3.0, is a solid compiler,
>especially the c++ stuff - and was needed for some enterprise
>customers. I have been compiling the 2.4 kernel with gcc-2.96
>on several boxes, and it's been completely solid.
It wasn't a silly uproar. Linus said RedHat 7.0 was useless as
a development station and he was right. You can't compile many
applicatons with the beta release version of GCC they provided.
2.96 is not a release but it will compile a kernel but damn little
else.
FYI
>
>> Be sure to have a LILO boot disk around, you'll need it to reinstall LILO
>> (or your boot manager of choice) on the MBR after you install Windows.
>
>Unlike Linux, windows simply wipes out the boot record for whatever
>OSes might be installed - a rather typical brain dead microsoft
>move - so that particular bit of advice is not inaccurate.
>
>jjs
>
------------------------------
From: "Aaron R. Kulkis" <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft,comp.os.ms-windows.nt.advocacy
Subject: Re: New Microsoft Ad :-)
Date: Fri, 19 Jan 2001 23:54:15 -0500
[EMAIL PROTECTED] wrote:
>
> In article <gK2a6.927$[EMAIL PROTECTED]>,
> "Erik Funkenbusch" <[EMAIL PROTECTED]> wrote:
> > <[EMAIL PROTECTED]> wrote in message
> news:949quf$ljt$[EMAIL PROTECTED]...
> > > In article <kvl96.136$[EMAIL PROTECTED]>,
> > > "Erik Funkenbusch" <[EMAIL PROTECTED]> wrote:
> > > > The test covers desktop environments, not servers. The average
> > > desktop *IS*
> > > > shutdown at night.
> > >
> > > This is an artifact of the historical unreliability of MS operating
> > > systems. Unix/Linux workstations are never shutdown at night.
> >
> > Tell that to your average "save the world" do gooder that insists on
> turning
> > everything off to save the ecology. So called "green PC's" were
> invented to
> > help shut these people up.
>
> And if your workstation is doing nothing, go ahead. Many
> scientist/engineers have big jobs running overnight on any available
> CPU. Depends on your environment I guess. Stability for multi-day
> computational runs was why I switched to Linux. That and a better
> development environment.
>
> >
> > > > > Well, there you have it, plain and simple. A study, funded by
> > > Microsoft,
> > > > > that proves that while 2K is better than NT, it still sucks.
> > > >
> > > > The way they count failure is "unplanned reboot". Also note that
> > > they used
> > > > beta versions of 2000 for the study (they also used the released
> > > version,
> > > > but beta's were also used).
> > >
> > > NO
> > > And I repeat NO NO NO
> > > They were not counting "unplannned reboot" they were
> counting "abnormal
> > > shutdown". Read the study (which is woefully short on details).
> So if
> > > the whole system has gone to hell (barely responsive, short on
> > > resources, etc.) and you reboot "voluntarily" before it completely
> > > freezes/bsods on you, this counts as a "normal shutdown" and doesn't
> > > count against the reliability numbers.
> >
> > And you're still ignoring the fact that they used *BETA* versions of
> the OS.
> > Several beta versions, some of which were known to be unstable.
>
> I don't care what version of the OS was used. The methodology of the
> study is fundamentally flawed in two ways:
>
> 1. Power outages count as abnormal shutdowns to MS detriment
> 2. "Voluntary" shutdowns to avert iminent crashes are counted as
> normal shutdowns to MS benefit.
>
> In my experience, item 2 amounts to a significant fraction of the total
> reboots even if you are shutting down every night. The impact of item
> 1 is probably much smaller. In either case, it's a big enough problem
> to seriously question the usefullness of this study. I consider it at
> best incompetent, and at worst, dishonest that they did not publish the
> total number of reboots for each OS over the study period.
I believe the word is "devious"
>
> >
> >
>
> Sent via Deja.com
> http://www.deja.com/
--
Aaron R. Kulkis
Unix Systems Engineer
DNRC Minister of all I survey
ICQ # 3056642
H: "Having found not one single carbon monoxide leak on the entire
premises, it is my belief, and Willard concurs, that the reason
you folks feel listless and disoriented is simply because
you are lazy, stupid people"
I: Loren Petrich's 2-week stubborn refusal to respond to the
challenge to describe even one philosophical difference
between himself and the communists demonstrates that, in fact,
Loren Petrich is a COMMUNIST ***hole
J: Other knee_jerk reactionaries: billh, david casey, redc1c4,
The retarded sisters: Raunchy (rauni) and Anencephielle (Enielle),
also known as old hags who've hit the wall....
A: The wise man is mocked by fools.
B: Jet Silverman plays the fool and spews out nonsense as a
method of sidetracking discussions which are headed in a
direction that she doesn't like.
C: Jet Silverman claims to have killfiled me.
D: Jet Silverman now follows me from newgroup to newsgroup
...despite (C) above.
E: Jet is not worthy of the time to compose a response until
her behavior improves.
F: Unit_4's "Kook hunt" reminds me of "Jimmy Baker's" harangues against
adultery while concurrently committing adultery with Tammy Hahn.
G: Knackos...you're a retard.
------------------------------
** FOR YOUR REFERENCE **
The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:
Internet: [EMAIL PROTECTED]
You can send mail to the entire list by posting to comp.os.linux.advocacy.
Linux may be obtained via one of these FTP sites:
ftp.funet.fi pub/Linux
tsx-11.mit.edu pub/linux
sunsite.unc.edu pub/Linux
End of Linux-Advocacy Digest
******************************