I've seen this behaviour, too. I once tried to move a large number of
mp3 files from one physical drive to another with rsync, and the machine
locked up, destroyed the reiserfs file systems on both drives, and I
lost a bunch of files. That's the only time I've had a near catastrophic
failure in ten years of running linux.


On Tue, 2003-07-29 at 13:02, John Summerfield wrote:
> On Tue, 29 Jul 2003, Tom Duerbusch wrote:
>
> > My take on multiple images is two fold.
> >
> > But first, the disclaimer:
> > This assumes you have sufficient resources in the first place to do
> > this (normally real memory).
> >
> > 1.  I don't know this to be true with Linux, but the Unix types have
> > always been leary of having multiple applications running on the same
> > box.  First, they say that they can't guarentee performance, then they
> > start talking about an application corrupting the memory of another
> > application.  So, one application per box if you want reliability.  I
> > haven't had the experience of memory problems in Linux, yet, so I still
> > tend to believe this.
>
> Linux doesn't handle memory very well. My Athlon has 512 Mbytes of RAM,
> and most of the time it works really well. However, I sometimes copy
> large - 600 Mbytes or more - files, either from disk to disk or across
> the LAN. When that happens, RAM gets filled with this data and
> performance is really bad for a while, even when the file processing is
> over.
>
>
>
> >
> > 2.  Once an application is running and is running good, it should
> > continue to run correctly until something external happens, like putting
> > on maintenance.  So, why put on maintenance, other than security
> > patches?  A new application may need a different gcc library or such.
> > The origional application, if not fully tested with the new changes, may
> > fail in production.
>
> Third-party software aside, this tends not to happen with Linux. At
> least with commercial distros, people are paid to fix things without
> causing such problems.
>
> When you do full distro upgrades, you upgrade everything and go through
> your QA routine.
>
> As we've seen in the past few hours, there can be problems with
> third-party software requiring specific versions of, potentially old,
> libraries. Mostly, there are compatibility libraries included to allow
> you to do this.
>
> If you go the "I'll just create this little symlink and see if it
> works," then you really are on your own. If it breaks guess who did it?
> The good news is the pieces are all yours.
>
> Sometimes it's happened in RHL that you needed compatibility libraries
> from a prior release.
>
>
> If vendor certifications are important to you, then life becomes more
> difficult.
>
>
> >
> > At least VM makes it a whole lot easier to define, maintain and control
> > multiple machines.
>
> Even then I can imagine that over time, the number of "standard
> configurations" will proliferate.
>
> I discovered recently that people are still using Red Hat Linux 6.2.
> Their applications all work on it. it's a good stable release of RHL 6.2
> and it works on their hardware.
>
> Unfortunately, Red Hat's not shipping fixes for it any more, and it is
> in need of fixes. Still, if it's properlly shielded it's probably no
> worst them MSWare.
>
> Come to think of it, I installed RHL 6.2 myself a week or so ago.
>
> I'm sure people here will be in that position wrt SLES7 or RHL 7.x in
> time: the cost of disrupting it will be too much to contemplate.
>
>
>
> --
>
>
> Cheers
> John.
>
> Join the "Linux Support by Small Businesses" list at
> http://mail.computerdatasafe.com.au/mailman/listinfo/lssb
> Copyright John Summerfield. Reproduction prohibited.
--
-------------------------------------------------------------
Michael Martin
[EMAIL PROTECTED]
(713) 918-2631
------------------------------------------------------------

Reply via email to