On Saturday 01 February 2003 05:06 pm, Sundance wrote: > > If you get a seg fault while compiling qt or libkde because of bad > > memory or a cpu overheat, I think you would get it when compiling > > anything of any size. > > This is an often made and legitimate assumption, but it almost always > ends up getting proven wrong. > > See this thread: > http://marc.theaimsgroup.com/?l=gentoo-user&m=104334984122667&w=2
Not to be pedantic, but this thread doesn't really _proove_ anything. It simply demonstrates the only reliable thing about hardware sig11 problems - randomness. The only other correspondence is with heat - and even that isn't always true. I had sig11 problems compiling GCC/glibc on an LFS box, and eventually had to buy a huge fan to make it compile. No C++ involved. C++ compilation is not "more intensive": it simply requires less work. The processor does (roughly) the same amount of work for each (say) millisecond, it just does this work for fewer milliseconds. Now, what might be (and almost certainly is) true is that a different agorithm is used by the compiler, resulting in different patterns of memory access, shifts, arithmetic and interrupts. The C++ compiler's pattern apparently annoys the processor more. -- Bruce J.A. Nourish <[EMAIL PROTECTED]> -- [EMAIL PROTECTED] mailing list
