Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-04 Thread Riad S. Wahby
Tim May [EMAIL PROTECTED] wrote:
 Now I grant you that I haven't tested CPUs in this way in many years. 
 But I am skeptical that recent CPUs are substantially different than 
 past CPUs. I would like to see some actual reports of burned 
 literally CPUs.

I've never seen a burned literally CPU, but I have tracked the
demise of an AMD K6 (or K6-2, can't remember now) from hot carrier
effects.  If all processors were made like that one, you would see a
lot more load-induced failures.

-- 
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2 M.Eng



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-04 Thread James A. Donald
--
On 1 Jan 2004 at 10:44, Tim May wrote:
 Further, junction-to-case temperature in a ceramic package
 has a time constant of tens of seconds, meaning, the case
 temperature reaches something like 98% of its equilibrium
 value (as wattage reaches, say, 60 watts, or whatever), in
 tens of seconds.

The time constant for the CPU+plus cooling system is a good
deal longer, and in modern CPUs the large mass of the cooling
system can result in quite long periods, for example a quarter
of an hour, before CPU load results in heat related shut off.

 We also used to run CPUs at 125 C ambient

Today's CPUs will generally fail a bit above seventy
centigrade.  They frequently fail in ways that cause them to
draw increased current, eventuallly incinerating the
motherboard.

To prevent this, always look for the bios option to shut down
the motheroboard in the event of CPU overheating. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 Uw0lUnQOu8bBc6kOrcDpYZKS0DjzIgrXM9AJSVh2
 49rBlWsHg9Teys0ELS5pT26g56P8tEMtp/mQ3eihl



Skeptical about claim that stamp creation burns out modern CPUs

2004-01-04 Thread Tim May
On Jan 1, 2004, at 8:13 AM, Eric S. Johansson wrote:
actually, we mean burned literally.  the stamp creation process raises 
the temperature of the CPU.  Most systems are not build for full tilt 
computational load.  They do not have the ventilation necessary for 
reliable operation.  So, they may get by with the first 8-12 hours of 
stamp generation (i.e. roughly 2000-3000 stamps per machine) but the 
machine reliability after that time will degrade as the heat builds 
up.  Feel free to run this experiment yourself.  Take a cheat machine 
from your local chop shop, run hashcash in an infinite loop, and wait 
for the smoke detector to go off.

there is nothing quite like waking up to the smell of freshly roasted 
Intel.



I'm skeptical of this claim. A lot of Intel and AMD and similar 
machines are running full-tilt, 24/7. To wit, Beowulf-type clusters, 
the Macintosh G5 cluster that is now rated third fasted in the world, 
and so on. None of these machines is reported to be burning up 
literally. Likewise, a lot of home and corporate users are running 
background tasks which are at 100% CPU utilization.

(Examples abound, from render farms to financial modeling to... Friends 
of mine run a bunch of 2 and 3 GHz Pentium 4 machines in CPU-bound 
apps, and they run them 24/7. (Their company, Invest by Agents, 
analyzes tens of thousands of stocks. They use ordinary Dells and have 
had no catastrophic burned literally failures.)

Further, junction-to-case temperature in a ceramic package has a time 
constant of tens of seconds, meaning, the case temperature reaches 
something like 98% of its equilibrium value (as wattage reaches, say, 
60 watts, or whatever), in tens of seconds. (For basic material and 
physics reasons...I used to make many of these measurements when I was 
at Intel, and nothing in the recent packaging has changed the physics 
of heat flow much.)

We also used to run CPUs at 125 C ambient, under operating conditions, 
for weeks at a time. Here the junction temperature was upwards of 185 
C. Failures occurred in various ways, usually do to electromigration 
and things like that. Almost never was there any kind of fire. Just 
burnout, which is a generic name but has nothing of course to do with 
burning in the chemical sense.

Now I grant you that I haven't tested CPUs in this way in many years. 
But I am skeptical that recent CPUs are substantially different than 
past CPUs. I would like to see some actual reports of burned 
literally CPUs.

By the way, I have run some apps on my Macintosh 1 GHz CPU which are 
CPU-bound. No burn ups.

I'd like to see some support for the claim that running a stamp 
creation process is more likely to burn up a modern machine than all of 
these apps running financial modeling, render farms, and supercomputer 
clusters are doing.

Until then, render me skeptical.

--Tim May



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-04 Thread Tim May
On Jan 1, 2004, at 2:35 PM, Eric S. Johansson wrote:

Tim May wrote:

I'm skeptical of this claim. A lot of Intel and AMD and similar 
machines are running full-tilt, 24/7. To wit, Beowulf-type 
clusters, the Macintosh G5 cluster that is now rated third fasted in 
the world, and so on. None of these machines is reported to be 
burning up literally. Likewise, a lot of home and corporate users are 
running background tasks which are at 100% CPU utilization.
I will admit to a degree of skepticism myself even though I am 
describing overheating as a likely outcome.
But what is your actual evidence, as opposed to your belief that 
overheating is a likely outcome? I have said that I know of many 
machines (tens of thousands of CPUs, and probably many more I don't 
know about directly) which are running CPU-bound applications 24/7. I 
have heard of no burning up literally cases with the many Beowulf 
clusters, supercomputers, and 24/7 home or business screensavers and 
crunching apps, so I suspect they are not common.

If you have actual evidence, as opposed to likely outcome 
speculations, please present the evidence.


First, if you lose a fan on an Intel CPU of at least Pentium III 
generation or an AMD equivalent, you will lose your CPU to thermal 
overload.  This is a well-known and well-documented problem.  One 
question is can stamp work thermally overload and damage a CPU.  
Second question is how much stamp work  can you do without thermally 
overloading the CPU.
This is true whether one is running Office or a stamp program. You are 
just repeating a general point about losing a fan, not about stamp 
generation per se. Boxer fan lifetimes are usually about comparable to 
hard drive lifetimes, which also kill a particular machine. You are not 
presenting anything new here, and the association with stamp generation 
is nonexistent.
Large clusters have more careful thermal engineering applied to them 
than probably most of the zombies out there.  I have seen one Beowulf 
cluster constructed out of standard 1U chassis, motherboards, fans 
etc. and frequently 10 percent of the systems are down at any one 
time.  The vast majority of the failures have been due to thermal 
problems.
Most clusters use exactly the same air-cooled machines as are available 
from Dell, Sun, Apple, etc. In fact, the blades and rackmount systems 
are precisely those available from Dell, Sun, Apple, etc.

You are presenting no evidence, just hypothesizing that your stamp 
protocol somehow burns out more CPUs than render farms do, than 
Mersenne prime apps to, than financial simulations do, etc. Yet you 
present no actual numbers.
so, will we see a Pentium IV spontaneously ignite like a third tier 
heavy-metal group in a Rhode Island nightclub?  No, you're right, we 
won't.  I think it's safe to say we will see increasing unreliability, 
power supply failures, and failures of microelectronics due to 
increased thermal load.  Which is good enough for my purposes.
Evidence is desirable, belief is just belief.



--Tim May
That government is best which governs not at all. --Henry David 
Thoreau



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-04 Thread Tim May
On Jan 1, 2004, at 11:56 AM, Riad S. Wahby wrote:

Tim May [EMAIL PROTECTED] wrote:
Now I grant you that I haven't tested CPUs in this way in many years.
But I am skeptical that recent CPUs are substantially different than
past CPUs. I would like to see some actual reports of burned
literally CPUs.
I've never seen a burned literally CPU, but I have tracked the
demise of an AMD K6 (or K6-2, can't remember now) from hot carrier
effects.  If all processors were made like that one, you would see a
lot more load-induced failures.
Just so. A lot of games are close to being CPU-bound, plus the 
screensavers used as Mersenne prime finders and the like, and there are 
few reports of house fires caused by the CPU being smoked.

When I did reliability stuff for Intel, CPUs failed, but mostly not in 
ways that had them catching on fire, as the stamp guy is suggesting is 
common for stamp generation.

--Tim May





#1. Sanhedrin 59a: Murdering Goyim (Gentiles) is like killing a wild 
animal.
#2. Aboda Sarah 37a: A Gentile girl who is three years old can be 
violated.
#3. Yebamoth 11b: Sexual intercourse with a little girl is permitted 
if she is three years of age.
#4. Abodah Zara 26b: Even the best of the Gentiles should be killed.
#5. Yebamoth 98a: All gentile children are animals.
#6. Schulchan Aruch, Johre Deah, 122: A Jew is forbidden to drink from 
a glass of wine which a Gentile has touched, because the touch has made 
the wine unclean.
#7. Baba Necia 114, 6: The Jews are human beings, but the nations of 
the world are not human beings but beasts.



Skeptical about claim that stamp creation burns out modern CPUs

2004-01-01 Thread Tim May
On Jan 1, 2004, at 8:13 AM, Eric S. Johansson wrote:
actually, we mean burned literally.  the stamp creation process raises 
the temperature of the CPU.  Most systems are not build for full tilt 
computational load.  They do not have the ventilation necessary for 
reliable operation.  So, they may get by with the first 8-12 hours of 
stamp generation (i.e. roughly 2000-3000 stamps per machine) but the 
machine reliability after that time will degrade as the heat builds 
up.  Feel free to run this experiment yourself.  Take a cheat machine 
from your local chop shop, run hashcash in an infinite loop, and wait 
for the smoke detector to go off.

there is nothing quite like waking up to the smell of freshly roasted 
Intel.



I'm skeptical of this claim. A lot of Intel and AMD and similar 
machines are running full-tilt, 24/7. To wit, Beowulf-type clusters, 
the Macintosh G5 cluster that is now rated third fasted in the world, 
and so on. None of these machines is reported to be burning up 
literally. Likewise, a lot of home and corporate users are running 
background tasks which are at 100% CPU utilization.

(Examples abound, from render farms to financial modeling to... Friends 
of mine run a bunch of 2 and 3 GHz Pentium 4 machines in CPU-bound 
apps, and they run them 24/7. (Their company, Invest by Agents, 
analyzes tens of thousands of stocks. They use ordinary Dells and have 
had no catastrophic burned literally failures.)

Further, junction-to-case temperature in a ceramic package has a time 
constant of tens of seconds, meaning, the case temperature reaches 
something like 98% of its equilibrium value (as wattage reaches, say, 
60 watts, or whatever), in tens of seconds. (For basic material and 
physics reasons...I used to make many of these measurements when I was 
at Intel, and nothing in the recent packaging has changed the physics 
of heat flow much.)

We also used to run CPUs at 125 C ambient, under operating conditions, 
for weeks at a time. Here the junction temperature was upwards of 185 
C. Failures occurred in various ways, usually do to electromigration 
and things like that. Almost never was there any kind of fire. Just 
burnout, which is a generic name but has nothing of course to do with 
burning in the chemical sense.

Now I grant you that I haven't tested CPUs in this way in many years. 
But I am skeptical that recent CPUs are substantially different than 
past CPUs. I would like to see some actual reports of burned 
literally CPUs.

By the way, I have run some apps on my Macintosh 1 GHz CPU which are 
CPU-bound. No burn ups.

I'd like to see some support for the claim that running a stamp 
creation process is more likely to burn up a modern machine than all of 
these apps running financial modeling, render farms, and supercomputer 
clusters are doing.

Until then, render me skeptical.

--Tim May



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-01 Thread Riad S. Wahby
Tim May [EMAIL PROTECTED] wrote:
 Now I grant you that I haven't tested CPUs in this way in many years. 
 But I am skeptical that recent CPUs are substantially different than 
 past CPUs. I would like to see some actual reports of burned 
 literally CPUs.

I've never seen a burned literally CPU, but I have tracked the
demise of an AMD K6 (or K6-2, can't remember now) from hot carrier
effects.  If all processors were made like that one, you would see a
lot more load-induced failures.

-- 
Riad Wahby
[EMAIL PROTECTED]
MIT VI-2 M.Eng



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-01 Thread Tim May
On Jan 1, 2004, at 11:56 AM, Riad S. Wahby wrote:

Tim May [EMAIL PROTECTED] wrote:
Now I grant you that I haven't tested CPUs in this way in many years.
But I am skeptical that recent CPUs are substantially different than
past CPUs. I would like to see some actual reports of burned
literally CPUs.
I've never seen a burned literally CPU, but I have tracked the
demise of an AMD K6 (or K6-2, can't remember now) from hot carrier
effects.  If all processors were made like that one, you would see a
lot more load-induced failures.
Just so. A lot of games are close to being CPU-bound, plus the 
screensavers used as Mersenne prime finders and the like, and there are 
few reports of house fires caused by the CPU being smoked.

When I did reliability stuff for Intel, CPUs failed, but mostly not in 
ways that had them catching on fire, as the stamp guy is suggesting is 
common for stamp generation.

--Tim May





#1. Sanhedrin 59a: Murdering Goyim (Gentiles) is like killing a wild 
animal.
#2. Aboda Sarah 37a: A Gentile girl who is three years old can be 
violated.
#3. Yebamoth 11b: Sexual intercourse with a little girl is permitted 
if she is three years of age.
#4. Abodah Zara 26b: Even the best of the Gentiles should be killed.
#5. Yebamoth 98a: All gentile children are animals.
#6. Schulchan Aruch, Johre Deah, 122: A Jew is forbidden to drink from 
a glass of wine which a Gentile has touched, because the touch has made 
the wine unclean.
#7. Baba Necia 114, 6: The Jews are human beings, but the nations of 
the world are not human beings but beasts.



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-01 Thread James A. Donald
--
On 1 Jan 2004 at 10:44, Tim May wrote:
 Further, junction-to-case temperature in a ceramic package
 has a time constant of tens of seconds, meaning, the case
 temperature reaches something like 98% of its equilibrium
 value (as wattage reaches, say, 60 watts, or whatever), in
 tens of seconds.

The time constant for the CPU+plus cooling system is a good
deal longer, and in modern CPUs the large mass of the cooling
system can result in quite long periods, for example a quarter
of an hour, before CPU load results in heat related shut off.

 We also used to run CPUs at 125 C ambient

Today's CPUs will generally fail a bit above seventy
centigrade.  They frequently fail in ways that cause them to
draw increased current, eventuallly incinerating the
motherboard.

To prevent this, always look for the bios option to shut down
the motheroboard in the event of CPU overheating. 

--digsig
 James A. Donald
 6YeGpsZR+nOTh/cGwvITnSR3TdzclVpR0+pr3YYQdkG
 Uw0lUnQOu8bBc6kOrcDpYZKS0DjzIgrXM9AJSVh2
 49rBlWsHg9Teys0ELS5pT26g56P8tEMtp/mQ3eihl



Re: Skeptical about claim that stamp creation burns out modern CPUs

2004-01-01 Thread Tim May
On Jan 1, 2004, at 2:35 PM, Eric S. Johansson wrote:

Tim May wrote:

I'm skeptical of this claim. A lot of Intel and AMD and similar 
machines are running full-tilt, 24/7. To wit, Beowulf-type 
clusters, the Macintosh G5 cluster that is now rated third fasted in 
the world, and so on. None of these machines is reported to be 
burning up literally. Likewise, a lot of home and corporate users are 
running background tasks which are at 100% CPU utilization.
I will admit to a degree of skepticism myself even though I am 
describing overheating as a likely outcome.
But what is your actual evidence, as opposed to your belief that 
overheating is a likely outcome? I have said that I know of many 
machines (tens of thousands of CPUs, and probably many more I don't 
know about directly) which are running CPU-bound applications 24/7. I 
have heard of no burning up literally cases with the many Beowulf 
clusters, supercomputers, and 24/7 home or business screensavers and 
crunching apps, so I suspect they are not common.

If you have actual evidence, as opposed to likely outcome 
speculations, please present the evidence.


First, if you lose a fan on an Intel CPU of at least Pentium III 
generation or an AMD equivalent, you will lose your CPU to thermal 
overload.  This is a well-known and well-documented problem.  One 
question is can stamp work thermally overload and damage a CPU.  
Second question is how much stamp work  can you do without thermally 
overloading the CPU.
This is true whether one is running Office or a stamp program. You are 
just repeating a general point about losing a fan, not about stamp 
generation per se. Boxer fan lifetimes are usually about comparable to 
hard drive lifetimes, which also kill a particular machine. You are not 
presenting anything new here, and the association with stamp generation 
is nonexistent.
Large clusters have more careful thermal engineering applied to them 
than probably most of the zombies out there.  I have seen one Beowulf 
cluster constructed out of standard 1U chassis, motherboards, fans 
etc. and frequently 10 percent of the systems are down at any one 
time.  The vast majority of the failures have been due to thermal 
problems.
Most clusters use exactly the same air-cooled machines as are available 
from Dell, Sun, Apple, etc. In fact, the blades and rackmount systems 
are precisely those available from Dell, Sun, Apple, etc.

You are presenting no evidence, just hypothesizing that your stamp 
protocol somehow burns out more CPUs than render farms do, than 
Mersenne prime apps to, than financial simulations do, etc. Yet you 
present no actual numbers.
so, will we see a Pentium IV spontaneously ignite like a third tier 
heavy-metal group in a Rhode Island nightclub?  No, you're right, we 
won't.  I think it's safe to say we will see increasing unreliability, 
power supply failures, and failures of microelectronics due to 
increased thermal load.  Which is good enough for my purposes.
Evidence is desirable, belief is just belief.



--Tim May
That government is best which governs not at all. --Henry David 
Thoreau