On Jan 1, 2004, at 8:13 AM, Eric S. Johansson wrote:

actually, we mean burned literally. the stamp creation process raises the temperature of the CPU. Most systems are not build for full tilt computational load. They do not have the ventilation necessary for reliable operation. So, they may get by with the first 8-12 hours of stamp generation (i.e. roughly 2000-3000 stamps per machine) but the machine reliability after that time will degrade as the heat builds up. Feel free to run this experiment yourself. Take a cheat machine from your local chop shop, run hashcash in an infinite loop, and wait for the smoke detector to go off.


there is nothing quite like waking up to the smell of freshly roasted Intel.



I'm skeptical of this claim. A lot of Intel and AMD and similar machines are running full-tilt, "24/7." To wit, Beowulf-type clusters, the Macintosh G5 cluster that is now rated third fasted in the world, and so on. None of these machines is reported to be burning up literally. Likewise, a lot of home and corporate users are running background tasks which are at 100% CPU utilization.

(Examples abound, from render farms to financial modeling to... Friends of mine run a bunch of 2 and 3 GHz Pentium 4 machines in CPU-bound apps, and they run them 24/7. (Their company, Invest by Agents, analyzes tens of thousands of stocks. They use ordinary Dells and have had no catastrophic "burned literally" failures.)

Further, junction-to-case temperature in a ceramic package has a time constant of tens of seconds, meaning, the case temperature reaches something like 98% of its equilibrium value (as wattage reaches, say, 60 watts, or whatever), in tens of seconds. (For basic material and physics reasons...I used to make many of these measurements when I was at Intel, and nothing in the recent packaging has changed the physics of heat flow much.)

We also used to run CPUs at 125 C ambient, under operating conditions, for weeks at a time. Here the junction temperature was upwards of 185 C. Failures occurred in various ways, usually do to electromigration and things like that. Almost never was there any kind of "fire." Just "burnout," which is a generic name but has nothing of course to do with "burning" in the chemical sense.

Now I grant you that I haven't tested CPUs in this way in many years. But I am skeptical that recent CPUs are substantially different than past CPUs. I would like to see some actual reports of "burned literally" CPUs.

By the way, I have run some apps on my Macintosh 1 GHz CPU which are CPU-bound. No burn ups.

I'd like to see some support for the claim that running a stamp creation process is more likely to burn up a modern machine than all of these apps running financial modeling, render farms, and supercomputer clusters are doing.

Until then, render me skeptical.

--Tim May

Reply via email to