On Sun, 07 Oct 2007 23:56:11 -0400 "Thomas A. McGonagle" <[EMAIL PROTECTED]> wrote: Hi Thomas:
Essentially our collective means of evaluating or determining processing power of CPUs needs to become more precise as we abandon single core to multicore systems. How that is defined or if such a determination can be made at all depends on what processing is actually happening versus what we believe to be happening; this is not as straightforward as it was with single core systems. Here's a nice table to flesh the problem out more clearly. The first is the early reported output for the Folding at Home project - a reasonably intensive test for any system. It takes the PS3 8 hours to process one work unit for the project, however note the early (March 2007) results here: http://www.imminst.org/forum/index.php?act=ST&f=217&t=15169&s= However if we consider what Stanford is reporting currently (Oct 2007) regarding their results the PS3 is approaching 1000 Teraflops. The unprecedented availability of such processing power is such that the project is hoping that if 50,000 PS3 participate the processing power involved can approach Petaflop scales. Currently competing systems or rather other participating computers are not even close to approaching PS3 processing rates. Here's the current figures: http://fah-web.stanford.edu/cgi-bin/main.py?qtype=osstats We are in a really fascinating time in the sense that the capacity of the variety of multicore systems are so new that there are no measurements which are commonly familiar that are meaningful or really well understood, yet. In the meantime, the numbers Stanford reports are stunning in that one can clearly see the force of all the various systems pitching in an doing their bit. My perspective is that will all the enormous presence of Intel and other systems out there I would have thought that the scale of processing would have been quite different. However, Stanford's figures are compelling in that the admittedly tiny -- miniscule -- amount of PS3's contributing to the project are doing an enormous amount of work. Of course, programming the PS3 to do something as intense for one's own interests is not a trivial weekend pursuit ... however the potential that is there to be used for personal research and learning and professional level contributions is unprecedented. > Hello All, > Earlier this year Wired Magazine published a one pager stating the > computational power of various computers. If I remember correctly (I > can't find the 1 pager on wired.com), they said that: > the PS3 could operate at 1.5 Teraflops > the XBox360 could operate at 1 Teraflops > 430 Pentium 4 computers could operate at 1 Teraflop. > > Since reading the "article" I have repeated this to anyone who would > listen. > > Upon visiting the Terrasoft website, I read that 1TFlop can > theoretically be reached by an 8 PS3 cluster. > > Would it be correct to say that one PS3 is .125(1T / 8 PS3s) TFlops? > > Also does anyone know what was wrong with that "article", or does > anyone have it handy? > > Thanks a lot. > -Tom =============== "If I were not a physicist, I would probably be a musician. I often think in music. I live my daydreams in music. I see my life in terms of music. ... I get most joy in life out of music." "What Life Means to Einstein: An Interview by George Sylvester Viereck," for the October 26, 1929 issue of The Saturday Evening Post.
signature.asc
Description: PGP signature
_______________________________________________ yellowdog-general mailing list [email protected] http://lists.terrasoftsolutions.com/mailman/listinfo/yellowdog-general HINT: to Google archives, try '<keywords> site:terrasoftsolutions.com'
