--- Tom McCabe <[EMAIL PROTECTED]> wrote: > You cannot get large amounts of computing power simply > by hooking up a hundred thousand PCs for problems that > are not easily parallelized, because you very quickly > run into bandwidth limitations even with gigabit > Ethernet. Parts of the brain are constantly > communicating with one another; I would be very > surprised if you could split up the brain effectively > enough to be able to both run one tiny piece on a PC > and have the PCs communicate effectively in realtime. > > - Tom
It is not that hard, really. Each of the 10^5 PCs simulates about 10 mm^3 of brain tissue. Axon diameter varies but is typically 1-2 microns. This means each bit of brain tissue has at most on the order of 10^7 inputs and outputs, each carrying 10 bits per second of information, or 100 Mb/s. This was barely within Google's network capacity in 2000, and probably well within it now. http://en.wikipedia.org/wiki/Google_platform I think individuals and small groups trying to build AGI will have a hard time competing with Google due to the cost of hardware. It costs $2 million/month just for electricity for their server farms. Google is building a supercomputer in Oregon that will have cooling towers 4 stories high. http://en.wikipedia.org/wiki/Project_02 > > --- Matt Mahoney <[EMAIL PROTECTED]> wrote: > > > --- Eugen Leitl <[EMAIL PROTECTED]> wrote: > > > > Google already have enough computing problem to > > do a crude simulation of a > > [human brain] > > > > > > Um, no. It takes 64 kNodes of Blue Gene/L to do > > about 8 1/10th speed > > > crudely-simulated > > > mice, or about one realtime cartoon mouse > > (assuming, the code would scale, > > > which it > > > wouldn't). > > > > The Blue Gene/L simulation is at a lower level than > > is needed to do useful AI. > > You don't need millisecond resolution. In most > > neural models, the important > > signal is the rate of firing, not the individual > > pulses. I realize there are > > exceptions, such as the transmission of phase > > information for stereoscopic > > sound perception up to 1500 Hz. But for most > > purposes, neurons have an > > information rate of about 10 bits per second. This > > was measured in tactile > > sensation in the finger. (Sorry I don't have the > > references). In any case, > > 0.1 seconds is about the smallest perceptable time > > unit in humans. > > > > The human brain has about 10^11 neurons with 10^4 > > synapses each. Each synapse > > represents about 1 bit of memory (Hopfield model). > > Therefore you need 10^15 > > bits of memory and 10^16 operations per second. > > > > Google has about 10^5 PCs with 2-4 GB memory each, > > connected by a high speed > > Ethernet. Therefore they have enough memory. They > > have a mix of different > > processors, but a modern processor can execute 10^10 > > to 10^11 16-bit > > multiply-add instructions per second using SIMD > > (SSE2) instructions (more if > > you add a GPU). Therefore they have enough > > computing power, or at least close > > enough to do useful experiments. > > > > > > > > -- Matt Mahoney, [EMAIL PROTECTED] -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07
