RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread MarkI-ZeroPoint
I think we’re 95% in agreement, and even though they choose to not believe that 
is ok, at least they don’t lop-off people’s heads because of it!  Even a purely 
atheistic society would perish if it doesn’t have some basic moral principles.  
Changes take time… perhaps by our next lifetime, Steven!  -mark

 

From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net] 
Sent: Tuesday, January 27, 2015 4:04 PM
To: vortex-l@eskimo.com
Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26

 

HI Mark,

 

I suspect no religion can escape having an embarrassing moment or two. That's 
becuz humans run them all, not any prescribed deity. I didn't mean to single 
out Christianity anymore than any other religion. IMHO, if we humans could just 
accept the fact that we all occasionally f#$k up, so perhaps learning how to 
forgive each other for our faults might not be a bad idea. . And if we can get 
over the fear that it's not the end of civilization if a black man marries a 
white man. I mean, jeez, haven't we gotten over interracial marriage yet?

 

PS: We and the catz are doing fine in the Midwest. Thanks for asking. We dodged 
the east coast blizzard bullet. 

 

Regards,

Steven Vincent Johnson

svjart.orionworks.com

zazzle.com/orionworks



RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Orionworks - Steven Vincent Johnson
HI Mark,

 

I suspect no religion can escape having an embarrassing moment or two. That's 
becuz humans run them all, not any prescribed deity. I didn't mean to single 
out Christianity anymore than any other religion. IMHO, if we humans could just 
accept the fact that we all occasionally f#$k up, so perhaps learning how to 
forgive each other for our faults might not be a bad idea. . And if we can get 
over the fear that it's not the end of civilization if a black man marries a 
white man. I mean, jeez, haven't we gotten over interracial marriage yet?

 

PS: We and the catz are doing fine in the Midwest. Thanks for asking. We dodged 
the east coast blizzard bullet. 

 

Regards,

Steven Vincent Johnson

svjart.orionworks.com

zazzle.com/orionworks



Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread John Berry
Programming could surely be improved a great deal, since some languages
were designed to be bad to give programmers lots of work.
And x86 architecture is baroque from an assembly language perspective.

But while I am on the topic, let me go slightly off topic, there is a lot
that I think is rather silly, for instance AI and Robots.

Mostly these have been conceived of being created by either some MacGyvered
solution or some mimicry of an insect or overly basic natural process.

I disagree, for the robot, if you give it all the sensors (cameras, 3D
laser scanners) needed to make a proper 3D model of the world around it, it
does not need tricks to navigate the world. Any more than the AI player in
computer games which has from the start of computer games been very able to
challenge human players on a more level playing field.

And once a computer is able to create a high quality 3D model of the world
around it, having it recognize what things likely are, categorized by type
becomes pretty simple.

And once it knows of the general form and material characteristics of a
plate (it could also access the internet where plates are sold with images
that match the appearance to the word).

And text to speech is old hat...

So there is very little from stopping a barely programmed robot from being
asked to get a plate and working out that the sounds relate to words, the
forms of the sentence appears to be a command of the robot, plates seem to
be flat things often for eating of, often white and often made of porcelain
(it could check wikipedia).
It could likely learn also learn that plates are often in kitchens,
recognize what room seems likely to be the kitchen, explore, and bring a
medium sized plate if one can be found.

But where is such a robot?

Oh, last I heard they couldn't recognize a key ring!

But a key ring is easy to describe, it is metal, metal should be detected
by thermal readings and colour/reflection analysis, and it is a circle with
a limited range of sized and most often keys attached.

So where is the problem?
I am pretty sure sufficient technology has not been supplied to see what is
in front of it, to see the world as we do.

I think artificial intelligence is much the same, it has been tried on the
cheap, hoping a gimmick will be found.
Note I am not talking about artificial consciousness.

AI that could pass the turing test is simple.
But I said simple, not easy.

A human being that is only a year old yet somehow mature that has been
force feed details is not going to act like a normal person.

For a computer to act like a human, if firstly needs a state of being and
states of being are based on hunger, boredom (wanting unique or familiar
input), wanting to connect to another, emotions etc...

And so a computer needs to be told it needs to breath, and it needs to need
food, and need security and friendship.
These different modules give it something perhaps a bit dangerous, an
agenda!

It needs to grow up either as a robot in the real world or in a simulated
reality so it has a history, memories, movies it has watched.

Now a robot 'without consciousness' watching a movie might seem odd, but
like us it would need the power to imagine it were the characters, to
experience their pain joy, adventure and fear.

What Myers Briggs type would the AI have?

The point I am making is that there is not really anything truly complex,
unknown or mysterious about how to create such an AI, but is is a lot of
work to create and 'raise' a complete being.

Without having all these qualities, drives, history and characteristics and
many more besides no computer will be convincing as a normal human
intelligence.

One without language or social problem anyway, after all it is very easy to
simulate a real human who has shut a door and refuses to speak or let you
in, but that hardly counts as a valid Turing test.

John


On Wed, Jan 28, 2015 at 11:31 AM, Jed Rothwell 
wrote:

> James Bowery  wrote:
>
>
>> For many day-to-day operations the responsiveness of systems like MS
>> Windows has actually decreased.
>>
>
> That is because they keep adding bells and whistles. It is feature-itus
> run amok. When I installed recent versions of Windows I went through and
> turned off a bunch of features and it went back to working quickly again.
> The features that slow down my computer the most are ones that display all
> kinds of useless clutter on the screen. I think Windows was going slowly
> because of a bottleneck between the CPU and the screen display, rather than
> bad programming *per se*. It may be a problem on my computer in
> particular, because it has an old, 2-port screen display card attached to
> two large screens.
>
>
>
>> There are real advances in software but they're generally buried in the
>> noise.
>>
>
> I think there must have been some astounding advances in software
> recently, judging by the results at places like Google and IBM. I mean, for
> example:
>
> * Self driving cars. Many experts predicted th

Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread James Bowery
Yes, the advances in neural machine learning are real but in substance they
have been around for decades and are now emerging having been submerged in
the noise.  For instance, read Jürgen Schmidhuber's timeline of "deep
learning " going
back to 1991.   Although I had the highest performing neural image
processor as of 1990 (3e9 connections per second multisource image
segmentation, Neural Engines Corp of La Jolla presented at the IJCNN in San
Diego) my important contribution was circa 2005
 setting up an AI competition based on
Schmidhuber's colleague, Marcus Hutter's notion of universal artificial
intelligence driven by Ockham's Razor (Kolmogorov Complexity).  Hutter's
criterion has now been adopted by the "singularity" folks
.

Having said that, there are still important advances that are submerged in
the noise -- not the least of which is Relation Arithmetic

(which, btw, subsumes quantum theory hence quantum information systems) and
even in neural systems, there is insufficient attention being paid to
Hecht-Nielsen's Confabulation Theory
.

On Tue, Jan 27, 2015 at 4:31 PM, Jed Rothwell  wrote:

> James Bowery  wrote:
>
>
>> For many day-to-day operations the responsiveness of systems like MS
>> Windows has actually decreased.
>>
>
> That is because they keep adding bells and whistles. It is feature-itus
> run amok. When I installed recent versions of Windows I went through and
> turned off a bunch of features and it went back to working quickly again.
> The features that slow down my computer the most are ones that display all
> kinds of useless clutter on the screen. I think Windows was going slowly
> because of a bottleneck between the CPU and the screen display, rather than
> bad programming *per se*. It may be a problem on my computer in
> particular, because it has an old, 2-port screen display card attached to
> two large screens.
>
>
>
>> There are real advances in software but they're generally buried in the
>> noise.
>>
>
> I think there must have been some astounding advances in software
> recently, judging by the results at places like Google and IBM. I mean, for
> example:
>
> * Self driving cars. Many experts predicted this would take another 20
> years, but here they are, and they are reportedly safer than human drivers.
>
> * Google's uncanny ability to recognize faces, which is beginning to
> exceed the human ability.
>
> * Google's ability to translate documents. This is still way behind human
> abilities, but it is far ahead of where the technology was 10 years ago.
>
> * The Watson computer and its superhuman ability to win at Jeopardy and
> diagnose diseases.
>
> They could not have done these things with hardware alone. Nor do I think
> they could do them by brute force methods. Watson and the Google-Plex are
> MPP computers, so however difficult it is to write MPP software, apparently
> they are making progress in doing it.
>
> Google has published papers describing its MPP techniques.
>
> - Jed
>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Jed Rothwell
James Bowery  wrote:


> For many day-to-day operations the responsiveness of systems like MS
> Windows has actually decreased.
>

That is because they keep adding bells and whistles. It is feature-itus run
amok. When I installed recent versions of Windows I went through and turned
off a bunch of features and it went back to working quickly again. The
features that slow down my computer the most are ones that display all
kinds of useless clutter on the screen. I think Windows was going slowly
because of a bottleneck between the CPU and the screen display, rather than
bad programming *per se*. It may be a problem on my computer in particular,
because it has an old, 2-port screen display card attached to two large
screens.



> There are real advances in software but they're generally buried in the
> noise.
>

I think there must have been some astounding advances in software recently,
judging by the results at places like Google and IBM. I mean, for example:

* Self driving cars. Many experts predicted this would take another 20
years, but here they are, and they are reportedly safer than human drivers.

* Google's uncanny ability to recognize faces, which is beginning to exceed
the human ability.

* Google's ability to translate documents. This is still way behind human
abilities, but it is far ahead of where the technology was 10 years ago.

* The Watson computer and its superhuman ability to win at Jeopardy and
diagnose diseases.

They could not have done these things with hardware alone. Nor do I think
they could do them by brute force methods. Watson and the Google-Plex are
MPP computers, so however difficult it is to write MPP software, apparently
they are making progress in doing it.

Google has published papers describing its MPP techniques.

- Jed


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread James Bowery
Yeah, well I worked at the facility that developed the first supercomputer
and the software was the pre-Internet social networking system called PLATO
after which I was hired to architect the first mass market electronic
newspaper and, from that position, developed an unpartiioned 64-bit version
of network addressing that was unfortunately crowded out by the partitioned
IP address.  In that position my colleagues included David P. Reed (UDP
inventor) as well as John Backus whose Turing Award Lecture was on the need
for what he called Formal Functional Programming as a means of formalizing
parallel processing.  My effort at that time and since has been a
relational, rather than functional, approach to programming since functions
are degenerate relations -- and that led to my final days as a computer
professional at HP's "Internet Chapter 2" project circa 2000 when I hired
perhaps the only guy in the world with the requisite background with
Principia Mathematica to take Russell's Relation Arithmetic a step closer
to resolving an enormous range of problems with the foundations of computer
science.  That one guy was nearly shut out of being hired to do that work
because he was a US citizen, and therefore did not qualify for H-1b status
-- which was what I was told I had to hire from.

I was such a racist bigot that I told them I would resign if I couldn't
hire this US citizen.

The only thing more nauseating than a Church Lady is a morally vain
"anti-racist" in a world where software technology is regressing so that
Silicon Valley can be turned foreign invasion force.

More than 62% of programmers in Silicon Valley are now foreign born and
they imposed the atavistic software technologhy knowns as Java on the
Fortune 500 along with nepotistic hiring.

On Tue, Jan 27, 2015 at 3:28 PM, MarkI-ZeroPoint 
wrote:

> Hi James,
>
> There are several Vorts who make a living doing software, and I as one
> (mostly embedded stuff) agree for the most part with your comments.
> Hardware designers (and I lump together logic / CPU / circuit designers)
> have much better tools (which are software!) than software designers… a
> modern CPU with tens/hundreds of millions of transistors can be
> designed/simulated/validated with excellent accuracy.  Software tools are
> not nearly as advanced, although they are moving in that direction.
> Another is that the pressure of very short development cycles prevents
> software teams from taking considerable time to come up to speed on the
> more sophisticated software design tools, like model-driven development.
>
> -mark
>
>
>
> *From:* James Bowery [mailto:jabow...@gmail.com]
> *Sent:* Tuesday, January 27, 2015 12:42 PM
> *To:* vortex-l
> *Subject:* Re: [Vo]:doubling speed every 2 years for decades more, Intel
> silicon photonics now revolutionizing data centers, Michael Kassner: Rich
> Murray 2015.01.26
>
>
>
> People are conflating advances in hardware with advances in software.
> Software has been stuck in the dark ages for decades and as a result has
> metastasized to fill whatever capacity Moore's Law has provided with
> linear, at best, advance in utility.  For many day-to-day operations the
> responsiveness of systems like MS Windows has actually decreased.
>
>
>
> There are real advances in software but they're generally buried in the
> noise.
>
>
>
> On Tue, Jan 27, 2015 at 2:10 PM, MarkI-ZeroPoint 
> wrote:
>
> Hi Steven,
> Hope you and the catz are staying warm and dry...
>
> I guess my point was more of a general observation... I have long thought
> it interesting that Darwinian theory ala 'survival of the fittest' could be
> applied equally well to a localized population of animals and to something
> as large as an entire human civilization.  Fully agree with your comment
> about ISIS/ISIL, and I will add that probably all religions have had their
> 'embarrasing' eras of fanatical followers, and that by the time that era
> ends, much pain and suffering will have occurred.  The cycle will likely
> continue until the consciousness of the majority of the human population
> gets raised considerably... I think you ought to cut the Christians a bit
> if slack since they were sounding the warning about the fanatical side of
> Islam long ago, and the liberals used every opportunity to label them as
> racists/bigots.
>
> -mark
>
>
> -----Original Message-
> From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net]
> Sent: Tuesday, January 27, 2015 10:18 AM
> To: vortex-l@eskimo.com
> Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel
> silicon photonics now revolutionizing data centers, Michael Kassner: Rich
> Murray 2015.01.26
>
> > Steven:
> > Societies withou

RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread MarkI-ZeroPoint
Hi James,

There are several Vorts who make a living doing software, and I as one (mostly 
embedded stuff) agree for the most part with your comments.  Hardware designers 
(and I lump together logic / CPU / circuit designers) have much better tools 
(which are software!) than software designers… a modern CPU with tens/hundreds 
of millions of transistors can be designed/simulated/validated with excellent 
accuracy.  Software tools are not nearly as advanced, although they are moving 
in that direction.  Another is that the pressure of very short development 
cycles prevents software teams from taking considerable time to come up to 
speed on the more sophisticated software design tools, like model-driven 
development.

-mark

 

From: James Bowery [mailto:jabow...@gmail.com] 
Sent: Tuesday, January 27, 2015 12:42 PM
To: vortex-l
Subject: Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26

 

People are conflating advances in hardware with advances in software.  Software 
has been stuck in the dark ages for decades and as a result has metastasized to 
fill whatever capacity Moore's Law has provided with linear, at best, advance 
in utility.  For many day-to-day operations the responsiveness of systems like 
MS Windows has actually decreased.

 

There are real advances in software but they're generally buried in the noise.

 

On Tue, Jan 27, 2015 at 2:10 PM, MarkI-ZeroPoint  wrote:

Hi Steven,
Hope you and the catz are staying warm and dry...

I guess my point was more of a general observation... I have long thought it 
interesting that Darwinian theory ala 'survival of the fittest' could be 
applied equally well to a localized population of animals and to something as 
large as an entire human civilization.  Fully agree with your comment about 
ISIS/ISIL, and I will add that probably all religions have had their 
'embarrasing' eras of fanatical followers, and that by the time that era ends, 
much pain and suffering will have occurred.  The cycle will likely continue 
until the consciousness of the majority of the human population gets raised 
considerably... I think you ought to cut the Christians a bit if slack since 
they were sounding the warning about the fanatical side of Islam long ago, and 
the liberals used every opportunity to label them as racists/bigots.

-mark


-Original Message-
From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net]
Sent: Tuesday, January 27, 2015 10:18 AM
To: vortex-l@eskimo.com
Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26

> Steven:
> Societies without some form of moral code, a shared sense of right and
> wrong, usually don’t last long…

Hi Mark,

Agreed. But in the meantime, they can do a lot of damage and cause much pain 
and suffering before they implode. The real irony is that most believe they are 
truly following the highest moral code of all. I'm thinking of ISIS as an 
example.

Again, I don't disagree with your point. However, is the point you are making 
in regards to Kurzweil's belief systems or is that my belief system you are 
referring to?

Or something else?

Regards,
Steven Vincent Johnson
svjart.orionworks.com
zazzle.com/orionworks

 



Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread James Bowery
People are conflating advances in hardware with advances in software.
Software has been stuck in the dark ages for decades and as a result has
metastasized to fill whatever capacity Moore's Law has provided with
linear, at best, advance in utility.  For many day-to-day operations the
responsiveness of systems like MS Windows has actually decreased.

There are real advances in software but they're generally buried in the
noise.

On Tue, Jan 27, 2015 at 2:10 PM, MarkI-ZeroPoint 
wrote:

> Hi Steven,
> Hope you and the catz are staying warm and dry...
>
> I guess my point was more of a general observation... I have long thought
> it interesting that Darwinian theory ala 'survival of the fittest' could be
> applied equally well to a localized population of animals and to something
> as large as an entire human civilization.  Fully agree with your comment
> about ISIS/ISIL, and I will add that probably all religions have had their
> 'embarrasing' eras of fanatical followers, and that by the time that era
> ends, much pain and suffering will have occurred.  The cycle will likely
> continue until the consciousness of the majority of the human population
> gets raised considerably... I think you ought to cut the Christians a bit
> if slack since they were sounding the warning about the fanatical side of
> Islam long ago, and the liberals used every opportunity to label them as
> racists/bigots.
>
> -mark
>
>
> -Original Message-
> From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net]
> Sent: Tuesday, January 27, 2015 10:18 AM
> To: vortex-l@eskimo.com
> Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel
> silicon photonics now revolutionizing data centers, Michael Kassner: Rich
> Murray 2015.01.26
>
> > Steven:
> > Societies without some form of moral code, a shared sense of right and
> > wrong, usually don’t last long…
>
> Hi Mark,
>
> Agreed. But in the meantime, they can do a lot of damage and cause much
> pain and suffering before they implode. The real irony is that most believe
> they are truly following the highest moral code of all. I'm thinking of
> ISIS as an example.
>
> Again, I don't disagree with your point. However, is the point you are
> making in regards to Kurzweil's belief systems or is that my belief system
> you are referring to?
>
> Or something else?
>
> Regards,
> Steven Vincent Johnson
> svjart.orionworks.com
> zazzle.com/orionworks
>
>


RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread MarkI-ZeroPoint
Hi Steven,
Hope you and the catz are staying warm and dry...

I guess my point was more of a general observation... I have long thought it 
interesting that Darwinian theory ala 'survival of the fittest' could be 
applied equally well to a localized population of animals and to something as 
large as an entire human civilization.  Fully agree with your comment about 
ISIS/ISIL, and I will add that probably all religions have had their 
'embarrasing' eras of fanatical followers, and that by the time that era ends, 
much pain and suffering will have occurred.  The cycle will likely continue 
until the consciousness of the majority of the human population gets raised 
considerably... I think you ought to cut the Christians a bit if slack since 
they were sounding the warning about the fanatical side of Islam long ago, and 
the liberals used every opportunity to label them as racists/bigots.

-mark 
 

-Original Message-
From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net] 
Sent: Tuesday, January 27, 2015 10:18 AM
To: vortex-l@eskimo.com
Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26

> Steven:
> Societies without some form of moral code, a shared sense of right and 
> wrong, usually don’t last long…

Hi Mark,

Agreed. But in the meantime, they can do a lot of damage and cause much pain 
and suffering before they implode. The real irony is that most believe they are 
truly following the highest moral code of all. I'm thinking of ISIS as an 
example.

Again, I don't disagree with your point. However, is the point you are making 
in regards to Kurzweil's belief systems or is that my belief system you are 
referring to?

Or something else?

Regards,
Steven Vincent Johnson
svjart.orionworks.com
zazzle.com/orionworks



RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Orionworks - Steven Vincent Johnson
> Steven:
> Societies without some form of moral code, a shared sense of right and wrong, 
> usually don’t last long…

Hi Mark,

Agreed. But in the meantime, they can do a lot of damage and cause much pain 
and suffering before they implode. The real irony is that most believe they are 
truly following the highest moral code of all. I'm thinking of ISIS as an 
example.

Again, I don't disagree with your point. However, is the point you are making 
in regards to Kurzweil's belief systems or is that my belief system you are 
referring to?

Or something else?

Regards,
Steven Vincent Johnson
svjart.orionworks.com
zazzle.com/orionworks



RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread MarkI-ZeroPoint
Steven:

Societies without some form of moral code, a shared sense of right and wrong, 
usually don’t last long…

-mark 

 

From: Orionworks - Steven Vincent Johnson [mailto:orionwo...@charter.net] 
Sent: Tuesday, January 27, 2015 8:02 AM
To: vortex-l@eskimo.com
Subject: RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26

 

>From Lewan,

 

> I believe that a good explanation for doubling speed is provided by Kurzweil’s

> suggestion that he calls The Law of Accelerating Returns

> (http://www.kurzweilai.net/the-law-of-accelerating-returns), basically 

> meaning that whatever is invented/evolved in a system is fed back into the

> system and increases the over-all speed of invention/evolution, leading

> mathematically to exponential growth of speed. 

 

I've read a few  of Kurzweil's books. Last one I read, I believe, was "The Age 
of Spiritual Machines."

 

His books are fun to read. It would seem that Ray's belief system involves an 
eventual "singularity event" which suggests the advancement of computer 
technology and AI will either save us, or transform our species into something 
very different than what we are now. That said, Ray's concept is not all that 
different, IMHO, than those among us who believe that Jesus' pending 2nd coming 
will save us, or that the benevolent Space Brothers from Arcturus (or is that 
Sirius... I can't keep the star systems straight) will either save us or at 
least transform us as a species.

 

Despite all of these pending predictions coming from Ray Kurzweil, or from 
Fundamentalist Christians or from other religious doctrine, or for that matter 
from the Brotherhood of Benevolent Space Brothers, I think it would be wise of 
us to never ever underestimate the collective power of stupidity, ignorance, 
and Luddism.

 

I'm more inclined to speculate that Kurzweil might possibly get some of his 
predictions right, but only if interstellar space travel becomes a practical 
reality. That would allow groups of like-minded humans to migrate to their very 
own habitable planet where they can then set up their own governing rules which 
would give them carte blanche to dabble with their genomes and infuse them with 
all the AI technology they see fit.

 

In the meantime I suspect a very large group of Luddites will stay on Earth and 
maintain the status quo. That's probably a good idea anyway. Who knows. For all 
I know perhaps that is "by design." It's probably good idea to maintain a 
diverse gene pool such as what we have on our planet allowing interplanetary 
anthropologists and scientist from god knows what civilization to occasionally 
stop by and sample.

 

Regards,

Steven Vincent Johnson

svjart.orionworks.com

zazzle.com/orionworks



RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Orionworks - Steven Vincent Johnson
>From Lewan,

 

> I believe that a good explanation for doubling speed is provided by Kurzweil’s

> suggestion that he calls The Law of Accelerating Returns

> (http://www.kurzweilai.net/the-law-of-accelerating-returns), basically 

> meaning that whatever is invented/evolved in a system is fed back into the

> system and increases the over-all speed of invention/evolution, leading

> mathematically to exponential growth of speed. 

 

I've read a few  of Kurzweil's books. Last one I read, I believe, was "The Age 
of Spiritual Machines."

 

His books are fun to read. It would seem that Ray's belief system involves an 
eventual "singularity event" which suggests the advancement of computer 
technology and AI will either save us, or transform our species into something 
very different than what we are now. That said, Ray's concept is not all that 
different, IMHO, than those among us who believe that Jesus' pending 2nd coming 
will save us, or that the benevolent Space Brothers from Arcturus (or is that 
Sirius... I can't keep the star systems straight) will either save us or at 
least transform us as a species.

 

Despite all of these pending predictions coming from Ray Kurzweil, or from 
Fundamentalist Christians or from other religious doctrine, or for that matter 
from the Brotherhood of Benevolent Space Brothers, I think it would be wise of 
us to never ever underestimate the collective power of stupidity, ignorance, 
and Luddism.

 

I'm more inclined to speculate that Kurzweil might possibly get some of his 
predictions right, but only if interstellar space travel becomes a practical 
reality. That would allow groups of like-minded humans to migrate to their very 
own habitable planet where they can then set up their own governing rules which 
would give them carte blanche to dabble with their genomes and infuse them with 
all the AI technology they see fit.

 

In the meantime I suspect a very large group of Luddites will stay on Earth and 
maintain the status quo. That's probably a good idea anyway. Who knows. For all 
I know perhaps that is "by design." It's probably good idea to maintain a 
diverse gene pool such as what we have on our planet allowing interplanetary 
anthropologists and scientist from god knows what civilization to occasionally 
stop by and sample.

 

Regards,

Steven Vincent Johnson

svjart.orionworks.com

zazzle.com/orionworks



Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Jed Rothwell
John Berry  wrote:

BTW AFAIK Moores Law isn't that speeds increase but that transistor
> densities increase.
>

Correct. It was originally a simple statement about two-dimensional
geometry. It happens that speed followed roughly the same curve for a long
time, although in recent years microprocessor speed has not increased.

It is not really a "law." It is just an informal rule of thumb.

This also goes back to a talk given by Feynman in 1959, "there is plenty of
room at the bottom."

http://www.zyvex.com/nanotech/feynman.html

I recall another statement by someone describing vacuum tubes as, "using a
railroad car to bring a pat of butter." Even before the invention of the
transistor, people understood that computing and signal processing could be
done with far less power and with smaller components.

- Jed


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Alain Sepeda
yes, moore law talk of size of transistor, but that is linked to speed,
complexity, RAM size...

there are many different moore laws and when they interact with dimensions
in real world it created big problems.
for exemple disk and RAM memory capacity grow faster than the I/O to talk
to the disk. this required cache with more and more efficiency, and
processor which speculate more and more (supervectorial)... or who pipeline
hundreds of cycles (MTA architecture), with memory hierarchy higher and
higher
same for the processors that grow faster than it's IO, and than heat
dissipation capacity...
One of the problems is as said here the radius of synchronicity...
this is why processor get more and more asynchronous, parallel, pipelined,
and that programmers have to change their algorithms to use parallelism,
despite more instructions used (Montecarlo, diagonalizations, gradient
solvers...)

there is also a moore law for the size of silicon wafer

a limit of expomential laws is the financial requirements which follows a
similar moore law.

anyway as Mats explain there is breakthrough that allows to get out of
those limits by changing the rules.

for example in companies there is an interest in concentrating resources,
but it creates problems. new management allows loosely coupled actors to
work together benefiting from the size effects, while not being hindered by
big corporate culture...
see for example Uber or Blablacar or AirBnb, or simply internet compared to
NBC news.

you can see that in real problems. moor is only a detail for most progress.

In a course someone talk about the speed of eliptic curves solving.
Moore law is a minor effect in the huge increese of speed, which was pushed
by huge improvement is mathematic.
Same for cryptographic problems like hash, cypher, primes...




2015-01-27 13:48 GMT+01:00 John Berry :

> BTW AFAIK Moores Law isn't that speeds increase but that transistor
> densities increase.
>
> On Wed, Jan 28, 2015 at 1:33 AM, Bob Higgins 
> wrote:
>
>> What James says is true about the radius of connection.  However, two
>> things have been driving that radius smaller - smaller gate size and chip
>> stacking.  We all recognize that making the transistors and the gates
>> smaller decreases this radius, but what is not widely recognized is chip
>> stacking technology is becoming more common.  The issue with chip stacking
>> is the heat dissipation.  This was addressed by IBM, for example, using
>> liquid cooled systems and stacking years ago.  However, once the IC power
>> is reduced, chip stacking becomes very practical.  It is currently used to
>> stack memories on top of processors in a lot of consumer devices.
>>
>> Going up provides a lot of opportunity for increased performance by
>> adding complexity without substantially increasing the radius of
>> connection.  There is presently a lot of headroom in this technology for
>> additional Moore's law advance.
>>
>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread John Berry
BTW AFAIK Moores Law isn't that speeds increase but that transistor
densities increase.

On Wed, Jan 28, 2015 at 1:33 AM, Bob Higgins 
wrote:

> What James says is true about the radius of connection.  However, two
> things have been driving that radius smaller - smaller gate size and chip
> stacking.  We all recognize that making the transistors and the gates
> smaller decreases this radius, but what is not widely recognized is chip
> stacking technology is becoming more common.  The issue with chip stacking
> is the heat dissipation.  This was addressed by IBM, for example, using
> liquid cooled systems and stacking years ago.  However, once the IC power
> is reduced, chip stacking becomes very practical.  It is currently used to
> stack memories on top of processors in a lot of consumer devices.
>
> Going up provides a lot of opportunity for increased performance by adding
> complexity without substantially increasing the radius of connection.
> There is presently a lot of headroom in this technology for additional
> Moore's law advance.
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Bob Higgins
What James says is true about the radius of connection.  However, two
things have been driving that radius smaller - smaller gate size and chip
stacking.  We all recognize that making the transistors and the gates
smaller decreases this radius, but what is not widely recognized is chip
stacking technology is becoming more common.  The issue with chip stacking
is the heat dissipation.  This was addressed by IBM, for example, using
liquid cooled systems and stacking years ago.  However, once the IC power
is reduced, chip stacking becomes very practical.  It is currently used to
stack memories on top of processors in a lot of consumer devices.

Going up provides a lot of opportunity for increased performance by adding
complexity without substantially increasing the radius of connection.
There is presently a lot of headroom in this technology for additional
Moore's law advance.


SV: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-27 Thread Lewan Mats
I believe that a good explanation for doubling speed is provided by Kurzweil’s 
suggestion that he calls The Law of Accelerating Returns 
(http://www.kurzweilai.net/the-law-of-accelerating-returns), basically meaning 
that whatever is invented/evolved in a system is fed back into the system and 
increases the over-all speed of invention/evolution, leading mathematically to 
exponential growth of speed.

Another basic principle in the universe seems to be self-organization, which 
has led to evolution of life and intelligence. And from a broad outside 
perspective, human intelligence is now part of a self-organizing system where 
technology is evolved.

At every phase, the speed of evolution follows an s-curve with a slow start, an 
exponential increase in speed and a slow-down due to some natural limitation. 
The overall exponential growth consists of several such s-curves put together. 
Biological life is at its end of its s-curve. What we know as technology is 
taking over (both those phenomena consist in turn of lots of s-curves put 
together).

I think that it would be a little bit presumptuous to believe that:
- human intelligence is the highest possible form of intelligence.
- humans could possibly understand the absolute limits of computation or future 
similar phenomena.

I can see no reason to believe that doubling speed would cease in an imaginable 
future. We’re just a piece of the puzzle.

Mats

---
Mats Lewan, Senior staff writer Ny 
Teknik, Digital Teknik. Managing Editor Next 
Magasin.
Phone: +46-8-796 64 10, Cellular: +46-70-590 72 52, Twitter: 
matslew



Jed Rothwell  wrote:

The fastest data processing in the known universe, by a wide margin, is 
biological cell reproduction. The entire genome is copied by every cell that 
splits. This is a parallel process. The moment a strand of DNA is exposed to 
solution, all of new bases begin match up simultaneously. DNA is also by far 
the most compact form of data storage in the known universe, and I predict is 
the most compact that will ever be found. I do not think subatomic data storage 
will ever be possible. All the human data now existing can be stored in about 7 
ml of DNA.

- Jed



Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Axil Axil
The Nanor is an example of a quantum based solid state LENR photonic device
that operated in a state of quantum entanglement. A quantum computer could
well be based on the Nanor.

On Mon, Jan 26, 2015 at 11:35 PM, Axil Axil  wrote:

> It is my contention that the Ni/H reactor is a proof of principle for the
> quantum computer. In the Ni/H reactor energy is shared instantaneously
> between all the plasmonic components of the reactor because there exists a
> condition of global BEC maintained throughout the reactor.
>
> On Mon, Jan 26, 2015 at 11:27 PM, Axil Axil  wrote:
>
>> The mechanism that underpins the quantum computer is entanglement and the
>> speed of entanglement is instantaneous. Computing components will be
>> connected through long rang entanglement so data will be shared
>> instantaniously.
>>
>> On Mon, Jan 26, 2015 at 9:15 PM, James Bowery  wrote:
>>
>>> All boolean functions (meaning all programs) can be parallelized to only
>>> 2 gate delays.  The problem is your computer ends up with more gates than
>>> there are elementary particles in the universe.
>>>
>>> A good deal of real computation consists of, in essence, decompressing a
>>> compressed form of the the answer.  The difficulty of writing MPP software
>>> is essentially attempting to decompress the compressed form of the answer
>>> (ie: the program and its inputs) prior to run time so it maps on to your
>>> parallel architecture.
>>>
>>> To make software maintainable, you start out with the minimal
>>> description -- the Ockham's Razor version -- so that you don't introduce
>>> extraneous complexity to the program specification.  The rest, as they say,
>>> is expansion of the Kolmogorov Complexity and there is just no getting
>>> around the fact that you have a _lot_ of serial work in that process.
>>>
>>> On Mon, Jan 26, 2015 at 8:00 PM, Jed Rothwell 
>>> wrote:
>>>
 James Bowery  wrote:


> Architectures that attempt to hide this problem with lots of
> processors accessing local stores in parallel are drunks looking for their
> keys under the lamp post.
>

 I disagree. The purpose of a computer is solve problems. To process
 data. Not to crunch numbers as quickly as possible. The human brain is many
 orders of magnitude slower than any computer, and yet we can recognize
 faces faster than just about any computer, because the brain is a massively
 parallel processor (MPP). Many neurons compare the image to stored images
 simultaneously, and the neurons that find the closest match "come to mind."
 Many data processing functions can be performed in parallel. Sorting and
 searching arrays has been done in parallel since the 1950s. Polyphase sort
 methods with multiple processors and mag tape decks were wonderfully fast.

 It is difficult to write MPP software, but once we master the
 techniques the job will be done, and it will be much easier to update.
 Already, Microsoft Windows works better on multi-processor computers than
 single processor models. Multiprocessor also run voice input programs much
 faster than single processors.

 A generation from now we may have personal computers with millions of
 processors. Even if every processor were much slower than today's
 processors, the overall speed for many classes of problems will be similar
 to today's supercomputers -- which can solve problems hundreds of thousands
 to millions of times faster than a PC or Mac. They will have the power of
 today's Watson computer, which is to say, they will be able to play
 Jeopardy or diagnose disease far better than any person. I expect they will
 also recognize faces and do voice input better than any person.

 There may be a few esoteric problems that are inherently serial in
 nature and that can only be solved by a single processor, but I expect most
 real world can be broken down into procedures run in parallel. Of course
 the breaking down will be done automatically. It is already.

 Before computers were invented, all large real world problems were
 broken down and solved in parallel by large groups of people, usually
 organized in a hierarchy. I mean, for example, the design of large
 buildings or the management of corporations, nations or armies.

 The fastest data processing in the known universe, by a wide margin, is
 biological cell reproduction. The entire genome is copied by every cell
 that splits. This is a parallel process. The moment a strand of DNA is
 exposed to solution, all of new bases begin match up simultaneously. DNA is
 also by far the most compact form of data storage in the known universe,
 and I predict is the most compact that will ever be found. I do not think
 subatomic data storage will ever be possible. All the human data now
 existing can be stored in about 7 ml of DNA.

 - Jed


>>>
>>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Eric Walker
On Mon, Jan 26, 2015 at 6:00 PM, Jed Rothwell  wrote:

They will have the power of today's Watson computer, which is to say, they
> will be able to play Jeopardy or diagnose disease far better than any
> person. I expect they will also recognize faces and do voice input better
> than any person.
>

This prediction seems very attainable.  I think developments along these
lines will happen in the next ten to twenty years.  Your smartphone may be
able to do these things.  (And your smartphone will be a tiny little
thing.)  A related development -- people are predisposed to
anthropomorphizing technology.  In the same time period, I suppose there
will be robots and computer systems that far surpass Siri in mimicking
sentient life.  There was a highly entertaining movie that came out
recently, "Her," that riffed on this idea.

 All the human data now existing can be stored in about 7 ml of DNA.
>

Note that much of this storage is not in the molecule itself but in how
it's arranged -- epigenetics.  This is a fascinating area of biology that
focuses on which genes are expressed and to what extent they are.  It could
hold the key to understanding things like cancer, which studies that focus
solely on genetics might not uncover.

Eric


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Axil Axil
It is my contention that the Ni/H reactor is a proof of principle for the
quantum computer. In the Ni/H reactor energy is shared instantaneously
between all the plasmonic components of the reactor because there exists a
condition of global BEC maintained throughout the reactor.

On Mon, Jan 26, 2015 at 11:27 PM, Axil Axil  wrote:

> The mechanism that underpins the quantum computer is entanglement and the
> speed of entanglement is instantaneous. Computing components will be
> connected through long rang entanglement so data will be shared
> instantaniously.
>
> On Mon, Jan 26, 2015 at 9:15 PM, James Bowery  wrote:
>
>> All boolean functions (meaning all programs) can be parallelized to only
>> 2 gate delays.  The problem is your computer ends up with more gates than
>> there are elementary particles in the universe.
>>
>> A good deal of real computation consists of, in essence, decompressing a
>> compressed form of the the answer.  The difficulty of writing MPP software
>> is essentially attempting to decompress the compressed form of the answer
>> (ie: the program and its inputs) prior to run time so it maps on to your
>> parallel architecture.
>>
>> To make software maintainable, you start out with the minimal description
>> -- the Ockham's Razor version -- so that you don't introduce extraneous
>> complexity to the program specification.  The rest, as they say, is
>> expansion of the Kolmogorov Complexity and there is just no getting around
>> the fact that you have a _lot_ of serial work in that process.
>>
>> On Mon, Jan 26, 2015 at 8:00 PM, Jed Rothwell 
>> wrote:
>>
>>> James Bowery  wrote:
>>>
>>>
 Architectures that attempt to hide this problem with lots of processors
 accessing local stores in parallel are drunks looking for their keys under
 the lamp post.

>>>
>>> I disagree. The purpose of a computer is solve problems. To process
>>> data. Not to crunch numbers as quickly as possible. The human brain is many
>>> orders of magnitude slower than any computer, and yet we can recognize
>>> faces faster than just about any computer, because the brain is a massively
>>> parallel processor (MPP). Many neurons compare the image to stored images
>>> simultaneously, and the neurons that find the closest match "come to mind."
>>> Many data processing functions can be performed in parallel. Sorting and
>>> searching arrays has been done in parallel since the 1950s. Polyphase sort
>>> methods with multiple processors and mag tape decks were wonderfully fast.
>>>
>>> It is difficult to write MPP software, but once we master the techniques
>>> the job will be done, and it will be much easier to update. Already,
>>> Microsoft Windows works better on multi-processor computers than single
>>> processor models. Multiprocessor also run voice input programs much faster
>>> than single processors.
>>>
>>> A generation from now we may have personal computers with millions of
>>> processors. Even if every processor were much slower than today's
>>> processors, the overall speed for many classes of problems will be similar
>>> to today's supercomputers -- which can solve problems hundreds of thousands
>>> to millions of times faster than a PC or Mac. They will have the power of
>>> today's Watson computer, which is to say, they will be able to play
>>> Jeopardy or diagnose disease far better than any person. I expect they will
>>> also recognize faces and do voice input better than any person.
>>>
>>> There may be a few esoteric problems that are inherently serial in
>>> nature and that can only be solved by a single processor, but I expect most
>>> real world can be broken down into procedures run in parallel. Of course
>>> the breaking down will be done automatically. It is already.
>>>
>>> Before computers were invented, all large real world problems were
>>> broken down and solved in parallel by large groups of people, usually
>>> organized in a hierarchy. I mean, for example, the design of large
>>> buildings or the management of corporations, nations or armies.
>>>
>>> The fastest data processing in the known universe, by a wide margin, is
>>> biological cell reproduction. The entire genome is copied by every cell
>>> that splits. This is a parallel process. The moment a strand of DNA is
>>> exposed to solution, all of new bases begin match up simultaneously. DNA is
>>> also by far the most compact form of data storage in the known universe,
>>> and I predict is the most compact that will ever be found. I do not think
>>> subatomic data storage will ever be possible. All the human data now
>>> existing can be stored in about 7 ml of DNA.
>>>
>>> - Jed
>>>
>>>
>>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Axil Axil
The mechanism that underpins the quantum computer is entanglement and the
speed of entanglement is instantaneous. Computing components will be
connected through long rang entanglement so data will be shared
instantaniously.

On Mon, Jan 26, 2015 at 9:15 PM, James Bowery  wrote:

> All boolean functions (meaning all programs) can be parallelized to only 2
> gate delays.  The problem is your computer ends up with more gates than
> there are elementary particles in the universe.
>
> A good deal of real computation consists of, in essence, decompressing a
> compressed form of the the answer.  The difficulty of writing MPP software
> is essentially attempting to decompress the compressed form of the answer
> (ie: the program and its inputs) prior to run time so it maps on to your
> parallel architecture.
>
> To make software maintainable, you start out with the minimal description
> -- the Ockham's Razor version -- so that you don't introduce extraneous
> complexity to the program specification.  The rest, as they say, is
> expansion of the Kolmogorov Complexity and there is just no getting around
> the fact that you have a _lot_ of serial work in that process.
>
> On Mon, Jan 26, 2015 at 8:00 PM, Jed Rothwell 
> wrote:
>
>> James Bowery  wrote:
>>
>>
>>> Architectures that attempt to hide this problem with lots of processors
>>> accessing local stores in parallel are drunks looking for their keys under
>>> the lamp post.
>>>
>>
>> I disagree. The purpose of a computer is solve problems. To process data.
>> Not to crunch numbers as quickly as possible. The human brain is many
>> orders of magnitude slower than any computer, and yet we can recognize
>> faces faster than just about any computer, because the brain is a massively
>> parallel processor (MPP). Many neurons compare the image to stored images
>> simultaneously, and the neurons that find the closest match "come to mind."
>> Many data processing functions can be performed in parallel. Sorting and
>> searching arrays has been done in parallel since the 1950s. Polyphase sort
>> methods with multiple processors and mag tape decks were wonderfully fast.
>>
>> It is difficult to write MPP software, but once we master the techniques
>> the job will be done, and it will be much easier to update. Already,
>> Microsoft Windows works better on multi-processor computers than single
>> processor models. Multiprocessor also run voice input programs much faster
>> than single processors.
>>
>> A generation from now we may have personal computers with millions of
>> processors. Even if every processor were much slower than today's
>> processors, the overall speed for many classes of problems will be similar
>> to today's supercomputers -- which can solve problems hundreds of thousands
>> to millions of times faster than a PC or Mac. They will have the power of
>> today's Watson computer, which is to say, they will be able to play
>> Jeopardy or diagnose disease far better than any person. I expect they will
>> also recognize faces and do voice input better than any person.
>>
>> There may be a few esoteric problems that are inherently serial in nature
>> and that can only be solved by a single processor, but I expect most real
>> world can be broken down into procedures run in parallel. Of course the
>> breaking down will be done automatically. It is already.
>>
>> Before computers were invented, all large real world problems were broken
>> down and solved in parallel by large groups of people, usually organized in
>> a hierarchy. I mean, for example, the design of large buildings or the
>> management of corporations, nations or armies.
>>
>> The fastest data processing in the known universe, by a wide margin, is
>> biological cell reproduction. The entire genome is copied by every cell
>> that splits. This is a parallel process. The moment a strand of DNA is
>> exposed to solution, all of new bases begin match up simultaneously. DNA is
>> also by far the most compact form of data storage in the known universe,
>> and I predict is the most compact that will ever be found. I do not think
>> subatomic data storage will ever be possible. All the human data now
>> existing can be stored in about 7 ml of DNA.
>>
>> - Jed
>>
>>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread James Bowery
All boolean functions (meaning all programs) can be parallelized to only 2
gate delays.  The problem is your computer ends up with more gates than
there are elementary particles in the universe.

A good deal of real computation consists of, in essence, decompressing a
compressed form of the the answer.  The difficulty of writing MPP software
is essentially attempting to decompress the compressed form of the answer
(ie: the program and its inputs) prior to run time so it maps on to your
parallel architecture.

To make software maintainable, you start out with the minimal description
-- the Ockham's Razor version -- so that you don't introduce extraneous
complexity to the program specification.  The rest, as they say, is
expansion of the Kolmogorov Complexity and there is just no getting around
the fact that you have a _lot_ of serial work in that process.

On Mon, Jan 26, 2015 at 8:00 PM, Jed Rothwell  wrote:

> James Bowery  wrote:
>
>
>> Architectures that attempt to hide this problem with lots of processors
>> accessing local stores in parallel are drunks looking for their keys under
>> the lamp post.
>>
>
> I disagree. The purpose of a computer is solve problems. To process data.
> Not to crunch numbers as quickly as possible. The human brain is many
> orders of magnitude slower than any computer, and yet we can recognize
> faces faster than just about any computer, because the brain is a massively
> parallel processor (MPP). Many neurons compare the image to stored images
> simultaneously, and the neurons that find the closest match "come to mind."
> Many data processing functions can be performed in parallel. Sorting and
> searching arrays has been done in parallel since the 1950s. Polyphase sort
> methods with multiple processors and mag tape decks were wonderfully fast.
>
> It is difficult to write MPP software, but once we master the techniques
> the job will be done, and it will be much easier to update. Already,
> Microsoft Windows works better on multi-processor computers than single
> processor models. Multiprocessor also run voice input programs much faster
> than single processors.
>
> A generation from now we may have personal computers with millions of
> processors. Even if every processor were much slower than today's
> processors, the overall speed for many classes of problems will be similar
> to today's supercomputers -- which can solve problems hundreds of thousands
> to millions of times faster than a PC or Mac. They will have the power of
> today's Watson computer, which is to say, they will be able to play
> Jeopardy or diagnose disease far better than any person. I expect they will
> also recognize faces and do voice input better than any person.
>
> There may be a few esoteric problems that are inherently serial in nature
> and that can only be solved by a single processor, but I expect most real
> world can be broken down into procedures run in parallel. Of course the
> breaking down will be done automatically. It is already.
>
> Before computers were invented, all large real world problems were broken
> down and solved in parallel by large groups of people, usually organized in
> a hierarchy. I mean, for example, the design of large buildings or the
> management of corporations, nations or armies.
>
> The fastest data processing in the known universe, by a wide margin, is
> biological cell reproduction. The entire genome is copied by every cell
> that splits. This is a parallel process. The moment a strand of DNA is
> exposed to solution, all of new bases begin match up simultaneously. DNA is
> also by far the most compact form of data storage in the known universe,
> and I predict is the most compact that will ever be found. I do not think
> subatomic data storage will ever be possible. All the human data now
> existing can be stored in about 7 ml of DNA.
>
> - Jed
>
>


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Rich Murray
I suppose there will evolve a molecular scale device of about 1,000 atoms,
interacting by light speed signals, with local memory modules directly
adjacent in six directions -- what would be the cycle time for this?

Since 1660 the growth of all science has been exponential -- that's the
actual empirical evidence -- if the single hyperinfinity is, well,
infinite, then exponential growth of "science" goes forever, especially
accepting a fractal multi-universe reality... our universe bubble must be
somewhere in the "middle" of all this...

wihin the fellowship of service,  Rich

On Mon, Jan 26, 2015 at 5:24 PM, James Bowery  wrote:

> This is nonsense.
>
> In microcomputer architecture there is something known as the radius of
> control, which is bounded by the distance that can be traversed by a signal
> from a processing unit to memory and back.  That feedback time is, even in
> some hypothetical all-optical computer, limited by the speed of light.
> Light travels one foot per nanosecond or thereabouts.  So if you had
> wafer-scale optical computing you could support radius of control at a
> cycle time of about 1GHz.  This is a hard limit -- very hard.
>
> I've attacked this computation limit as directly as just about anyone with an
> analog mutex crossbar circuit that keeps main memory access on chip
> .
> This is critical because as soon as you go off chip you suffer orders of
> magnitude slowdown in your primary control cycle.
>
> Architectures that attempt to hide this problem with lots of processors
> accessing local stores in parallel are drunks looking for their keys under
> the lamp post.
>
>
> On Mon, Jan 26, 2015 at 6:22 PM, Rich Murray  wrote:
>
>> doubling speed every 2 years for decades more, Intel silicon photonics
>> now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26
>> http://rmforall.blogspot.com/2015/01/doubling-speed-every-2-years-for.html
>>
>>
>> [ See also:
>>
>> exponential information technology 1890-2014 10exp17 more MIPS per
>> constant 2004 dollar in 124 years, Luke Muehlhauser, Machine Intelligence
>> Research Institute 2014.05.12: Rich Murray 2014.12.27
>>
>> http://rmforall.blogspot.com/2014/12/exponential-information-technology-1890.html
>>
>>
>> since 1890, increase by 10 times every 7.3 years --
>>
>> since 1950 -- 2014 = 64 years, with about 10exp13  times more =
>> 10,000,000,000,000 times more per device, from vacuum tubes to multicore
>> processors -- increase by 10 times every 5 years per constant 2004 dollar.
>>
>>
>> CSICON -- Murray's Law -- Eternal Exponential Expansion of Science: Rich
>> Murray 1997.04.05, 2001.06.22, 2011.01.03
>>
>> http://rmforall.blogspot.com/2011/01/csicon-murrays-law-eternal-exponential.html
>> http://groups.yahoo.com/group/rmforall/message/102]
>>
>>
>>
>> http://www.techrepublic.com/article/silicon-photonics-will-revolutionize-data-centers-in-2015/
>>
>>
>> NETWORKING 
>> Silicon photonics will revolutionize data centers in 2015
>>
>> By Michael Kassner
>>  January 23,
>> 2015, 11:23 AM PST
>>
>>- Email Alert
>>
>> 
>>- RSS 
>>
>>
>>- Comments
>>
>> 
>>- Save
>>- Facebook0
>>- Twitter0
>>- Linkedin0
>>-
>>- More
>>
>> Data centers are morphing into computing singularities, albeit large
>> ones. Silicon photonics will hasten that process. The reason why begins
>> with Moore's Law.
>>
>> [image: siliconphotonics012815.jpg]
>>  Image courtesy of Intel
>>
>> Gordon Moore's prediction known as Moore's Law
>> 
>>  --
>> "The number of transistors incorporated in a chip will approximately double
>> every 24 months." -- has been uncanny in its accuracy since he made it in
>> April 1965. That didn't stop pundits from saying Moore's Law
>>  had
>> a nice run, but like all good things, it was coming to an end. The pundits'
>> prediction was erroneous, thanks to Intel (the company Moore co-founded).
>> The reason is light, or more accurately photons.
>> The problem photons overcome
>>
>> [image: gordonmooreintel.png]
>> Gordon Moore
>>  Image courtesy of Intel
>> Moore's Law requires scientists and engineers to continually figure out
>> how to pack larger quantities of transistors and support circuitry into
>> chips. It's a challenge, but not as difficult as figuring out what to do
>> about the by-produ

RE: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Hoyt A. Stearns Jr.
Well said !





From: Jed Rothwell [mailto:jedrothw...@gmail.com]
Sent: Monday, January 26, 2015 7:00 PM
To: vortex-l@eskimo.com
Subject: Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon 
photonics now revolutionizing data centers, Michael Kassner: Rich Murray 
2015.01.26



James Bowery  wrote:



Architectures that attempt to hide this problem with lots of processors 
accessing local stores in parallel are drunks looking for their keys under the 
lamp post.



I disagree. The purpose of a computer is solve problems. To process data. Not 
to crunch numbers as quickly as possible. The human brain is many orders of 
magnitude slower than any computer, and yet we can recognize faces faster than 
just about any computer, because the brain is a massively parallel processor 
(MPP). Many neurons compare the image to stored images simultaneously, and the 
neurons that find the closest match "come to mind." Many data processing 
functions can be performed in parallel. Sorting and searching arrays has been 
done in parallel since the 1950s. Polyphase sort methods with multiple 
processors and mag tape decks were wonderfully fast.



It is difficult to write MPP software, but once we master the techniques the 
job will be done, and it will be much easier to update. Already, Microsoft 
Windows works better on multi-processor computers than single processor models. 
Multiprocessor also run voice input programs much faster than single processors.



A generation from now we may have personal computers with millions of 
processors. Even if every processor were much slower than today's processors, 
the overall speed for many classes of problems will be similar to today's 
supercomputers -- which can solve problems hundreds of thousands to millions of 
times faster than a PC or Mac. They will have the power of today's Watson 
computer, which is to say, they will be able to play Jeopardy or diagnose 
disease far better than any person. I expect they will also recognize faces and 
do voice input better than any person.



There may be a few esoteric problems that are inherently serial in nature and 
that can only be solved by a single processor, but I expect most real world can 
be broken down into procedures run in parallel. Of course the breaking down 
will be done automatically. It is already.



Before computers were invented, all large real world problems were broken down 
and solved in parallel by large groups of people, usually organized in a 
hierarchy. I mean, for example, the design of large buildings or the management 
of corporations, nations or armies.



The fastest data processing in the known universe, by a wide margin, is 
biological cell reproduction. The entire genome is copied by every cell that 
splits. This is a parallel process. The moment a strand of DNA is exposed to 
solution, all of new bases begin match up simultaneously. DNA is also by far 
the most compact form of data storage in the known universe, and I predict is 
the most compact that will ever be found. I do not think subatomic data storage 
will ever be possible. All the human data now existing can be stored in about 7 
ml of DNA.



- Jed





---
This email is free from viruses and malware because avast! Antivirus protection 
is active.
http://www.avast.com


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Jed Rothwell
James Bowery  wrote:


> Architectures that attempt to hide this problem with lots of processors
> accessing local stores in parallel are drunks looking for their keys under
> the lamp post.
>

I disagree. The purpose of a computer is solve problems. To process data.
Not to crunch numbers as quickly as possible. The human brain is many
orders of magnitude slower than any computer, and yet we can recognize
faces faster than just about any computer, because the brain is a massively
parallel processor (MPP). Many neurons compare the image to stored images
simultaneously, and the neurons that find the closest match "come to mind."
Many data processing functions can be performed in parallel. Sorting and
searching arrays has been done in parallel since the 1950s. Polyphase sort
methods with multiple processors and mag tape decks were wonderfully fast.

It is difficult to write MPP software, but once we master the techniques
the job will be done, and it will be much easier to update. Already,
Microsoft Windows works better on multi-processor computers than single
processor models. Multiprocessor also run voice input programs much faster
than single processors.

A generation from now we may have personal computers with millions of
processors. Even if every processor were much slower than today's
processors, the overall speed for many classes of problems will be similar
to today's supercomputers -- which can solve problems hundreds of thousands
to millions of times faster than a PC or Mac. They will have the power of
today's Watson computer, which is to say, they will be able to play
Jeopardy or diagnose disease far better than any person. I expect they will
also recognize faces and do voice input better than any person.

There may be a few esoteric problems that are inherently serial in nature
and that can only be solved by a single processor, but I expect most real
world can be broken down into procedures run in parallel. Of course the
breaking down will be done automatically. It is already.

Before computers were invented, all large real world problems were broken
down and solved in parallel by large groups of people, usually organized in
a hierarchy. I mean, for example, the design of large buildings or the
management of corporations, nations or armies.

The fastest data processing in the known universe, by a wide margin, is
biological cell reproduction. The entire genome is copied by every cell
that splits. This is a parallel process. The moment a strand of DNA is
exposed to solution, all of new bases begin match up simultaneously. DNA is
also by far the most compact form of data storage in the known universe,
and I predict is the most compact that will ever be found. I do not think
subatomic data storage will ever be possible. All the human data now
existing can be stored in about 7 ml of DNA.

- Jed


Re: [Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread James Bowery
This is nonsense.

In microcomputer architecture there is something known as the radius of
control, which is bounded by the distance that can be traversed by a signal
from a processing unit to memory and back.  That feedback time is, even in
some hypothetical all-optical computer, limited by the speed of light.
Light travels one foot per nanosecond or thereabouts.  So if you had
wafer-scale optical computing you could support radius of control at a
cycle time of about 1GHz.  This is a hard limit -- very hard.

I've attacked this computation limit as directly as just about anyone with an
analog mutex crossbar circuit that keeps main memory access on chip
.
This is critical because as soon as you go off chip you suffer orders of
magnitude slowdown in your primary control cycle.

Architectures that attempt to hide this problem with lots of processors
accessing local stores in parallel are drunks looking for their keys under
the lamp post.


On Mon, Jan 26, 2015 at 6:22 PM, Rich Murray  wrote:

> doubling speed every 2 years for decades more, Intel silicon photonics now
> revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26
> http://rmforall.blogspot.com/2015/01/doubling-speed-every-2-years-for.html
>
>
> [ See also:
>
> exponential information technology 1890-2014 10exp17 more MIPS per
> constant 2004 dollar in 124 years, Luke Muehlhauser, Machine Intelligence
> Research Institute 2014.05.12: Rich Murray 2014.12.27
>
> http://rmforall.blogspot.com/2014/12/exponential-information-technology-1890.html
>
>
> since 1890, increase by 10 times every 7.3 years --
>
> since 1950 -- 2014 = 64 years, with about 10exp13  times more =
> 10,000,000,000,000 times more per device, from vacuum tubes to multicore
> processors -- increase by 10 times every 5 years per constant 2004 dollar.
>
>
> CSICON -- Murray's Law -- Eternal Exponential Expansion of Science: Rich
> Murray 1997.04.05, 2001.06.22, 2011.01.03
>
> http://rmforall.blogspot.com/2011/01/csicon-murrays-law-eternal-exponential.html
> http://groups.yahoo.com/group/rmforall/message/102]
>
>
>
> http://www.techrepublic.com/article/silicon-photonics-will-revolutionize-data-centers-in-2015/
>
>
> NETWORKING 
> Silicon photonics will revolutionize data centers in 2015
>
> By Michael Kassner 
>  January 23, 2015, 11:23 AM PST
>
>- Email Alert
>
> 
>- RSS 
>
>
>- Comments
>
> 
>- Save
>- Facebook0
>- Twitter0
>- Linkedin0
>-
>- More
>
> Data centers are morphing into computing singularities, albeit large ones.
> Silicon photonics will hasten that process. The reason why begins with
> Moore's Law.
>
> [image: siliconphotonics012815.jpg]
>  Image courtesy of Intel
>
> Gordon Moore's prediction known as Moore's Law
>  
> --
> "The number of transistors incorporated in a chip will approximately double
> every 24 months." -- has been uncanny in its accuracy since he made it in
> April 1965. That didn't stop pundits from saying Moore's Law
>  had
> a nice run, but like all good things, it was coming to an end. The pundits'
> prediction was erroneous, thanks to Intel (the company Moore co-founded).
> The reason is light, or more accurately photons.
> The problem photons overcome
>
> [image: gordonmooreintel.png]
> Gordon Moore
>  Image courtesy of Intel
> Moore's Law requires scientists and engineers to continually figure out
> how to pack larger quantities of transistors and support circuitry into
> chips. It's a challenge, but not as difficult as figuring out what to do
> about the by-products of shoving electricity through an ever-more dense
> population of chips: heat buildup, current leakage, and crosstalk between
> adjacent wire traces.
>
> Multi-core technology
> 
>  breathed
> new life into Moore's Law, but only for a short time. Using copper wires to
> transmit the digital information becomes the limiting factor. This MIT
> Technology Review 2005 article
> explains
> why copper wires were no longer good enough. "The problem is that
> electrical pulses traveling through a copper wire encounter electrical
> resistance, which degrades the information

[Vo]:doubling speed every 2 years for decades more, Intel silicon photonics now revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26

2015-01-26 Thread Rich Murray
doubling speed every 2 years for decades more, Intel silicon photonics now
revolutionizing data centers, Michael Kassner: Rich Murray 2015.01.26
http://rmforall.blogspot.com/2015/01/doubling-speed-every-2-years-for.html


[ See also:

exponential information technology 1890-2014 10exp17 more MIPS per constant
2004 dollar in 124 years, Luke Muehlhauser, Machine Intelligence Research
Institute 2014.05.12: Rich Murray 2014.12.27
http://rmforall.blogspot.com/2014/12/exponential-information-technology-1890.html


since 1890, increase by 10 times every 7.3 years --

since 1950 -- 2014 = 64 years, with about 10exp13  times more =
10,000,000,000,000 times more per device, from vacuum tubes to multicore
processors -- increase by 10 times every 5 years per constant 2004 dollar.


CSICON -- Murray's Law -- Eternal Exponential Expansion of Science: Rich
Murray 1997.04.05, 2001.06.22, 2011.01.03
http://rmforall.blogspot.com/2011/01/csicon-murrays-law-eternal-exponential.html
http://groups.yahoo.com/group/rmforall/message/102]


http://www.techrepublic.com/article/silicon-photonics-will-revolutionize-data-centers-in-2015/


NETWORKING 
Silicon photonics will revolutionize data centers in 2015

By Michael Kassner
 January
23, 2015, 11:23 AM PST

   - Email Alert
   

   - RSS 


   - Comments
   

   - Save
   - Facebook0
   - Twitter0
   - Linkedin0
   -
   - More

Data centers are morphing into computing singularities, albeit large ones.
Silicon photonics will hasten that process. The reason why begins with
Moore's Law.

[image: siliconphotonics012815.jpg]
 Image courtesy of Intel

Gordon Moore's prediction known as Moore's Law

--
"The number of transistors incorporated in a chip will approximately double
every 24 months." -- has been uncanny in its accuracy since he made it in
April 1965. That didn't stop pundits from saying Moore's Law
 had
a nice run, but like all good things, it was coming to an end. The pundits'
prediction was erroneous, thanks to Intel (the company Moore co-founded).
The reason is light, or more accurately photons.
The problem photons overcome

[image: gordonmooreintel.png]
Gordon Moore
 Image courtesy of Intel
Moore's Law requires scientists and engineers to continually figure out how
to pack larger quantities of transistors and support circuitry into chips.
It's a challenge, but not as difficult as figuring out what to do about the
by-products of shoving electricity through an ever-more dense population of
chips: heat buildup, current leakage, and crosstalk between adjacent wire
traces.

Multi-core technology

breathed
new life into Moore's Law, but only for a short time. Using copper wires to
transmit the digital information becomes the limiting factor. This MIT
Technology Review 2005 article
explains
why copper wires were no longer good enough. "The problem is that
electrical pulses traveling through a copper wire encounter electrical
resistance, which degrades the information they carry," states author
Robert Service. "As a result, data bits traveling through copper must be
spaced far enough apart and move slowly enough that devices on the other
end of the wire can pick them up."

That challenge becomes evident when walking through a data center, because
most, if not all, copper-based Ethernet runs have been replaced with fiber
optics. Using *existing* fiber-optic technology will not help Moore's Law
-- that requires a new technology called the silicon laser

.
Fast forward to 2009

Intel's Photonics Technology Laboratory

in
2009 mastered the silicon laser. "We have done all the things that skeptics
said we could not," mentions Intel Fellow Mario Paniccia

in
this SPIE article . "We have got beyond
the proof-of-principle stage. Now we're putting it all together so that
Moore's Law can extend for decades into the future."

The article goes on to explain how Paniccia and his team created high-