On 07/11/2014 22:16, Matt Mahoney wrote:
On Fri, Nov 7, 2014 at 8:18 PM, Tim Tyler via AGI <[email protected]> wrote:
in fact, it is impossible to predict the consequences of above-human-level AI.
As Vernor Vinge (1993) wrote, the technological singularity (i.e., the advent
of far-above-human-level AI) is an "opaque wall across the future."
I think that this is nonsense. [...]
I agree with Vinge. From an information theoretic viewpoint, if the
singularity represents vastly greater knowledge, then we only have a
tiny fraction of it now.
Unless you call all the rest of it "details". But we don't know now if
there will even be a singularity.
It's not that there's no mountain of information in the future - rather we
can make predictions with what we already have. Science gets the
low-hanging fruit first - so we already have many of the laws of
mechanics, gravity, electromagnetism, thermodynamics and evolution.
Some of this fundamental material illuminates the future of our descendants.
There's no "opaque wall". It's more like a curtain - and as we get closer to
it we will see more and more. The idea of a "singularity" which represents
a wall to predictions about our descendants - like an "event horizon" that
cannot be seen beyond - is unhelpful, unscientific and not justified by the
facts.
--
__________
|im |yler http://timtyler.org/ [email protected] Remove lock to reply.
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com