Algorithmic information has nothing to do with my argument.
I'm talking about time complexity.

There are limits to how fast a computer can
run its clock, for example because delta E times Delta T must
be greater than hbar, so if you try to make delta T too 
small you explode.


Ben> It's not solved by shielding, because the hypothetical
Ben> "computable source whose algorithmic information is too high for
Ben> me to grok it" could be within the molecules of the brain, just
Ben> where the hypothetical "uncomputable source" is hypothesized to
Ben> be by Penrose and Hammeroff and so forth.

Ben> You can never do any experiment to distinguish directly between

Ben> A = "X is uncomputable"

Ben> and

Ben> B = "X is a computable but has an algorithmic information far
Ben> higher than my brain."

Ben> You can distinguish between them indirectly via inference
Ben> according to some theory, but, then the extension of theory to
Ben> deal with A and B is going to be speculative and unsupported,
Ben> etc.



Ben> -- Ben G

Ben> On Sun, Oct 26, 2008 at 9:19 AM, Eric Baum <[EMAIL PROTECTED]>
Ben> wrote:

>> I don't think this is reasonable. For the experiment, we would
>> isolate you with various shielding. It is a question of the design
>> of an experiment, like any other physics experiment. At some point,
>> Occam's Razor tells you that the best theory is a non-computational
>> system.
>> 
>> And, I hate to be defending people who make this kind of claim,
>> because their claims are wrong-- since what they are claiming to
>> have observed the mind do could easily be done by a computer.  And
>> the kind of stuff I am saying you would use to test it I don't
>> believe people could do.
>> 
>> But the point is only that one could perform experiments that would
>> test the hypothesis. The claim that such experiments would have to
>> be infinitely long to be convincing is not valid, I don't believe.
>> 
>> 
>> 
Ben> Eric, According to your argument, there are some cases in which
Ben> you could demonstrate that I was producing outputs that could not
Ben> be generated by the specific computer that is **my brain**
Ben> according to our current understanding of my brain.
>>
Ben> However, this would not demonstrate that the source is
Ben> noncomputational.  There are other possible explanations, such as
Ben> the explanation that there is some more powerful computer
Ben> somewhere generating the outputs, in a way that we don't
Ben> currently understand.
>>
Ben> So the question then becomes how would you distinguish between
Ben> the hypothesis of a hidden noncomputational source, and a hidden
Ben> more-powerful-computer source?  Again, you need to make this
Ben> distinction using a finite set of finite-precision
Ben> observations....  And so my argument then applies again to this
Ben> additional set of observations....
>>
Ben> So I don't see that you have really provided a counterexample.
Ben> However, I can see the value of formalizing my argument
Ben> mathematically so as to avoid the appearance of such loopholes...
>>
Ben> ben g
>>
Ben> On Fri, Oct 24, 2008 at 7:01 PM, Eric Baum <[EMAIL PROTECTED]>
Ben> wrote:
>> >> >> You have not convinced me that you can do anything a computer
>> >> can't do.  >> And, using language or math, you never will -- >>
>> because any finite set of symbols >> you can utter, could also be
>> >> uttered by some computational system.  >> -- Ben G
>> >>
>> >> I have the sense that this argument is not air tight, because I
>> can >> imagine a zero-knowledge proof that you can do something a
>> computer >> can't do.
>> >>
>> >> Any finite set of symbols you utter *could*, of course, be >>
>> utterable by some computational system, but if they are generated
>> >> in response to queries that are not known in advance, it might
>> be >> arbitrarily unlikely that they *would* be uttered by any
>> particular >> computational system.
>> >>
>> >> For example, to make this concrete and airtight, I can add a
>> time >> element.  Say I compute offline the answers to a large
>> number of >> problems that, if one were to solve them with a
>> computation, >> provably could only be solved by extremely long
>> sequential >> computations, each longer than any sequential
>> computation that a >> computer that could possibly be built out of
>> the matter in your >> brain could compute in an hour, and I present
>> you these problems >> and you answer 10000 of them in half an
>> hour. At this point, I am >> going, I think, to be pursuaded that
>> you are doing something that >> can not be captured by a Turing
>> machine.
>> >>
>> >> Not that I believe, of course, that you can do anything a
>> computer >> can't do. I'm just saying, the above argument is not a
>> proof that, >> if you could, it could not be demonstrated.
>> >>
>> >>
>> >> -------------------------------------------
>> >> agi Archives: https://www.listbox.com/member/archive/303/=now
>> RSS >> Feed: https://www.listbox.com/member/archive/rss/303/ Modify
>> Your >> Subscription: https://www.listbox.com/member/?&; Powered by
>> Listbox: >> http://www.listbox.com
>> >>
>> 
>> 
>> 
Ben> -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director
Ben> of Research, SIAI [EMAIL PROTECTED]
>>
Ben> "A human being should be able to change a diaper, plan an
Ben> invasion, butcher a hog, conn a ship, design a building, write a
Ben> sonnet, balance accounts, build a wall, set a bone, comfort the
Ben> dying, take orders, give orders, cooperate, act alone, solve
Ben> equations, analyze a new problem, pitch manure, program a
Ben> computer, cook a tasty meal, fight efficiently, die gallantly.
Ben> Specialization is for insects."  -- Robert Heinlein
>>
Ben> ------------------------------------------- agi Archives:
Ben> https://www.listbox.com/member/archive/303/=now RSS Feed:
Ben> https://www.listbox.com/member/archive/rss/303/ Modify Your
Ben> Subscription: https://www.listbox.com/member/?&; Powered by
Ben> Listbox: http://www.listbox.com <br>Eric,<br><br>According to
Ben> your argument, there are some cases in which you could
Ben> demonstrate that I was producing outputs that could not be
Ben> generated by the specific computer that is **my brain** according
Ben> to our current understanding of my brain.<br> <br>However, this
Ben> would not demonstrate that the source is noncomputational.&nbsp;
Ben> There are other possible explanations, such as the explanation
Ben> that there is some more powerful computer somewhere generating
Ben> the outputs, in a way that we don&#39;t currently
Ben> understand.&nbsp; <br> <br>So the question then becomes how would
Ben> you distinguish between the hypothesis of a hidden
Ben> noncomputational source, and a hidden more-powerful-computer
Ben> source?&nbsp; Again, you need to make this distinction using a
Ben> finite set of finite-precision observations....&nbsp; And so my
Ben> argument then applies again to this additional set of
Ben> observations....<br> <br>So I don&#39;t see that you have really
Ben> provided a counterexample.&nbsp; However, I can see the value of
Ben> formalizing my argument mathematically so as to avoid the
Ben> appearance of such loopholes...<br><br>ben g<br><br><div
Ben> class="gmail_quote"> On Fri, Oct 24, 2008 at 7:01 PM, Eric Baum
Ben> <span dir="ltr">&lt;<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a>&gt;</span>
Ben> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px
Ben> solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex;
Ben> padding-left: 1ex;"> <div class="Ih2E3d"><br> &gt;&gt; You have
Ben> not convinced me that you can do anything a computer can&#39;t
Ben> do.<br> &gt;&gt; And, using language or math, you never will --
Ben> because any finite set of symbols<br> &gt;&gt; you can utter,
Ben> could also be uttered by some computational system.<br> &gt;&gt;
Ben> -- Ben G<br> <br> </div>I have the sense that this argument is
Ben> not air tight, because I can<br> imagine a zero-knowledge proof
Ben> that you can do something a computer<br> can&#39;t do.<br> <br>
Ben> Any finite set of symbols you utter *could*, of course, be
Ben> utterable by<br> some computational system, but if they are
Ben> generated in response to<br> queries that are not known in
Ben> advance, it might be arbitrarily unlikely<br> that they *would*
Ben> be uttered by any particular computational system.<br> <br> For
Ben> example, to make this concrete and airtight, I can add a time
Ben> element.<br> Say I compute offline the answers to a large number
Ben> of<br> problems that, if one were to solve them with a
Ben> computation,<br> provably could only be solved by extremely long
Ben> sequential<br> computations, each longer than any sequential
Ben> computation<br> that a computer that could<br> possibly be built
Ben> out of the matter in your brain could compute in an hour,<br> and
Ben> I present you these problems and you answer 10000 of them in
Ben> half<br> an hour. At this point, I am going, I think, to be
Ben> pursuaded that you<br> are doing something that can not be
Ben> captured by a Turing machine.<br> <br> Not that I believe, of
Ben> course, that you can do anything a computer<br> can&#39;t
Ben> do. I&#39;m just saying, the above argument is not a proof
Ben> that,<br> if you could, it could not be demonstrated.<br>
Ben> <div><div></div><div class="Wj3C7c"><br> <br>
Ben> -------------------------------------------<br> agi<br> Archives:
Ben> <a href="https://www.listbox.com/member/archive/303/=now";
Ben> target="_blank">https://www.listbox.com/member/archive/303/=now
>> </a><br>
Ben> RSS Feed: <a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> target="_blank">https://www.listbox.com/member/archive/rss/303/
>> </a><br>
Ben> Modify Your Subscription: <a
Ben> href="https://www.listbox.com/member/?&amp;";
Ben> target="_blank">https://www.listbox.com/member/?&amp;</a><br>
Ben> Powered by Listbox: <a href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a><br>
Ben> </div></div></blockquote></div><br><br clear="all"><br>-- <br>Ben
Ben> Goertzel, PhD<br>CEO, Novamente LLC and Biomind LLC<br>Director
Ben> of Research, SIAI<br><a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a><br><br>&quot;A
Ben> human being should be able to change a diaper, plan an invasion,
Ben> butcher a hog, conn a ship, design a building, write a sonnet,
Ben> balance accounts, build a wall, set a bone, comfort the dying,
Ben> take orders, give orders, cooperate, act alone, solve equations,
Ben> analyze a new problem, pitch manure, program a computer, cook a
Ben> tasty meal, fight efficiently, die gallantly. Specialization is
Ben> for insects.&quot; &nbsp;-- Robert Heinlein<br> <br><br> <div
Ben> style="padding:0 4px 4px 4px;background-color:#fff;clear:both"
Ben> bgcolor="#ffffff"> <hr>
>>
Ben> <table border="0" cellspacing="0" cellpadding="0" width="100%"
Ben> style="background-color:#fff" bgcolor="#ffffff"> <tr> <td
Ben> padding="4px"> <font color="black" size="1" face="helvetica,
Ben> sans-serif;"> <strong>agi</strong> | <a
Ben> style="text-decoration:none;color:#669933;border-bottom: 1px
Ben> solid #444444"
Ben> href="https://www.listbox.com/member/archive/303/=now"; title="Go
Ben> to archives for agi">Archives</a> <a border="0"
Ben> style="text-decoration:none;color:#669933"
Ben> href="https://www.listbox.com/member/archive/rss/303/"; title="RSS
Ben> feed for agi"><img border=0
Ben> src="https://www.listbox.com/images/feed-icon-10x10.jpg";></a> |
Ben> <a style="text-decoration:none;color:#669933;border-bottom: 1px
Ben> solid #444444" href="https://www.listbox.com/member/?&";
Ben> title="">Modify</a> Your Subscription<td valign="top"
Ben> align="right"><a style="border-bottom:none;"
Ben> href="http://www.listbox.com";> <img
Ben> src="https://www.listbox.com/images/listbox-logo-small.jpg";
Ben> title="Powered by Listbox" border="0" /></a></td>
>>
Ben> </font> </td> </tr> </table> </div>
>> 
>> 
>> -------------------------------------------
>> agi Archives: https://www.listbox.com/member/archive/303/=now RSS
>> Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your
>> Subscription: https://www.listbox.com/member/?&; Powered by Listbox:
>> http://www.listbox.com
>> 



Ben> -- Ben Goertzel, PhD CEO, Novamente LLC and Biomind LLC Director
Ben> of Research, SIAI [EMAIL PROTECTED]

Ben> "A human being should be able to change a diaper, plan an
Ben> invasion, butcher a hog, conn a ship, design a building, write a
Ben> sonnet, balance accounts, build a wall, set a bone, comfort the
Ben> dying, take orders, give orders, cooperate, act alone, solve
Ben> equations, analyze a new problem, pitch manure, program a
Ben> computer, cook a tasty meal, fight efficiently, die gallantly.
Ben> Specialization is for insects."  -- Robert Heinlein



Ben> ------------------------------------------- agi Archives:
Ben> https://www.listbox.com/member/archive/303/=now RSS Feed:
Ben> https://www.listbox.com/member/archive/rss/303/ Modify Your
Ben> Subscription:
Ben> https://www.listbox.com/member/?&;
Ben> Powered by Listbox: http://www.listbox.com <br>It&#39;s not
Ben> solved by shielding, because the hypothetical &quot;computable
Ben> source whose algorithmic information is too high for me to grok
Ben> it&quot; could be within the molecules of the brain, just where
Ben> the hypothetical &quot;uncomputable source&quot; is hypothesized
Ben> to be by Penrose and Hammeroff and so forth.<br> <br>You can
Ben> never do any experiment to distinguish directly between<br><br>A
Ben> = &quot;X is uncomputable&quot;<br><br>and<br><br>B = &quot;X is
Ben> a computable but has an algorithmic information far higher than
Ben> my brain.&quot;<br> <br>You can distinguish between them
Ben> indirectly via inference according to some theory, but, then the
Ben> extension of theory to deal with A and B is going to be
Ben> speculative and unsupported, etc.<br><br><br><br>-- Ben G<br><br>
Ben> <div class="gmail_quote">On Sun, Oct 26, 2008 at 9:19 AM, Eric
Ben> Baum <span dir="ltr">&lt;<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a>&gt;</span>
Ben> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px
Ben> solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex;
Ben> padding-left: 1ex;"> <br> I don&#39;t think this is
Ben> reasonable. For the experiment, we would isolate<br> you with
Ben> various shielding. It is a question of the design of an<br>
Ben> experiment, like any other physics experiment. At some point,<br>
Ben> Occam&#39;s Razor tells you that the best theory is a
Ben> non-computational<br> system.<br> <br> And, I hate to be
Ben> defending people who make this kind of claim,<br> because their
Ben> claims are wrong-- since what they are claiming to<br> have
Ben> observed the mind do could easily be done by a computer.<br> And
Ben> the kind of stuff I am saying you would use to test it<br> I
Ben> don&#39;t believe people could do.<br> <br> But the point is only
Ben> that one could perform experiments that would<br> test the
Ben> hypothesis. The claim that such experiments would have to<br> be
Ben> infinitely long to be convincing is not valid, I don&#39;t
Ben> believe.<br> <br> <br> <br> Ben&gt; Eric, According to your
Ben> argument, there are some cases in which<br> Ben&gt; you could
Ben> demonstrate that I was producing outputs that could not<br>
Ben> Ben&gt; be generated by the specific computer that is **my
Ben> brain**<br> Ben&gt; according to our current understanding of my
Ben> brain.<br> <br> Ben&gt; However, this would not demonstrate that
Ben> the source is<br> Ben&gt; noncomputational. &nbsp;There are other
Ben> possible explanations, such as<br> Ben&gt; the explanation that
Ben> there is some more powerful computer<br> Ben&gt; somewhere
Ben> generating the outputs, in a way that we don&#39;t<br> Ben&gt;
Ben> currently understand.<br> <br> Ben&gt; So the question then
Ben> becomes how would you distinguish between<br> Ben&gt; the
Ben> hypothesis of a hidden noncomputational source, and a hidden<br>
Ben> Ben&gt; more-powerful-computer source? &nbsp;Again, you need to
Ben> make this<br> Ben&gt; distinction using a finite set of
Ben> finite-precision<br> Ben&gt; observations.... &nbsp;And so my
Ben> argument then applies again to this<br> Ben&gt; additional set of
Ben> observations....<br> <br> Ben&gt; So I don&#39;t see that you
Ben> have really provided a counterexample.<br> Ben&gt; However, I can
Ben> see the value of formalizing my argument<br> Ben&gt;
Ben> mathematically so as to avoid the appearance of such
Ben> loopholes...<br> <br> Ben&gt; ben g<br> <br> Ben&gt; On Fri, Oct
Ben> 24, 2008 at 7:01 PM, Eric Baum &lt;<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a>&gt;<br>
Ben> Ben&gt; wrote:<br> <br> &gt;&gt; &gt;&gt; You have not convinced
Ben> me that you can do anything a computer<br> &gt;&gt; can&#39;t
Ben> do. &nbsp;&gt;&gt; And, using language or math, you never will
Ben> --<br> &gt;&gt; because any finite set of symbols &gt;&gt; you
Ben> can utter, could also be<br> &gt;&gt; uttered by some
Ben> computational system. &nbsp;&gt;&gt; -- Ben G<br> &gt;&gt;<br>
Ben> &gt;&gt; I have the sense that this argument is not air tight,
Ben> because I can<br> &gt;&gt; imagine a zero-knowledge proof that
Ben> you can do something a computer<br> &gt;&gt; can&#39;t do.<br>
Ben> &gt;&gt;<br> &gt;&gt; Any finite set of symbols you utter
Ben> *could*, of course, be<br> &gt;&gt; utterable by some
Ben> computational system, but if they are generated<br> &gt;&gt; in
Ben> response to queries that are not known in advance, it might
Ben> be<br> &gt;&gt; arbitrarily unlikely that they *would* be uttered
Ben> by any particular<br> &gt;&gt; computational system.<br>
Ben> &gt;&gt;<br> &gt;&gt; For example, to make this concrete and
Ben> airtight, I can add a time<br> &gt;&gt; element. &nbsp;Say I
Ben> compute offline the answers to a large number of<br> &gt;&gt;
Ben> problems that, if one were to solve them with a computation,<br>
Ben> &gt;&gt; provably could only be solved by extremely long
Ben> sequential<br> &gt;&gt; computations, each longer than any
Ben> sequential computation that a<br> &gt;&gt; computer that could
Ben> possibly be built out of the matter in your<br> &gt;&gt; brain
Ben> could compute in an hour, and I present you these problems<br>
Ben> &gt;&gt; and you answer 10000 of them in half an hour. At this
Ben> point, I am<br> &gt;&gt; going, I think, to be pursuaded that you
Ben> are doing something that<br> &gt;&gt; can not be captured by a
Ben> Turing machine.<br> &gt;&gt;<br> &gt;&gt; Not that I believe, of
Ben> course, that you can do anything a computer<br> &gt;&gt;
Ben> can&#39;t do. I&#39;m just saying, the above argument is not a
Ben> proof that,<br> &gt;&gt; if you could, it could not be
Ben> demonstrated.<br> <div class="Ih2E3d">&gt;&gt;<br> &gt;&gt;<br>
Ben> &gt;&gt; -------------------------------------------<br> &gt;&gt;
Ben> agi Archives: <a
Ben> href="https://www.listbox.com/member/archive/303/=now";
Ben> target="_blank">https://www.listbox.com/member/archive/303/=now</a>
Ben> RSS<br> &gt;&gt; Feed: <a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> target="_blank">https://www.listbox.com/member/archive/rss/303/</a>
Ben> Modify Your<br> &gt;&gt; Subscription: <a
Ben> href="https://www.listbox.com/member/?&amp;";
Ben> target="_blank">https://www.listbox.com/member/?&amp;</a> Powered
Ben> by Listbox:<br> &gt;&gt; <a href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a><br> &gt;&gt;<br> <br>
Ben> <br> <br> </div>Ben&gt; -- Ben Goertzel, PhD CEO, Novamente LLC
Ben> and Biomind LLC Director<br> Ben&gt; of Research, SIAI <a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a><br> <br>
Ben> Ben&gt; &quot;A human being should be able to change a diaper,
Ben> plan an<br> Ben&gt; invasion, butcher a hog, conn a ship, design
Ben> a building, write a<br> Ben&gt; sonnet, balance accounts, build a
Ben> wall, set a bone, comfort the<br> Ben&gt; dying, take orders,
Ben> give orders, cooperate, act alone, solve<br> Ben&gt; equations,
Ben> analyze a new problem, pitch manure, program a<br> Ben&gt;
Ben> computer, cook a tasty meal, fight efficiently, die
Ben> gallantly.<br> Ben&gt; Specialization is for insects.&quot;
Ben> &nbsp;-- Robert Heinlein<br> <br> <br> <br> Ben&gt;
Ben> ------------------------------------------- agi Archives:<br>
Ben> Ben&gt; <a href="https://www.listbox.com/member/archive/303/=now";
Ben> target="_blank">https://www.listbox.com/member/archive/303/=now</a>
Ben> RSS Feed:<br> Ben&gt; <a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> target="_blank">https://www.listbox.com/member/archive/rss/303/</a>
Ben> Modify Your<br> Ben&gt; Subscription:<br> Ben&gt; <a
Ben> href="https://www.listbox.com/member/?&amp;";
Ben> target="_blank">https://www.listbox.com/member/?&amp;</a><br>
Ben> Ben&gt; Powered by Listbox: <a href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a><br> Ben&gt;
Ben> &lt;br&gt;Eric,&lt;br&gt;&lt;br&gt;According to your argument,
Ben> there are some cases<br> Ben&gt; in which you could demonstrate
Ben> that I was producing outputs that<br> Ben&gt; could not be
Ben> generated by the specific computer that is **my<br> Ben&gt;
Ben> brain** according to our current understanding of my
Ben> brain.&lt;br&gt;<br> Ben&gt; &lt;br&gt;However, this would not
Ben> demonstrate that the source is<br> Ben&gt;
Ben> noncomputational.&amp;nbsp; There are other possible
Ben> explanations,<br> Ben&gt; such as the explanation that there is
Ben> some more powerful computer<br> Ben&gt; somewhere generating the
Ben> outputs, in a way that we don&amp;#39;t<br> Ben&gt; currently
Ben> understand.&amp;nbsp; &lt;br&gt; &lt;br&gt;So the question then
Ben> becomes<br> Ben&gt; how would you distinguish between the
Ben> hypothesis of a hidden<br> Ben&gt; noncomputational source, and a
Ben> hidden more-powerful-computer<br> Ben&gt; source?&amp;nbsp;
Ben> Again, you need to make this distinction using a<br> Ben&gt;
Ben> finite set of finite-precision observations....&amp;nbsp; And so
Ben> my<br> Ben&gt; argument then applies again to this additional set
Ben> of<br> Ben&gt; observations....&lt;br&gt; &lt;br&gt;So I
Ben> don&amp;#39;t see that you have really<br> Ben&gt; provided a
Ben> counterexample.&amp;nbsp; However, I can see the value of<br>
Ben> Ben&gt; formalizing my argument mathematically so as to avoid
Ben> the<br> Ben&gt; appearance of such
Ben> loopholes...&lt;br&gt;&lt;br&gt;ben
Ben> g&lt;br&gt;&lt;br&gt;&lt;div<br> Ben&gt;
Ben> class=&quot;gmail_quote&quot;&gt; On Fri, Oct 24, 2008 at 7:01
Ben> PM, Eric Baum<br> Ben&gt; &lt;span
Ben> dir=&quot;ltr&quot;&gt;&amp;lt;&lt;a<br> Ben&gt;
Ben> href=&quot;mailto:<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a>&quot;&gt;<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL 
PROTECTED]</a>&lt;/a&gt;&amp;gt;&lt;/span&gt;<br>
Ben> Ben&gt; wrote:&lt;br&gt;&lt;blockquote
Ben> class=&quot;gmail_quote&quot; style=&quot;border-left: 1px<br>
Ben> Ben&gt; solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex;<br>
Ben> Ben&gt; padding-left: 1ex;&quot;&gt; &lt;div
Ben> class=&quot;Ih2E3d&quot;&gt;&lt;br&gt; &amp;gt;&amp;gt; You
Ben> have<br> Ben&gt; not convinced me that you can do anything a
Ben> computer can&amp;#39;t<br> Ben&gt; do.&lt;br&gt; &amp;gt;&amp;gt;
Ben> And, using language or math, you never will --<br> Ben&gt;
Ben> because any finite set of symbols&lt;br&gt; &amp;gt;&amp;gt; you
Ben> can utter,<br> Ben&gt; could also be uttered by some
Ben> computational system.&lt;br&gt; &amp;gt;&amp;gt;<br> Ben&gt; --
Ben> Ben G&lt;br&gt; &lt;br&gt; &lt;/div&gt;I have the sense that this
Ben> argument is<br> Ben&gt; not air tight, because I can&lt;br&gt;
Ben> imagine a zero-knowledge proof<br> Ben&gt; that you can do
Ben> something a computer&lt;br&gt; can&amp;#39;t do.&lt;br&gt;
Ben> &lt;br&gt;<br> Ben&gt; Any finite set of symbols you utter
Ben> *could*, of course, be<br> Ben&gt; utterable by&lt;br&gt; some
Ben> computational system, but if they are<br> Ben&gt; generated in
Ben> response to&lt;br&gt; queries that are not known in<br> Ben&gt;
Ben> advance, it might be arbitrarily unlikely&lt;br&gt; that they
Ben> *would*<br> Ben&gt; be uttered by any particular computational
Ben> system.&lt;br&gt; &lt;br&gt; For<br> Ben&gt; example, to make
Ben> this concrete and airtight, I can add a time<br> Ben&gt;
Ben> element.&lt;br&gt; Say I compute offline the answers to a large
Ben> number<br> Ben&gt; of&lt;br&gt; problems that, if one were to
Ben> solve them with a<br> Ben&gt; computation,&lt;br&gt; provably
Ben> could only be solved by extremely long<br> Ben&gt;
Ben> sequential&lt;br&gt; computations, each longer than any
Ben> sequential<br> Ben&gt; computation&lt;br&gt; that a computer that
Ben> could&lt;br&gt; possibly be built<br> Ben&gt; out of the matter
Ben> in your brain could compute in an hour,&lt;br&gt; and<br> Ben&gt;
Ben> I present you these problems and you answer 10000 of them in<br>
Ben> Ben&gt; half&lt;br&gt; an hour. At this point, I am going, I
Ben> think, to be<br> Ben&gt; pursuaded that you&lt;br&gt; are doing
Ben> something that can not be<br> Ben&gt; captured by a Turing
Ben> machine.&lt;br&gt; &lt;br&gt; Not that I believe, of<br> Ben&gt;
Ben> course, that you can do anything a computer&lt;br&gt;
Ben> can&amp;#39;t<br> Ben&gt; do. I&amp;#39;m just saying, the above
Ben> argument is not a proof<br> Ben&gt; that,&lt;br&gt; if you could,
Ben> it could not be demonstrated.&lt;br&gt;<br> Ben&gt;
Ben> &lt;div&gt;&lt;div&gt;&lt;/div&gt;&lt;div
Ben> class=&quot;Wj3C7c&quot;&gt;&lt;br&gt; &lt;br&gt;<br> Ben&gt;
Ben> -------------------------------------------&lt;br&gt;
Ben> agi&lt;br&gt; Archives:<br> Ben&gt; &lt;a href=&quot;<a
Ben> href="https://www.listbox.com/member/archive/303/=now";
Ben> 
target="_blank">https://www.listbox.com/member/archive/303/=now</a>&quot;<br>
Ben> Ben&gt; target=&quot;_blank&quot;&gt;<a
Ben> href="https://www.listbox.com/member/archive/303/=now";
Ben> 
target="_blank">https://www.listbox.com/member/archive/303/=now</a>&lt;/a&gt;&lt;br&gt;<br>
Ben> Ben&gt; RSS Feed: &lt;a<br> Ben&gt; href=&quot;<a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> 
target="_blank">https://www.listbox.com/member/archive/rss/303/</a>&quot;<br>
Ben> Ben&gt; target=&quot;_blank&quot;&gt;<a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> 
target="_blank">https://www.listbox.com/member/archive/rss/303/</a>&lt;/a&gt;&lt;br&gt;<br>
Ben> Ben&gt; Modify Your Subscription: &lt;a<br> Ben&gt; href=&quot;<a
Ben> href="https://www.listbox.com/member/?&amp;amp";
Ben> target="_blank">https://www.listbox.com/member/?&amp;amp</a>;&quot;<br>
Ben> Ben&gt; target=&quot;_blank&quot;&gt;<a
Ben> href="https://www.listbox.com/member/?&amp;amp";
Ben> 
target="_blank">https://www.listbox.com/member/?&amp;amp</a>;&lt;/a&gt;&lt;br&gt;<br>
Ben> Ben&gt; Powered by Listbox: &lt;a href=&quot;<a
Ben> href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a>&quot;<br> Ben&gt;
Ben> target=&quot;_blank&quot;&gt;<a href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a>&lt;/a&gt;&lt;br&gt;<br>
Ben> Ben&gt;
Ben> &lt;/div&gt;&lt;/div&gt;&lt;/blockquote&gt;&lt;/div&gt;&lt;br&gt;&lt;br
Ben> clear=&quot;all&quot;&gt;&lt;br&gt;-- &lt;br&gt;Ben<br> Ben&gt;
Ben> Goertzel, PhD&lt;br&gt;CEO, Novamente LLC and Biomind
Ben> LLC&lt;br&gt;Director<br> Ben&gt; of Research,
Ben> SIAI&lt;br&gt;&lt;a<br> Ben&gt; href=&quot;mailto:<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a>&quot;&gt;<a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL 
PROTECTED]</a>&lt;/a&gt;&lt;br&gt;&lt;br&gt;&amp;quot;A<br>
Ben> Ben&gt; human being should be able to change a diaper, plan an
Ben> invasion,<br> Ben&gt; butcher a hog, conn a ship, design a
Ben> building, write a sonnet,<br> Ben&gt; balance accounts, build a
Ben> wall, set a bone, comfort the dying,<br> Ben&gt; take orders,
Ben> give orders, cooperate, act alone, solve equations,<br> Ben&gt;
Ben> analyze a new problem, pitch manure, program a computer, cook
Ben> a<br> Ben&gt; tasty meal, fight efficiently, die
Ben> gallantly. Specialization is<br> Ben&gt; for insects.&amp;quot;
Ben> &amp;nbsp;-- Robert Heinlein&lt;br&gt; &lt;br&gt;&lt;br&gt;
Ben> &lt;div<br> Ben&gt; style=&quot;padding:0 4px 4px
Ben> 4px;background-color:#fff;clear:both&quot;<br> Ben&gt;
Ben> bgcolor=&quot;#ffffff&quot;&gt; &lt;hr&gt;<br> <br> Ben&gt;
Ben> &lt;table border=&quot;0&quot; cellspacing=&quot;0&quot;
Ben> cellpadding=&quot;0&quot; width=&quot;100%&quot;<br> Ben&gt;
Ben> style=&quot;background-color:#fff&quot;
Ben> bgcolor=&quot;#ffffff&quot;&gt; &lt;tr&gt; &lt;td<br> Ben&gt;
Ben> padding=&quot;4px&quot;&gt; &lt;font color=&quot;black&quot;
Ben> size=&quot;1&quot; face=&quot;helvetica,<br> Ben&gt;
Ben> sans-serif;&quot;&gt; &lt;strong&gt;agi&lt;/strong&gt; |
Ben> &lt;a<br> Ben&gt;
Ben> style=&quot;text-decoration:none;color:#669933;border-bottom:
Ben> 1px<br> Ben&gt; solid #444444&quot;<br> Ben&gt; href=&quot;<a
Ben> href="https://www.listbox.com/member/archive/303/=now";
Ben> target="_blank">https://www.listbox.com/member/archive/303/=now</a>&quot;
Ben> title=&quot;Go<br> Ben&gt; to archives for
Ben> agi&quot;&gt;Archives&lt;/a&gt; &lt;a border=&quot;0&quot;<br>
Ben> Ben&gt; style=&quot;text-decoration:none;color:#669933&quot;<br>
Ben> Ben&gt; href=&quot;<a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> target="_blank">https://www.listbox.com/member/archive/rss/303/</a>&quot;
Ben> title=&quot;RSS<br> Ben&gt; feed for agi&quot;&gt;&lt;img
Ben> border=0<br> Ben&gt; src=&quot;<a
Ben> href="https://www.listbox.com/images/feed-icon-10x10.jpg";
Ben> 
target="_blank">https://www.listbox.com/images/feed-icon-10x10.jpg</a>&quot;&gt;&lt;/a&gt;
Ben> |<br> Ben&gt; &lt;a
Ben> style=&quot;text-decoration:none;color:#669933;border-bottom:
Ben> 1px<br> Ben&gt; solid #444444&quot;<br> Ben&gt; href=&quot;<a
Ben> href="https://www.listbox.com/member/?&amp;";
Ben> target="_blank">https://www.listbox.com/member/?&amp;</a>&quot;<br>
Ben> Ben&gt; title=&quot;&quot;&gt;Modify&lt;/a&gt; Your
Ben> Subscription&lt;td valign=&quot;top&quot;<br> Ben&gt;
Ben> align=&quot;right&quot;&gt;&lt;a
Ben> style=&quot;border-bottom:none;&quot;<br> Ben&gt; href=&quot;<a
Ben> href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a>&quot;&gt; &lt;img<br>
Ben> Ben&gt; src=&quot;<a
Ben> href="https://www.listbox.com/images/listbox-logo-small.jpg";
Ben> 
target="_blank">https://www.listbox.com/images/listbox-logo-small.jpg</a>&quot;<br>
Ben> Ben&gt; title=&quot;Powered by Listbox&quot; border=&quot;0&quot;
Ben> /&gt;&lt;/a&gt;&lt;/td&gt;<br> <br> Ben&gt; &nbsp; &nbsp; &nbsp;
Ben> &lt;/font&gt; &lt;/td&gt; &lt;/tr&gt; &lt;/table&gt;
Ben> &lt;/div&gt;<br> <div><div></div><div class="Wj3C7c"><br> <br>
Ben> -------------------------------------------<br> agi<br> Archives:
Ben> <a href="https://www.listbox.com/member/archive/303/=now";
Ben> target="_blank">https://www.listbox.com/member/archive/303/=now</a><br>
Ben> RSS Feed: <a
Ben> href="https://www.listbox.com/member/archive/rss/303/";
Ben> target="_blank">https://www.listbox.com/member/archive/rss/303/</a><br>
Ben> Modify Your Subscription: <a
Ben> href="https://www.listbox.com/member/?&amp;";
Ben> target="_blank">https://www.listbox.com/member/?&amp;</a><br>
Ben> Powered by Listbox: <a href="http://www.listbox.com";
Ben> target="_blank">http://www.listbox.com</a><br>
Ben> </div></div></blockquote></div><br><br clear="all"><br>-- <br>Ben
Ben> Goertzel, PhD<br>CEO, Novamente LLC and Biomind LLC<br>Director
Ben> of Research, SIAI<br><a
Ben> href="mailto:[EMAIL PROTECTED]">[EMAIL PROTECTED]</a><br><br>&quot;A
Ben> human being should be able to change a diaper, plan an invasion,
Ben> butcher a hog, conn a ship, design a building, write a sonnet,
Ben> balance accounts, build a wall, set a bone, comfort the dying,
Ben> take orders, give orders, cooperate, act alone, solve equations,
Ben> analyze a new problem, pitch manure, program a computer, cook a
Ben> tasty meal, fight efficiently, die gallantly. Specialization is
Ben> for insects.&quot; &nbsp;-- Robert Heinlein<br> <br><br> <div
Ben> style="padding:0 4px 4px 4px;background-color:#fff;clear:both"
Ben> bgcolor="#ffffff"> <hr>

Ben> <table border="0" cellspacing="0" cellpadding="0" width="100%"
Ben> style="background-color:#fff" bgcolor="#ffffff"> <tr> <td
Ben> padding="4px"> <font color="black" size="1" face="helvetica,
Ben> sans-serif;"> <strong>agi</strong> | <a
Ben> style="text-decoration:none;color:#669933;border-bottom: 1px
Ben> solid #444444"
Ben> href="https://www.listbox.com/member/archive/303/=now"; title="Go
Ben> to archives for agi">Archives</a> <a border="0"
Ben> style="text-decoration:none;color:#669933"
Ben> href="https://www.listbox.com/member/archive/rss/303/"; title="RSS
Ben> feed for agi"><img border=0
Ben> src="https://www.listbox.com/images/feed-icon-10x10.jpg";></a> |
Ben> <a style="text-decoration:none;color:#669933;border-bottom: 1px
Ben> solid #444444"
Ben> href="https://www.listbox.com/member/?&";
Ben> title="">Modify</a> Your Subscription<td valign="top"
Ben> align="right"><a style="border-bottom:none;"
Ben> href="http://www.listbox.com";> <img
Ben> src="https://www.listbox.com/images/listbox-logo-small.jpg";
Ben> title="Powered by Listbox" border="0" /></a></td>

Ben>       </font> </td> </tr> </table> </div>


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to