Sergio,
I am not sure why you are still writing me.  I don't assume that I will
create an actual AGI program in the near future.  My point is that no one
really knows how the brain works and no one really knows how to create
genuine AGI.  (I don't know how to get this across to you, but I am not the
guy participating in this thread who thinks that he has found the secret
that everyone has been longing for or that the scientific sect he belongs
should be revered by treating by suspending doubt that their conjectures
will go much beyond being conjectures.)  On the other hand, I do think that
AGI will be feasible some day.

I don't think that the current popular theories which are based on
probability have taken the field any further than could be explained by the
amazing advances in hardware.  Probability methods are useful, I am not
denying that, but it takes elaborate schemes to design AGI programs or to
develop conjectures on how the brain works in neuroscience, and inside
these elaborations any validity that the probability methods might
introduce are squelched.  Well, the human mind is not perfect either right?
Well, ok then why make the pretense that the science-of-probability is what
is driving the human mind.  Probability methods, like Bayesian reasoning,
may be useful in narrow situations but once you take them to extremes the
stuff that comes out is a lot of fluff.

If you were sincere in expressing your wish to explain EI, you would have
focused on doing just that with almost humorous single mindness.
Jim Bromer




On Mon, Aug 20, 2012 at 3:29 PM, Sergio Pissanetzky
<[email protected]>wrote:

> Jim,****
>
> ** **
>
> JIM> … it is a projection of a dim awareness of an inner sense about
> yourself that is personally too threatening to the maintenance of your
> exaggerated level of self-esteem for you to accept.****
>
> ** **
>
> SERGIO> Thanks for the psychoanalysis. Really, I am not satirical. I can
> tell you what my fears are. I found EI, a tiny detail in the immensity of
> observational science. It becomes important because so many people have
> been looking for it for so long, and because a number of major technologies
> are stuck without EI. So my top priority is to communicate my experience to
> young people while I can. Which is not a very long time. And this is my
> fear. ****
>
> ** **
>
> I started three parallel efforts, in Complexity, Computer Engineering, and
> AGI. Complexity is going well. CE is going very well. AGI has been going
> nowhere. I have been in Academia all my life. I am accostumed to scientists
> being  very happy when a new observation is made, and trying to fit it in
> their theories. Not in AGI. From some people (not all), I get the feeling
> that they would prefer not to hear about EI. I see them going to exagerated
> extremes ("the pomposity of science", or "Friston's ideas will not make
> much of a dent in producing AGI" in the same sentence where you admit your
> wish to understand more about it). Is it perhaps possible that the fears
> you say I am projecting, are actually your fears that you are projecting?
> Is it possible that your sense of self-esteem is no less exagerated than
> mine? Maybe you are projecting a dim awareness of your own inner fear. You
> seem to believe that you (and other AGIers) will produce AGI, not Friston.
> And you jump at any suggestion I make that you may be wrong. ****
>
> ** **
>
> There is a lot to be gained by trying to be objective, as much as
> possible. That's why Physics relies on observation that is reproducible and
> observer-agnostic. That's why I try to use principles and laws of nature. I
> know that people from other disciplines can't understand the laws, but the
> laws can't be ignored even if they don't understand why. ****
>
> ** **
>
> JIM> nobody really knows how the brain works****
>
> SERGIO> I agree. But we can still consider that what is known. Are you
>  ignoring that too? Just wondering. ****
>
> ** **
>
> ** **
>
> Sergio****
>
> ** **
>
> ** **
>
> *From:* Jim Bromer [mailto:[email protected]]
> *Sent:* Sunday, August 19, 2012 8:27 PM
>
> *To:* AGI
> *Subject:* Re: [agi] Uncertainty, causality, entropy, self-organization,
> and Schroedinger's cat.****
>
> ** **
>
> Sergio:Reading your posts, seems as if you are the only person in the
> world who knows what he is doing. ****
>
>  ****
>
> Sergio,  You are projecting.  How am I sure of that?  Contrary to your
> comment, I am the only one in this discussion who says that nobody really
> knows how the brain works.  You have certainly sounded like you are
> claiming to have the underlying principles all figured out and Adam sounds
> like he was arguing that Friston was well on his way.  So the claim that I
> seem to you as if I was "the only person in the world who knows what he is
> doing," is not only an exaggeration but absurdly inaccurate. So why would
> you say it?  Because it is a projection of a dim awareness of an inner
> sense about yourself that is personally too threatening to the maintenance
> of your exaggerated level of self-esteem for you to accept.****
>
>  ****
>
> I think Friston is pretty interesting just because it looks like fun.  I
> wish I understood more about it.  However, his ideas will not make much of
> a dent in producing artificial general intelligence. There is nothing that
> advances the field enough in his ideas and the Bayesian Networks have not
> proven themselves strong enough to produce AGI.  So I am not too worried
> about being proven wrong about this.****
>
>  ****
>
> Jim Bromer****
>
> ** **
>
> *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> |
> Modify Your 
> Subscription****<https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
>
> **** <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
>
> ** ** <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
>
> *AGI | Archives | Modify Your 
> Subscription<https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
> *
>
> **** <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
>
> ** ** <https://www.listbox.com/member/archive/rss/303/10561250-164650b2>
>   *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/10561250-164650b2> |
> Modify<https://www.listbox.com/member/?&;>Your Subscription
> <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to