On Thu, Oct 2, 2014 at 4:42 PM, Bruno Marchal <[email protected]> wrote:

>
> On 02 Oct 2014, at 14:41, Telmo Menezes wrote:
>
>
>
> On Thu, Oct 2, 2014 at 3:25 AM, meekerdb <[email protected]> wrote:
>
>>
>>
>>> This is why I find protein folding intriguing. I see the following
>>> possibilities:
>>>
>>> -> Molecular interactions entail an immense computational power;
>>> -> P = NP;
>>> -> We are constantly winning at quantum suicide.
>>> Am I missing something?
>>>
>>
>> P=/=NP doesn't mean that NP problems require "immense computational
>> power" beyond what could biochemistry can provide.  Being NP is just a
>> statement about how a problem scales with size of the input.  But for some
>> given finite size it might be quickly solved.
>>
>
> Right, but with protein folding we have an interesting comparison. Given
> the practical applications, many bioinformatics labs are working on the
> problem. Finding the lowest energy state for many interesting proteins
> seems to be beyond the capabilities of very powerful clusters using current
> algorithms.
>
> I agree with you that the most likely answer is that biochemistry can
> indeed provide this computational power, but this is not an observation
> devoid of interesting implications. One is technological: imagine what we
> could do with biochemical computers. The other is related to the brain: if
> biochemistry can unleash this computational power, then surely the brain
> uses it. In which case our estimations of how far away we are from AGI may
> be optimistic.
>
>
> But what is an AGI? A machine which decapitate another machines becomes
> the run another program?
>

We already have something a bit like this with Tom Ray's Tierra
environment. A lot of interesting things happen there, live parasitism and
viruses, all in a machine code ecology.

In any case, I mean AGI in the engineering sense: the sort of technology
that can free us from unwanted work. Or, put another way, a level of
automation where no task requires a human. So that humans are free to
pursue poetry, space exploration, sex orgies or whatever else they please.


>
> I think that Löbian universal machines are already intelligent, and that
> we can accelerate the growing of competence, although this is utimately
> what intelligence is for on this plane, but this might limit their
> intelligence, and indeed they might become as delusional and 'stupid' like
> us and get sleepy in the samsara.
>

I like your intelligence/competence dichotomy. It's usually artists that
are more aware of this, I think. Out culture is not really paying attention
at the moment, the momentum is too strong in the competence direction, I
think.

Telmo.


>
>
>
>
> Sorry for the tangents in any case.
>
>
> Sorry for mine too.
>
> Best,
>
> Bruno
>
>
>
> Cheers
> Telmo.
>
>
>>
>> Brent
>>
>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "Everything List" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> Visit this group at http://groups.google.com/group/everything-list.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>  --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at http://groups.google.com/group/everything-list.
> For more options, visit https://groups.google.com/d/optout.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to