As I started reading I thought to myself "I told you 1000 times, it
depends on the criteria". Reading on, I saw that it is precisely the
criteria you use as a parameter. Well, I'd like to find out the
programming language that makes the most money while giving
immortality :) A little more seriously, if the criteria are cognitive,
as they often are in the real world, you'd be digging yourself a hole
too deep to get out of. On the other hand, if the criteria are
domain-specific, relating to well-behaved domains, I am afraid we are
heading towards tautologies and trivialities. Something like
Mathematica would be optimal for algebra, analysis, gravity, mechanics
etc (though what about instead of calculating a parachute drop
actually measure a real parachute drop), for economics, psychology,
necromancy most things would do equally badly, and for AGI all options
have so far being worse than bad. Mind you, I am in the process of
defining an AGI architecture not as a compression problem but as a
distributed computation problem, and I would challenge you to answer
the question:

Which programming language/mechanism would be ideal for calculating X
as quickly as possible.

where X, for the sake of argument, is just a/any "heavy calculation"
without necessarily any of the anomalies of chaotic behavior, pi's
infinite series etc. It is not that I expect intelligence to arise out
of PDEs and integrals, rather I am asking which is the "perfect"
distributed system for calculus, as I am expecting your answer to take
the form of multipliers and other exotic units all converging in an
addition pipeline. I still can't help thinking that the fastest way
for parallel computations is the actual experiment, after all we have
the 3/n body problem and a ton of mathematics OR just an experiment
with n bodies in a field.

With regards to a possible language for AGI, I don't see how you can
do much better than a human language. Never mind Turing completeness,
we have GI completeness here (except for that part of human language,
perhaps 100% of it, that gets its meaning from its grounding, its
grounding from its embodiment, and its embodiment from - god?)

AT

On Mon, Aug 27, 2012 at 10:44 PM, Russell Wallace
<[email protected]> wrote:
> On Mon, Aug 27, 2012 at 9:12 PM, Ben Goertzel <[email protected]> wrote:
>> For domains in which one is concerned with recognizing large ensembles
>> of weak patterns, the language one uses to represent patterns can make
>> a big difference...
>>
>> Image analysis, genetic data analysis and financial prediction are
>> contexts in which I've found this to be the case
>>
>> In these settings, if one does pattern recognition via automated
>> program learning with an Occam bias,
>> the underlying language relative to which the Occam bias is expressed
>> makes a big difference...
>
> Absolutely, but these overheads are not constants - the computational
> cost of a poor choice of representation language is typically
> exponential.
>
>> From a different direction, consider Hutter's proof that AIXI-tl is as
>> good as any other reinforcement learning system ... up to an arbitrary
>> constant.
>
> Well, much violence is being done to the word 'constant' in this case.
> Sure, f(N) is constant for a given N, but... :)
>
>
> -------------------------------------------
> AGI
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/14050631-7d925eb1
> Modify Your Subscription: https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to