Hi,

On Mon, May 14, 2007 4:57 pm, David Clark wrote:
> Some people take Mathematics and their so called "proofs" as the gospel
> when it comes to programming and AGI.  Even though I have a Math minor
> from University, I have used next to no Mathematics in my 30 year
> programming/design career.  I have never been impressed by complicated
> formulas and I have been many slick (Math) talking people who couldn't
> produce anything that worked in the real world.
I don't think the question is wether the program itself should rely on
math or not. The point is math & computing are deeply linked, and math
theory does say that some things are possible, and some or not.

It's a fact that a computer with 1GB memory will have much trouble
simulating a computer with 1GB memory. Maybe a computer with 1GB + 1 bit
could do the job of simulating the same computer with 1GB, but well, I
imagine it would still need a little more than that. For other reasons -
but still math related - you'll never ever find an algorithm which
compresses *every* file. Of course you'll find an algorithm which
compresses most "real world" files (we use them everyday) but there's a
slight difference between "all" file and "most" files.

Of course real world programs that do usefull stuff is what we need, but
when theory says "this kind of program / computer / foo / bar" cannot be
built, it's wise to pay attention. I'm happy Godel discovered that it
wasn't possible to answer some questions, or else I might still be
searching for the answer.

Now if a computer with 1,1GB can simulate a computer with 1GB this is
probably, from your "real world" point of view, pretty much enough, since
someone capable of building a 1GB computer can probably build a 1,1GB
computer for that purpose. Scaling being much easier on computers than on
a human brain, this property might by itself justify the interest in AGI.
But a computer with 1,1GB is not the same than a computer with 1GB. Or
else I could proove you no matter how much memory you put in a computer,
they are all the same. The conclusion is that a real world computer, as we
know them today (Turing machines) can't simulate "itself". But it can
simulate something that is so close to "itself" that in most cases it
won't make any difference. "Most". Not "all".

I suspect you consider "math related" all activities that are linked to
formulas, calculus, probability, numbers, and such things. It happens that
knowing that ((not a) and (not b)) is equivalent to (not (a or b)) is just
plain math. And that's the kind of code you find everywhere in web
servers, arcade games, cryptographic algorithms, regexp engines, well,
anywhere.

Of course, we can debate of what "math" is but IMHO most computer related
concepts and skills are derived from math.

Have a nice day,

Christian.

-- 
Christian Mauduit <[EMAIL PROTECTED]>     __/\__ ___
                                        \~/ ~/(`_ \   ___
http://www.ufoot.org/                   /_o _\   \ \_/ _ \_
http://www.ufoot.org/ufoot.pub (GnuPG)    \/      \___/ \__)

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to