Logan: have you ever programmed a robot? You have to measure the distance to 
the wall so you don't walk into it



Logan,

Think carefully about your assumptions here.

You’re assuming that a robot must be programmed as robots have always been.

And if a normal robot is programmed to walk to a given goal, the programmer may 
indeed measure or plot the distance and route to the goal.

That is the normal practice. And reasonable practice.

IF you want to keep producing NARROW AI robots.

You are actually basing everything on **narrow AI** assumptions (just as Ben’s 
& Jim’s concurrent thread is based).

But we want an AGI ROBOT that can conduct activities like animals and humans –  
that can walk down a field or street just as YOU do - something that no robot 
has ever done before.

Now consider how you actually walk down a new field or a new street.

Do you first “measure the distance to the end of the field/street”?   
**Before** you walk down the field?

That’s physically impossible isn’t it? (In a normal situation).

And in a sense it’s physically impossible for a narrow AI robot too. It wasn’t 
actually  the robot that measured the distance to the wall or goal – it was the 
PROGRAMMER.

AGI is about creating courses of action – new courses of action - walking down 
a new field of whatever description – physical or metaphorical -  that *can’t* 
be measured or plotted in advance.

And for that maths/measurement simply doesn’t apply -  at least not in any 
necessary way. Programs without maths are not only possible, they are essential 
here.

Any program here can only, essentially, tell the robot to head for the goal, 
put one foot in front of the other,  and hope for the best.  Because you can’t 
know for sure what lies ahead in a new field – let alone measure it or the 
steps that must be taken.

Narrow AI is about getting machines to take old journeys in old fields, that 
the programmer has already taken on behalf the machine – before it moves a 
metal muscle - and that the programmer knows exactly how to take.

AGI is about getting machines to take new journeys in new fields, that robot 
and programmer alike *haven’t** already taken – *and don’t know exactly how to 
take.** (or measure).

Nobody in AGI gets the distinction. 


  From: Logan Streondj 
Sent: Tuesday, January 01, 2013 7:45 PM
To: AGI 
Subject: Re: [agi] Why Logic & Maths Have Sweet FA to do with Real world 
reasoning





On Tue, Jan 1, 2013 at 5:15 AM, Mike Tintner <[email protected]> wrote:

  Logan:I simply said that math was necessary for programming to work

  Really? You are saying that a robot can’t take steps to a goal – walk across 
a room or field – without some kind of counting or numbers being involved? 

certainly! have you ever programmed a robot?


You have to measure the distance to the wall so you don't walk into it. Also 
assuming it has legs has to calculate step length so it doesn't exceed the 
amount of space available.  Sure when you walk, you don't explicitly count it 
in mm or w/e, but you do implicitly based on measuring the amount of visible 
space,  much of course is done by lower brain regions which are out of the way 
of conscious thinking.


 
  That – wh. is more or less what David talks about -  a robot “taking steps to 
a goal” – is a good v. general way to think about both the final function of 
programming and AGI. Why do those steps have to involve maths?  

even version increments involve counting. I use a hexadecimal increment system 
in my roadmap. GIT uses sha hashes for versioning, which is a more complicated 
numbering system, that uses more advanced math functionality.


  (There does have to be some sense of quantities – for example, of putting 
more or less effort into those steps – but again why does that quantitative 
sense have to be precisely mathematical rather than crudely emotional? 

Emotions are for making the actual decisions, whereas math helps quantify the 
options, allowing for smarter decisions, which may lead to more positive 
emotions.


  When you do pressups,  do you think your system is performing mathematical 
calculations of effort – or is your sense of pain rather something very crudely 
and imperfectly fluidly quantitative? After all, your system doesn’t actually 
know its precise limits – how can they be quantified?)

sure they can be quantified, with kg's and things like that.  A healthy 
vertabrate can on average safely lift and carry about 25% of their body weight 
for prolonged periods of time. 

Though potentially a 100% or more for short intervals. 


If getting groceries from the store, I often at least make rough calculations 
of how many kg I'm getting, as I carry the food in my backpack, and if I'm 
walking it could be half an hour of carrying or more.  It can be very grueling 
to carry too much, so I like to be able to estimate with kg and know how much 
is safe. 



      AGI | Archives  | Modify Your Subscription   



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to