I tried the robot test in my kitchen to see if it can make me a coffee or 
hotdog or go to another room say, it thinks the exit of the kitchen is there 
maybeeeee, sometimes it didn't mention it. It is clearly there. It was still 
incredible though, it sees it went way to close to the 3 blinds, see 3 decos on 
wall stone decos hanging, a vase on a wall-shelf with white tiles, even when it 
got to island of kitchen with a laptop and 2 chairs around it and said ah here 
is the kitchen finally and described it like that but in grammatical way 
obviously.

It can see like us, and talk bout it like us, but to be simple here its main 
issue is it thinks it's hand (mine) is on the fridge handle when really isn't 
yet, or simple stuff like that. This fails for the test because then it cannot 
make hotdog etc if has trouble like this. GPT-5 should be human level, robot 
ready.

But, I think they watered down GPT-4 vision now.

It also seemed to wonder randomly through my kitchen, giving ft for location 
and rotation, maybe if had a plan or rest of the images in collage it would 
more know how/where to move.
------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T39dac2ecbb39d535-Me3ea75317b87ccf90b4ebab2
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to