You can get depth information from single camera motion (eg Andrew Davison's MonoSLAM), but this requires an initial size calibration and continuous tracking.  If the tracking is lost at any time you need to recalibrate.  This makes single camera systems less practical.  With a stereo camera the baseline gives the scale calibration.

My inside sources tell me that there's little or no software development going on at Evolution Robotics, and that longstanding issues and bugs remain unfixed.  They did licence their stuff to WoWee, and also Whitebox Robotics, so its likely we'll see more SIFT enabled robots in the not too distant future.  I think their stuff was also licenced to Sony for use on their AIBO, before Sony axed their robotics products.

Grinding my own axe, I also think that stereo vision systems will bring significant improvements to robotics over the next few years.  Being able to build videogame-like 3D models of the environment in real time is now a feasible proposition which I think will happen before the decade is out.  With a good model of the environment the robot can rehearse possible scenarios before actually running them, and find important features such as desk or table surfaces.




On 23/10/06, Neil H. <[EMAIL PROTECTED]> wrote:
Is a stereo camera system really necessary if you can move the camera
around to get shape-from-motion?

Last I heard Evolution was partnering up with WowWee, to load their
software onto the next generation of their RoboSapien toy:
http://www.pcmag.com/article2/0,1895,1941233,00.asp

I find a SIFT-equipped robot in the $200 range to be quite exciting.
Hopefully the new generation will be as hack-friendly as the older
generations.

-- Neil

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]


This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to