interestingly, this sort of thing could be massively helpful for AI research as 
well.  Having a metaverse-sized dataset of objects with reasonably accurate 
metadata tags is a resource that could have a huge impact on that space.

Just a thought..

Daniel B. Miller
aka danx0r
life is a simulation


--- On Thu, 9/18/08, Peter_Quirk <[EMAIL PROTECTED]> wrote:

> From: Peter_Quirk <[EMAIL PROTECTED]>
> Subject: Accessibility for the blind
> To: "realXtend" <[email protected]>
> Date: Thursday, September 18, 2008, 9:27 AM
> I've always thought that virtual worlds presented great
> difficulties
> for the blind. This post
> (http://coolcatteacher.blogspot.com/2008/09/
> how-can-second-life-be-used-by-visually.html) changed my
> mind. The
> idea of attachments that report on obstacles and content,
> combined
> with spatial sound and perhaps some special haptics, could
> make
> virtual worlds even more navigable than the web for the
> blind.
> 
> We owe it to that community to work on some standards for
> consistent
> labeling and typing of objects, plus 3D equivalents of the
> web tools
> for checking accessibility. I'll see if I can get one
> of the people
> behidn the mobility cane involved in this group.
> 

--~--~---------~--~----~------------~-------~--~----~
http://groups.google.com/group/realxtend
-~----------~----~----~----~------~----~------~--~---

Reply via email to