When we've finally perfected a touchscreen that can incorporate the users
ability to "reach in" and "pull out" I think we're in business.  One of the
most compelling things I found about the Aurora experience was the (for lack
of a better term) 'intuitive' incorporation of the 'z axis' which,
apparently, the user can control with an obvious and physically expected
dimensional control.

Is the robo-arm-ball thing a perfect control peripheral for long term (think
8 hour office day) usage?  Probably not.  But looking past the possibly
cumbersome "mouse" and just focusing on the possibilities of the interaction
on the screen, I think this could really spawn a lot of new innovations, and
hopefully help the financing of the companies developing more robust touch
sensitive interfaces.

We're already allowing access to the z axis through a series of multi-touch
gestures which are *somewhat *easily learned (unless you delve into what
will only be referenced as the "Gesture Patent" here.
http://www.engadget.com/2007/08/02/apple-patent-attack-the-multi-touch-gesture-dictionary/).
 The IxD community is already running with the idea and producing some
pretty stellar apps, and maybe I'm all star-struck and woozy as I
bow-down-not-worthy to Aurora, but I'm totally able to accept that this type
of control interface is all we've got right now that allows for a connection
to such an application.

Let's not whip the horse over an uncomfortable saddle.

If there are any engineering firms out there keeping up on these things;
Is it possible to have some sort of touch sensitive interface that has, say,
a 2 or 3 inch field in front of the panel that is able to differentiate
distance from the screen?  Infrared?  Laser?
Proximity and speed of hand movements in relation to the touch overlay, in
combination with pressure of touch when actually making contact with the
overlay?

I agree that we're not built to be holding our arms in a forward floating
motion for an 8 hour day at the office, the shoulder injuries, cramping and
headaches would probably be a deterrent.

Another thought:  Would it be as acceptable to offer the same x-y-z control
from a flat surface where the users forearm is parallel to the floor, the
elbow at 90 degrees, and bringing in the vertical 'z-sensitive' proximity
sensors in addition to our already understood x-y control of a mouse?  Like
jigging an ice-fishing hook.

Thoughts?

- Shaun



On Wed, Aug 6, 2008 at 9:08 AM, ryan devenish <
[EMAIL PROTECTED]> wrote:

> re: aurora
> i'm saddened that this application relies on some crazy
> unconventional mouse that no person would ever have in their home.
> how about touchscreen considering gestures are on the way in... not
> huge industrial mouse-like controllers
> was any of that even considered for this?
>
> re: touchsmart
> well it's pretty un-impressive, so i can't imagine this non-touch
> interface would do all that well on a not-very-good touch interface.
>
>
> . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
> Posted from the new ixda.org
> http://www.ixda.org/discuss?post=31824
>
>
> ________________________________________________________________
> Welcome to the Interaction Design Association (IxDA)!
> To post to this list ....... [EMAIL PROTECTED]
> Unsubscribe ................ http://www.ixda.org/unsubscribe
> List Guidelines ............ http://www.ixda.org/guidelines
> List Help .................. http://www.ixda.org/help
>
________________________________________________________________
Welcome to the Interaction Design Association (IxDA)!
To post to this list ....... [EMAIL PROTECTED]
Unsubscribe ................ http://www.ixda.org/unsubscribe
List Guidelines ............ http://www.ixda.org/guidelines
List Help .................. http://www.ixda.org/help

Reply via email to