Hi everyone,

Just as an addendum to this conversation, i was "meditating" of sorts
about some game theory and the controls of a game app which could be
ported to a gprs/navigation application. Here goes:

1 The 3d positional audio used on two vectors of sound propagation.

I will give an example of an RTS game for this concept.

Since i think relating the true distance of a sound from the back or
the front could be quite daunting, to the least having a high learning
curve, i was thinking  that spatial references can be laid out on a
bird's eye view perspective. IE: front and back= top and bottom while
left and right stay the same.
If i imagined this as a physical boundary, the x and y axises would be
quote flattened unquote to y and z axises. So an object at one's back
at the limit of the set boundary of spatial dimension, the sound would
be heard at the bottom, or the chin area of the wearer. whilst the
opposite would be heard at the top of the forehead. left and right
would retain their relative positions, but the difference would be
felt when an object is ahead on a diagonal position. That would
translate to a top/left audio cue and so forth.
I think of this application or flattening of sound as most of the time
when we are walking around, we take much less notice of height factors
as unless we are as taller than basket ball players, we don't often
have to crouch to walk somewhere.
So going back to the RTS game, if we have this flat two axis
representation of space over a map, the one channel whoch would be
back can be used for, well...background or ambient sounds. Cues would
be relevant to such things as the type of terrain one is in, etc. And
the front audio within a central boundary would represent unit
responses and general game command voice responses.

2 controls and gestures.
Taking the rts element to the table, the main advantage paused for
controls which can be easily accessed without cluttering the actual
space of the screen would be using the rotor for various menus and
functions. For instance,
one finger drag=feel around the game space with audio cues on hitting
an object or contextual item
2 finger dragaround screen to displace the map to the opposite
direction of the fingers being dragged to explore more of the map
3 finger drag for faster scroll of the map area
2 finger rotor once tapped on an object for actions choice: patrol,
attack/ build, etc...Each instance of a double-tap would go one level
deeper into the contextual menu for choosing for example which
building to build and the sorts.
3 finger rotor for other behavioral actions or quote special unquote abilities.
1 finger flick once double tapped on an object slash soldier=
selecting the next or previous unit
2 finger flick= selection of next building

and the list goes on.

I will continue to add to this as i think through the possibilities,
sort of a diary to myself as well so i can gather all the notes for
eventually starting the programming part once i get my hands on that
SDK :)

Hope i'm not a total bore...

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"MacVisionaries" group.
To post to this group, send email to macvisionaries@googlegroups.com
To unsubscribe from this group, send email to 
macvisionaries+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/macvisionaries?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to