Usually this is solved by having ray-object intersection code for
all of your geometry.  If your 3D objects are all polygon meshes then
you need ray-polygon intersection code.  You then generate a ray from
the eye point through the screen point that the user is touching.
This ray is then tested against all of your geometry to find the
closest hit.  That tells you what object was hit as well as where on
the object.  Of coarse, if your scene has many objects with lots of
polygons to test you will want to use some sort of spacial data
structure to speed up the hit testing.  However, since you only need
it to run at "human speeds" you can get away with doing a lot of hit
tests before you have to resort to acceleration structures.  One
simple thing to do is to hit test the objects bounding volumes
(spheres are particularly easy).  That lets you not test the
individual polygons in objects that fail the bounding volume test.  If
you have scripts for your objects that need to receive touch events
you can pass the event along to the object once you've determined
which object is hit.

    -Anton

On May 31, 2:24 am, quill <[email protected]> wrote:
> HI all,
> I am writing a game using OPENGL, there are several 3D objects in my
> surfaceview.
> My question is:
> Should each 3D object implement view in order to get touch events?
> Or there is other way to do this?
>
> Thank you!
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to