I kind of rewrite export script for Blender that gives you your
object's vertices and texture coordinates (when you do uv unwrap) as a
float array. You can find it here: http://halmi.sk/uploads/android.zip
Unzip and put it into Blender's script folder. Then select your
object, and from menu select
1) Initially I wont need it, but how do you handle for touch
interactions with a given 3d object on screen? I've heard of doing
things like mapping screen vertices to objects and when you touch in x
region, you are touching x object. Is there an easier way?
Android is using OpenGL ES See
Hi,
let me start by answering your questions:
1) What you refer to is usually called picking and involves a bit of
math. Your initial goal is to get a ray (defined by a starting point
and a unit length direction) from your touch coordinates. This can be
done via GLU.gluUnProject
Ronnyek,
There are a few methods of picking a 3D object from a 2D coordinate.
The ones I know of are (in order of popularity):
1) gluUnProject to get a ray, then use a collision detection
algorithm to check against your object bounds
2) Use the color/depth buffer
-- This involves a special
This is exactly the sort of discussion I was hoping to have.
I didnt actually expect to find allt he tools working for me out of
the box, but rather hoped to help piece together bits that anyone
could use to help expedite it.
I appreciate your time... and if i come up with some tools that help
5 matches
Mail list logo