I had posted this query before, but I was unable to solve my problem.

Here's exactly what i'm trying to do :

when the program is run, eg :
java BasicTerrain1_2 2002.jpg

it reads 2002.jpg
(pictures larger than around 40x40 seem to cause an overflow - but i'm not
worried about that right now),
interprets it as a contour map - the intensity at a point in the bitmap is
proportional
to the height.

Next, it divides this bitmap into triangles and converts it into a terrain
by
replacing each 2D traingle with a 3D triangle using
            z = intensity of point at (x,y) in the bitmap.

1) My problem is that for each 3D traingle, I want the *color* of the
triangle
    to be the average of the three vertices in the 2D triangle in the
    bitmap that corresponds to this triangle - so that when the terrain is
viewed
    from above, from  a distance, a blurred version of the bitmap itself
    would be visible.
    The code for setting the color of the traingles isn't working

2)
    Right now, the code to read the bitmap isn't there. Instead, it is using
a random
    bitmap. Also, the *code for setting the color is commented out*
(average_color1(...) and 2)
    since it isn't working.
    All the triangles have been set to a constant color instead.

whats wrong with the color setting code?

Thanks in advance,
vamshi

Attachment: BasicTerrain1_2.java
Description: Binary data

Reply via email to