Thank you for response robertos, I've been playing around with the code the
last few days and it seems like the filters are doing something, it is just not
what I expected. I'm wondering if I misinterpreted the meaning of a "NEAREST"
As I understand it, putting both the min and mag filters on nearest means that,
when sampling a location on your texture it will return the color of the
nearest pixel. In other words: you will always get a value back that is present
in the pixeldata from the image.
However, as far as I can see from my test, this is not the case: I created an
image with red pixels on one half (255,0,0) and black pixels on the other half
(0,0,0). When sampling the texture to which this image was set and simply
drawing the sampled values onto a bigger texture for debugging purposes, I see
a red half, a black half and a line of dark red "pixels" separating them. As if
the values have been interpolated when sampling.
If i turn off the "nearest" filters this line becomes a gradient, so clearly
the filtering is doing something.
Am I just completely misinterpreting the intended behavior of the "nearest"
filters? or is there something wrong in the implementation?
Read this topic online here:
osg-users mailing list