josselin wrote:
> Hi Luca,
> The way I found is :
> 1. compute the log10 values of the entire texture in a first shader
> 2. then perform the mipmap with:
> osgPPU::UnitInMipmapOut* sceneLuminance = new osgPPU::UnitInMipmapOut();
> sceneLuminance->setName("ComputeSceneLuminance");
> {
> sceneLuminance->generateMipmapForInputTexture(0);
> }
> 3. get the 5th or 6th level of the mipmapped texture, and perform the  
> mean of this texture in a shader. You'll got your average luminance.
> 


Hi Josselin,
I don't think your solution is correct, the log should be computed only on the 
first level. And in general I think that the whole example is fine from an 
algorithm standpoint (it basically matches 1:1 the HDRLighting sample you get 
with DirectX SDK, with some minor changes).
The problem is really in accessing the last mipmap level of that luminance 
texture. I have no idea if this goes wrong because the mipmaps are actually not 
there of for some other reason.

Also, for Allen, the error is not so evident thanks to the temporal adaptation 
which tends to mask it. But if you disable it the problem will be well visible 
I think.

    Luca

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=27251#27251





_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to