Hi Mathias

> De: Mathias Fröhlich
> 
> Fred,
> 
> On Sunday, December 18, 2011 10:18:39 Frederic Bouvier wrote:
> > Your patronage will be welcome.
> 
> Ok.
> My problem is that I have too many open projects currently. So,
> promising to help here is something I cannot do today. But Sure, 
> if you have questions feel free to ask. But I currently cannot 
> invest much concrete work into this.

The problem I have to solve currently is how to feed the G-Buffer 
to the Effect system because the textures used to store it are 
camera-dependent (in a multi screen context) but the pass (which 
is a stateset) is built once for all.


> I will in any case come back to you and this changes for fgviewer.
> This was the code where I wanted to introduce a different fog/sky
> algorithm that will decouple the atmospehric computations from the 
> actual models. This is also something that requires rendering the 
> actual scene into a set of fbos and was thought as a prework for 
> what you are doing now. But I am completely happy with an other 
> approach. I just see that we will need this kind of stuff.

Currently, the fog is computed using the depth buffer as a post-process
pass. Any smarter computation (like atmospheric scattering) is just
a matter of writing a single shader that would replace the default
fog computation.

> So, may be just one question for what you have done already again
> without looking into any code:
> 
> You do not require float textures?
> As far as I can see, there is a patent issue on this extension and
> usually this is not strictly required.
> Using a fixed point representation that makes use of the usual 
> depth buffer - one that scales differently than the usualy 
> perspective depth - could be used instead and I think we should 
> use this in the end. In the end this really gives even better 
> accuracy than a float representation since floats waste some
> bits for the expontent, where a fixed point representation could 
> just use all the bits for accuracy.

I used Policarpo's tutorial on deferred shading so I don't store 
absolute world position in a float texture. As described in the 
tutorial, I used the view direction and the depth value to compute 
the eye space position of the fragment.

Nevertheless, I use float texture to store the normals. I tried 
setting up a normal buffer with just luminance and alpha with 
half-float while the color buffers were rgb8 but it was very slow. 
Instead I used rgba16f for all textures (except the depth buffer). 
As I use additive blending to collect the lights, I thought it 
would be possible to have some kind of HDR without the risk of 
saturating the 8bit color components. I can try to use 2 color 
channels to store the x and y components of the normal and 
reconstruct the z part with sqrt(1 - (x^2+y^2)) but that won't 
solve the saturation issue. Some use LUV color space to store 
higher intensity in RGB8 but afaik, it don't support additive 
blending.


> 
> Appart from that:
> 
> merry christmas!

Happy new year,
-Fred
http://wiki.flightgear.org/Project_Rembrandt

------------------------------------------------------------------------------
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
_______________________________________________
Flightgear-devel mailing list
Flightgear-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/flightgear-devel

Reply via email to