Hi Curt,

thanks for your comments and explanations.

> We always recognized potential difficulties with blending the sky into
> the terrain.  The original design used the fog color as the opengl clear
> color (the color that the display buffer is cleared to before starting
> to draw   any
> of the frame).  We blended the bottom of the skydome into the fog color.
> And then made sure we drew enough tiles to extend at least to the fully
> fogged range in all directions.  This allowed us to "hide" the seam
> between the sky and the ground in the fog/haze at the horizon.

Yes, as far as I can see, that's the only way this can be done and why the
scattering skydome shader never worked with the default terrain shading.

As a side note: I've observed that we're currently fading into fog with

exp(-d^2/vis^2)

where d is distance and vis is visibility where I believe the physically
correct behaviour would be

exp(-d/vis)

Now, the only reason I can think of to do the square fading is that you
really get a hard cutoff after d = vis and minimize the amount of terrain
tiles you need for a seamless blending. On the other hand, the square
fading brings less fog for d< vis than there should be, and actually many
optical integrals rely on the factorization transmission (layer a) *
transmission (layer b) = transmission (layer a + layer b). So I would
prefer linear exponential fading. If the hard cutoff is the only rational
for square fading, then I'd propose to use

exp(-d/vis - 0.1 (d/vis)^6)

instead which for any distance d < vis gives linear fading and then has a
cutoff as efficient as square fading. Please let me know if there's any
other reason why the current code is as it is!


> One problem: tall
> mountains in the distance would be fog colored and extend visually up
> into the blue part of the skydome and look weird -- so the original
> design worked pretty well, but wasn't perfect.

I would guess it is visually acceptable to let mountaintops never blend
into the horizon fog. In my current design it would be possible to do so,
provided the visibility passed to the tile manager is a clever combination
of ground and aloft visibility.

Maybe that's actually the general solution - pass a function of different
visibility properties used by the shader to the tile manager. The simplest
solution is to pass the maximum of all visibilities, but that may load
more than one ever needs, so probably an angular weighted version
dependent on current altitude would be more useful. I can work the math
out... as long as it's possible to go for a solution like that.

> Oh, and we also had some additional code that would change the fog color
> depending on your view angle relative to the sun ... so at sun set/rise
> the fog color would be oranger/redder looking at the sun versus
> looking away from it.

I'm currently using gl_LightSource[0].diffuse as fog color - to the degree
that sunlight light changes, my version inherits that (seems to be working
fine so far).

> So definitely we should try to figure out how to make your sky model
> blend with the terrain some how.

It *seems* to be doing fine now after I introduced the harder fogging
cutoff - above the ground layer, it still uses /environment/visibility-m
to keep track of this part of the optical integrals, so the amount of
tiles loaded is actually sufficient - but at 50.000 ft or so I rountinely
have visibility values of 100 km. For me, Flightgear renders that still
stable and reasonably fast (I tried Custom France scenery, Hawaii as well
as Pacific Northwest as test cases), and only above 140 km I get unstable
behaviour (some of my Blackbird Screenshots were rather difficult). But on
slow machines, this'd probably be a show-stopper. But that's not a shader
problem but related to the weather code - that can be instructed to
provide a cutoff for visibility that is never exceeded.

> One more thought while I'm writing.  The current scattering effect
> skydome
> has a glitch/seam at the azimuth.  Depending on the time of day it is
> more
> or less visible.  It can be really ugly, I'd love to get this fixed if
> you  happen to stumble on what's causing it as you play with all
> this code.

I know...

I'm really not a shader person, I have just one idea as to what the cause
might be (?).

I've often observed that the edges of scenery tiles with the water
reflection shader have different colors. I have also on occasion seen fog
artefacts at just the same position. My theory as to what happens is that
the vertices in this case (= the tile edges) are rather far apart, so when
the vertex shader is asked to compute an angle for reflection or light
scattering or fogging, it must interpolate over a large area because the
vertices are far apart. If it interpolates linear, but the quantity to
interpolate is (as in this case) non-linear, then in a certain angular
regime there must be artefacts.

In this case, there'd be nothing wrong with the shader, the vertices would
just too far apart to get it right.

Now, I have no idea how many vertices the skydome has or if that is indeed
the problem - it's just my working hypothesis based on what I see.

ThorstenB wrote:

> And it'd be rather complicated to implement any other tile loading
> method instead of the current concept of loading all tiles within a
> certain range. The tile loader lives in a simple 2D world. It knows
> nothing about elevation of certain tiles etc.

But can it be made to use a function of all visibility parameters used by
the shader? For zero elevation the problem is not difficult to solve, the
terrain shader basically does it for every vertex in the scene per frame
anyway - just solving it once every couple of frames for the tile manager
shouldn't be an issue, even Local Weather's Nasal code could do it.

But we probably have something like min_elevation and max-elevation for
the scenery tiles we have already loaded, so these can be used to even
optimize the guess when the next tile might become visible.

It'd be so cool to be able to load only the mountains sticking out of
haze... But I think we can do well enough without that gimmick.

In completely different matters:

Does anyone know of a short writeup of how the shader coordinate systems
are defined? I waste most of my time debugging code to find out that a
vector never was what I thought it should be because I started with the
wrong coordinate system, and that's starting to annoy me. Once I have the
vectors right, then everything works much faster...

Thanks in advance!

* Thorsten


------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
_______________________________________________
Flightgear-devel mailing list
Flightgear-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/flightgear-devel

Reply via email to