Thor recently pushed an implementation for MSAA for those cases when the 
feature is supported by the card and where a Scene (or SubScene) is created 
with the antiAliasing flag set to true. MSAA is "Multi-sampled Anti Aliasing", 
which means that the graphics card, when configured in this mode, will sample 
each fragment multiple times. The upshot is that 3D doesn't look as jaggy.

However this has an impact on performance (usually an extra buffer copy or at 
the very least you will be sampling each pixel multiple times so if you are 
doing something graphically intense then that might push you over the edge 
where you start to see performance degradation). Now multi-sampling can be 2x, 
4x, etc. The higher the multi-sampling value, the better the quality, and the 
lower the performance.

I'm also bothered but the name "antiAliasing" because there are many forms of 
anti-aliasing in the world and it isn't clear which this is. I think perhaps we 
should instead have an enum. The idea is that we can add to the enum over time 
with greater options for how to perform the scene antialiasing.

public enum SceneAntiAliasing {
    DISABLED,
    DEFAULT,
    MSAA_2X,
    MSAA_4X
}

And then grow it over time to include potentially other techniques. My thought 
here is that the implementation is going to matter to folks. They're going to 
want to be able to make the performance / quality tradeoff, and perhaps even 
the implementation tradeoff (since different implementations may provide 
somewhat different results). DISABLED turns it off, obviously. DEFAULT allows 
us to pick what we think is the best (might be different on different 
platforms. Desktop might go with MSAA_16x or equivalent while iOS might be 
MSAA_2X). Then some standard options.

Thoughts?
Richard

Reply via email to