On Sun, May 26, 2019 at 4:10 AM NaN via Digitalmars-d-announce <[email protected]> wrote: > > On Saturday, 25 May 2019 at 23:23:31 UTC, Ethan wrote: > > On Sunday, 19 May 2019 at 21:01:33 UTC, Robert M. Münch wrote: > >> > >> Browsers are actually doing quite well with simple 2D graphics > >> today. > > > > Browsers have been rendering that on GPU for years. > > Just because (for example) Chrome supports GPU rendering doesn't > mean every device it runs on does too. For example... > > Open an SVG in your browser, take a screenshot and zoom in on an > almost vertical / horizontal edge, EG.. > > https://upload.wikimedia.org/wikipedia/commons/f/fd/Ghostscript_Tiger.svg > > If you look for an almost vertical or almost horizontal line and > check whether the antialiasing is stepped or smooth. GPU > typically maxes out at 16x for path rendering, CPU you generally > get 256x analytical.
What? ... this thread is bizarre. Why would a high quality SVG renderer decide to limit to 16x AA? Are you suggesting that they use hardware super-sampling to render the SVG? Why would you use SSAA to render an SVG that way? I can't speak for their implementation, which you can only possible speculate upon if you read the source code... but I would; for each pixel, calculate the distance from the line, and use that as the falloff value relative to the line weighting property. How is the web browser's SVG renderer even relevant? I have absolutely no idea how this 'example' (or almost anything in this thread) could be tied to the point I made way back at the start before it went way off the rails. Just stop, it's killing me.
