On Wednesday, 9 July 2014 at 05:30:21 UTC, Ola Fosheim Grøstad wrote:
That's true, but OpenGL is being left behind now that there is a push to match the low level of how GPU drivers work.

As I said, ALL api's are converging on low level access, this includes opengl. This means that all major api's are moving to a buffer+shader model because this is what the hardware likes(there is some more interesting things with command buffers that is happening also).

Apple's Metal is oriented towards the tiled PowerVR and scenegraphs,

I am not exactly sure where you are get that idea, Metal is the same, buffers+shaders. The major difference is the command buffer that is being explicitly exposed, this is actually what is meant when they say that the the api is getting closer to the hardware. In current api's(dx/ogl) the command buffers are hidden from the user and constructed behind the scenes, in dx is is done by Microsoft and in ogl it is done by the driver(nvidia/amd/intel). There has been a push recently for this to be exposed to the user in some form, this is what metal does, I believe mantel does something similar but I can't be sure because they have not released any documentation.


probably also with some expectations of supporting the upcoming raytracing accelerators.

I doubt it.

AMD is in talks with Intel (rumour) with the intent of cooperating on Mantle.

I don't know anything about that but I also doubt it.

Direct-X is going lower level… So, there is really no stability in the API at the lower level.

On the contrary, all this movement towards low level API is actually causing the API's to all look vary similar.


But yes, OpenGL is not particularly suitable for rendering a scene graph without an optimizing engine to reduce context switches.

I was not talking explicitly about ogl, I am just talking about video cards in general.

Actually, modern 2D APIs like Apple's Quartz are backend "independent" and render to PDF. Native PDF support is important if you want to have an advantage in the web space and in the application space in general.

This does not really have any thing to do with what I am talking about. I am talking about hardware accelerated graphics, once it gets into the hardware(gpu), there is no real difference between 2d and 3d.

There is almost no chance anyone wanting to do 3D would use something like Aurora… If you can handle 3D math you also can do OpenGL, Mantle, Metal?

As it stands now, that may be the case, but I honestly don't see a reason it must be so.

But then again, the official status for Aurora is kind of unclear.

This is true.

Reply via email to