On Jan 25, 2010, at 11:53 AM, Casey Duncan wrote:
> That said OpenGL 3.2 is pretty exciting because it finally breaks away
> from the old fixed functionality system to a fully programmable one.
> But as mentioned that means you'll need to stretch your bootstraps a
> bit more to get off the ground, since there isn't a much preprogrammed
> functionality. But it also means that there aren't a bunch of
> different now obsolete apis that do the same thing to distract you.

I can deal with "harder" to learn, as long as it's incremental. If I can build up bit by bit, and see what each additional layer of complexity does, I should be able to figure it out. But it sounds like OpenGL 3 is not widely supported on video cards? I am developing on a MacBook Pro, if that makes a difference.


> If you decide to go with OpenGL 1 or 2 instead, do yourself a favor
> and don't bother with wholly obsolete parts of the api like immediate
> mode. At a minimum use vertex arrays (or vbos). Pyglet has an api for
> defining them that insulates you a bit from the bare metal, and should
> work automagically with any graphics card that supports OpenGL 1.1-2.

Is there a way for me (someone new to OpenGL) to tell what parts are deprecated/discouraged? Or perhaps a page of OpenGL samples showing the "new, improved" way?

--
You received this message because you are subscribed to the Google Groups 
"pyglet-users" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/pyglet-users?hl=en.

Reply via email to