oglbase-discuss, Brian Paul writes: > 1. In glext_proto.h, glTexImage3D() has the wrong type for the internal > format parameter. It should be GLint, not GLenum. A bit of history. Originally in OpenGL 1.0, glTexImage2D and glTexImage1D took a 3rd parameter called "components" and declared as a "GLint". This could be either 1, 2, 3, or 4. OpenGL 1.1 renamed the "components" parameter as "internalFormat" and allowed various GLenum parameters to be passed. This allowed support for sized internalFormats (ie, RGBA8) and also new 1-component base formats (GL_INTENSITY and GL_ALPHA). For compatibility reasons, the renamed parameter was left of type GLint. The EXT_texture3D extension came after OpenGL 1.1 and it added glTexImage3DEXT with its 3rd parameter called "internalFormat" in line with OpenGL 1.1, but since the entry point was new (read: no compatibility issue) the type is GLenum. However, the OpenGL 1.2 specification stuck with the "GLint internalformat" prototype for glTexImage3D. I actually believe this is an oversight. The OpenGL SI *does* have its internal prototype for glTexImage3D as "GLenum internalformat". This is both in glcore/s_tex.c and in the gl.spec and enum.spec files. Moreover, the 3rd edition of the OpenGL Reference Manual for OpenGL 1.2 documents glTexImage3D as taking "GLenum internalformat". We could take the view that the OpenGL 1.2.1 spec is definitive, but it is inconsistent with the EXT extension, the sample implementation, and the Blue Book. My guess is that the OpenGL 1.2.1 spec is just a cut-and-paste version of glTexImage2D that has the "GLint" version of the parameter. The intent was for the type to be "GLenum internalformat". There is another oddity in this area. If the "internalformat" parameter is not supported, should GL_INVALID_VALUE or GL_INVALID_ENUM be returned. If the type is "GLenum", presumably GL_INVALID_ENUM makes sense. The OpenGL 1.2.1 specification is quite explicit that GL_INVALID_VALUE is returned though. The Blue Book is also explicit about this. I'm satisfied to leave well enough alone and have GL_INVALID_VALUE be returned (it doesn't really matter). It fits the way the specification is written where glTexImage2D and glTexImage1D are defined in terms of glTexImage3D. For what it is worth, the EXT_texture3D *explicitly* did not support 1, 2, 3, or 4 as internalformat parameters for the glTexImage3DEXT call (though the OpenGL 1.2.1 spec explicitly supports them for glTexImage3D). The EXT_texture3D spec also explicitly says that the error is GL_INVALID_ENUM. My belief is that this is simply a specification bug and we should fix the specification to have the prototype be "GLenum internalformat". It is easy and painless to fix and makes the spec consistent with other printed documentation, implementations, and the intent. The fact that the error for called glTexImage3D with a bad internalformat is still GL_INVALID_VALUE would simply be an OpenGL trivia question. In case anyone is too attached to wording in the specification, this is hardly the only 1.2.1 specification bug that exists. The OpenGL 1.2.1 specification's discussion of LOD clamping is quite incorrect if you take the formulas at face value. - Mark
