On Wednesday, 14 September 2016 at 16:49:51 UTC, Darren wrote:


While googling, the idea seemed to be to create and SDL_Surface* and pass that (or surface.pixels) as the last argument for glTexImage2D. Didn't work for me, however.

Does anyone have any tips?

I'm driving blind here without any of your code to see, but here's some general advice.

The last three arguments to glTexImage2D [1] are what you use to describe the image data you are sending to it (aside from 'width' & 'height', which also set the size of the texture being created).

`format` needs to match the format of your image. For most simple things you do with OpenGL, that's going to be GL_RGB or GL_RGBA. If you don't know the format of your image, you can get it from the SDL_PixelFormatEnum value [2] in surface.format.format. Then you'll just need to send the corresponding GL enum to glTexImage2D. Though, be aware that the way SDL and OGL treat color formats may not always correspond in the way you think they should [3].

`type` is almost always going to be GL_UNSIGNED_BYTE. I don't know that SDL_image supports anything else, like floating point formats, anyway.

`data` needs to be a pointer to the pixel data and nothing else (never SDL_Surface!), so in this case surface.pixels is correct.

If you are getting a crash, check the return of any of the image loading functions you call to make sure the load was successful. That's the first place I'd check.

[1] https://www.opengl.org/sdk/docs/man/html/glTexImage2D.xhtml
[2] https://wiki.libsdl.org/SDL_PixelFormatEnum
[3] https://bugzilla.libsdl.org/show_bug.cgi?id=2111

Reply via email to