Thanks, I will try that. I was wondering about:
bufPtr[3] = 1;
It looks like this is for when the pixel should be visible. Isn't
this the alpha channel? Shouldn't that be 255?
Thanks Again
-- Rick
On 5/3/07, Garrett Potts <[EMAIL PROTECTED]> wrote:
Hello:
I typically use the setImage method and this seems to work fine. I
have never tried going the allocateImage route:
here is some code (Only doing the unsigned 8 bit part) of our own
internal image data format being converted and used as an osg::Image:
void ossimPlanetImage::fromOssimImage(ossimRefPtr<ossimImageData>
data,
bool reassignNullFlag,
double nullValue)
{
ossim_uint32 w = 0;
ossim_uint32 h = 0;
GLint internalFormat = GL_LUMINANCE;
GLenum pixelFormat = GL_LUMINANCE;
GLenum type = GL_FLOAT;
osg::Image::AllocationMode allocMode = osg::Image::USE_NEW_DELETE;
unsigned char* buf = 0;
if(data.valid())
{
w = data->getWidth();
h = data->getHeight();
switch(data->getScalarType())
{
case OSSIM_UINT8: // only support 1 to 3 band and will
convert to RGBA gl type
{
if(data->getNumberOfBands()>0)
{
ossim_uint8 nullPix = (ossim_uint8)data->getNullPix(0);
ossim_uint32 sizeInBytes = data-
>getSizePerBandInBytes()*4;
buf = new unsigned char[sizeInBytes];
unsigned char* bufPtr = buf;
memset(buf, 0, sizeInBytes);
if(data->getBuf()&&
data->getDataObjectStatus() != OSSIM_EMPTY)
{
unsigned char* dataBuf[3];
dataBuf[0] = (unsigned char*)data->getBuf(0);
if(data->getNumberOfBands() > 1)
{
dataBuf[1] = (unsigned char*)data->getBuf(1);
}
else
{
dataBuf[1] = (unsigned char*)data->getBuf(0);
}
if(data->getNumberOfBands() > 2)
{
dataBuf[2] = (unsigned char*)data->getBuf(2);
}
else
{
dataBuf[2] = (unsigned char*)data->getBuf(0);
}
ossim_uint32 area = data->getWidth()*data-
>getHeight();
ossim_uint32 idx = 0;
for(;idx < area; ++idx)
{
if((*dataBuf[0] != nullPix)||
(*dataBuf[1] != nullPix)||
(*dataBuf[2] != nullPix))
{
bufPtr[0] = *dataBuf[0];
bufPtr[1] = *dataBuf[1];
bufPtr[2] = *dataBuf[2];
bufPtr[3] = 1;
}
else
{
bufPtr[3] = 0;
}
bufPtr+=4;
++dataBuf[0];
++dataBuf[1];
++dataBuf[2];
}
internalFormat = GL_RGBA;
pixelFormat = GL_RGBA;
type = GL_UNSIGNED_BYTE;
}
}
break;
}
default:
{
// not supported yet
break;
}
}
}
if(buf)
{
setImage(w, h, 1, internalFormat, pixelFormat, type, buf,
allocMode);
}
Note: osismPlanetImage just derives from osg::Image. Calling
setImage seems to work fine. This is then used as a glTexture.
Hope this helps some
Take care
Garrett
On May 3, 2007, at 3:14 PM, [EMAIL PROTECTED] wrote:
Thanks, but no dice
On 5/3/07, Michael Henheffer <[EMAIL PROTECTED] > wrote:
Hi Rick,
I'm not sure if this will fix your problem but try calling dirty() on
the image after you modify the data.
Mike
[EMAIL PROTECTED] wrote:
>
> Hello All,
>
>
>
> I am stuck trying to work out how to programmatically create an
image
> for a texture.
>
> Here is what I have so far:
>
>
>
> osg::ref_ptr<osg::Image> image = new osg::Image();
> int s, t; s=t=256;
> image->allocateImage(s, t, 1, GL_BGRA, GL_UNSIGNED_BYTE);
>
> for (int row = 0; row < t; ++row)
> {
> for (int col = 0; col < s; ++col)
> {
> unsigned char* newImageData = image->data(col, row);
> newImageData[0] = row;
> newImageData[1] = col;
> newImageData[2] = (row+col)/2;
> newImageData[3] = 255 - b;
> }
> }
>
> // It does not seem to work if I do not write the image out and
read
> it back in
> osgDB::writeImageFile(*image, "temp.bmp");
> image = osgDB::readImageFile("temp.bmp");
>
> osg::ref_ptr<osg::Texture2D> texture = new osg::Texture2D;
> texture->setImage( image.get());
> ...
>
>
>
> In this code, I am writing the image to a temp file and then
reading
> it backout again. It seems that if I do not do this, it does not
> work. I am guessing I am not creating the image properly, but the
> .bmp writer is smart enough to figure it out whereas the Texture2D
> does not like it. I tried looking at the more verbose output
and did
> not see anything different. I wrote both files out to a .osg file,
> but they were identical (neither showing a texture. Where are
> textures stored when writing osg files anyway?)
>
> This is actually all in my effort to get the osgDB_lwo reader using
> the transparency map from the .lwo file itself. <I did not
imagine I
> would have such a tough time of it :( >. Even by writing
the .bmp out
> and reading it back in, I am not getting the alpha channel for the
> blending. I have tried allocating the image with "GL_BGR", with no
> more success.
>
> Like I said, I am sure I am doing something wrong. someone please
> help :O Perhaps there are examples somewhere of manually
creating an
> image as a texture?
>
> You graveling servant
>
> -- Rick
>
>
>
>
---------------------------------------------------------------------
---
>
> _______________________________________________
> osg-users mailing list
> [email protected]
> http://openscenegraph.net/mailman/listinfo/osg-users
> http://www.openscenegraph.org/
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/