Hello, Sebastian
Thank you for your reply!
In my programme, camera is set to be prerender, render target is
FRAME_BUFFER_OBJECT, I attach the camera to an image, in order to get
the bufferdata via image.
(1) the fragment shader is as following:
out vec4 Frag_Color;
void main()
{
Frag_Color = vec4(a,b,c,d);
}
a,b,c,d can be replaced by any number between 0~1
examples of the inaccuracy of data output are as following:
set in Frag shader -> data read via image
0.55555 -> 0.556863
0.174 -> 0.172549
0.17 -> 0.168627
0.99 -> 0.988235
Frag_Color = vec4(0.2,0.3,0.4,0.5) ->
cur[0]=0.2,cur[1]=0.298039,cur[2]=0.4,cur[3]=0.498039
Frag_Color = vec4(0.1,0.6,0.7,0.8) ->
cur[0]=0.0980392,cur[1]=0.6,cur[2]=0.698039,cur[3]=0.8
Frag_Color = vec4(0.9,0.66,0.77,0.88) ->
cur[0]=0.898039,cur[1]=0.658824,cur[2]=0.768627,cur[3]=0.878431
(2)The camera setup details are listed below.
// renderTarget
osg::Camera::RenderTargetImplementation renderTargetImplementation;
renderTargetImplementation = osg::CameraNode::FRAME_BUFFER_OBJECT;
// shader program
osg::Program * program = new osg::Program();
program->addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT,
fragmentShaderPath));
program->addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX,
vertexShaderPath));
// image
osg::Image * image = new osg::Image();
image->allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);
//camera
osg::ref_ptr<osg::CameraNode> camera(new osg::CameraNode());
camera->setProjectionMatrixAsFrustum(-tan(YawView * math::D2R * 0.5),
tan(YawView * math::D2R * 0.5), -tan(PitchView * math::D2R * 0.5),
tan(PitchView * math::D2R * 0.5),zNear, zFar);
camera->setComputeNearFarMode(osg::CullSettings::DO_NOT_COMPUTE_NEAR_FAR);
camera->setViewport(0, 0, XRes, YRes);
camera->setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
camera->setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
camera->setReferenceFrame(osg::Transform::ABSOLUTE_RF);
camera->setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f),
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));
camera->setRenderOrder(osg::CameraNode::PRE_RENDER);
camera->setRenderTargetImplementation(renderTargetImplementation);
camera->attach(osg::CameraNode::COLOR_BUFFER, image);
//read out data
osg::Vec4f * rgbaData = (osg::Vec4f *) (image->data());
for (int h = 0; h < yRes; h++) {
for (int w = 0; w < xRes; w++) {
osg::Vec4f cur =
rgbaData[xRes * h + w];
cout<<"cur[0]="<<cur[0]<<",cur[1]="<<cur[1]<<",cur[2]="<<cur[2]<<",cur[3]="<<cur[3]<<endl;
}
}
Thank you very much for any advice!
Shuiying
On 01/18/2012 08:11 AM, Sebastian Messerschmidt wrote:
I suppose you are using a normal framebuffer.
I this case it is pretty normal, that your original value that is in
the image (you unfortunately didn't tell use if the value comes from a
texture)
doesn't exactly match the value in the framebuffer. Usually the
framebuffer has 8bit per color channel, which makes the value of
0.174977 not representable.
If you need the "exact" value to be in the framebuffer you'll have to
render it to a floating point buffer via RTT.
Also you didn't tell us how you got the result value. There is also
the possibility that you read an interpolated/filtered value.
If you provide a bit more context someone might be able to help you
with your problem
Hello,
when I write gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
then I get through a related image: float Vec4: (0.176471,0,0,1)
so why 0.174977 changed into 0.176471?
I think there should be no process there after per fragment
operations that can change this element in the rendering pipeline.
Blend, alphatest and so on are all disenabled.
Thank you in advance!
Shuiying
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org