Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread J.P. Delport

Hi,

Viggo Løvli wrote:

Ok, that make sense.
It happens every time I render with a floating point texture.

The fall in framerate does not happen when I run the OSG multi render 
target example using HDR. My code does however use additive blending 
toward the floating point texture so I bet that is what cause it to fall 
into a software rendering mode.


We have also experienced this slowdown when blending was enabled with 
float textures. Testing on newer cards is pending as we cannot figure 
out from docs on the internet if this is actually supported in current 
hardware at all. Seems like DX10.1 mandates float blending, but we will 
test to make sure. Please let me know if you know if it is supported in 
hardware.




Do you know about any texture surface bit format that is more than 8 
bits (unsigned integer) ?


I don't know of any. I've only seen a paper once where people were using 
three channels to simulate one large channel. The RGB channels were 
partially overlapped to create a higher dynamic range channel.


jp



Viggo


  Date: Fri, 30 May 2008 16:29:05 +0100
  From: [EMAIL PROTECTED]
  To: osg-users@lists.openscenegraph.org
  Subject: Re: [osg-users] I get errors when trying to render to a 
luminance buffer...

 
  Hi Viggo,
 
  When performance drops like this it's because you've dropped onto a
  software fallback path in the OpenGL driver. Exactly what formats are
  software vs hardware depends upon the hardware and OpenGL drivers.
  You'll need to check with your hardware vendors specs to see what will
  be hardware accelerated.
 
  Robert.
 
  On Fri, May 30, 2008 at 2:16 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
   Hi Robert,
  
   I modified my code as you suggested.
   The warning is gone :-)
  
   The framerate is now 10 seconds per frame instead of 30 frames per 
second.

   It does something.
   The texture I render to remains black (cleared to black).
  
   If I change the setInternalFormat to GL_RGBA then the framerate is 
up again,
   and the texture gets colors. This works, but then I only have 8 bit 
in the
   red channel. What I need is as many bits as possible in the red 
channel,

   preferably 32. And I do not need GBA channels.
   Do you have a suggestion for me on this one?
  
   Viggo
  
   Date: Fri, 30 May 2008 13:25:24 +0100
   From: [EMAIL PROTECTED]
   To: osg-users@lists.openscenegraph.org
   Subject: Re: [osg-users] I get errors when trying to render to a 
luminance

   buffer...
  
   Hi Viggo,
  
   The warning is exactly right, pbuffers don't suport multiple render
   targets, only FrameBufferObjects do.
  
   Perhaps what you intend it not to use multiple render targets, in
   which case you should set the Camera attachment to COLOR_BUFFER rather
   than COLOR_BUFFER0, that later tells the OSG that you want MRT and
   will be using glFragColor[] in your shaders.
  
   Also the Camera::setDrawBuffer(GL_COLOR_ATTACHMENT0_EXT) is
   inappropriate for pbuffers.
  
   Robert.
  
   On Fri, May 30, 2008 at 1:18 PM, Viggo Løvli [EMAIL PROTECTED] 
wrote:

Hi,
   
I want to render to a floating point buffer, and I set things up 
like

this:
   
tex-setInternalFormat( GL_LUMINANCE16F_ARB );
tex-setSourceFormat( GL_RED );
tex-setSourceType( GL_FLOAT );
   
camera-setRenderTargetImplementation( 
osg::Camera::FRAME_BUFFER_OBJECT

);
camera-attach( osg::Camera::BufferComponent( 
osg::Camera::COLOR_BUFFER0

),
tex );
camera-setDrawBuffer( GL_COLOR_ATTACHMENT0_EXT );
   
My fragment-shader that write to the surface output the value 
this way:

gl_FragData[0].r = 1.0;
   
Another fragment-shader reads the surface this way:
value = texture2DRect( id, gl_FragCoord.xy ).r;
   
I get the following output when I try to run my app:
Warning: RenderStage::runCameraSetUp(state) Pbuffer does not 
support

multiple color outputs.
   
My app runs, but nothing is written to the texture.
   
Is it possible to set up a surface that holds one channel 
(GL_RED) which

is
an unsigned int of 32 bit resolution? I'd rather use that than a 
float

:-)
   
Viggo
   

Få Hotmail du også. Windows Live Hotmail nå med 5000 MB gratis
lagringsplass.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
   

http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

   
   
   ___
   osg-users mailing list
   osg-users@lists.openscenegraph.org
   
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

  
  
   
   SkyDrive er her. Glem minnepinnen!
   ___
   osg-users mailing list
   osg-users@lists.openscenegraph.org
   
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread Viggo Løvli

Hi again :-)
 
I abandoned the idea of using a floating point buffer.
I went into the thinking box and came to the same conclusion as you wrote about 
in the end of your comment.
 
I needed a quite high resolution so I decided to use all 4 channels (RGBA). The 
numbers I want to accumulate is seldom large, so I needed most resolution in 
the lower scale.
I thus decided to use RGBA where R is most significant and A is least 
significant.
Bit usage is:
   R = 6 bits
   G = 5 bits
   B = 4 bits
   A = 3 bits
All non used bits will overlap with next channel.
 
I am using this to generate a pixel weight when rendering many volumetric 
clouds on top of each other on screen. Naturally most overlapping clouds will 
be further away from camera, so I needed as high a number of overlap bits on 
the lest significant buffers that I could get. My usage accepts in worst case 
32 overlapping pixels with maximum weight (in the distance). I think that is 
something I can live with :-)
 
Anyhow, I got a range (18 bit) that was good enough for my usage and it gives a 
decent cloud sorting for clouds seen up to 40 kilometers away.
 
I must stress one point: 
- If you ever try using the alpha channel for this then remember to turn off 
alpha-clipping (set alpha-func to always). 
 
Viggo
 Date: Mon, 2 Jun 2008 10:05:24 +0200 From: [EMAIL PROTECTED] To: 
 osg-users@lists.openscenegraph.org Subject: Re: [osg-users] I get errors 
 when trying to render to a luminance buffer...  Hi,  Viggo Løvli wrote: 
  Ok, that make sense.  It happens every time I render with a floating 
 point texture.The fall in framerate does not happen when I run the 
 OSG multi render   target example using HDR. My code does however use 
 additive blending   toward the floating point texture so I bet that is what 
 cause it to fall   into a software rendering mode.  We have also 
 experienced this slowdown when blending was enabled with  float textures. 
 Testing on newer cards is pending as we cannot figure  out from docs on the 
 internet if this is actually supported in current  hardware at all. Seems 
 like DX10.1 mandates float blending, but we will  test to make sure. Please 
 let me know if you know if it is supported in  hardware. Do you 
 know about any texture surface bit format that is more than 8   bits 
 (unsigned integer) ?  I don't know of any. I've only seen a paper once 
 where people were using  three channels to simulate one large channel. The 
 RGB channels were  partially overlapped to create a higher dynamic range 
 channel.  jp Viggo   Date: Fri, 30 May 2008 16:29:05 
 +0100   From: [EMAIL PROTECTED]   To: 
 osg-users@lists.openscenegraph.org   Subject: Re: [osg-users] I get errors 
 when trying to render to a   luminance buffer... Hi Viggo,   
   When performance drops like this it's because you've dropped onto a   
 software fallback path in the OpenGL driver. Exactly what formats are   
 software vs hardware depends upon the hardware and OpenGL drivers.   
 You'll need to check with your hardware vendors specs to see what will   
 be hardware accelerated. Robert. On Fri, May 30, 2008 at 
 2:16 PM, Viggo Løvli [EMAIL PROTECTED] wrote:Hi Robert,  
  I modified my code as you suggested.The warning is gone :-)
The framerate is now 10 seconds per frame instead of 30 frames per   
 second.It does something.The texture I render to remains 
 black (cleared to black).   If I change the setInternalFormat to 
 GL_RGBA then the framerate is   up again,and the texture gets 
 colors. This works, but then I only have 8 bit   in thered channel. 
 What I need is as many bits as possible in the red   channel,
 preferably 32. And I do not need GBA channels.Do you have a 
 suggestion for me on this one?   Viggo   Date: Fri, 
 30 May 2008 13:25:24 +0100From: [EMAIL PROTECTED]To: 
 osg-users@lists.openscenegraph.orgSubject: Re: [osg-users] I get 
 errors when trying to render to a   luminancebuffer... 
   Hi Viggo,   The warning is exactly right, pbuffers don't 
 suport multiple rendertargets, only FrameBufferObjects do.
Perhaps what you intend it not to use multiple render targets, in   
  which case you should set the Camera attachment to COLOR_BUFFER rather  
   than COLOR_BUFFER0, that later tells the OSG that you want MRT and   
  will be using glFragColor[] in your shaders.   Also the 
 Camera::setDrawBuffer(GL_COLOR_ATTACHMENT0_EXT) isinappropriate for 
 pbuffers.   Robert.   On Fri, May 30, 2008 at 
 1:18 PM, Viggo Løvli [EMAIL PROTECTED]   wrote: Hi, 
 I want to render to a floating point buffer, and I set things up   
 like this: tex-setInternalFormat( 
 GL_LUMINANCE16F_ARB ); tex-setSourceFormat( GL_RED ); 
 tex-setSourceType( GL_FLOAT ); 
 camera-setRenderTargetImplementation(   osg::Camera::FRAME_BUFFER_OBJECT 
 ); camera-attach( osg::Camera::BufferComponent

Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread Guy
Hello,

 

Category: FBO, blending, H/W

 

 I don't understand exactly what do you mean but H/W blending with float is not 
supported.

I know for sure that we render particle effect with transparency to float 
(16bit) FBO, and get the correct results. We also get that with a decent frame 
rate (100-200Hz depending on the number of particles).

 

So I've the feeling that I didn't understand what you mean be blend.

 

Our H/W is nVIDIA 8800 driver 6.14.11.6921 osg2.0 I think :-)

 

I'd appreciate if you clarify your problem.

Thanks,

Guy.

 



From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Viggo L?vli
Sent: Monday, June 02, 2008 11:29 AM
To: OpenSceneGraph Users
Subject: Re: [osg-users] I get errors when trying to render to a luminance 
buffer...

 

Hi again :-)
 
I abandoned the idea of using a floating point buffer.
I went into the thinking box and came to the same conclusion as you wrote about 
in the end of your comment.
 
I needed a quite high resolution so I decided to use all 4 channels (RGBA). The 
numbers I want to accumulate is seldom large, so I needed most resolution in 
the lower scale.
I thus decided to use RGBA where R is most significant and A is least 
significant.
Bit usage is:
   R = 6 bits
   G = 5 bits
   B = 4 bits
   A = 3 bits
All non used bits will overlap with next channel.
 
I am using this to generate a pixel weight when rendering many volumetric 
clouds on top of each other on screen. Naturally most overlapping clouds will 
be further away from camera, so I needed as high a number of overlap bits on 
the lest significant buffers that I could get. My usage accepts in worst case 
32 overlapping pixels with maximum weight (in the distance). I think that is 
something I can live with :-)
 
Anyhow, I got a range (18 bit) that was good enough for my usage and it gives a 
decent cloud sorting for clouds seen up to 40 kilometers away.
 
I must stress one point: 
- If you ever try using the alpha channel for this then remember to turn off 
alpha-clipping (set alpha-func to always). 
 
Viggo


 Date: Mon, 2 Jun 2008 10:05:24 +0200
 From: [EMAIL PROTECTED]
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] I get errors when trying to render to a luminance 
 buffer...
 
 Hi,
 
 Viggo Løvli wrote:
  Ok, that make sense.
  It happens every time I render with a floating point texture.
  
  The fall in framerate does not happen when I run the OSG multi render 
  target example using HDR. My code does however use additive blending 
  toward the floating point texture so I bet that is what cause it to fall 
  into a software rendering mode.
 
 We have also experienced this slowdown when blending was enabled with 
 float textures. Testing on newer cards is pending as we cannot figure 
 out from docs on the internet if this is actually supported in current 
 hardware at all. Seems like DX10.1 mandates float blending, but we will 
 test to make sure. Please let me know if you know if it is supported in 
 hardware.
 
  
  Do you know about any texture surface bit format that is more than 8 
  bits (unsigned integer) ?
 
 I don't know of any. I've only seen a paper once where people were using 
 three channels to simulate one large channel. The RGB channels were 
 partially overlapped to create a higher dynamic range channel.
 
 jp
 
  
  Viggo
  
  
   Date: Fri, 30 May 2008 16:29:05 +0100
   From: [EMAIL PROTECTED]
   To: osg-users@lists.openscenegraph.org
   Subject: Re: [osg-users] I get errors when trying to render to a 
  luminance buffer...
  
   Hi Viggo,
  
   When performance drops like this it's because you've dropped onto a
   software fallback path in the OpenGL driver. Exactly what formats are
   software vs hardware depends upon the hardware and OpenGL drivers.
   You'll need to check with your hardware vendors specs to see what will
   be hardware accelerated.
  
   Robert.
  
   On Fri, May 30, 2008 at 2:16 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
Hi Robert,
   
I modified my code as you suggested.
The warning is gone :-)
   
The framerate is now 10 seconds per frame instead of 30 frames per 
  second.
It does something.
The texture I render to remains black (cleared to black).
   
If I change the setInternalFormat to GL_RGBA then the framerate is 
  up again,
and the texture gets colors. This works, but then I only have 8 bit 
  in the
red channel. What I need is as many bits as possible in the red 
  channel,
preferably 32. And I do not need GBA channels.
Do you have a suggestion for me on this one?
   
Viggo
   
Date: Fri, 30 May 2008 13:25:24 +0100
From: [EMAIL PROTECTED]
To: osg-users@lists.openscenegraph.org
Subject: Re: [osg-users] I get errors when trying to render to a 
  luminance
buffer...
   
Hi Viggo,
   
The warning is exactly right, pbuffers don't suport multiple render
targets, only FrameBufferObjects do.
   
Perhaps

Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread J.P. Delport

Hi,

(I'm not the guy with the original problem, just interested...)

Guy wrote:

Hello,

 


Category: FBO, blending, H/W

 

 I don't understand exactly what do you mean but H/W blending with float 
is not supported.


I know for sure that we render particle effect with transparency to 
float (16bit) FBO, and get the correct results. We also get that with a 
decent frame rate (100-200Hz depending on the number of particles).


I'm interested in transparency with 32bit floats (alpha blending). Have 
you tested that? I would like to know if it is supported in hardware.




 


So I've the feeling that I didn't understand what you mean be blend.

 


Our H/W is nVIDIA 8800 driver 6.14.11.6921 osg2.0 I think J

 


I'd appreciate if you clarify your problem.

Thanks,

Guy.


jp



 




*From:* [EMAIL PROTECTED] 
[mailto:[EMAIL PROTECTED] *On Behalf Of *Viggo 
L?vli

*Sent:* Monday, June 02, 2008 11:29 AM
*To:* OpenSceneGraph Users
*Subject:* Re: [osg-users] I get errors when trying to render to a 
luminance buffer...


 


Hi again :-)
 
I abandoned the idea of using a floating point buffer.
I went into the thinking box and came to the same conclusion as you 
wrote about in the end of your comment.
 
I needed a quite high resolution so I decided to use all 4 channels 
(RGBA). The numbers I want to accumulate is seldom large, so I needed 
most resolution in the lower scale.
I thus decided to use RGBA where R is most significant and A is least 
significant.

Bit usage is:
   R = 6 bits
   G = 5 bits
   B = 4 bits
   A = 3 bits
All non used bits will overlap with next channel.
 
I am using this to generate a pixel weight when rendering many 
volumetric clouds on top of each other on screen. Naturally most 
overlapping clouds will be further away from camera, so I needed as high 
a number of overlap bits on the lest significant buffers that I could 
get. My usage accepts in worst case 32 overlapping pixels with maximum 
weight (in the distance). I think that is something I can live with :-)
 
Anyhow, I got a range (18 bit) that was good enough for my usage and it 
gives a decent cloud sorting for clouds seen up to 40 kilometers away.
 
I must stress one point:
- If you ever try using the alpha channel for this then remember to turn 
off alpha-clipping (set alpha-func to always).
 
Viggo




 Date: Mon, 2 Jun 2008 10:05:24 +0200
 From: [EMAIL PROTECTED]
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] I get errors when trying to render to a 

luminance buffer...


 Hi,

 Viggo Løvli wrote:
  Ok, that make sense.
  It happens every time I render with a floating point texture.
 
  The fall in framerate does not happen when I run the OSG multi render
  target example using HDR. My code does however use additive blending
  toward the floating point texture so I bet that is what cause it to 

fall

  into a software rendering mode.

 We have also experienced this slowdown when blending was enabled with
 float textures. Testing on newer cards is pending as we cannot figure
 out from docs on the internet if this is actually supported in current
 hardware at all. Seems like DX10.1 mandates float blending, but we will
 test to make sure. Please let me know if you know if it is supported in
 hardware.

 
  Do you know about any texture surface bit format that is more than 8
  bits (unsigned integer) ?

 I don't know of any. I've only seen a paper once where people were using
 three channels to simulate one large channel. The RGB channels were
 partially overlapped to create a higher dynamic range channel.

 jp

 
  Viggo
 
 
   Date: Fri, 30 May 2008 16:29:05 +0100
   From: [EMAIL PROTECTED]
   To: osg-users@lists.openscenegraph.org
   Subject: Re: [osg-users] I get errors when trying to render to a
  luminance buffer...
  
   Hi Viggo,
  
   When performance drops like this it's because you've dropped onto a
   software fallback path in the OpenGL driver. Exactly what formats are
   software vs hardware depends upon the hardware and OpenGL drivers.
   You'll need to check with your hardware vendors specs to see what will
   be hardware accelerated.
  
   Robert.
  
   On Fri, May 30, 2008 at 2:16 PM, Viggo Løvli [EMAIL PROTECTED] 

wrote:

Hi Robert,
   
I modified my code as you suggested.
The warning is gone :-)
   
The framerate is now 10 seconds per frame instead of 30 frames per
  second.
It does something.
The texture I render to remains black (cleared to black).
   
If I change the setInternalFormat to GL_RGBA then the framerate is
  up again,
and the texture gets colors. This works, but then I only have 8 bit
  in the
red channel. What I need is as many bits as possible in the red
  channel,
preferably 32. And I do not need GBA channels.
Do you have a suggestion for me on this one?
   
Viggo
   
Date: Fri, 30 May 2008 13:25:24 +0100
From: [EMAIL PROTECTED]
To: osg

Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread Guy
Category: FBO, blending, H/W

Well, I run a different setting from the previous one I mentioned.
I use osg 1.2 (yeah yeah, I promise to upgrade soon). I haven't integrated 
particle system to my application but I use 1 big billboard with transparency.
Until now my FBO cameras used GL_RGBA16F_ARB textures, and also the texture for 
the billboard was GL_RGBA16F_ARB.
I just changed all of them to GL_RGBA32F_ARB and the scene renders correctly. 
(with transparency).
I haven't tried to save the frames to files and check the precision so I've no 
way to know if the driver didn't fall back to half-float, but I don't believe 
it would do that anyhow. I'd excpect that if there were troubles they would be 
the same as yours. But I get solid frame rate of 80+ Hz.
(just as before I added the billboard with the blending mode to the scene).
Again, I work with nVIDIA 8800 GT, driver 6.14.11.6921, on windows XP.

Do you want to send me .osg scenes, or code that I'll test on my machine?
Maybe you ment GL_LUMINANCE_ALPHA32F_ARB format? If so tell me how to set it up 
instead of the RGBA format and I'll try it.

Guy. 

-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of J.P. Delport
Sent: Monday, June 02, 2008 2:22 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] I get errors when trying to render to a luminance 
buffer...

Hi,

(I'm not the guy with the original problem, just interested...)

Guy wrote:
 Hello,
 
  
 
 Category: FBO, blending, H/W
 
  
 
  I don't understand exactly what do you mean but H/W blending with float 
 is not supported.
 
 I know for sure that we render particle effect with transparency to 
 float (16bit) FBO, and get the correct results. We also get that with a 
 decent frame rate (100-200Hz depending on the number of particles).

I'm interested in transparency with 32bit floats (alpha blending). Have 
you tested that? I would like to know if it is supported in hardware.

 
  
 
 So I've the feeling that I didn't understand what you mean be blend.
 
  
 
 Our H/W is nVIDIA 8800 driver 6.14.11.6921 osg2.0 I think J
 
  
 
 I'd appreciate if you clarify your problem.
 
 Thanks,
 
 Guy.

jp

 
  
 
 
 
 *From:* [EMAIL PROTECTED] 
 [mailto:[EMAIL PROTECTED] *On Behalf Of *Viggo 
 L?vli
 *Sent:* Monday, June 02, 2008 11:29 AM
 *To:* OpenSceneGraph Users
 *Subject:* Re: [osg-users] I get errors when trying to render to a 
 luminance buffer...
 
  
 
 Hi again :-)
  
 I abandoned the idea of using a floating point buffer.
 I went into the thinking box and came to the same conclusion as you 
 wrote about in the end of your comment.
  
 I needed a quite high resolution so I decided to use all 4 channels 
 (RGBA). The numbers I want to accumulate is seldom large, so I needed 
 most resolution in the lower scale.
 I thus decided to use RGBA where R is most significant and A is least 
 significant.
 Bit usage is:
R = 6 bits
G = 5 bits
B = 4 bits
A = 3 bits
 All non used bits will overlap with next channel.
  
 I am using this to generate a pixel weight when rendering many 
 volumetric clouds on top of each other on screen. Naturally most 
 overlapping clouds will be further away from camera, so I needed as high 
 a number of overlap bits on the lest significant buffers that I could 
 get. My usage accepts in worst case 32 overlapping pixels with maximum 
 weight (in the distance). I think that is something I can live with :-)
  
 Anyhow, I got a range (18 bit) that was good enough for my usage and it 
 gives a decent cloud sorting for clouds seen up to 40 kilometers away.
  
 I must stress one point:
 - If you ever try using the alpha channel for this then remember to turn 
 off alpha-clipping (set alpha-func to always).
  
 Viggo
 
 
  Date: Mon, 2 Jun 2008 10:05:24 +0200
  From: [EMAIL PROTECTED]
  To: osg-users@lists.openscenegraph.org
  Subject: Re: [osg-users] I get errors when trying to render to a 
 luminance buffer...

  Hi,

  Viggo Løvli wrote:
   Ok, that make sense.
   It happens every time I render with a floating point texture.
  
   The fall in framerate does not happen when I run the OSG multi render
   target example using HDR. My code does however use additive blending
   toward the floating point texture so I bet that is what cause it to 
 fall
   into a software rendering mode.

  We have also experienced this slowdown when blending was enabled with
  float textures. Testing on newer cards is pending as we cannot figure
  out from docs on the internet if this is actually supported in current
  hardware at all. Seems like DX10.1 mandates float blending, but we will
  test to make sure. Please let me know if you know if it is supported in
  hardware.

  
   Do you know about any texture surface bit format that is more than 8
   bits (unsigned integer) ?

  I don't know of any. I've only seen a paper once where people were using
  three channels to simulate one

Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-06-02 Thread Viggo Løvli

Hi,

When one float mode did not work then I tried all modes I could see in the gl.h 
file. The result on my end was the same every time: 10SPF (seconds per frame) 
instead of 30 FPS which I had when testing with GL_RGBA. The first I tried was 
GL_LUMINANCE16F_ARB. I figured that I'd start with a simple one (not 32 bit). 
My blend function settings are source = ONE, destination = ONE.  

The solution I have now where I use 18 bit with 14 overlapping bits in an 
unsigned byte RGBA (GL_RGBA mode with additive blend), works, but it would work 
much better if I had 32 bits to play with. So, I will have to research this 
further in the future. I have to move on now though, the research code done so 
far has proven that the technology I am researching works.

 Viggo



 Date: Mon, 2 Jun 2008 16:53:24 +0200
 From: [EMAIL PROTECTED]
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] I get errors when trying to render to a  
 luminance   buffer...
 
 Category: FBO, blending, H/W
 
 Well, I run a different setting from the previous one I mentioned.
 I use osg 1.2 (yeah yeah, I promise to upgrade soon). I haven't integrated 
 particle system to my application but I use 1 big billboard with transparency.
 Until now my FBO cameras used GL_RGBA16F_ARB textures, and also the texture 
 for the billboard was GL_RGBA16F_ARB.
 I just changed all of them to GL_RGBA32F_ARB and the scene renders correctly. 
 (with transparency).
 I haven't tried to save the frames to files and check the precision so I've 
 no way to know if the driver didn't fall back to half-float, but I don't 
 believe it would do that anyhow. I'd excpect that if there were troubles they 
 would be the same as yours. But I get solid frame rate of 80+ Hz.
 (just as before I added the billboard with the blending mode to the scene).
 Again, I work with nVIDIA 8800 GT, driver 6.14.11.6921, on windows XP.
 
 Do you want to send me .osg scenes, or code that I'll test on my machine?
 Maybe you ment GL_LUMINANCE_ALPHA32F_ARB format? If so tell me how to set it 
 up instead of the RGBA format and I'll try it.
 
 Guy. 
 
 -Original Message-
 From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of J.P. Delport
 Sent: Monday, June 02, 2008 2:22 PM
 To: OpenSceneGraph Users
 Subject: Re: [osg-users] I get errors when trying to render to a luminance 
 buffer...
 
 Hi,
 
 (I'm not the guy with the original problem, just interested...)
 
 Guy wrote:
  Hello,
  
   
  
  Category: FBO, blending, H/W
  
   
  
   I don't understand exactly what do you mean but H/W blending with float 
  is not supported.
  
  I know for sure that we render particle effect with transparency to 
  float (16bit) FBO, and get the correct results. We also get that with a 
  decent frame rate (100-200Hz depending on the number of particles).
 
 I'm interested in transparency with 32bit floats (alpha blending). Have 
 you tested that? I would like to know if it is supported in hardware.
 
  
   
  
  So I've the feeling that I didn't understand what you mean be blend.
  
   
  
  Our H/W is nVIDIA 8800 driver 6.14.11.6921 osg2.0 I think J
  
   
  
  I'd appreciate if you clarify your problem.
  
  Thanks,
  
  Guy.
 
 jp
 
  
   
  
  
  
  *From:* [EMAIL PROTECTED] 
  [mailto:[EMAIL PROTECTED] *On Behalf Of *Viggo 
  L?vli
  *Sent:* Monday, June 02, 2008 11:29 AM
  *To:* OpenSceneGraph Users
  *Subject:* Re: [osg-users] I get errors when trying to render to a 
  luminance buffer...
  
   
  
  Hi again :-)
   
  I abandoned the idea of using a floating point buffer.
  I went into the thinking box and came to the same conclusion as you 
  wrote about in the end of your comment.
   
  I needed a quite high resolution so I decided to use all 4 channels 
  (RGBA). The numbers I want to accumulate is seldom large, so I needed 
  most resolution in the lower scale.
  I thus decided to use RGBA where R is most significant and A is least 
  significant.
  Bit usage is:
 R = 6 bits
 G = 5 bits
 B = 4 bits
 A = 3 bits
  All non used bits will overlap with next channel.
   
  I am using this to generate a pixel weight when rendering many 
  volumetric clouds on top of each other on screen. Naturally most 
  overlapping clouds will be further away from camera, so I needed as high 
  a number of overlap bits on the lest significant buffers that I could 
  get. My usage accepts in worst case 32 overlapping pixels with maximum 
  weight (in the distance). I think that is something I can live with :-)
   
  Anyhow, I got a range (18 bit) that was good enough for my usage and it 
  gives a decent cloud sorting for clouds seen up to 40 kilometers away.
   
  I must stress one point:
  - If you ever try using the alpha channel for this then remember to turn 
  off alpha-clipping (set alpha-func to always).
   
  Viggo
  
  
   Date: Mon, 2 Jun 2008 10:05:24 +0200
   From: [EMAIL PROTECTED

[osg-users] I get errors when trying to render to a luminance buffer...

2008-05-30 Thread Viggo Løvli

Hi,
 
I want to render to a floating point buffer, and I set things up like this:
 
tex-setInternalFormat( GL_LUMINANCE16F_ARB );
tex-setSourceFormat( GL_RED );
tex-setSourceType( GL_FLOAT );
 
camera-setRenderTargetImplementation( osg::Camera::FRAME_BUFFER_OBJECT );
camera-attach( osg::Camera::BufferComponent( osg::Camera::COLOR_BUFFER0 ), tex 
);
camera-setDrawBuffer( GL_COLOR_ATTACHMENT0_EXT );
 
My fragment-shader that write to the surface output the value this way:
gl_FragData[0].r = 1.0;
 
Another fragment-shader reads the surface this way:
value = texture2DRect( id, gl_FragCoord.xy ).r;
 
I get the following output when I try to run my app:
Warning: RenderStage::runCameraSetUp(state) Pbuffer does not support multiple 
color outputs.
 
My app runs, but nothing is written to the texture.
 
Is it possible to set up a surface that holds one channel (GL_RED) which is an 
unsigned int of 32 bit resolution? I'd rather use that than a float :-)
 
Viggo
_
Lei av å vente på svar? Si det direkte med Windows Live Messenger.
http://get.live.com/nb-no/messenger/overview___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-05-30 Thread Robert Osfield
Hi Viggo,

When performance drops like this it's because you've dropped onto a
software fallback path in the OpenGL driver.  Exactly what formats are
software vs hardware depends upon the hardware and OpenGL drivers.
You'll need to check with your hardware vendors specs to see what will
be hardware accelerated.

Robert.

On Fri, May 30, 2008 at 2:16 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
 Hi Robert,

 I modified my code as you suggested.
 The warning is gone :-)

 The framerate is now 10 seconds per frame instead of 30 frames per second.
 It does something.
 The texture I render to remains black (cleared to black).

 If I change the setInternalFormat to GL_RGBA then the framerate is up again,
 and the texture gets colors. This works, but then I only have 8 bit in the
 red channel. What I need is as many bits as possible in the red channel,
 preferably 32. And I do not need GBA channels.
 Do you have a suggestion for me on this one?

 Viggo

 Date: Fri, 30 May 2008 13:25:24 +0100
 From: [EMAIL PROTECTED]
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] I get errors when trying to render to a luminance
 buffer...

 Hi Viggo,

 The warning is exactly right, pbuffers don't suport multiple render
 targets, only FrameBufferObjects do.

 Perhaps what you intend it not to use multiple render targets, in
 which case you should set the Camera attachment to COLOR_BUFFER rather
 than COLOR_BUFFER0, that later tells the OSG that you want MRT and
 will be using glFragColor[] in your shaders.

 Also the Camera::setDrawBuffer(GL_COLOR_ATTACHMENT0_EXT) is
 inappropriate for pbuffers.

 Robert.

 On Fri, May 30, 2008 at 1:18 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
  Hi,
 
  I want to render to a floating point buffer, and I set things up like
  this:
 
  tex-setInternalFormat( GL_LUMINANCE16F_ARB );
  tex-setSourceFormat( GL_RED );
  tex-setSourceType( GL_FLOAT );
 
  camera-setRenderTargetImplementation( osg::Camera::FRAME_BUFFER_OBJECT
  );
  camera-attach( osg::Camera::BufferComponent( osg::Camera::COLOR_BUFFER0
  ),
  tex );
  camera-setDrawBuffer( GL_COLOR_ATTACHMENT0_EXT );
 
  My fragment-shader that write to the surface output the value this way:
  gl_FragData[0].r = 1.0;
 
  Another fragment-shader reads the surface this way:
  value = texture2DRect( id, gl_FragCoord.xy ).r;
 
  I get the following output when I try to run my app:
  Warning: RenderStage::runCameraSetUp(state) Pbuffer does not support
  multiple color outputs.
 
  My app runs, but nothing is written to the texture.
 
  Is it possible to set up a surface that holds one channel (GL_RED) which
  is
  an unsigned int of 32 bit resolution? I'd rather use that than a float
  :-)
 
  Viggo
 
  
  Få Hotmail du også. Windows Live Hotmail nå med 5000 MB gratis
  lagringsplass.
  ___
  osg-users mailing list
  osg-users@lists.openscenegraph.org
 
  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


 
 SkyDrive er her. Glem minnepinnen!
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] I get errors when trying to render to a luminance buffer...

2008-05-30 Thread Viggo Løvli

Ok, that make sense.
It happens every time I render with a floating point texture.

The fall in framerate does not happen when I run the OSG multi render target 
example using HDR. My code does however use additive blending toward the 
floating point texture so I bet that is what cause it to fall into a software 
rendering mode.

Do you know about any texture surface bit format that is more than 8 bits 
(unsigned integer) ?

Viggo


 Date: Fri, 30 May 2008 16:29:05 +0100
 From: [EMAIL PROTECTED]
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] I get errors when trying to render to a luminance
 buffer...
 
 Hi Viggo,
 
 When performance drops like this it's because you've dropped onto a
 software fallback path in the OpenGL driver.  Exactly what formats are
 software vs hardware depends upon the hardware and OpenGL drivers.
 You'll need to check with your hardware vendors specs to see what will
 be hardware accelerated.
 
 Robert.
 
 On Fri, May 30, 2008 at 2:16 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
  Hi Robert,
 
  I modified my code as you suggested.
  The warning is gone :-)
 
  The framerate is now 10 seconds per frame instead of 30 frames per second.
  It does something.
  The texture I render to remains black (cleared to black).
 
  If I change the setInternalFormat to GL_RGBA then the framerate is up again,
  and the texture gets colors. This works, but then I only have 8 bit in the
  red channel. What I need is as many bits as possible in the red channel,
  preferably 32. And I do not need GBA channels.
  Do you have a suggestion for me on this one?
 
  Viggo
 
  Date: Fri, 30 May 2008 13:25:24 +0100
  From: [EMAIL PROTECTED]
  To: osg-users@lists.openscenegraph.org
  Subject: Re: [osg-users] I get errors when trying to render to a luminance
  buffer...
 
  Hi Viggo,
 
  The warning is exactly right, pbuffers don't suport multiple render
  targets, only FrameBufferObjects do.
 
  Perhaps what you intend it not to use multiple render targets, in
  which case you should set the Camera attachment to COLOR_BUFFER rather
  than COLOR_BUFFER0, that later tells the OSG that you want MRT and
  will be using glFragColor[] in your shaders.
 
  Also the Camera::setDrawBuffer(GL_COLOR_ATTACHMENT0_EXT) is
  inappropriate for pbuffers.
 
  Robert.
 
  On Fri, May 30, 2008 at 1:18 PM, Viggo Løvli [EMAIL PROTECTED] wrote:
   Hi,
  
   I want to render to a floating point buffer, and I set things up like
   this:
  
   tex-setInternalFormat( GL_LUMINANCE16F_ARB );
   tex-setSourceFormat( GL_RED );
   tex-setSourceType( GL_FLOAT );
  
   camera-setRenderTargetImplementation( osg::Camera::FRAME_BUFFER_OBJECT
   );
   camera-attach( osg::Camera::BufferComponent( osg::Camera::COLOR_BUFFER0
   ),
   tex );
   camera-setDrawBuffer( GL_COLOR_ATTACHMENT0_EXT );
  
   My fragment-shader that write to the surface output the value this way:
   gl_FragData[0].r = 1.0;
  
   Another fragment-shader reads the surface this way:
   value = texture2DRect( id, gl_FragCoord.xy ).r;
  
   I get the following output when I try to run my app:
   Warning: RenderStage::runCameraSetUp(state) Pbuffer does not support
   multiple color outputs.
  
   My app runs, but nothing is written to the texture.
  
   Is it possible to set up a surface that holds one channel (GL_RED) which
   is
   an unsigned int of 32 bit resolution? I'd rather use that than a float
   :-)
  
   Viggo
  
   
   Få Hotmail du også. Windows Live Hotmail nå med 5000 MB gratis
   lagringsplass.
   ___
   osg-users mailing list
   osg-users@lists.openscenegraph.org
  
   http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
  
  
  ___
  osg-users mailing list
  osg-users@lists.openscenegraph.org
  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
 
 
  
  SkyDrive er her. Glem minnepinnen!
  ___
  osg-users mailing list
  osg-users@lists.openscenegraph.org
  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

_
Windows Live Hotmail nå med 5000 MB gratis lagringsplass.
http://get.live.com/nb-no/mail/overview___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org