[osg-users] Sharing an OpenGL texture context with another API

2014-07-08 Thread Harash Sharma
Dear Mr. Robert,

I have a frame grabber card. The accompanying APIs allow me to grab the 
image frames. I had been using these APIs to grab the video frames and then 
display the video using a textured quad. The display works fine but the 
procedure consumes CPU and is a bit slow. One of the provided APIs also allows 
the frame to be directly rendered to an OpenGL Texture with a given ID. This 
would be very helpful in offloading the processor. I have seen a couple of 
earlier posts on the topic. I tried to get the textureID from the TextureObject 
and draw the grabbed image to it. But It does not work.

I would be very thankful if you can suggest a suitable solution.

Regards

Harash Kumar Sharma___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Accessing Geometry user Data in Geometry / Vertex / Fragment Shader

2011-03-23 Thread Harash Sharma
Dear Mr. Hanson,
Thanks for the quick reply. I was able to define the data as vertex attributes 
and access these attributes in the vertex shader. Things are up and running. 

Thanks once again.

Regards

Harash




From: Chris 'Xenon' Hanson 
To: OpenSceneGraph Users 
Sent: Wed, March 23, 2011 1:39:18 AM
Subject: Re: [osg-users] Accessing Geometry user Data in Geometry / Vertex / 
Fragment Shader

On 3/22/2011 1:14 PM, Harash Sharma wrote:
> For one of the applications I working on, I need to attach some data to all 
> the 
>geometry
> nodes of the scenegraph. The data may be different for all the nodes. I need 
> to 
>use this
> data for some calculations during rendering. Is there a way to access this 
> data 
>in the
> geometry / vertex / fragment shaders. 

  If the data is per-node, look at using Uniforms.

  If it is per-vertex, use vertex attributes.

> Harash Kumar Sharma

-- 
Chris 'Xenon' Hanson, omo sanza lettere. 
Xenon@AlphaPixel.comhttp://www.alphapixel.com/
  Digital Imaging. OpenGL. Scene Graphs. GIS. GPS. Training. Consulting. 
Contracting.
"There is no Truth. There is only Perception. To Perceive is to Exist." - 
Xen
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Accessing Geometry user Data in Geometry / Vertex / Fragment Shader

2011-03-22 Thread Harash Sharma
Hi All,

For one of the applications I working on, I need to attach some data to all the 
geometry nodes of the scenegraph. The data may be different for all the nodes. 
I 
need to use this data for some calculations during rendering. Is there a way to 
access this data in the geometry / vertex / fragment shaders. 

Thanks in advance

Regards

Harash Kumar Sharma


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osgPPU CUDA Example - slower than expected?

2011-01-09 Thread Harash Sharma
Dear Mr. Art,

I too am noticing a problem similar to what Mr. Thorsten pointed out. Just 
curious about if the openGL and CUDA going together, I downloaded the osg2.9.10 
and osgCompute nodekit. I have CUDA 3.2 installed with on my machine Core2Duo 
with GEForce. The osgGeometryDemo sample code for warping with cow.osg is 
giving 
a reasonably high frame rate. I thought I should share this in case it is of 
any 
help.

Regards

Harash




From: J.P. Delport 
To: OpenSceneGraph Users 
Sent: Mon, January 3, 2011 3:30:34 PM
Subject: Re: [osg-users] osgPPU CUDA Example - slower than expected?

Hi,

I don't have any other suggestions than to use a GL debugger to make 
sure nothing is going to CPU or to try the new CUDA functions in osgPPU 
or your own code. I remember something in the GL to CUDA stuff bugging 
me, but cannot remember the details. AFAIR something was converting from 
texture to PBO and then to CUDA mem.

jp

On 16/12/10 13:25, Thorsten Roth wrote:
> Hi,
>
> as I explained in some other mail to this list, I am currently working
> on a graph based image processing framework using CUDA. Basically, this
> is independent from OSG, but I am using OSG for my example application :-)
>
> For my first implemented postprocessing algorithm I need color and depth
> data. As I want the depth to be linearized between 0 and 1, I used a
> shader for that and also I render it in a separate pass to the color.
> This stuff is then fetched from the GPU to the CPU by directly attaching
> osg::Images to the cameras. This works perfectly, but is quite a bit
> slow, as you might already have suspected, because the data is also
> processed in CUDA kernels later, which is quite a back and forth ;-)
>
> In fact, my application with three filter kernels based on CUDA (one
> gauss blur with radius 21, one image subtract and one image "pseudo-add"
> (about as elaborate as a simple add ;-)) yields about 15 fps with a
> resolution of 1024 x 1024 (images for normal and absolute position
> information are also rendered transferred from GPU to CPU here).
>
> So with these 15 frames, I thought it should perform FAR better when
> avoiding that GPU <-> CPU copying stuff. That's when I came across the
> osgPPU-cuda example. As far as I am aware, this uses direct mapping of
> PixelBuferObjects to cuda memory space. This should be fast! At least
> that's what I thought, but running it at a resolution of 1024 x 1024
> with a StatsHandler attached shows that it runs at just ~21 fps, not
> getting too much better when the cuda kernel execution is completely
> disabled.
>
> Now my question is: Is that a general (known) problem which cannot be
> avoided? Does it have anything to do with the memory mapping functions?
> How can it be optimized? I know that, while osgPPU uses older CUDA
> memory mapping functions, there are new ones as of CUDA 3. Is there a
> difference in performance?
>
> Any information on this is appreciated, because it will really help me
> to decide wether I should integrate buffer mapping or just keep the
> copying stuff going :-)
>
> Best Regards
> -Thorsten
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>

-- 
This message is subject to the CSIR's copyright terms and conditions, e-mail 
legal notice, and implemented Open Document Format (ODF) standard. 

The full disclaimer details can be found at 
http://www.csir.co.za/disclaimer.html.

This message has been scanned for viruses and dangerous content by MailScanner, 
and is believed to be clean.  MailScanner thanks Transtec Computers for their 
support.

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] OSG website down ?

2011-01-08 Thread Harash Sharma
Hi all,

I was trying to access OSG website. I am getting the following message:
Traceback (most recent call last):   File 
"/usr/lib/python2.5/site-packages/trac/web/api.py", line 339, in send_error 
'text/html')   File "/usr/lib/python2.5/site-packages/trac/web/chrome.py", line 
684, in render_template data = self.populate_data(req, data)   File 
"/usr/lib/python2.5/site-packages/trac/web/chrome.py", line 592, in 
populate_data d['chrome'].update(req.chrome)   File 
"/usr/lib/python2.5/site-packages/trac/web/api.py", line 168, in __getattr__
 
value = self.callbacks[name](self)   File 
"/usr/lib/python2.5/site-packages/trac/web/chrome.py", line 460, in 
prepare_request for category, name, text in 
contributor.get_navigation_items(req):   File 
"/usr/lib/python2.5/site-packages/trac/versioncontrol/web_ui/browser.py", line 
295, in get_navigation_items if 'BROWSER_VIEW' in req.perm:   File 
"/usr/lib/python2.5/site-packages/trac/perm.py", line 523, in has_permission
 
return self._has_permission(action, resource)   File 
"/usr/lib/python2.5/site-packages/trac/perm.py", line 537, in _has_permission   
  
check_permission(action, perm.username, resource, perm)   File 
"/usr/lib/python2.5/site-packages/trac/perm.py", line 424, in check_permission  
   
perm)   File "/usr/lib/python2.5/site-packages/trac/perm.py", line 282, in 
check_permission get_user_permissions(username)   File 
"/usr/lib/python2.5/site-packages/trac/perm.py", line 357, in 
get_user_permissions for perm in self.store.get_user_permissions(username): 
  
File "/usr/lib/python2.5/site-packages/trac/perm.py", line 175, in 
get_user_permissions cursor.execute("SELECT username,action FROM 
permission")   File "/usr/lib/python2.5/site-packages/trac/db/util.py", line 
51, 
in execute return self.cursor.execute(sql)   File 
"/usr/lib/python2.5/site-packages/trac/db/sqlite_backend.py", line 58, in 
execute args or [])   File 
"/usr/lib/python2.5/site-packages/trac/db/sqlite_backend.py", line 50, in 
_rollback_on_error return function(self, *args, **kwargs) OperationalError: 
database is locked
By any chance is the website down?. 

Regards
Harash


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] dynamic of the pipeline

2010-04-08 Thread Harash Sharma
Hi Allen,

One reasonably simple technique would be to handle this in the shader itself. 
Keep a Flag in the shader. If the Flag is set, perform convolution to compute 
the O/P pixel color. Otherwise simply set the output pixel color from the 
Incoming texture pixel.

Another Way: The One you are using Keep a UnitInOut parallel to your blurring 
channel. In the shader for his unit, just pass the Input to the output. Connect 
the O/P of both to the UnitOut. Enable node mask of the preferred channel as 
per requirement. 

Regards 

Harash




From: Allen Saucier 
To: osg-users@lists.openscenegraph.org
Sent: Fri, April 9, 2010 12:46:43 AM
Subject: [osg-users] [osgPPU] dynamic of the pipeline

Hi,

does anyone know if it is possible to setup a pipeline such that I may turn on 
and turn off a blurring effect during runtime?

For example:

camera->setViewport(new osg::Viewport(0,0,windowWidth,windowHeight));
camera->attach(osg::Camera::COLOR_BUFFER0, textureView);
camera->setRenderTargetImplementation(renderImplementation);

// setup osgPPU pipeline processor, which will use the main camera
osg::ref_ptr processor = new osgPPU::Processor();
p_ppuProcessor = processor;
processor->setCamera(camera);

// setup unit which will bring the output of the MAIN camera,
//  w/in the Processor unit into the pipeline
osgPPU::UnitCameraAttachmentBypass* ucaByPass = new 
osgPPU::UnitCameraAttachmentBypass();
ucaByPass->setBufferComponent(osg::Camera::COLOR_BUFFER0);
ucaByPass->setName("mainCamOutputTexUCAB");
processor->addChild(ucaByPass);

// bypass With Blurring
//
osgPPU::UnitBypass* bypassWithBlurring = new osgPPU::UnitBypass();
bypassWithBlurring->setName("BypassWBlurring");
ucaByPass->addChild(bypassWithBlurring);

// bypass withOUT blurring
osgPPU::UnitBypass* bypassWOBlurring = new osgPPU::UnitBypass();
bypassWOBlurring->setName("BypassWOBlurring");
ucaByPass->addChild(bypassWOBlurring);


blurring ppu code...


   osgPPU::UnitOut* unitOut2= new osgPPU::UnitOut(); 
   osgPPU::ShaderAttribute* shaderAttribute= new osgPPU::ShaderAttribute(); 
   { 
  osg::Shader* shader= new osg::Shader(osg::Shader::FRAGMENT); 
  const char* shaderSource= 
 "uniform sampler2D textureNameInShader;\n" 
 "void main()\n" 
 "{\n" 
 "  gl_FragColor=texture2D(textureNameInShader,gl_TexCoord[0].st);\n" 
 "}"; 
  shader->setShaderSource(shaderSource); 
  shaderAttribute->addShader(shader); 
  shaderAttribute->setName("nomShaderAttribute"); 
  shaderAttribute->add("textureNameInShader", osg::Uniform::SAMPLER_2D); 
  shaderAttribute->set("textureNameInShader", 0); 

  unitOut2->setName("finalOutputUnit"); 
  unitOut2->setViewport(new osg::Viewport(0,0, windowWidth, windowHeight) );
  unitOut2->getOrCreateStateSet()->setAttributeAndModes(shaderAttribute); 
   } 


   //exp 1 for blurring whole scene;
   bypassWithBlurring->addChild(blurx);
   blurx->addChild(blury);
   blury->addChild(unitOut2);

   bypassWOBlurring->addChild(unitOut2);


How may I turn on and off my blur effect during runtime?  I have tried using 
setActive but unfortunately, it only appears to work - within my pipeline - 
when applied to the last ppu, blury, and I get a black screen.  Using 
removeChild() in any situation causes a runtime crash.

I am not sure if removing a unit from processor is the right path to go either, 
because my units are children of other units and that would force me to 
re-construct my whole pipeline dynamically during runtime.  I can do that but 
it is more work.

Any help would be appreciated. 


Thank you!

Cheers,
Allen

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=26595#26595





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] bluring model edges

2010-03-17 Thread Harash Sharma
Hi Allen,

    Yes, This principle does work in real time. I do a more intense computation 
and am still able to get 60+ fps. For this you need to use osgPPU in your 
project. This library allows you to define your Post Processing pipeline. Mr. 
Art Tevs has provided some very good examples with the library. 

    Your Processing pipeline consists of a bypass unit (to bring the rendered 
frame into the pipeline), followed by an arrangement of I/O units where the 
processing of each unit can be defined by means of shader programs (like in 
your case the filters will have to be implemented as shaders) and finally an 
Out unit to display the result. 

    I will suggest that you kindly download and go thru osgPPU examples 
(especially osgPPUGlow) as this more or less works in the way you require.


Regards

Harash. 





From: Allen Saucier 
To: osg-users@lists.openscenegraph.org
Sent: Wed, March 17, 2010 8:46:50 PM
Subject: Re: [osg-users] bluring model edges

Hi Harash.  Well, I truly do not know what you're talking about but I kinda 
understand in principle.  Would your methodology work in a real time simulation 
- ie can this methodology be performed in milliseconds?

2nd question: where would I start?
I've no idea how to render a scene to a texture let alone where to even find a 
high pass filter.  Would you explain how to do this in osg?

Thanks Harash. :)


Harash Sharma wrote:
> Hi Julia, Allen,
> 
> 
> I have one solution that most probably seems to be wht you are looking for. I 
> have been using OSG for doing some image processing. Thanks to Mr. Art Tevs' 
> osgPPU. 
> 
> Render the scene to a texture T1, High pass filter this image using a shader 
> to extract the edges (store to T2), followed by a Gaussian filtering of T2. 
> Overlay this processed image over the original with a weight factor.  
> 
> Scene --> T1 --> [HighPass] --> T2 --> [Gaussian(LowPass)] --> T3-|  
> | +--> T4 --> Output to Framebuffer
> ||
> 
> 
> Regards
> 
> Harash
> 
> From: Allen Saucier <>
> To: 
> Sent: Tue, March 16, 2010 8:27:51 PM
> Subject: Re:  bluring model edges
> 
> Hi,
> 
> I am in need of this type of feature, too. I need to be able to take a model 
> & "fuzz" it's edges, though I must admit that I don't fully understand the 
> code given, I'll try this.
> 
> Thanks Guy for the ideas.
> 
> Cheers,
> Allen
> 
> --
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=25726#25726
> 
> 
> 
> 
> 
> ___
> osg-users mailing list
>  ()
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
> 
>  --
> Post generated by Mail2Forum


--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=25761#25761





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] bluring model edges

2010-03-16 Thread Harash Sharma
Hi Julia, Allen,
 
    I have one solution that most probably seems to be wht you are looking for. 
I have been using OSG for doing some image processing. Thanks to Mr. Art Tevs' 
osgPPU.
 
    Render the scene to a texture T1, High pass filter this image using a 
shader to extract the edges (store to T2), followed by a Gaussian filtering of 
T2. Overlay this processed image over the original with a weight factor. 
    
    Scene --> T1 --> [HighPass] --> T2 --> [Gaussian(LowPass)] --> 
T3-|  
    
|   
 +--> T4 --> Output to Framebuffer
    
||


Regards

Harash



From: Allen Saucier 
To: osg-users@lists.openscenegraph.org
Sent: Tue, March 16, 2010 8:27:51 PM
Subject: Re: [osg-users] bluring model edges

Hi,

I am in need of this type of feature, too.  I need to be able to take a model & 
"fuzz" it's edges, though I must admit that I don't fully understand the code 
given, I'll try this.

Thanks Guy for the ideas.

Cheers,
Allen

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=25726#25726





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] Problem Passing Lookup Table to Processing Unit

2009-12-03 Thread Harash Sharma
Hi Mr. J.P. and Mr. Art Tevs,

Thanks for the quick reply. Mr. J.P. the pointers you gave to image reduction 
are very interesting. I am going through those since they will help me in the 
long run, but it would take some time to get to it. 
Below, I am attaching the fragment shader code for the unit that accesses the 
mipmap:

// The input image
uniform sampler2DImage;

// Mean and MeanSquared values
uniform sampler2DMeanStdImage;

vec3 floatToRGB(float Val)
{
float D = Val * 16777215.0;
float R = floor(D / 65536.0);
D = D - R * 65536.0;
float G = floor(D/256.0);
D = D - G*256.0;
float B = floor(D);
return vec3(R/255.0, G/255.0, B/255.0);
}

void main (void)
{
// get the image gray value -- C.r = C.g = C.b;
vec4 C = texture2D(Image, gl_TexCoord[0].st);
float Gray = C.r; 

// grab the image mean and std dev;
vec4 Avg = texture2D(MeanStdImage, vec2(0.5,0.5), 100.0);
float M = Avg.r;// Mean 
float M2 = Avg.g;// Mean of squares

// Compute the variance & Std. Dev.
float Var = clamp(M2 - M*M, 0.0, 1.0);
float Std = sqrt(Var);
float Factor = 0.5/Std;// Compute the scale factor
// grab the pixel color
Gray = (Gray - M) * Factor + 0.5;
Gray = clamp(Gray, 0.0, 1.0);
gl_FragColor = vec4(Gray, Gray, Gray, 1.0);
}

I have separately tried to convert the floating values M, M2, Std and 
unprocessed Gray into RGB (using floatToRGB) and grabbed using UnitCapture. 
Importing to Matlab revealed that the image has the following parameters: 
Min = 0.09, Max = 0.12, Mean ~ 0.11, std. dev = 0.01.., 
However, the mean and std dev as computed by Mipmapping are 0.115 and  ~0.12 
respectively. So it looks like the mean is more or less proper, but the std. 
dev. is improper. 
One doubt that I have is -- whether the placing of mean in red and sq. mean in 
green components is improper?

Regards

Harash



From: Art Tevs 
To: osg-users@lists.openscenegraph.org
Sent: Thu, December 3, 2009 5:55:24 PM
Subject: Re: [osg-users] [osgPPU] Problem Passing Lookup Table to Processing 
Unit

Hi Harash, J.P.


J.P. Delport wrote:
> 
> I'm not sure your shader is doing a proper mean. For the GPU to do 
> mean/sum you need to do what is called a reduction. Search for reduction 
> on www.gpgpu.org. This is different from doing per pixel operations like 
> changing luminance e.g.
> 


The posted code seems to be similar to the luminance computation in the HDR 
example (quick overview over the code). Using mipmaps one can compute an 
average or arithmetic mean value, I am not aware if reduction is really needed 
here. Maybe reduction means exactly the same. To compute it I would do the way, 
you described here: using UnitInMipmapOut to crete a mipmap with the average 
value in the highest level.

What do you mean, that the values are not the same? In order to get the 
computed value out of the mipmap you have to access the last mipmap level in 
your shader. Take a look into brightpass/tonemap shader, where the last level 
with scene luminance is accessed. Also make sure that in the last level you 
access not on (0,0) position, but on (0.5,0.5), because otherwise you will get 
interpolated values. It is actually better to disable GL_LINEAR filtering for 
the textures used by the unit.

art

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=20803#20803





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] Problem Passing Lookup Table to Processing Unit

2009-12-01 Thread Harash Sharma
Hi All,

    First of all thanks Mr. Art Tevs for your suggestions and pointers for 
passing Lookup tables. And thanks again for such a fantastic library OSGPPU. 
Now my whole application is ported into osgPPU pipeline and it works very well. 

    Now the only worry that remains is the linear contrast enhancement. I had 
posed this problem earlier but in a different way. Motivated by Mr. Art's 
Luminance shader, I wanted to implement the following algorithm:

1. Use a UnitInMipmapOut to compute the Mean (M) of the image gray values and 
the mean of image squared gray levels
2. Read these values in the child unit shader, compute the image std. 
deviation, from the image gray value (G), compute the new gray levels as (G-M) 
* (0.5/S) + 0.5;
3. However, this doesn't give any contrast improvent
4. I have tried to capture the Mean and Std. Dev. by converting them to RGB and 
storing using UnitCapture.
5. The rendered image statistics and GPU computed statistics (std. dev.) don't 
match 

I have used the following fragment shader attached to UnitInMipmapOut  to 
compute Mean and Mean (Image squared)

void
{
    // get texture sizes of the previous level
    vec2 size = vec2(osgppu_ViewportWidth, osgppu_ViewportHeight) * 2.0;
    // this is our starting sampling coordinate 
    vec2 iCoord = gl_TexCoord[0].st;
    // this represent the step size to sample the pixels from previous mipmap 
level
    vec2 texel = vec2(1.0, 1.0) / (size);
    vec2 halftexel = vec2(0.5, 0.5) / size;
    // create offset for the texel sampling (TODO check why -1 seems to be 
correct) 
    vec2 st[4];
    st[0] = iCoord - halftexel + vec2(0,0);
    st[1] = iCoord - halftexel + vec2(texel.x,0);
    st[2] = iCoord - halftexel + vec2(0,texel.y);
    st[3] = iCoord - halftexel + vec2(texel.x,texel.y);
    // retrieve 4 texels from the previous mipmap level
    float c[4];
    float Avg = 0.0;
    float Avg2 = 0.0;
    for (int i=0; i < 4; i++)
    {
   // map texels coordinates, such that they do stay in defined space
   st[i] = clamp(st[i], vec2(0,0), vec2(1,1));
        // get texel from the previous mipmap level
        vec4 V = texture2D(texUnit0, st[i], osgppu_MipmapLevel - 1.0);
 
   Avg = Avg + V.r;
   // for the first mipmap level, just compute the squared 
   // gray values from red component
   if (abs(osgppu_MipmapLevel - 1.0) < 0.1) 
       Avg2 = Avg2 + V.r*V.r;
   // for the rest of the levels, simply add the green component
        else 
            Avg2 = Avg2 + V.g;
    }
    Avg *= 0.25;        Avg2 *= 0.25;
    // place the computed mean values in R component
    // place the computed mean2 values in G component
    gl_FragData[0].rgba = vec4(Avg, Avg2, 0.0, 1.0);  main(void)
}
 
    Am I making some conceptual mistake in the calculation of std. deviation. 
Please suggest. Thanks in advance.
 
regards
 
Harash
Hi Harash,

First take a look into documentation of osgPPU. osgPPU is very well documented, 
look here (http://projects.tevs.eu/osgppu and 
http://www.tevs.eu/doc/osgPPU/namespaceosgPPU.html)
If this does not help you, then take a look into examples. Every example has 
tons of comments to help new users to understand how certain things works.



Harash Sharma wrote:
> 
> ...However when I attach the shader to the processing unit, the results are 
> unexpected and I see the openGL warning. 
> 
> Do I have to use UnitTexture to pass the lookup table? I am totally stuck in 
> dark. Please help.
> 


Attaching a shader to a unit does works fine. In every example you can see this 
behaviour. So it might be that you are doing something wrong with the setup of 
your shader. 

If you want to have a texture as input to any unit, then you have to use 
UnitTexture instead. You just put a UnitTexture under the processor and call 
unitTexture::setTexture(_myTexture) with your texture. Then any unit which is a 
child of that unit texture will have this texture as input. 

Regards,
art

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=19734#19734





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org




From: Art Tevs 
To: osg-users@lists.openscenegraph.org
Sent: Tue, November 17, 2009 8:03:17 PM
Subject: Re: [osg-users] [osgPPU] Problem Passing Lookup Table to Processing 
Unit



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] [osgPPU] Problem Passing Lookup Table to Processing Unit

2009-11-17 Thread Harash Sharma
Dear Mr. Art Tevs,

        I am trying to pass a GL_RGB texture to one of the Proceessing units 
for Lookup table. I attach a shader to the processing unit to compute the final 
color. The scheme works when I attach the shader to the scenegraph and access 
the lookup texture thererin. However when I attach the shader to the processing 
unit, the results are unexpected and I see the openGL warning. 

        Do I have to use UnitTexture to pass the lookup table? I am totally 
stuck in dark. Please help.

        Thanks

Regards

Harash 



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] [osgPPU] Problem Passing Lookup Table to Shader

2009-11-16 Thread Harash Sharma
Hi all,

 I am reasonably new to shaders. I intend to implement a floating point 
lookup table. The aim is to write a fragment shader that takes an in coming RGB 
Color, Lookup a floating value from the table and write it to the frame buffer 
pixel. I tried using a GL_FLOAT type 1-D LUMINANCE texture and accessing 
it using a sampler, followed by GL_FLOAT type 1-D RGB texture, but both of them 
put up a blank screen. 

    Thereafter, I tried to declare a varying float Array in the vertex shader 
and computed it there and used the values in the fragment shader. But now I get 
the following error:
    ** error C5041: cannot locate suitable resource to bind parameter "" **

    Please help.

Regards

Harash


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] Please test current SVN (release v0.5.6)

2009-10-25 Thread Harash Sharma
Dear Mr. Art Tevs,

    A belated thanks for your earlier help. Now I have downloaded the svn trunk 
for testing. The osg version is osg-2.9.5. Compiler: Visual Studio2005, OS: 
Windows 7-RC1. I am encountering some compilationm errors. Wanted to delay the 
migration to osg-2.9.6 for some days due to some official presentations. If 
these errors have something to do with osg-version differences, pls. ignore. I 
will update and retry. Following are the errors:

3>Unit.cpp
3>.\Unit.cpp(509) : warning C4018: '<' : signed/unsigned mismatch
3>.\Unit.cpp(647) : error C2039: 'getOrCreateGLBufferObject' : is not a member 
of 'osg::PixelDataBufferObject'
3> D:\ExternalAPIs\OpenSceneGraph\OpenSceneGraph\include\osg/BufferObject(383) 
: see declaration of 'osg::PixelDataBufferObject'
3>.\Unit.cpp(647) : error C2227: left of '->isDirty' must point to 
class/struct/union/generic type
3>.\Unit.cpp(649) : error C2039: 'getOrCreateGLBufferObject' : is not a member 
of 'osg::PixelDataBufferObject'
3> D:\ExternalAPIs\OpenSceneGraph\OpenSceneGraph\include\osg/BufferObject(383) 
: see declaration of 'osg::PixelDataBufferObject'
3>.\Unit.cpp(649) : error C2227: left of '->isDirty' must point to 
class/struct/union/generic type
3>Generating Code...3>Build log was saved at 
"file://d:\ExternalAPIs\OpenSceneGraph\osgPPU\src\osgPPU\osgPPU.dir\Debug\BuildLog.htm"
3>osgPPU - 4 error(s), 7 warning(s)
Regards,
 
Harash





From: Art Tevs 
To: osg-users@lists.openscenegraph.org
Sent: Sat, October 24, 2009 1:11:40 AM
Subject: [osg-users] [osgPPU] Please test current SVN (release v0.5.6)

Hi folks,

In couple of last days I made huge differences to the code of osgPPU. I would 
like to ask you to test current svn before I am going for an official release 
(this will be as soon as there is new stable OSG version 2.10 released). 
Current code is built against osg v2.9.6. 

I've added a new unit: UnitBypassRepeat, which provides you with a possibility 
of iterative rendering of a subgraph. A new example osgppu_diffusion, shows a 
simple mean curvature flow diffusion filter. So it is implemented with PDE 
(partial derivative equation) and runs for several iterations. The number of 
iterations is changeable during the runtime. Changing of the filter to a 
sharpening filter or something else is very easy.

I've also made changes in the way how the unit graph is traversed and gathered 
by the CullVisitor. It is now more conservative, lazy placing of units in the 
graph will be punished ;) The advantage is, that every unit is executed only 
before its parents has been already executed. In the previous releases this was 
not guaranteed for very complex unit graphs.

MRTs (multiple rendering targets) are also now supported in very easy way. Take 
a look into diffusion example. There I am using up-to 3 MRTs to compute the 
derivatives.

Please let me know of any bugs if you will find them.

Thank you all in advance.
Art

P.S. For those who is not familar with osgPPU go here
http://projects.tevs.eu/osgPPU/[/url]

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=18639#18639





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] osgPPU for image processing

2009-09-29 Thread Harash Sharma
Dear Mr. Art Tevs,

 I have been able to sort out the problem I posed yesterday, i.e., making 
the texture Image available for analysis / parameter determination. Similar to 
your osgPPU::UnitOutCapture, a class called UnitTextureTap is derived from 
osgPPU::UnitOut. It overrides the noticeFinishRendering where I store a copy of 
the last rendered texture into an image and give an interface to access this 
Image. I have tested it and seems to work fine for my application. In case this 
class will be useful to anyone else, I would be more than happy to contribute.

    Thanks for the kind support.

Regards


Harash.  





From: Art Tevs 
To: osg-users@lists.openscenegraph.org
Sent: Monday, September 28, 2009 3:04:19 PM
Subject: Re: [osg-users] [osgPPU] osgPPU for image processing

Hello Harash,


Harash Sharma wrote:
> 
> I have been able to incorporate osgPPU into my application. Image filtering 
> and color space conversions are working fine for a desired resolution that I 
> set at the beginning of the osgPPU Pipeline. 
> 

Nice to hear that the library is also used not only for rendering, but also for 
some online/offline computations/processing. 


> 
> 1. We require to carry out filtering at a higher resolution followed by sub 
> sampling. Can you suggest a method to sub-sample the image to a lower 
> resolution.
> 

Huh, I suppose there exists multiple works in the field of image processing 
which handles about sub-sampling. The straight forward implementation, which is 
enabled by default in osgPPU uses GL_LINEAR for filtering. Although it is very 
simple approach and is supported by the hardware, so is very fast, it suffers 
by not-preventing high frequency details very well. Yeah, it even may create 
some kind of aliasing artifacts, when subsampling high frequency data. Maybe 
using gaussian kernel, one can produce smoother subsampled images.
If you haven't done it yet, then take a look into the lecture slides about 
image resampling of princton university here:
http://www.cs.princeton.edu/courses/archive/fall99/cs426/lectures/sampling/index.htm


> 
> 2. We need to stretch the contrast of the image. Earlier we were doing this 
> on CPU. We build a histogram of the image, identify the lower and higher gray 
> levels, followed by the pixel by pixel transformation of gray levels.
> 

Computing any kind of histogram on a GPU is not efficient, because this 
operation can not be parallelized. However if you just want to find the minimum 
and maximum value of your contrast, then I would propose to use custom build 
mipmaps for that. So you first transform your image into contrast 
representation, computing contrast for every pixel. Then you do the same thing 
as in the HDR example of osgPPU, where you build the mipmap of your image upto 
the 1x1 level. The mipmap will consist of two channels, the minimum and maximum 
value. 
At the end you read out the last level and use those both values to transform 
each pixel of the original image correspondigly. It seem this is almost the 
same thing as in the HDR example, so take a look there.

I hope, I was able to help.

regards,
art

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=17681#17681





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] [osgPPU] osgPPU for image processing

2009-09-29 Thread Harash Sharma
Dear Mr. Art Tevs,

 Thanks for the informative reply. I was going thru your example codes and 
came accross "UnitInResampleOut" class. This happens to be exactly what I 
wanted. I can attach it to a unit that does all the prefiltering / High 
resolution processing and am able to resample image to a lower resolution. I 
have incorporated this unit and is working just fine. 
 
 Now I have another doubt. I want to access the texture attached to any 
Unit, do a pixel by pixel analysis and use the results in a shader while 
rendering the next frame. I have used the getOrCreateOutputTexture() function 
for the purpose. The problem now is that even though I am able to access the 
texture, the attached image is null. So I am unable to do a pixelwise 
computation. Can you please suggest a way out. This method I am planning to use 
for methods like histogram analysis of frame-N and use the results in 
Frame-(N+1)
 
Regards
 
Harash





From: Art Tevs 
To: osg-users@lists.openscenegraph.org
Sent: Monday, September 28, 2009 7:34:19 PM
Subject: Re: [osg-users] [osgPPU] osgPPU for image processing

Hello Harash,


Harash Sharma wrote:
> 
> I have been able to incorporate osgPPU into my application. Image filtering 
> and color space conversions are working fine for a desired resolution that I 
> set at the beginning of the osgPPU Pipeline. 
> 

Nice to hear that the library is also used not only for rendering, but also for 
some online/offline computations/processing. 


> 
> 1. We require to carry out filtering at a higher resolution followed by sub 
> sampling. Can you suggest a method to sub-sample the image to a lower 
> resolution.
> 

Huh, I suppose there exists multiple works in the field of image processing 
which handles about sub-sampling. The straight forward implementation, which is 
enabled by default in osgPPU uses GL_LINEAR for filtering. Although it is very 
simple approach and is supported by the hardware, so is very fast, it suffers 
by not-preventing high frequency details very well. Yeah, it even may create 
some kind of aliasing artifacts, when subsampling high frequency data. Maybe 
using gaussian kernel, one can produce smoother subsampled images.
If you haven't done it yet, then take a look into the lecture slides about 
image resampling of princton university here:
http://www.cs.princeton.edu/courses/archive/fall99/cs426/lectures/sampling/index.htm


> 
> 2. We need to stretch the contrast of the image. Earlier we were doing this 
> on CPU. We build a histogram of the image, identify the lower and higher gray 
> levels, followed by the pixel by pixel transformation of gray levels.
> 

Computing any kind of histogram on a GPU is not efficient, because this 
operation can not be parallelized. However if you just want to find the minimum 
and maximum value of your contrast, then I would propose to use custom build 
mipmaps for that. So you first transform your image into contrast 
representation, computing contrast for every pixel. Then you do the same thing 
as in the HDR example of osgPPU, where you build the mipmap of your image upto 
the 1x1 level. The mipmap will consist of two channels, the minimum and maximum 
value. 
At the end you read out the last level and use those both values to transform 
each pixel of the original image correspondigly. It seem this is almost the 
same thing as in the HDR example, so take a look there.

I hope, I was able to help.

regards,
art

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=17681#17681





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] osgPPU for image processing

2009-09-27 Thread Harash Sharma
Dear Mr. Art Tevs,

    I have been able to incorporate osgPPU into my application. Image filtering 
and color space conversions are working fine for a desired resolution that I 
set at the beginning of the osgPPU Pipeline. Further I am facing a few problems 
in which respect I would like your kind suggestions

1. We require to carry out filtering at a higher resolution followed by sub 
sampling. Can you suggest a method to sub-sample the image to a lower 
resolution. 
2. We need to stretch the contrast of the image. Earlier we were doing this 
on CPU. We build a histogram of the image, identify the lower and higher gray 
levels, followed by the pixel by pixel transformation of gray levels.

    It would be very kind of you if you can suggest appropriate methods, or 
else if you can indicate suitable pointers

Regards

Harash


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] offscreen rendering

2009-08-10 Thread Harash Sharma
Hi Rabbi,

This is my first response, I myself am a newbi, so the more experienced may 
kindly correct.
You have not set the camera manipulator for the viewer class. Calling 
viewer.run() automatically creates a default trackball manipulator object and 
sets it as the manipulator. The same does not happen in this case. Just try 
setting the appropriate manipulator.

Bye 

Harash




From: Rabbi Robinson 
To: osg-users@lists.openscenegraph.org
Sent: Monday, August 10, 2009 11:55:24 PM
Subject: Re: [osg-users] offscreen rendering

Hi,

Here is what I got so far.

osgViewer::Viewer viewer;
//graphics context
osg::ref_ptr traits = new 
osg::GraphicsContext::Traits;
traits->x = 0;
traits->y = 0;
traits->width = image->s();
traits->height = image->t();
traits->red = 8;
traits->green = 8;
traits->blue = 8;
traits->alpha = 8;
traits->depth = 24;
traits->windowDecoration = false;
traits->pbuffer = true;
traits->doubleBuffer = true;
traits->sharedContext = 0;

osg::ref_ptr pbuffer = 
osg::GraphicsContext::createGraphicsContext(traits.get());
if (pbuffer.valid())
{
osg::ref_ptr camera = viewer.getCamera();
camera->setGraphicsContext(pbuffer.get());
camera->setViewport(new osg::Viewport(0,0,image->s(),image->t()));
camera->setClearColor(osg::Vec4(0.0f,0.0f,0.0f,0.0f));
camera->setDrawBuffer(GL_BACK);
camera->setReadBuffer(GL_BACK);
viewer.realize();
}
else
{
std::cout << "Error cofiguring pbuffer\n";
exit(0);
}

//draw
viewer.setSceneData(geode);
viewer.frame();

It crashes at viewer.frame(), complaining some resource already in use.



Thank you!

Cheers,
Rabbi

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=16051#16051





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Accessing floating point textures in Fragment Shader

2008-10-17 Thread Harash Sharma
Hi all,
 
    I am facing a problem in accessing the floating point texture within a 
fragment shader. I have some 2D floating point array that I wish to access 
within the Fragment Shader. For the purpose, 
1. I have created an image with the pixel format GL_LUMINANCE and data type of 
GL_FLOAT. 
2. The Array data is copied to the Image data array.
3. The image is bound to a texture and this texture passed as uniform to the 
Fragment Shader.
 
The output is a totally white screen.
 
However, if I modify the Pixel format to GL_RGBA and the datatype to 
GL_UNSIGNED_BYTE and convert the values to integer and split the 4 bytes of 
unsigned int into R,G,B,A, I am able to reconstruct the values within the 
shader and process the fragment accordingly. 
 
I am using nVidia Quadro Fx 4500 graphics card on Intel Xeon 5160 processor. 
 
Please Guide
 
Regards
 
Harash.


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] performance issues with RTT

2008-09-08 Thread Harash Sharma
Hi, 
 
    I am rendering to Texture and accessing the image to do some preprocessing. 
So the sequence of operations is -- 
1. Render the scene to an image.
2. Preprocess this image
3. Use a textured quad to display the resulting image
 
-- Even if we remove the preprocessing part completely, no speedup is achieved.
 
Regards
 
Harash

- Original Message 
From: Steffen Kim <[EMAIL PROTECTED]>
To: osg-users@lists.openscenegraph.org
Sent: Sunday, September 7, 2008 12:57:03 PM
Subject: Re: [osg-users] performance issues with RTT

Good morning,

I'm just rendering to texture, no images or pixel reading involved.

Regards,
Steffen

Schon gehört? Bei WEB.DE gibt' s viele kostenlose Spiele:
http://games.entertainment.web.de/de/entertainment/games/free/index.html

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] performance issues with RTT

2008-09-05 Thread Harash Sharma
Hi Steffen, Guay and Robert,

   In the same regard I would like to mention something. We have also developed 
an application that uses multiple RTT cameras. However only one camera is 
active at a time ( the rest are masked using node mask). All cameras differ 
only in the resolution of the texture, they are viewing exactly the same scene 
and have identical fields of view. We too have found that the frame rate has a 
lot of dependence on the texture size. 256 x256 texture gives a fairly high 
frame rate(30+ frames/second), 512x512 slows it down considerably (to around 8 
frames / second) and 1024x1024 slows it even further (~ 1 fps). All these 
figures are in release mode build.

My machine specifications
Processor:     Xeon 5160
RAM:            4 GB Fully Buffered DDR2
Graphics:    nVidia Quadro FX4500 (512MB)

    I would be highly obliged if a way out can be suggested.

Regards

Harash  


- Original Message 
From: Steffen Kim <[EMAIL PROTECTED]>
To: osg-users@lists.openscenegraph.org
Sent: Friday, September 5, 2008 10:04:25 PM
Subject: Re: [osg-users] performance issues with RTT

Thanks for the hints so far...

I gave you wrong information since I forgot that I take care of the NPOT-stuff 
myself.
The API I use later on needs POT-textures so I make sure that only POT-textures 
are rendered. So in fact my textures are normally 2048x1024.

I know that the performance issues are most certain a problem of the texture 
rendering since we have parts of the application where even huger amounts of 
(normal render-to-view) cameras are involved without slowing down everything as 
much. The scene is pretty simple too and runs smoothly without the RTT-cameras.
Back on Monday I will try to get some FPS-values for exactly the same 
camera-setup with and without RTT for different scene complexities (here in 
Germany the weekend begins now ;)).

What I just found out is that the texture sizes make a huge difference in 
rendering performance. With 256x256-textures my cameras hardly slow down the 
viewer at all. That's also an evidence that the problem has to be somewhere in 
the texture-creation-part. 
If this cannot be enhanced then I probably have to find a good balance between 
quality and speed.
But I still have hopes that there are better ways to improve my performance 
without having to sacrifice too much resolution-wise.

BTW: I'm running my stuff on a Xeon 5160 at 3GHz and 2GB RAM with a Radeon 
X1900 card.

Have a nice weekend,
Steffen



___
Jetzt neu! Schützen Sie Ihren PC mit McAfee und WEB.DE. 30 Tage
kostenlos testen. http://www.pc-sicherheit.web.de/startseite/?mc=00

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Getting a list of all textures (including the ones used by particle systems) in a scenegraph

2008-06-02 Thread Harash Sharma
Hi all,
   I have a requirement to build a list of all textures used in a scene graph. 
For this purpose, I have used a NodeVisitor derived class with the apply 
function given below. 
Problem:  I am not able to get the details of textures used by the attached 
Particle systems. I would be very grateful if someone can help with some hints 
/ pointers.
Regards 
Harash 
Apply Function:
void
{ 
    osg::Node* Ptr = &searchNode; 
osg::StateSet* state = Ptr->getStateSet(); 
FindTextureVisitor::apply(osg::Node &searchNode) // get the state set for the 
node    osg::StateSet::TextureAttributeList Attr; //         Attr = 
state->getTextureAttributeList();     if(state != NULL) {        intSz = 
Attr.size();        // for all the attributes, find the ones related to 
texture            osg::ref_ptr texture = 
                                                                
(j,osg::StateAttribute::TEXTURE));        // if a texture is found            
{            if(texture.valid())                // This is a valuable node 
containing texture                foundNodeList.push_back(&searchNode);         
       // also store the Texture object related to this node so we              
   // don't have to repeat the call to get Texture attribute List               
 foundTextureList.push_back(texture);                // Now search if the 
texture has already been used by an earlier node                // get the 
image related to the texture                osg::ref_ptr texImage = 
texture->getImage(0);                // check if it is already in the list      
  
         std::string filename = texImage->getFileName();                
intTexSz = foundImageList.size(); // TextureList.size();                        
flag = 
                    }
                }                boolflag = true;                for(intk = 0; 
k < TexSz; k++) {                    if(foundImageList[k]->getFileName() == 
filename) {false;                        break;                if(flag) {       
             // No it is not on the List                    
foundImageList.push_back(texImage);
                }
            }
        }
    }
    traverse(searchNode); 
}                 // add it to the node list        for(intj = 0; j< Sz; j++) 
{dynamic_cast(state->getTextureAttribute    


  ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Multiple shaders

2008-03-13 Thread Harash Sharma
Hi all,

I have an application that consists of a terrain, and a few other natural / 
man-made features, and vehicles, populated on it. I had earlier asked a query 
related to computation of Normals at each pixel. Robert was kind enough to 
suggest the use of shaders for the purpose. The problem was thereafter solved 
by using a RTT camera associated with a shader to compute Normals at each 
pixel. 

   I now want to associate a shader with the terrain to provide some special 
effects. I am sorry if I am asking too naive a question, but can we associate 
two shaders to a fragment. Or otherwise van anyone suggest how I should solve 
this problem.

   Thanks in advance.


Regards

Harash


  

Never miss a thing.  Make Yahoo your home page. 
http://www.yahoo.com/r/hs___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Finding Normals at all pixel positions

2008-02-27 Thread Harash Sharma
Hi Robert and Guy.

   Thanks for your valuable advices. I will implement these and come back to 
you.

Regards

Harash



- Original Message 
From: Robert Osfield <[EMAIL PROTECTED]>
To: OpenSceneGraph Users 
Sent: Monday, February 25, 2008 9:02:47 PM
Subject: Re: [osg-users] Finding Normals at all pixel positions

Hi Harash,

Render you scene to a depth texture and then compute the normals from
this in a second pass in the shader.

Robert.

On Mon, Feb 25, 2008 at 3:20 PM, Harash Sharma <[EMAIL PROTECTED]> wrote:
>
>
>
> Hi All,
>
>
>
>In my OSG application, I want to find the normal vector corresponding to
> each pixel position. It means that after rendering if an image pixel (i,j)
> comes from an object  plane P, I need to find the Normal vector to P. One of
> the ways would be to use LineSegmentIntersector to find the intersecting
> plane for pixel (i,j), given the Line of Sight. But doing this for all
> pixels in the field of view would be a very slow process.
>
>
>
>It would be very kind if anyone could tell if a faster mechanism to do
> this is there in OSG and give a pointer to it. Thanks in advance.
>
>
>
> Regards
>
>
>
> Harash.
>
>
>
>  
> Looking for last minute shopping deals? Find them fast with Yahoo! Search.
> ___
>  osg-users mailing list
>  osg-users@lists.openscenegraph.org
>  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


  

Looking for last minute shopping deals?  
Find them fast with Yahoo! Search.  
http://tools.search.yahoo.com/newsearch/category.php?category=shopping___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Finding Normals at all pixel positions

2008-02-25 Thread Harash Sharma
Hi All,

   In my OSG application, I want to find the normal vector corresponding to 
each pixel position. It means that after rendering if an image pixel (i,j) 
comes from an object  plane P, I need to find the Normal vector to P. One of 
the ways would be to use LineSegmentIntersector to find the intersecting plane 
for pixel (i,j), given the Line of Sight. But doing this for all pixels in the 
field of view would be a very slow process.

   It would be very kind if anyone could tell if a faster mechanism to do this 
is there in OSG and give a pointer to it. Thanks in advance.

Regards

Harash.


  

Never miss a thing.  Make Yahoo your home page. 
http://www.yahoo.com/r/hs___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osgViewer::Viewer vs osgViewer::CompositeViewer

2008-01-17 Thread Harash Sharma
Hi Jean, 

   I have developed an application that uses a CompositeViewer. It has 2 views. 
One of the View is used as a User Interface -- display a few buttons drawn 
using osg::Geometry, etc. The task of this view is to communicate to the 
application the image processing modules to be incorporated on the captured 
image before it is displayed. The second view handles the scene and has an 
associated manipulator to move the camera. The scene is rendered to texture, 
required image processing modules are applied and result displayed on the 
scene. I think it may be difficult to handle such a case using 
osgViewer::Viewer.

Regards


Harash


- Original Message 
From: Jean-Sebastien Guay <[EMAIL PROTECTED]>
To: osg-users@lists.openscenegraph.org
Sent: Thursday, January 17, 2008 12:17:50 AM
Subject: [osg-users] osgViewer::Viewer vs osgViewer::CompositeViewer

Hello,

I was wondering if someone could give me some sample use cases for both
osgViewer::Viewer and osgViewer::CompositeViewer, at a high level.

I understand that Viewer "is a" View whereas CompositeViewer "has a" list of
Views, so in general, if we only have one view, CompositeViewer will only add
unnecessary complexity, is that right?

Say we need a viewer that will be able to view one scene, either in a window or
full screen or across multiple screens. osgViewer::Viewer is sufficient for
that, is that right? And osgViewer::CompositeViewer will add complexity, like
event handlers needing to be assigned to each view, and such, is that right?
But beyond this example, I'd like some more general use cases. That would be a
big help.

Where I work, we are using CompositeViewer, but I think for the wrong reasons.
I'd like to know more so I can see if it's the right decision.

Thanks,

J-S
--
__
Jean-Sebastien Guay[EMAIL PROTECTED]
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


  

Be a better friend, newshound, and 
know-it-all with Yahoo! Mobile.  Try it now.  
http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Tracking an object from a moving vehicle

2007-10-03 Thread Harash Sharma
Hi Robert,
   
 Thanks Robert. I don't need the line by line teaching. I only need 
pointers. Thanks for your help. That much should be adequate for me. 
   
  Regards
   
  Harash

Robert Osfield <[EMAIL PROTECTED]> wrote:
  Hi Harash,

Moving the vehicle can be done via osg::AnimationPathCallback attached
to a MatrixTransform.

Tracking the vehicle will require something along the lines of
NodeTrackerManipulator - you'll need to accumulate the world transform
on every frame and compute the view matrix accordingly. How best to
compute this is your problem, I could point you in the right direction
but it's not my role to teach you how to write your application.

Robert.

On 10/3/07, Harash Sharma wrote:
> Hi Robert,
>
> I am trying to simulate observation of an object from a vehicle. The
> task is as follows:
>
> 1. The observer is placed in a moving vehicle that is following a
> pre-programmed path.
> 2. The observer is constantly looking at a fixed (or moving) object from
> within this vehicle.
>
> How do I achieve this effect?
>
> I have tried using the NodeTrackerManipulator. I have derived a class from
> this. The node to be tracked is placed within this vehicle. Override the
> handle function.
>
> But how to compute the view matrices? And is the procedure proper or is
> there any better mechanism? Thanks in advance.
>
> Regards
>
> Harash
>
> 
> Need a vacation? Get great deals to amazing places on Yahoo! Travel.
>
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


   
-
Take the Internet to Go: Yahoo!Go puts the Internet in your pocket: mail, news, 
photos & more. ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Tracking an object from a moving vehicle

2007-10-03 Thread Harash Sharma
Hi Robert,
   
   I am trying to simulate observation of an object from a vehicle. The 
task is as follows:
   
  1. The observer is placed in a moving vehicle that is following a 
pre-programmed path.
  2. The observer is constantly looking at a fixed (or moving) object from 
within this vehicle.
   
  How do I achieve this effect?
   
  I have tried using the NodeTrackerManipulator. I have derived a class from 
this. The node to be tracked is placed within this vehicle. Override the handle 
function. 
   
  But how to compute the  view matrices? And is the procedure proper or is 
there any better mechanism? Thanks in advance.
   
  Regards
   
  Harash

   
-
Need a vacation? Get great deals to amazing places on Yahoo! Travel. ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Computing Image Co-ordinates

2007-09-06 Thread Harash Sharma
Hi Robert,
   
   A sincere thanks to you and Zach for your support. I will try putting in 
the modelMatrix tomorrow morning. I think I may have escaped any problems since 
the node in question is directly the child of the root.  
   
   I am on the completion stage of my OSG based project. I am especially 
thankful to you for your support throughout the lifetime of the project. 
Without that it would have been literally impossible to come to this stage and 
that too far ahead of schedule. THANKS A LOT.
   
   
  regards
   
  Harash
  

Robert Osfield <[EMAIL PROTECTED]> wrote:
  Hi Harash,

Like Zach I've found it difficult working out what you are trying to do.

W.r.t compute world coordinates in window coordinates (I presume this
is what you mean by image coordinates) is to do:

Vec3d windowCoord =
objectCoord * (modellMatrix * viewMatrix * projectionMatrix
* windowMatrix);

You won't need to do the division by w yourself as when using Vec3's
on matrix maths the OSG will automatically do the required division.

The only bit missing in your own example below is the modelMatrix
which you'll need to compute for the nodes in question, each node has
a getWorldMatrices() method then you can use to from local object
coords into world coordinates (this is the modelMatrix in OpenGL
speak). Please note that this method can return a number of matrices
as would be the case when a node has multiple parents.

Robert.


On 9/6/07, Harash Sharma wrote:
> Hi
> Thanks to all. I now know the problem was probably too simplistic for
> anyone to waste their precious time. I was able to figure out the way to
> compute the image co-ordinates from world object co-ordinates. I am writing
> here the method in case some newbie like me needs it.
> If the World Co-ordinate of the object Point is P, Then we can obtain the
> Image co-ordinate C as :
>
> 1. C1 = P * MatView * MatProj * MatWind
> 2. C = C1 / C1.w()
>
> where
>
> MatView:- View Matrix obtained through osg::Camera::getViewMatrix()
> MatProj:- Projection Matrix obtained through
> osg::Camera::getProjectionMatrix()
> MatWind:- Window Matrix obtained through
> osg::Viewport::computeWindowMatrix()
>
> Here the viewport is set to the size of the image to which the scene is
> rendered through RTT.
>
> Regards
>
> Harash
>
>
>
>
> Harash Sharma wrote:
>
> Hi All,
>
> I want some help regarding computation of projected model co-ordinates.
> The problem is like this. I have a cuboidal model (car). I have been able to
> compute its bounding box -- so I know the world co-ordinates of the car
> bound corners. The scene is being rendered to a texture with an image size
> of MxN Pixels. I would like to calculate the image co-ordinates
> corresponding to the 8 bounding box corner co-ordinates. Is there any
> function available in OSG which I can use to achieve this. If not, it would
> be very kind of you to indicate some pointers on how to do it. I thought it
> would be as simple as multiplying the model vectors by the
> osg::View::getCamera()->getViewMatrix() *
> osg::View::getCamera()->getProjectionMatrix(), but I think
> I am missing something.
>
> Please Help. Thanks in advance.
>
> Regards
>
> Harash
> 
> Sick sense of humor? Visit Yahoo! TV's Comedy with an Edge to see what's
> on, when. ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
>
> 
> Boardwalk for $500? In 2007? Ha!
> Play Monopoly Here and Now (it's updated for today's economy) at Yahoo!
> Games.
>
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


   
-
Looking for a deal? Find great prices on flights and hotels with Yahoo! 
FareChase.___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Computing Image Co-ordinates

2007-09-06 Thread Harash Sharma
Hi 
 Thanks to all. I now know the problem was probably too simplistic for 
anyone to waste their precious time. I was able to figure out the way to 
compute the image co-ordinates from world object co-ordinates. I am writing 
here the method in case some newbie like me needs it.
 If the World Co-ordinate of the object Point is P, Then we can obtain the 
Image co-ordinate C as :
   
  1. C1 = P * MatView * MatProj * MatWind
  2. C = C1 / C1.w()
   
  where 
   
  MatView:- View Matrix obtained through osg::Camera::getViewMatrix()
MatProj:-  Projection Matrix obtained through osg::Camera::getProjectionMatrix()
  MatWind:- Window Matrix obtained through osg::Viewport::computeWindowMatrix()
   
Here the viewport is set to the size of the image to which the scene is 
rendered through RTT.
   
  Regards 
   
  Harash
   

   
  Harash Sharma <[EMAIL PROTECTED]> wrote:
Hi All,
   
  I want some help regarding computation of projected model co-ordinates. 
The problem is like this. I have a cuboidal model (car). I have been able to 
compute its bounding box -- so I know the world co-ordinates of the car bound 
corners. The scene is being rendered to a texture with an image size of MxN 
Pixels. I would like to calculate the image co-ordinates corresponding to the 8 
bounding box corner co-ordinates. Is there any function available in OSG which 
I can use to achieve this. If not, it would be very kind of you to indicate 
some  pointers on how to do it. I thought it would be as simple as multiplying 
the model vectors by the osg::View::getCamera()->getViewMatrix() * 
osg::View::getCamera()->getProjectionMatrix(), but I think I am missing 
something. 
   
 Please Help. Thanks in advance.
   
  Regards
   
  Harash

-
  Sick sense of humor? Visit Yahoo! TV's Comedy with an Edge to see what's on, 
when. ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


   
-
Boardwalk for $500? In 2007? Ha! 
Play Monopoly Here and Now (it's updated for today's economy) at Yahoo! Games.___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Computing Image Co-ordinates

2007-09-05 Thread Harash Sharma
Hi All,
   
  I want some help regarding computation of projected model co-ordinates. 
The problem is like this. I have a cuboidal model (car). I have been able to 
compute its bounding box -- so I know the world co-ordinates of the car bound 
corners. The scene is being rendered to a texture with an image size of MxN 
Pixels. I would like to calculate the image co-ordinates corresponding to the 8 
bounding box corner co-ordinates. Is there any function available in OSG which 
I can use to achieve this. If not, it would be very kind of you to indicate 
some  pointers on how to do it. I thought it would be as simple as multiplying 
the model vectors by the osg::View::getCamera()->getViewMatrix() * 
osg::View::getCamera()->getProjectionMatrix(), but I think I am missing 
something. 
   
 Please Help. Thanks in advance.
   
  Regards
   
  Harash

   
-
Sick sense of humor? Visit Yahoo! TV's Comedy with an Edge to see what's on, 
when. ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Projection of objects with no lighting, shading etc.

2007-08-07 Thread Harash Sharma
Hi all,
   
  I am facing a strange problem. I have a few textured objects. The 
textures are not coloured but 16 bit grayscale. In my application I am required 
to project the scene built by these objects. It is important for me to retain 
the exact gray values. (The object texture gray values should be identical to 
the gray values obtained after projection.) But the gray values are lost and in 
fact what some of the bright objects are becoming dark while some darker ones 
are becoming brighter. 
   
 I would be thankful if you can tell me what all state values / settings  I 
need to change to achieve the above goal. 
   
 I have already Disabled Lighting and set the Shademodel to Flat.
   
 Thanks in Advance.
   
  Regards
   
  Harash

   
-
Sick sense of humor? Visit Yahoo! TV's Comedy with an Edge to see what's on, 
when. ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org