Re: [osg-users] getting correct light position

2012-03-28 Thread Sergey Polischuk
Hi
check if you treat directional light in a right way (i mean use light direction 
instead of position and such)

28.03.2012, 13:29, Andrey Ibe xry...@gmail.com:
 Hi,

 i am doing an off-line ray-tracing. at the beginning of the computation i 
 fetch all the lights and then use them.

 the problem is, i am not getting the correct light position, obviously. i 
 attached two pictures showing the difference between the position the 
 osg::viewer is getting and the one i'm getting for the ray-tracing. the 
 pictures feature a solo directional light. in the viewer picture you can 
 guess the position of the light by the specular reflection on the pink 
 object. it seems to be exactly on the opposite side compared to the 
 ray-traced image.

 so far this only happened with the directional light for this model, but i 
 start to suspect, that the other lights (in different models) are not 
 positioned correctly either.

 i fetch the lights using a lightsource visitor, taht collects the light 
 sources, then i use getWorldMatrices() and take the first transformation 
 matrix, finally i multiply the light's position and directions.  this is the 
 code:
 Code:

 void RayManager::collectLightSources(WorldLightSourceContainer lc) const {

 osg::ref_ptrLightSourceVisitor lVisitor = new 
 LightSourceVisitor(lc.sources);
 _scene-accept(*(lVisitor.get()));
 for (LightSources::iterator lsIt = lc.sources.begin(); lsIt != 
 lc.sources.end(); lsIt++) {

 osg::MatrixList worldMatrices = (*lsIt)-getWorldMatrices();
 // take the first world matrix
 osg::Matrix m = worldMatrices.front();
 lc.positions.push_back((*lsIt)-getLight()-getPosition()
  * m);
 osg::Vec3d direction = 
 (*lsIt)-getLight()-getDirection() * m;
 direction.normalize();
 lc.directions.push_back(direction);
 }
 }

 and this is the code for my lighSource visitor
 Code:

 LightSourceVisitor::LightSourceVisitor(LightSources lightSources) : 
 osg::NodeVisitor(TRAVERSE_ALL_CHILDREN),
 _lightSources(lightSources) {
 }

 void LightSourceVisitor::apply(osg::LightSource node) {
 _lightSources.push_back(node);
 osg::NodeVisitor::apply((osg::Group)node);
 }

 am i doing something wrong ? the code seems to work for the two other lights 
 (which are turned off in the attached pictures), that's what worries me.
 Thank you!

 Cheers,
 Andrey

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=46645#46645

 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] VDSM and directional light problem

2012-03-28 Thread Robert Osfield
Hi Mike,

I don't know the root of the problems you've seen yet, but as a
general comment, the ViewDependentShadowMap implementation currently
only implements perspective shadow maps for directional lights, and
fallback to using an orthographic shadow map for positioning lights.
So for you it would seem that there is an error when perspective
shadow maps are being computed.

Robert.

On 27 March 2012 13:51, Mike Connell michael.conn...@gmail.com wrote:
 Hello again

 Just in case anyone else gets bitten by this: I didn't find the source of
 the error and instead switched to a positional light instead when shadowing.

 When I was delving in the VDSM code I saw inside
 computeLightViewFrustumPolytope that you start with the same plane equations
 for both positional and directional light, but during the edge check the
 boundary edges found will be treated in the opposite fashion in the
 directional and positional cases - ie those planes that are inverted for a
 positional light are exactly those which are not inverted for directional,
 and vice versa. That struck me as a little odd, but switching the plane
 orientation so that the directional case behaved as positional didn't appear
 to improve anything for my test case.

 best wishes

 Mike


 On 19 March 2012 14:02, Mike Connell michael.conn...@gmail.com wrote:

 Hi!

 I've got a problem with shadow clipping in VDSM. I was interested to test
 the patches from Wang Rui last week, but unfortunately they don't fix my
 issue (this is with trunk from earlier today)

 If my memory serves, directional lights in OpenGL normally have a w=0.0,
 whereas positional lights typically have w=1.0. The problem in our model is
 that our shadows get clipped (at very close range) when we use a directional
 light. I've made a testcase from osgshadow.cpp which illustrates the
 problem, and there if you use a positional light instead the clipping is
 correct.

 To reproduce the problem:
 1. Compile modified osgshadow.cpp
 2. Run with --vdsm --mapres 2048 --noUpdate
 3. Hit space to get the default view which I've modified to show the
 problem, if you zoom out a little you'll see the shadow recover, zoom back
 in and it will be stripped away again.
 4. Change light position from (1,1,1 ,0) to (1,1,1
 ,1) and it will work fine.

 Am I missing something obvious? I know I could just use a second
 positional light, but that seems like it shouldn't be necessary?

 best wishes

 Mike



 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] simplifier tristripifier

2012-03-28 Thread Robert Osfield
Hi Cedric,

The TriStripVisitor is really something that was valuable 5+ years
ago, these days modern graphics cards prefer larger primitive batches
using index triangle meshes, with the index ordering optimized to make
the best use of the vertex cache on the GPUs.  The problem with
tri-stripping is that it creates a large number of PrimtiveSet's which
creates more OpenGL calls than a smaller number of larger
PrimitiveSet.

Use of the MeshOptimizer would probably be a the best default for
Simplifer now.  Right now you can use the
Simplifier::setDoTriStrip(bool) method to switch off the use of
tristripping, but I now inclined to think we should have an enum that
tells the Simplifer what post processing it should do on the data.

Thoughts?
Robert.

On 27 March 2012 00:57, Cedric Pinson cedric.pin...@plopbyte.com wrote:
 Hi Robert and all users,

 I was wondering if it still makes sense to use the stripifier like in the
 osgUtils::simplifier, in the end it generates a lot of draw calls and make
 the geometry slow. To fix this I disabled the stripifier in my osgconv bin
 when using simplifier.
 I was curious if it makes sense for some other usage ? in the osgjs I use it
 but at the end I generate dummy vertexes to make one big strip without that
 it just drains performance.

 I mean if everybody feels the same we could start to submit patch to avoid
 it or as a standalone operation.
 Any other thoughts about it ?


 Cedric Pinson
 Provide OpenGL, WebGL services
 +33 659 598 614 -
 http://cedricpinson.com - http://osgjs.org - http://showwebgl.com


 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Image::readPixels (3.0.1)

2012-03-28 Thread Robert Osfield
Hi Andrew,

Calling setPacking after you've called setData or allocateImage will
break the packing of the image and is not something you should do.
There will circumstances that it would be safe, but in general it
won't be.

What to do about this flexibility of osg::Image that provides the
setPacking method without reallocating the image data is a question we
should ask.  One could just remove the set method completely to avoid
the issue, or perhaps just add a comment that setPacking should only
be done with the data already allocated and assigned to the osg::Image
is compatible with this change.  The later change would not change the
API, but also wouldn't close the potential usage error.  The OSG is an
advanced API so I'm more inclined to retain the flexibility that power
users can take advantage of rather than the trying to catch all the
possible usage errors that end users could come up with.

Robert.

On 27 March 2012 18:53, Andrew Cunningham andr...@mac.com wrote:
 I think there is an issue in  Image::readPixels that I noted in 2.8.x

 Assume you have set the Image packing is set to say 4 , via 
 image-setPacking(4) before this call.
 glReadPixels might try and store 'too much' data into _data ( overwriting 
 memory) because glReadPixels  will be expecting the pixel data is using a 
 packing of 4 but allocateImage was passed a packing of 1 via the default 
 parameter.

 It certainly cause my code to crash depending on the variations of width and 
 height.

 void Image::readPixels(int x,int y,int width,int height,
                       GLenum format,GLenum type)
 {
    allocateImage(width,height,1,format,type);

    glPixelStorei(GL_PACK_ALIGNMENT,_packing);

    glReadPixels(x,y,width,height,format,type,_data);
 }

 changing code to

  allocateImage(width,height,1,format,type,_packing);

 seems to fix the issue.
 It's pretty subtle, and may only affect some people who need a particular 
 format for the image data, and maybe calling setPacking is an abuse of the 
 API so this can be seen as informational only rather than a 'bug' report.

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=46637#46637





 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] osgShadow private members of PSSM

2012-03-28 Thread Daniel Schmid
Hi there

Is there a reason why in ParallelSplitShadowMap, lots of the interesting 
members are private? I would like to derive my own PSSM, and implement 
multitexturing, but for this I do not have access to _textureUnitOffset, which 
is a private member and does't even have a setter method...

Regards
Daniel


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] getting correct light position

2012-03-28 Thread Andrey Ibe
Hi and thank you for the quick suggestion.

i double checked. i treat all three types of the light equally, in terms of 
vector mathematics. the only difference is the attenuation stuff, but these 
values are scalar.

Cheers,
Andrey

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=46651#46651





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osgShadow private members of PSSM

2012-03-28 Thread Robert Osfield
Hi Daniel,

I would recommend using over ViewDependentShadowMap over
ParallelSplitShadowMap, this does mean use a recent dev release rather
than a stable release but the new shadow technique can do parallel
splits shadow maps as well as perspective shadow maps and make it easy
to switch/between or combine them.  It also performs much better.

Robert.

On 28 March 2012 14:26, Daniel Schmid daniel.sch...@swiss-simtec.ch wrote:
 Hi there



 Is there a reason why in ParallelSplitShadowMap, lots of the interesting
 members are private? I would like to derive my own PSSM, and implement
 multitexturing, but for this I do not have access to _textureUnitOffset,
 which is a private member and does’t even have a setter method…



 Regards

 Daniel






 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Any one has updated osgVrpn from Mike Weiblen to OSG-3.xx ?

2012-03-28 Thread Pierre Bourdin
Dear all,
I just wanted to do a short test using osg  vrpn...
I managed to compile osgVRPN  osgVRPNviewer...
It runs, but there's something wrong ? It looks like the camera matrix is
not updated, or more probably it is squashed before the repaint ?
The application is blinking, but no update of the view...

The code was done for osg 2.8.

 I just changed :

osgGA/MatrixManipulator

for

osgGA/CameraManipulator

is there anything else I should have changed ?

Has anybody already update osgVRPN for the 3.xx branch ?

Regards,
Pierre
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] GraphicsWindowingSystem ::DestroyWindow() crash

2012-03-28 Thread Steven Powers
I'm having an issue with window initialization. The crash occurs on the 
GraphicsWindowWin32::setPixelFormat() return. 

Specifically within the ~OpenGLContext() destructor on this line:

Line 558-   ::DestroyWindow(_hwnd);

The crash seems to be caused by some sort of a race condition but I can't seem 
to identify what the cause could be. The _hwnd value always seems to be valid 
and matches the value assigned during the CreateWindowEx() call. Commenting 
DestroyWindow() out works perfectly.

I'm running OSG 3.0.1 and the crash occurs within the osgviewer application as 
well.

My machine is a laptop with both Intel HD Integrated graphics and an Nvidia 
Quadro 4000M. The hard drive has full disk encryption if that makes a 
difference.

Any advice?

Cheers,
Steven

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=46658#46658





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Image::readPixels (3.0.1)

2012-03-28 Thread Andrew Cunningham
Just imagine this scenario of a DrawCallBack

struct SnapImage : public osg::Camera::DrawCallback
{
SnapImage(unsigned int format):
_snapImage(false),_format(format)
{
_image = new osg::Image; 
_image-setPacking(4);
}

~SnapImage(){};
virtual void operator () (osg::RenderInfo renderInfo) const
{

if (!_snapImage) return;

osg::notify(osg::NOTICE)Camera callbackstd::endl;

osg::Camera* camera = renderInfo.getCurrentCamera();
osg::Viewport* viewport = camera ? camera-getViewport() : 0;


if (viewport  _image.valid())
{

_image-readPixels(int(viewport-x()),int(viewport-y()),int(viewport-width()),int(viewport-height()),
   _format,
   GL_UNSIGNED_BYTE);

osg::notify(osg::NOTICE)Taken screenshot.. std::endl; 

}
   
_snapImage = false;
}

mutable bool_snapImage;
mutable unsigned int_format;
mutable osg::ref_ptrosg::Image_image;
};



This will likely crash when width is not a multiple of 4. I not calling 
setPacking() after the image has been allocated.
I wanted a packing of 4 as this matches the packing of BMP

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=46659#46659





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame Rate Decay w/ Silver Lining Integration

2012-03-28 Thread Christiansen, Brad
Hi,

Sorry it has taken so long to respond. The joy of releases!

The issue was resolved by updating to the 300 series drivers (had to use a 
modified inf file to be able to install drivers more recent than 276 for my 
laptop, that is why I didn't try this straight away.)
Turning off threaded optimisations didn't seem to help.

Cheers,
Brad

From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Wojciech 
Lewandowski
Sent: Saturday, 17 March 2012 10:33 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] Frame Rate Decay w/ SilverLining Integration

Hi Brad,

Thank you for the report. I personally investigated and reported another issue 
with VBOs on Nvidia drivers few months ago. And that issue was fixed in 290 
driver series. It may be a long shot but I am just curious if these two 
problems could be related. While investigating my issue 
(http://forum.openscenegraph.org/viewtopic.php?t=9258postdays=0postorder=ascstart=0).
 I got following post from Ruben Smelik:

[..]
This mail reminded me of an issue I had a couple of years ago with VBO's on a 
particular Windows pc with a 9800GX2. I thought it was an issue of that PC, as 
it was quite unstable, so I didn't report the problem at that time. The 
solution I accidently found back then was to turn off Threaded Optimization in 
the NVidia Control Panel (Auto per default).

But now I'm getting the bad result of your test on a GTX 480 (266.58 driver), 
and that fix works again. After turning off Threaded Optimization, I see the 
proper gradient displayed.

Could you try this as well?
[..]


Your drivers are 276.21 so pretty close to 266.58 Ruben used. So I am now also 
curious if you could try to turn off Threaded Optimization and/or try newer 
drivers and see if the problem still exsists.

Cheers,
Wojtek


2012/3/16 Christiansen, Brad 
brad.christian...@thalesgroup.com.aumailto:brad.christian...@thalesgroup.com.au
Hi Woktej,

Thanks for you offer to help out, but I have managed to track it down enough to 
have a good enough solution for now.
For anyone else who stumbles across this issue,  my work around is to disable 
VBOs in silverlining. If I did this by using  the SILVERLINING_NO_VBO 
environment variable it crashed so I simply hard coded them to off in 
SilverLiningOpenGL.cpp. I narrowed down the source of the issue to calls to 
AllocateVertBuffer in the same file.  Even if the buffers are never used, 
simply allocating them for the 6 faces of the sky box is enough to cause things 
to go wrong.

I am using version 2.35 of SilverLining.
VS2010 SP1
OSG trunk as of a month or two ago
Windows 7
Nvidia GTX460M Driver Version 267.21

The same problem was also occurring on another machine. I think that had a 
450GT in it, but otherwise the same.

Cheers,

Brad

From: 
osg-users-boun...@lists.openscenegraph.orgmailto:osg-users-boun...@lists.openscenegraph.org
 
[mailto:osg-users-boun...@lists.openscenegraph.orgmailto:osg-users-boun...@lists.openscenegraph.org]
 On Behalf Of Wojciech Lewandowski
Sent: Saturday, 17 March 2012 1:59 AM

To: OpenSceneGraph Users
Subject: Re: [osg-users] Frame Rate Decay w/ SilverLining Integration

Hi, Brad,

We have SilverLining source code license. I may find few hours in next week to 
look at the problem, if the issue can be reproduced on one of my machines (ie 
Nvidia GF580/GF9400 or GF540M). I would like to have as much info as possible 
to replicate the issue, though. I would like to know:

- System version
- OSG version
- Graphics board and driver version (dual monitor setup ? GPU panel tweaks)
- Compiler/linker VS studio version
- SilverLining version. If not yet tested I would be grateful if you could test 
it with latest trial SilverLining SDK to be sure its not fixed already.

What exactly is done with SilverLining ? What cloud types / wind settings / 
lighnting etc are used ?. Each type of SilverLining Cloud entities  has its own 
specific parameters and can be drawn with different algorithm and use differen 
graphics resources. So it may be important to know what SilverLining resourses 
are in use. Probably the best would be if you could send the sample source you 
are testing.

Cheers,
Wojtek Lewandowski


2012/3/16 Christiansen, Brad 
brad.christian...@thalesgroup.com.aumailto:brad.christian...@thalesgroup.com.au
Hi,

Thanks for the response. I have a little more details of the problem but am 
still completely stumped.

This is my test:
Start my application and leave it running for a while.  Frame rate, memory use 
etc all stable.
Enable silverlinng.
As reported by gDebugger, after the initial expected increase,  the number of 
reported OpenGL calls, vertices, texture objects (and every other counter they 
have)
stays completely stable expect for the frame rate which reduces at a steady 
rate, a few frames each second.

In the earlier thread, it was noted that changing the threading model seemed to 
;reset' the frame rate. I looked into this some more and it