Re: [osg-users] Crash from loading assets in worker thread

2013-11-22 Thread Bradley Baker Searles
Hi Robert-

I agree the osgDB::readRef*File() functions are safe. I was only noting that 
the osgDB::readRef*File() functions that return raw pointers are used in more 
than just Input.cpp and the deprecated wrappers as you'd mentioned.

As for whether using the take methods invalidate the reference count, I think 
we're arguing semantics. I do not consider a reference counted object to be 
properly reference counted if there are raw pointers to it being kept and used 
(performance critical situations excepted).

I'm looking forward to a less intrusive tweak, that'd be great.

Thanks for the responses!

Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=57402#57402





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Crash from loading assets in worker thread

2013-11-21 Thread Bradley Baker Searles
Hi Robert,

The DatabasePager calls using ReadResult all use a ref_ptrosg::Object to 
store the loaded item, unlike the code in 
src/osgwrappers/deprecated-dotosg/osg/texture2d.cpp :: 
Texture2D_readLocalData(Object, Input), which uses Input::ReadImage(filename) 
to load the image.

Input::ReadImage() calls osgDB::readImageFile() which loads the image via the 
Registry (which keeps a reference), then violates the ref_ptr to return a raw 
pointer. This code is not thread safe. If a render happens before this raw 
pointer is safely tucked into a ref_ptr, the asset is deleted as described 
previously which makes the raw pointer junk.

This crash is happening deep within OSG code paths on both threads. Our app 
isn't affecting anything but the timing here.

Is there any reason I should not submit a patch to the OSG that uses 
thread-safe ref_ptr's for any load path that uses the Registry Object Cache 
to load assets?

Thanks,
Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=57370#57370





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Crash from loading assets in worker thread

2013-11-19 Thread Bradley Baker Searles
Hello,

Summary:

Since upgrading to OSG 3.2, our application will crash during render while 
loading a legacy .OSG file on a worker thread. The legacy/deprecated OSG code 
will load objects that are put into the object cache, but then use takeImage 
to get a raw pointer to a ref-counted object. If the threads are serviced in a 
particular order the main thread will do cache cleanup and delete this object 
that looks like it has no references while the load thread is pre-empted.

More Detail:

In this use case, the rendered scene graph is empty, and the loaded assets are 
not introduced into the scene graph until the worker thread returns. Let me map 
out the basic flow of the error case (not that freezing the worker thread when 
it has the raw pointer can trigger this crash every time):


Code:

WORKER THREAD:
* Texture2D_readLocalData (src/OsgWrappers/deprecated-dotosg/osg/Texture2D.cpp) 
calls:
* Input::readImage(), which calls 
* osgDB::readImageFile(), which uses rr.takeImage() to return a raw pointer.
* NOTE: the only reference count to the Image at this point is from the cache 
object.

MAIN THREAD:
* Registry::updateTimeStampOfObjectsInCacheWithExternalReferences()
*** timestamp is not updated because there are no counted external references!
* Registry::removeExpiredObjectsInCache()
*** this deletes the object because the timestamp was not updated above.

WORKER THREAD:
* raw Image pointer is added to the texture, but it's not valid at this point. 
Crash ensues.


  

There are several fixes I can think of offhand, some seem more proper than 
others. I'm not sure why the raw pointer usage is there, it would seem to 
violate the principle of the smart pointers.

POSSIBLE FIXES:

 1. Remove code that sort of uses ref counting, or remove ref-counting along 
entire speed critical paths. This would seem most correct at a glance.
 2. Do not render while loading. We've done this as a temporary fix, but the 
drawbacks are obvious. Heck, maybe this is not a supported feature and I have 
just somehow missed this along the way.
 3. Introduce an UNDEFINED TIME to initialize cache objects to when they're 
added (instead of the 0.0 that the parameter currently defaults to). This would 
allow the problem to be masked to not happen as long as the objects are 
re-ref-counted within the timeframe alotted to the object cache. I implemented 
this as well, and it works in our situation, although if I were to freeze the 
load thread long enough in the right spots I could get it to fail, implying 
it's not proper in my mind.
 4. Don't use old style OSG files. This is not really a good option for us, 
we've been using OpenSceneGraph for years, and we have a lot of customers with 
legacy OSG data out there. Our typical use has a mixture of OSG and OBJ files, 
the OBJ load paths seem to use refcounted pointers throughout.
  
I would have supplied a patch with this post, but not fully understanding the 
intentions of this code prevents me from creating something that I'm sure is 
correct.

Build/Test Environment:
OSG 3.2, using wxWidgets 2.9.5 for windowing
Windows 8.1, 64-bit
Visual Studio 2010

Input, insight, or better solutions would be greatly appreciated.

Thanks
Baker
[/code]

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=57327#57327





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] nVidia bug with multiple windows

2013-04-30 Thread Bradley Baker Searles
FYI - This bug is fixed as of the 320.00 Beta drivers.

-Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=53868#53868





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] nVidia bug with multiple windows

2012-11-28 Thread Bradley Baker Searles
Just as an update, I submitted this issue to NVIDIA and they were able to 
reproduce it on their end. They are currently working on it, although they do 
not have an ETA at this point.

In addition to our workaround of enabling VBOs, they offered up this possible 
fix (I have not tried it, very busy and just using VBO w/ NVIDIA at the moment):

“A possible WAR for the developer would be to call 
glClientActiveTexture(GL_TEXTURE0) explicitly where texture unit 0 is used.”

If I'm notified they have a driver fix I'll update this thread.

Regards-
Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=51271#51271





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] nVidia bug with multiple windows

2012-10-22 Thread Bradley Baker Searles
Hi Leigh,

Thanks for the pointer to that revision, however our issue persists at revision 
13170.

I posted that repro case to their dev support email address, and they let me 
know they're taking a look at it. I'll update this thread with any progress 
made.

Thanks-
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=50720#50720





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] nVidia bug with multiple windows

2012-10-17 Thread Bradley Baker Searles
Hi J-S,

That's a good idea, and I just submitted the example to them. I'll go ahead and 
post the zip file here too just for posterity, as it's self-contained other 
than the Visual Studio 2010 runtimes.

Actually, I posted it via their website and not directly to that e-mail 
address, perhaps I'll do that as well. My initial plan was to put it on their 
dev forums, but apparently they're still down from the big security breach a 
few months ago!

Thanks-
Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=50650#50650




Attachments: 
http://forum.openscenegraph.org//files/20121017_nvidia_osg_multiplewindowissue_180.zip


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Memory issues when using uniform shader variables on Intel HD graphics

2012-10-16 Thread Bradley Baker Searles
Hi Clemens-

What version of the OSG are you using?

There was a bug fixed in OSG r13015 that would cause out of memory issues for 
our application. It still seems like the Intel driver uses more memory than the 
other drivers (AMD/nVidia), but it's workable now.

http://www.openscenegraph.org/projects/osg/changeset/13015

As an aside, the other fix I needed to do for the Intel drivers was to manually 
compute the half-vector, as gl_LightSource[].halfVector was not being provided 
with the latest drivers.

With those two fixes we're running quite nicely on the HD 4000 graphics we have 
here.

Take care,
Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=50633#50633





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] nVidia bug with multiple windows

2012-10-16 Thread Bradley Baker Searles
Hi,

Short Version

Run the attached files with the osgCamera example on your nVidia Geforce GPU 
(we're on Windows), with the parameters -s -3 [FILENAME]. You'll notice the 
file 1_UV.OSG renders the single triangle fine. 2_UV.OSG does not render 
correctly across the windows (it  added only a second UV set).

Has anyone else seen this and have any other solutions or workarounds, other 
than enabling VBO and/or multi-threaded viewer?

Long Version

We've identified an issue with nVidia Geforce hardware and geometry with 
multiple UV sets when being displayed on multiple windows. This may be a driver 
bug, as we don't see it on Intel or AMD GPUs (or interestingly, an nVidia 
Quadro in a laptop here), but there are some interesting details.

I've narrowed the problem down to an OSG file containing a single triangle, and 
I can reproduce the issue in the osgCamera example. I will attach the minimal 
example files.

The issue only seems to appear in the single threaded viewer model. When 
running with the parameters -s -3 in osgCamera, we get the following behavior:

1_UV.OSG - Single triangle with one texture, one UV set. Works fine.

2_UV.OSG - Same as above, but added one additional UV set. Initially it was 
another texture added as well, but that is unnecessary to generate the problem. 
Triangle only renders in one of the spawned windows.

2_UV_VBO.OSG - Same as 2_UV.OSG, but with VBO turned on. Renders fine across 
all windows.

So the two workarounds we've found so far is to enable VBO on all geometry, or 
enable the multi-threaded viewer.

Using VBO across the board does give us a performance hit in our application, 
from 5% to 35% depending on the scene being rendered and the GPU/Driver combo 
(including Intel and AMD).

When we first enabled it on our OSG Composite Viewer, the multi-threaded mode 
was set, but startThreading was never called (we're creating our own windows, 
so realize is not called). Interestingly, this seems to have made the geometry 
corruption issue go away even though the additional threads were not spawned... 
Enabling the multi-threaded viewer properly has caused some other misbehavior 
in our app, but we may be able to work around the problems. 

Platform:
Windows 7 x64
16 GB RAM
nVidia 680, driver 306.97 (we've tried back to 275.33 with a 460 board)
OSG 3.0.0 via VS 2010 SP1
Two monitors (so my osgCamera example spawns across them both, although we've 
reproduced the issues on a single monitor machine).

Any help, suggestions, or commiseration would be appreciated :)

Thanks!
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=50634#50634




Attachments: 
http://forum.openscenegraph.org//files/20121016_nvidia_multi_context_render_corruption_191.zip


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osg::Image pixel format conversion

2011-11-10 Thread Bradley Baker Searles
Hi Robert-

Yeah no worries, I just ended up mucking with the bits directly, I was just 
wanting to use built-in functions if they were available.

For anyone searching with a similar issue (this was in Windows btw), I ended up 
just setting all of the alpha elements to 0xFF within my GL_BGRA image. I was 
going to take it to GL_BGR, but the CF_DIB clipboard format wants a 4-byte 
aligned width, so instead of padding or cropping the original captured image, I 
just kept it in a 4-byte per-element format. 

If you were to keep it in BGR and you don't have it properly aligned the image 
will be sheared and look very wrong.

Thanks--
Baker



robertosfield wrote:
 Hi Baker,
 
 On 7 November 2011 19:32, Bradley Baker Searles 
 
  How would you use gluScaleImage to convert formats? It only takes
  one format parameter, and internally it uses that one format to
  compute the size of the before and after image? It seems suited only
  to scale the image, not change formats... Am I missing something?
  
 
 
 You are right I there is only one format parameter, I was thinking
 about the GLenum typeIn, typeOut parameters which are for the data
 type, so won't be suitable for your purpose.
 
 The include/osg/ImageUtils header contains a number help functions for
 processing image data so you could probably use this.  However, if you
 know that you have GL_RGBA and GL_UNSIGNED_BYTE then it would be
 easier to write a quick converter function that flips the 0 and 2
 bytes to give you GL_BGRA.
 
 Robert.
 ___
 osg-users mailing list
 
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
 
  --
 Post generated by Mail2Forum


--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=43818#43818





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] ViewDependentShadowMap - Everything shadowed in subsequent panes.

2011-11-10 Thread Bradley Baker Searles
Hi Robert-

I built and tested the shiny new ViewDependentShadowMap code yesterday from 
trunk (r12877), in the hope that it would alleviate some issues we are seeing 
in a large scene with the LispSM technique (which works great most of the time, 
but from some camera angles the light source camera's mapping into the shadow 
texture goes haywire).

The new method works much better in our use case, as far as having a stable 
usage of the shadow texture is concerned.

One issue I'm seeing is that we can hop back and forth between a single window, 
and a multi-screen setup, and the first time I hop into multi-screen it's fine, 
but subsequent attempts will have the second window showing everything within 
the shadow volume as being in shadow.

This issue does not happen with LispSM.

We're using wxWidgets for our window management, along with wxGLCanvas. A slave 
camera does the render to the additional windows (just 1 extra in this case). 
We use custom shaders, and I have VDSM using texture channel 7 for the shadow 
map.

I'd imagine there is an example that can bring up multiple windows (to see if 
it's a general issue, or ours), but one doesn't come to mind immediately. I did 
2 windows with osgViewer, but it doesn't appear you can take one down and then 
bring it back up again, like we're doing in our app. 

I apologize that I don't have concise repro steps here. I may be able to spend 
more time on this later, and I'll update this thread if I have more information.

Thanks-
Baker
[/list]

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=43819#43819





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSGExp specular

2011-06-28 Thread Bradley Baker Searles
Hi Guys,

We're running into this issue as well, and I'd agree that to synchronize what 
the artist sees in MAX with what is rendered in OSG, the scale by 9.99 should 
not happen. Currently the MAX specular level of 999 is required to get an OSG 
specular material value of 1.0, which means the object is glowing hot in MAX, 
and normal looking in our OSG app.

This change would also allow the artist the range of 0.0 to 9.99 in the OSG 
material, so they could oversaturate it if desired.

I'm also of the opinion that the exponent shouldn't be scaled from the MAX 
(0-100) to OSG (0-128). It's just the exponent for the specular contribution. 
I'd personally trade consistency between the authoring environment and our OSG 
app for losing out on the exponent values of 101+ (which we aren't using 
anyway).

So I'd propose that this code in MtlKeeper::createMaterial(...) remove the 
divide by 9.99 from the setSpecular() call, and the setShininess() call 
multiply by 100.0 instead of 128.0.

Thanks-
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=41001#41001





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OBJ reader and Ns / shininess

2011-04-08 Thread Bradley Baker Searles
Hi David-

Thank you for your thoughtful reply.

The more I look at this, the more sure I am that the OBJ importer should not be 
scaling the Ns value into the OSG shininess. I've attached some screens with 
the (tiny) change I made, which behaves as you'd expect (unlike the scaling). 
When we run in fixed function the behavior is the same.

3DS Max allows you to set the lighting model, it defaults to Blinn, which isn't 
much different from Phong (what we use in our shaders, as well as (I think) 
what the fixed function mode uses).

So anyway, this line in src/osgPlugins/obj/ReaderWriterOBJ.cpp:


Code:

osg_material-setShininess(osg::Material::FRONT_AND_BACK,(material.Ns/1000.0f)*128.0f
 ); // note OBJ shiniess is 0..1000.




changed to this:


Code:

float shininess = (material.Ns  128.0f) ? material.Ns : 128.0f;
osg_material-setShininess(osg::Material::FRONT_AND_BACK, shininess); // note 
OBJ shiniess is 0..1000, clamp.




And all is well. If no one objects I'll suggest this change get submitted into 
trunk.

I'm attaching a few files to illustrate the problem of converting an Ns value 
of 20 to a shininess of ~2.5.

Thanks-
Baker

P.S. Yeah, every place I've worked professionally has already had a Brad, so I 
just started going by my middle name, it's so much easier that way :) Sorry for 
the confusion.

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=38339#38339




Attachments: 
http://forum.openscenegraph.org//files/max_obj_shininess_173.jpg
http://forum.openscenegraph.org//files/max_obj_shininess_default_importer_214.jpg


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] OBJ reader and Ns / shininess

2011-04-07 Thread Bradley Baker Searles
Hi,

We have a fairly large array of legacy shapes that we're loading with our OSG 
application in the OBJ/MTL format, and I think the OSG reader may not be 
handling the shininess value correctly.

It has been noticed that the built-in OBJ exporter (v0.97b by guruware) in 3ds 
Max 2011 outputs its shininess value in its native range (0-100), and this 
value is then scaled (by a large amount) within the OSG OBJ reader. The reader 
is expecting the OBJ (well, actually the accompanying MTL file) to have an Ns 
(shininess) range of [0 ... 1000], so it scales the value to the [0 ... 128] 
range.

I think the OSG OBJ reader should just clamp the Ns value from [0 ... 128], 
since it's simply an exponent to the specular lighting equation.

I also think the OBJ exporter in Max should probably support the full [0 ... 
1000] range of the OBJ format, and we've filed a support ticket with Autodesk, 
who have confirmed this as a bug (along with another weird behavior or two with 
the OBJ exporter).

Sources:

The OBJ/MTL formats are old Wavefront technologies, and there does not appear 
to be a current definitive format specification. The best sources I've found 
are:

Material Library File (.mtl) specification from wotsit.org (an actual 
Alias|Wavefront document from 1995). Ns is simply called a specular exponent 
here. I haven't found further detail in other documents I've found:
http://www.wotsit.org/list.asp?fc=2

The web page listed as a source in src/osgPlugins/obj/ReaderWriterOBJ.cpp 
(where the conversion takes place), which doesn't exist anymore, but can be dug 
up via the internet time machine. This page says Ns is *clamped* to [0 ... 128]:
http://replay.waybackmachine.org/20071214191634/http://java3d.j3d.org/utilities/loaders/obj/sun.html
 (time machine version from Dec 17th, 2007, nearest to the commit date of Dec 
10th, for revision 7651 where the code was introduced)

If anyone is familiar with this issue, or could illuminate my thinking it would 
be greatly appreciated.

Thanks,
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=38298#38298





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StandardShadowMap on ATI

2010-08-04 Thread Bradley Baker Searles
Hi Wojtek,

Ahh, good catch with the improper derivation!  :)  Thank you. I'm attaching the 
corrected st.h file.

I just tried to get the osgShadow example working (-4 --lispsm), and it didn't 
work with any of the workarounds that I'd used in my code (not using variables 
to index light sources, don't use derived light values such as 
gl_FrontLightModelProduct, skip empty program add, etc).

Now the standard LispSM does its lighting in the vertex shader, and all of my 
Programs do their lighting in the fragment programs...  so there may be 
additional tricks for vert shaders that I'm not yet aware of?

I haven't tried any of the other techniques yet, but perhaps I'll see if those 
can be patched up.  But... even if we could patch them, the fixes are ugly 
enough that I don't think they'd belong in the osg source.  Perhaps I should 
post some of these issues on the ATI forums.

Thank you!
Baker

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=30533#30533




Attachments: 
http://forum.openscenegraph.org//files/st_191.h


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StandardShadowMap on ATI

2010-08-03 Thread Bradley Baker Searles
Alright, so just wanted to post the code for the overrides I did, just in case 
anyone stumbles upon this forum entry and wants to see precisely what I did.  
It's a bit different than the example that Wojtek posted in this thread (as 
referenced above):

http://forum.openscenegraph.org/viewtopic.php?t=4178   (Wojtek's 
MyViewDependentShadowMap example)

 :D 

Seems to be working very well.

Thanks again-
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=30504#30504




Attachments: 
http://forum.openscenegraph.org//files/osgforum_ati_107_drivers_266.jpg
http://forum.openscenegraph.org//files/st_174.cpp
http://forum.openscenegraph.org//files/st_482.h


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StandardShadowMap on ATI

2010-07-29 Thread Bradley Baker Searles
Hi Wojtek-

  Thanks again for the thoughtful response.

  I believe I have enough information to work around this issue now.  I just 
wanted to post my last set of findings for anyone else who might be helped by 
the thread.

  I installed catalyst 10.4 and 10.2, and I never quite got to a point where 
everything worked as expected.  I will post some screenshots below.  I am 
thinking I will probably just go with the latest drivers and work around the 
NULL program driver bug, either by hacking our local OSG copy, or doing the 
proper ViewData override as you suggested (haven't looked at those details yet).

  Ahh, yes the texture matrix issue in the glsl.  I had indeed already done 
that (just omitted the matrix altogether in the shader as you referenced). I 
didn't think to attach an osg TexMat identity matrix at the root, so I was 
confused by your reference to the osg TexMat class  :)  But the TexMat solution 
is probably a more appropriate workaround.

Take care-
Baker Searles

P.S. Yes I too had a dual-gpu ATI setup at home last year (dual 4850 in 
crossfire) and they were crazy fast, but I had render issues in Fallout 3 (HDR 
effects would blink on and off) so I had to drop down to using just one of the 
boards.  I guess you can't do that though with the 5970 having both chips on 
one board  :)

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=30405#30405




Attachments: 
http://forum.openscenegraph.org//files/ati_102_osgshadow_4_sm_211.jpg
http://forum.openscenegraph.org//files/ati_102_osgshadow_4_lispsm_299.jpg
http://forum.openscenegraph.org//files/ati_102_osgshadow_4_stsm_112.jpg
http://forum.openscenegraph.org//files/ati_102_osgviewer_coupe_106.jpg
http://forum.openscenegraph.org//files/ati_102_coupe_143.jpg


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StandardShadowMap on ATI

2010-07-28 Thread Bradley Baker Searles
Hi Wojtek,

  Thank you so much for the response.

  Just for reference, the modifications I've made in my glsl shaders to get it 
running on ATI were mainly the following:

  
  - only index texture coordinates with constants, variables (even const) don't 
seem to work.
  - ensure all variables are initialized (sloppy on my part)
  - hardcode to light 0, the accumulation loop did not work.
  - don't use gl_FrontMaterial / gl_BackMaterial, as the values didn't work as 
they did with nVidia, and produced unexpected values.
  - don't use gl_FrontLightModelProduct, same reason as previous.


  I'm not using alpha in a nonstandard way (at least not intentionally), and I 
don't use discard for anything.  That's good information though, and I will 
fiddle around with my shaders to see if they have any impact (although with the 
NULL program, they shouldn't be used, eh?).

  So having said that, the use of a NULL program indeed does seem like it 
should be valid, as specified in the OpenGL 21.20061201 spec (p. 75), and 
indeed OSG does call glUseProgram(0),  which should revert to fixed function.  
So this is a driver bug, right?

  Still, I agree it might be useful to let the user provide a Program to use, 
or whether to attach the NULL program.

  I ran osgshadow on the ATI 4670 with the 10.7 drivers, and I'll post the 
results below.  I'm also attaching a screenshot of the vehicle in osgviewerWX, 
which seems to be having some texture coordinate issues (again, fine on 
nVidia).  Hmm, even the osg cow seems wrong.

  I don't know precisely what the Texture Matrix fix is that you're speaking 
about, but I can poke around the forums and see if I find the issue.

  And by the way, thanks for the lispsm submission, it's really well done!

Thanks
Baker Searles

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=30355#30355




Attachments: 
http://forum.openscenegraph.org//files/osgcow_504.jpg
http://forum.openscenegraph.org//files/osgviewerwx_coupe_325.jpg
http://forum.openscenegraph.org//files/osgshadow_4_stsm_369.jpg
http://forum.openscenegraph.org//files/osgshadow_4_lispsm_133.jpg


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] StandardShadowMap on ATI

2010-07-27 Thread Bradley Baker Searles
Hi,

  I have been working for a while on compatibility issues with nVidia and ATI 
in our OpenSceneGraph application.  Aside from a fair number of GLSL shader 
source issues, one of the last remaining problems was that textured objects 
were not casting shadows (please see first attached screenshot).

  After a little detective work, it seems that the empty osg Program attached 
to the StateSet belonging to the _camera in StandardShadowMap::ViewData::init() 
is causing the problem:


 
 // optimization attributes 
 osg::Program* program = new osg::Program;
 stateset-setAttribute( program, osg::StateAttribute::OVERRIDE | 
 osg::StateAttribute::ON );
 stateset-setMode
 ( GL_LIGHTING, osg::StateAttribute::OVERRIDE | 
 osg::StateAttribute::OFF );
 stateset-setMode
 ( GL_BLEND, osg::StateAttribute::OVERRIDE | 
 osg::StateAttribute::OFF );
 
 #if 0 // fixed pipeline seems faster (at least on my 7800)
 program-addShader( new osg::Shader( osg::Shader::FRAGMENT,
 uniform sampler2D texture;   
 \n
 void main(void)  
 \n
 {
 \n
  gl_FragColor = texture2D( texture, gl_TexCoord[0].xy ); 
 \n
 }
 \n
 ) ); // program-addShader Fragment
 
 program-addShader( new osg::Shader( osg::Shader::VERTEX,
 void main(void)  
 \n
 {
 \n
gl_Position = ftransform();   
 \n
gl_TexCoord[0] = gl_MultiTexCoord0;   
 \n
 }
 \n
 ) ); // program-addShader Vertex
 #endif
 


  The setting of the fragment and vertex shaders to the stateset are #if 0'd 
out, but the osg Program setting is not.

  Anyway, if I skip adding the empty program, things seem fine.  Oddly, if I 
keep the empty program and skip the AlphaFunc above it that allows texture 
alpha to affect the depth buffer render, it casts shadows properly (well, 
without the alpha penetration of course).

  Everything works fine with nVidia drivers, even with the empty program.

  My scene has a mixture of shader programs depending on how the objects are 
setup, but they all include the shadow shader source (except the flat shaded 
objects like the force arrows shown below).

  So it seems to be an easy fix, but I was curious if anyone else had run into 
this problem?  I did many searches on the forum and came up empty.

  And if anyone has deeper illumination into how the empty program could affect 
the statesets (seems like a bug to me since nVidia works fine), or if there is 
a more appropriate fix I'm not thinking of, I'd appreciate the feedback.

Thanks-
Baker Searles


Hardware Versions:
nVidia GTS 250
ATI Radeon HD4670

Software Versions:
OSG @ 2.8.3
ATI Display Drivers @ 10.7
nVidia Display Drivers @ 258.96

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=30329#30329




Attachments: 
http://forum.openscenegraph.org//files/ati_emptyglslprogram_noalphafunc_163.jpg
http://forum.openscenegraph.org//files/ati_noemptyglslprogram_192.jpg
http://forum.openscenegraph.org//files/ati_emptyglslprogram_debughud_103.jpg
http://forum.openscenegraph.org//files/ati_emptyglslprogram_167.jpg


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org