Hi everyone!
I've just upgraded OSG from version 2.4 to latest SVN. As a result, I've lost
multitexturing on all objects imported via the LWO plugin.
According to the SVN log, a change in osgPlugins/lwo/Surface.cpp was committed
on behalf of Bob Kuehne on 18th June 2008. Reverting that change re
Hi Wojtek,
Its really weird. I am pretty much sure we have tested this with Vista
x64,
but maybe not the latest drivers. May I send you simpified OpenGL only
repro
I have sent to NVidia as a bug report ? I am curious if the crash will
happen with it as well ?
Sure, I'm glad to help.
Marco
Hi,
on a fresh build of latest OSG with default configuration options, all
examples and applications based on osgViewer crash if any of the
multi-threading modes are enabled (including the default one). This happens
on Vista x64 with an nVIDIA GeForce 8800 GT card and latest drivers
(178.24).
Hi Wojtek,
your test case behaves exactly as osgViewer does. The app crashes when
wglMakeCurrent() gets called for the second time. Setting repeatMakeCurrent
to 0 fixes the crash.
One thing I forgot to say, is that it doesn't always crash: the
success/failure ratio is about 1 to 15. This appli
I'm trying to add multisampling capabilities to my floating-point FBOs, but
I've found what I think is a bug in osgUtil/RenderStage.cpp. Basically, when
I specify a number of samples greater than 0, I actually get a
non-floating-point framebuffer.
See line 380 in RenderStage.cpp:
GLenum intern
Hi Robert,
[...], or perhaps if it's 0 then
RenderStage.cpp code to look for an alternative as a fallback - first
it could check for a texture, then finally drop back to RGBA.
Yes, that's basically what my fix does.
Send me the changed file and I can pontificate on this. Could you
also modi
Hi everybody, long time no see!
I'm in the process of upgrading my development environment to new tools and
libraries. All my OSG-based applications still use OSG 1.2, so I'm now
switching them to the latest version of OSG in SVN. Some of them use my own
viewer library (which in turn uses Scene
Hi,
> setenv OSG_WINDOW '0 0 1280 1024' (or equivalant)
or:
osgviewer --window 0 0 1280 1024 cow.osg
Cheers,
Marco
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegrap
Hi Robert,
> Another possibility I missed is the use of doubles in osg::Plane by
> default, whereas 1.2 used float by default in osg::Plane.
thanks for the hint. However, I think the problem could be related to the
compile/apply of GL objects instead. If I disable the GLObjectsVisitor that
my a
Hi Robert,
> To avoid
> threading problems apparent in 1.2 the OSG-2.x versions initialization
> the GL objects arrays to a larger default, if you app isn't setting
> the number of context down to the number you are using then perhaps
> this is where the discrepancy is occurring.
My viewer is alr
Hi J-S,
> Err, are you sure about that last one? SVN OSG cannot use the same 3rd
> party libs as 1.2 (at least, if you used the 3rdParty libs zip that was
> downloadable from the OSG site). In particular, there were some changes
> in osgText that made it incompatible with the version of freetype t
Hi Wojtek,
> Have you tested with --SingleThreaded option ? In my configuration
> multithreaded configs usually eat more RAM. However its not twice as in
> SingleThreaded.
I'm using my own single-threaded viewer, so I don't even have to choose. :-)
> I have read somewhere that NVidia OpenGL dirv
Hi Robert,
> Could you try running a latest 2.3.x dev release or SVN version's
> osgviewer on the dataset.
this is going to be difficult, as the scene is assembled by the application
itself at run time (scene management is quite articulated in this
application). I could try to export a static s
> For now I'm ujst putting some logging code in OSG 1.2 and 2.3 to show the
> number of compiled GL objects, which I suppose should be the same in both
> versions. I'll let you know if I find any discrepancies.
Here are the results: upon firing the GLObjectsVisitor, the app built with
OSG 1.2 cal
Hi Robert,
> It's a bit of long shot, but you could have a look at what the
> osgUtil::Optimizer is doing with the data, perhaps OSG-2.x is being
> more conservative with collapsing "duplicate" state.
Thanks, precious hint. I've just finished diff'ing the log messages emitted
by osgUtil::Optimiz
> if (itr->second.first->getDataVariance()==UNSPECIFIED &&
>(itr->second.first->getUpdateCallback() ||
> itr->second.first->getEventCallback()))
>{
>itr->second.first->setDataVariance(DYNAMIC);
>}
>else// <- ADDED this block
>{
>
Hi Robert,
> So yes, UNSPECIFIED help solve the problem of deciding what data variance an
> StateAttribute should have, if the user has explicitly set STATIC or DYNAMIC
> then this is a formal decision made by the user and its inappropriate to
> override this with automatic codes that try to wo
Hi Robert,
> I'm looking for feedback from users who have worked on clusters that
> implement some forms of swap ready synchronization. In particular I'm
> looking at any hooks into osgViewer to allow users to implement their own
> swap ready implementation, also a software based swap ready co
18 matches
Mail list logo