Re: [osg-users] latest NVIDIA drivers

2010-09-14 Thread Fred Smith

robertosfield wrote:
 
 The desktop market is already rather stagnant in comparison to the growth of 
 mobile device,
 mindshare and market share are moving away from the desktop to mobile,
 with this sea change Microsoft and hence DirectX will be loosing
 ground.


I am appalled to see that GLSL development tools are almost non-existent.
Microsoft historically succeeded by attracting developers, and I think this 
might also be true for DirectX.

Look at shader debugging tools. I'm not talking about small tools used to play 
for a few minutes with your shader, but real debuggers with breakpoints.

Apart from glslDevil and its very limited capabilities there is just no 
development tool available.

HLSL has plenty of support from Microsoft (PIX), nVidia (FX Composer and now 
Parralel nSight) and ATI (GPU PerfStudio).

ATI does not plan to support GLSL debugging in the near term I was told. nVidia 
doesn't look much more interested. Also, their latest Optimus technology isn't 
compatible with Linux yet, which doesn't help...

That's a pity.

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=31587#31587





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-14 Thread Trajce (Nick) Nikolov
our GLSL code is bug free ... no need of debugging .. *smile*
-Nick


On Tue, Sep 14, 2010 at 5:06 PM, Fred Smith osgfo...@tevs.eu wrote:


 robertosfield wrote:
 
  The desktop market is already rather stagnant in comparison to the growth
 of mobile device,
  mindshare and market share are moving away from the desktop to mobile,
  with this sea change Microsoft and hence DirectX will be loosing
  ground.


 I am appalled to see that GLSL development tools are almost non-existent.
 Microsoft historically succeeded by attracting developers, and I think this
 might also be true for DirectX.

 Look at shader debugging tools. I'm not talking about small tools used to
 play for a few minutes with your shader, but real debuggers with
 breakpoints.

 Apart from glslDevil and its very limited capabilities there is just no
 development tool available.

 HLSL has plenty of support from Microsoft (PIX), nVidia (FX Composer and
 now Parralel nSight) and ATI (GPU PerfStudio).

 ATI does not plan to support GLSL debugging in the near term I was told.
 nVidia doesn't look much more interested. Also, their latest Optimus
 technology isn't compatible with Linux yet, which doesn't help...

 That's a pity.

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=31587#31587





 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-06 Thread Robert Osfield
Hi Nick,

On Fri, Sep 3, 2010 at 6:47 PM, Trajce (Nick) Nikolov
nikolov.tra...@gmail.com wrote:
 nice reading ... :) .. I agree about the DirectX part .. Let start talking
 to Robert to make OSG DirectX compatible :)

We do have the .x plugin ;-)

As a general note, my views practicability of support for Direct3D
haven't changed, it's not really possible without massively affecting
the cleaness and maintainability of the OSG code base.

As a general note for the future, Microsoft look to have missed the
boat on mobile devices, all the most widespread mobile devices
supporting 3D are OpenGL ES all the way.  The desktop market is
already rather stagnant in comparison to the growth of mobile device,
mindshare and market share are moving away from the desktop to mobile,
with this sea change Microsoft and hence DirectX will be loosing
ground.  It may take a few years but it looks likely to be me that
Direct3D won't be seen as a compelling API even on the desktop, not
just by me, but the majority of the industry.  Those stuck on Direct3D
will struggle to adapt and take advantage of the new mobile/desktop
cross over and will eventually whither on the vine and become
irrelevant.

Of course NVidia, AMD and Intel all have the ability to slow down
progress by releasing crappy OpenGL and OpenGL ES drivers...

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-06 Thread Wojciech Lewandowski

Hi J-S   All,

Going back to the subject, I just tested Catalyst 10.8 today and was 
pleasantly surprised that some bugs were not present anymore. So driven by 
this small succes, I went further and also installed newest non-WHQL NVidia 
259.31 drivers (Windows 7 64 bit). They are available through NVidia OpenGL 
DeveloperRelations portal. And these drivers also seem to work correctly as 
pre 256 driver series. At least my pixel lighting shaders compile again. So 
it looks like NVidia guys fixed the bug while we were  complaining ; - ). 
You may check if theses drivers work for You.


Cheers,
Wojtek



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-06 Thread Jean-Sébastien Guay

Hi Wojtek,


Going back to the subject, I just tested Catalyst 10.8 today and was
pleasantly surprised that some bugs were not present anymore. So driven
by this small succes, I went further and also installed newest non-WHQL
NVidia 259.31 drivers (Windows 7 64 bit). They are available through
NVidia OpenGL DeveloperRelations portal. And these drivers also seem to
work correctly as pre 256 driver series. At least my pixel lighting
shaders compile again. So it looks like NVidia guys fixed the bug while
we were complaining ; - ). You may check if theses drivers work for You.


Great, thanks for the update! That's good news on both fronts.

I actually had a co-worker who asked about problems with ATI drivers and 
shadows lately and also replied that 10.8 fixed it for him too.


But the workaround I have for the nvidia issue I think I'll keep, as 
unrolling the loop and using the preprocessor instead of a for loop can 
potentially be faster on older cards... Even if it's ugly and harder to 
read...


J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Wojciech Lewandowski

Thanks J-S,

Interesting. I will keep it mind. I remember that case I reported to NVidia 
was also doing good as long as light number was less than 4. I guess 
compiler was implicitly unrolling the loop, but when number got above 3, it 
could not unroll and the problem started to show.


In the meantime I changed the code to use my own (non gl_) uniforms and this 
also seems to work. However, instead of array of LightStructs I rather use a 
set of arrays each containing single light attribute (for example diffuse) 
for all lights. I did it because it allows for better use of uniform memory.


Cheers,
Wojtek Lewandowski

--
From: Jean-Sébastien Guay jean-sebastien.g...@cm-labs.com
Sent: Thursday, September 02, 2010 4:27 PM
To: OpenSceneGraph Users osg-users@lists.openscenegraph.org
Subject: Re: [osg-users] latest NVIDIA drivers


Hello Wojtek,


I have noticed issues with Shader compilation on 256 (and above) series
on Windows 7. Some fragment shaders using gl_LightSource fields were
generating internal compiler errors. In fact I also posted a bug report
to NVidia. These errors were normally reported by OSG with compilation
log showing assembly cg output that was causing trouble for compiler.
You don't see any compilation errors even with
OSG_NOTIFY_LEVEL=DEBUG_INFO ?
I have attached the bug report I posted to NVidia you can check if it
could related.


We have put back updating our simulators' drivers for this reason. Our 
lighting shaders did not compile with that same error (binding in 
multiple relative-addressedarrays). I looked it up last night to try and 
find a workaround, and found one.


If you simply unroll the loop that iterates over lights, it compiles fine. 
So what you can do is:


  // Don't forget the backslash at the end of each line except
  // the last in the define...
  #define APPLY_LIGHT(i)  \
// the code that you had in your loop before  \
// that uses gl_LightSource[i]\
// for example:   \
// ...\
ambient += attenuation * gl_LightSource[i].ambient;   \
// ...

  APPLY_LIGHT(0)
  APPLY_LIGHT(1)
  APPLY_LIGHT(2)
  APPLY_LIGHT(3)
  // Up to however many lights you want to support

Sure it's just a workaround, I think the behavior you reported to nVidia 
is still a bug they should fix, but at least this allows us to keep on 
working even if some clients or users update their drivers. I always hate 
telling people don't update your drivers to a version newer than x.y, 
because that shows some incompatibility in our own software and also I may 
forget to tell them that it's ok to update once the driver bug is fixed, 
and then they may run into other issues in the future because they have 
old drivers.


Also, I don't have any other loops currently in my code, so I can't say if 
this same compiler error might affect looping over other variables.


See this thread on OpenGL.org for some discussion of this issue. I got the 
inspiration for the #define above there.


http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflatNumber=281429

Someone mentioned in that thread: In the GLSL spec, if you use some of 
the built in array types (like gl_ClipDistance) you need to either declare 
the size of the array or make the source code so that the GLSL compiler 
can figure out the maximum index you access. It might be possible to 
remove the above workaround if we did what he suggests. But I haven't yet 
found out how to do that. What I've tried (uniform 
gl_LightSourceParameters gl_LightSource[gl_MaxLights]; or uniform 
gl_LightSourceParameters gl_LightSource[8]; at the top of the shader) 
didn't change anything.


I really hope nVidia fixes the bug...

Hope this helps,

J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Jean-Sébastien Guay

Hi Wojtek,


In the meantime I changed the code to use my own (non gl_) uniforms and
this also seems to work. However, instead of array of LightStructs I
rather use a set of arrays each containing single light attribute (for
example diffuse) for all lights. I did it because it allows for better
use of uniform memory.


Yes, that strategy was also suggested on one of the forum threads I read 
about the problem.


http://www.gamedev.net/community/forums/topic.asp?topic_id=578784whichpage=1#3688129

Actually someone on that thread said that bugs in deprecated 
functionality were likely to appear often, which to me suggests that 
nVidia might never fix this bug because it relates to built-in uniforms 
which are deprecated, and just using your own uniforms instead of 
gl_LightSource[] works fine, so why should they fix a deprecated feature?


I actually wonder how true that is, based on this text that can be found 
on nVidia's site (http://developer.nvidia.com/object/opengl_driver.html) :


-

4) Is NVIDIA going to remove functionality from OpenGL in the future?

NVIDIA has no interest in removing any feature from OpenGL that our ISVs 
rely on. NVIDIA believes in providing maximum functionality with minimal 
churn to developers. Hence, NVIDIA fully supports the ARB_compatibility 
extension and Compatibility profile, and is shipping OpenGL drivers 
without any functionality removed, including any functionality that is 
marked deprecated.


5) Will existing applications still work on current and future shipping 
hardware?


NVIDIA has no plans for dropping support for any version of OpenGL on 
our existing and future shipping hardware. As a result, all currently 
shipping applications will continue to work on NVIDIA's existing and 
future hardware.


-

But then again, that text might just be PR speak and wishful thinking. 
If some feature is deprecated (OpenGL 2.x, built-in uniforms, etc.), and 
less developers are using it over time, how many resources are they 
likely to devote to fixing bugs that appear in that feature?


Of course, from the version number jump, we might assume that nVidia did 
some big work on their drivers lately, maybe even a rewrite of some or 
all of them. If that's the case, then they might have had to rewrite the 
deprecated parts too, and since they most likely tested these parts less 
than the others, it could explain why we see some bugs in it at this 
point. This is all conjecture on my part of course, but this kind of 
thing happens pretty often in development projects...


What do you think? I don't know what to think at this point, but since 
we have an acceptable workaround I'm not too concerned. I just hope the 
situation doesn't go downhill from here (at least not before OSG has a 
good transition path to OpenGL 3+ that we can use).


In any case, let us know if you ever get news from the bug report you 
sent. In the past when I've reported bugs they've been rather quick to 
respond, but maybe that has changed too...


J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Wojciech Lewandowski

Hi J-S,

Responses below:

[...]
Actually someone on that thread said that bugs in deprecated functionality 
were likely to appear often, which to me suggests that nVidia might never 
fix this bug because it relates to built-in uniforms which are deprecated, 
and just using your own uniforms instead of gl_LightSource[] works fine, 
so why should they fix a deprecated feature?


I think that breaking gl_LightSource usage in fragment shaders is actually a 
major problem. On this forum there are three of us who admitted it affected 
them. Probably few more did not mention it. How many OpenGL developers 
outside OSG community do pixel lighting ?  I bet there thousands if not tens 
of thousands who were or can be affected in the future.
Its not just a minor issue, so I guess NVidia will do something about this 
sooner or later. I hope they will, despite the fact, they did not respond to 
my bug report at all ;-(. I am telling myself they probalby did not, because 
they already knew about it.


I actually wonder how true that is, based on this text that can be found 
on nVidia's site (http://developer.nvidia.com/object/opengl_driver.html) :


-

4) Is NVIDIA going to remove functionality from OpenGL in the future?

NVIDIA has no interest in removing any feature from OpenGL that our ISVs 
rely on. NVIDIA believes in providing maximum functionality with minimal 
churn to developers. Hence, NVIDIA fully supports the ARB_compatibility 
extension and Compatibility profile, and is shipping OpenGL drivers 
without any functionality removed, including any functionality that is 
marked deprecated.


5) Will existing applications still work on current and future shipping 
hardware?


NVIDIA has no plans for dropping support for any version of OpenGL on our 
existing and future shipping hardware. As a result, all currently shipping 
applications will continue to work on NVIDIA's existing and future 
hardware.


-


Yeah, I thought about the same ;-).  Are NVidia continued legacy OpenGL 
support statements still valid ?


But then again, that text might just be PR speak and wishful thinking. If 
some feature is deprecated (OpenGL 2.x, built-in uniforms, etc.), and less 
developers are using it over time, how many resources are they likely to 
devote to fixing bugs that appear in that feature?


Of course, from the version number jump, we might assume that nVidia did 
some big work on their drivers lately, maybe even a rewrite of some or all 
of them. If that's the case, then they might have had to rewrite the 
deprecated parts too, and since they most likely tested these parts less 
than the others, it could explain why we see some bugs in it at this 
point. This is all conjecture on my part of course, but this kind of thing 
happens pretty often in development projects...




I think NVidia was adding support for OpenGL 4.0  4.1 for Fermi based GPUs 
and they screwed something in shader compilers.  If this was a minor issue 
they could ignore it, but I think its huge problem for many developers and 
NVidia should be aware of its importance. So I really think they will fix 
it. If they are not and will continue such attitude, then one day ATI will 
start to have better quality drivers. And it won't happen because ATI 
drivers improved ;-) Btw, I would love ATI/AMD OpenGL drivers improve so we 
have a real competition in OpenGL.


What do you think? I don't know what to think at this point, but since we 
have an acceptable workaround I'm not too concerned. I just hope the 
situation doesn't go downhill from here (at least not before OSG has a 
good transition path to OpenGL 3+ that we can use).




Since, I said before I think they will fix it, I can now play a little 
devils advocate ;-). I actually think that such OpenGL legacy support policy 
prevents faster progress. I think that DirectX has now edge over OpenGL and 
now dictates the pace of 3D graphics. This success was partially achieved by 
Microsoft policy to do a revolution with every major DirectX release. They 
redefined whole API and removed all stuff that did not fit anymore. With 
such attitude developers were forced to adapt but they also gained a lot.
With compatibility  profiles OpenGL cannot progress that quick.  And number 
of OpenGL new and older calls  usage combinations certainly makes building 
fast  well behaving drivers more difficult. So I would rather like to see 
some revolution is OpenGL and adapt my code to pure OpenGL 4.0 profiles than 
deal with unexpected driver errors.


In any case, let us know if you ever get news from the bug report you 
sent. In the past when I've reported bugs they've been rather quick to 
respond, but maybe that has changed too...


As I said I have not heard from them after bug report. But I hope its a good 
sign and it means they are working on the issue.


Wojtek


___
osg-users mailing list

Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Jean-Sébastien Guay

Hi Wojtek,


I think that breaking gl_LightSource usage in fragment shaders is
actually a major problem. On this forum there are three of us who
admitted it affected them. Probably few more did not mention it. How
many OpenGL developers outside OSG community do pixel lighting ? I bet
there thousands if not tens of thousands who were or can be affected in
the future.


I agree, and in fact in my case the error happened in a vertex shader 
when doing a loop over light sources to do per-vertex lighting, so it's 
not just limited to per-pixel lighting shaders. I bet any code that 
loops over gl_LightSource[] in any shader will cause this error.


I also agree with your other points. Hopefully soon OSG will allow us to 
choose which path we use, OpenGL 2.x or OpenGL 3.x+/4.x, and will help 
ease our transition.


J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Trajce (Nick) Nikolov
nice reading ... :) .. I agree about the DirectX part .. Let start talking
to Robert to make OSG DirectX compatible :)
-Nick


On Fri, Sep 3, 2010 at 7:29 PM, Jean-Sébastien Guay 
jean-sebastien.g...@cm-labs.com wrote:

 Hi Wojtek,


  I think that breaking gl_LightSource usage in fragment shaders is
 actually a major problem. On this forum there are three of us who
 admitted it affected them. Probably few more did not mention it. How
 many OpenGL developers outside OSG community do pixel lighting ? I bet
 there thousands if not tens of thousands who were or can be affected in
 the future.


 I agree, and in fact in my case the error happened in a vertex shader when
 doing a loop over light sources to do per-vertex lighting, so it's not just
 limited to per-pixel lighting shaders. I bet any code that loops over
 gl_LightSource[] in any shader will cause this error.

 I also agree with your other points. Hopefully soon OSG will allow us to
 choose which path we use, OpenGL 2.x or OpenGL 3.x+/4.x, and will help ease
 our transition.


 J-S
 --
 __
 Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Jean-Sébastien Guay

Hi Nick,


nice reading ... :) .. I agree about the DirectX part .. Let start
talking to Robert to make OSG DirectX compatible :)
-Nick


Haven't you been here for a while now? You should know by now that's not 
going to happen :-) Search the archives if you're interested, this has 
been discussed in the past.


But let's not let this thread go into API wars, please. We're using one 
API, and this thread just discussed bugs in parts of that API which are 
deprecated but should still be supported by the vendor in question.


J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-03 Thread Trajce (Nick) Nikolov
hi J-S,

I was just kidding :) .. I think I know Robert's view on Microsoft. My
opinion is, even the fact the DirectX is kind of further then OpenGL, what
osg is on top of opengl is my favorite, and not only osg but all the rest of
opensource projects around that are on top of it  (vpb, osgEarth, osgOcean
... ) as well ...

-Nick


On Fri, Sep 3, 2010 at 10:23 PM, Jean-Sébastien Guay 
jean-sebastien.g...@cm-labs.com wrote:

 Hi Nick,


  nice reading ... :) .. I agree about the DirectX part .. Let start
 talking to Robert to make OSG DirectX compatible :)
 -Nick


 Haven't you been here for a while now? You should know by now that's not
 going to happen :-) Search the archives if you're interested, this has been
 discussed in the past.

 But let's not let this thread go into API wars, please. We're using one
 API, and this thread just discussed bugs in parts of that API which are
 deprecated but should still be supported by the vendor in question.


 J-S
 --
 __
 Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-09-02 Thread Jean-Sébastien Guay

Hello Wojtek,


I have noticed issues with Shader compilation on 256 (and above) series
on Windows 7. Some fragment shaders using gl_LightSource fields were
generating internal compiler errors. In fact I also posted a bug report
to NVidia. These errors were normally reported by OSG with compilation
log showing assembly cg output that was causing trouble for compiler.
You don't see any compilation errors even with
OSG_NOTIFY_LEVEL=DEBUG_INFO ?
I have attached the bug report I posted to NVidia you can check if it
could related.


We have put back updating our simulators' drivers for this reason. Our 
lighting shaders did not compile with that same error (binding in 
multiple relative-addressedarrays). I looked it up last night to try 
and find a workaround, and found one.


If you simply unroll the loop that iterates over lights, it compiles 
fine. So what you can do is:


  // Don't forget the backslash at the end of each line except
  // the last in the define...
  #define APPLY_LIGHT(i)  \
// the code that you had in your loop before  \
// that uses gl_LightSource[i]\
// for example:   \
// ...\
ambient += attenuation * gl_LightSource[i].ambient;   \
// ...

  APPLY_LIGHT(0)
  APPLY_LIGHT(1)
  APPLY_LIGHT(2)
  APPLY_LIGHT(3)
  // Up to however many lights you want to support

Sure it's just a workaround, I think the behavior you reported to nVidia 
is still a bug they should fix, but at least this allows us to keep on 
working even if some clients or users update their drivers. I always 
hate telling people don't update your drivers to a version newer than 
x.y, because that shows some incompatibility in our own software and 
also I may forget to tell them that it's ok to update once the driver 
bug is fixed, and then they may run into other issues in the future 
because they have old drivers.


Also, I don't have any other loops currently in my code, so I can't say 
if this same compiler error might affect looping over other variables.


See this thread on OpenGL.org for some discussion of this issue. I got 
the inspiration for the #define above there.


http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflatNumber=281429

Someone mentioned in that thread: In the GLSL spec, if you use some of 
the built in array types (like gl_ClipDistance) you need to either 
declare the size of the array or make the source code so that the GLSL 
compiler can figure out the maximum index you access. It might be 
possible to remove the above workaround if we did what he suggests. But 
I haven't yet found out how to do that. What I've tried (uniform 
gl_LightSourceParameters gl_LightSource[gl_MaxLights]; or uniform 
gl_LightSourceParameters gl_LightSource[8]; at the top of the shader) 
didn't change anything.


I really hope nVidia fixes the bug...

Hope this helps,

J-S
--
__
Jean-Sebastien Guayjean-sebastien.g...@cm-labs.com
   http://www.cm-labs.com/
http://whitestar02.webhop.org/
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] latest NVIDIA drivers

2010-08-05 Thread Trajce (Nick) Nikolov
Hi community,

anyone has experienced some weirdness with the latest drivers from NVIDIA?
My shaders just stopped working with them without any warning/error from OSG
...

-Nick
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] latest NVIDIA drivers

2010-08-05 Thread Wojciech Lewandowski
Hi Trajce,

I have noticed issues with Shader compilation on 256 (and above) series on 
Windows 7. Some fragment shaders using gl_LightSource fields were generating 
internal compiler errors. In fact I also posted a bug report to NVidia. These 
errors were normally reported by OSG with compilation log showing assembly cg 
output that was causing trouble for compiler. You don't see any compilation 
errors even with OSG_NOTIFY_LEVEL=DEBUG_INFO ? 

I have attached the bug report I posted to NVidia you can check if it could 
related.

Cheers,
Wojtek




From: Trajce (Nick) Nikolov 
Sent: Thursday, August 05, 2010 8:47 AM
To: OpenSceneGraph Users 
Subject: [osg-users] latest NVIDIA drivers


Hi community, 


anyone has experienced some weirdness with the latest drivers from NVIDIA? My 
shaders just stopped working with them without any warning/error from OSG ...

-Nick






___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Any OpenGL program doing pixel shader lighting using standard 
OpenGL state variables may encounter this problem.
Compilation of pretty standard fragment shaders doing pixel lighting math 
using gl_LightSource / gl_FrontMaterial uniforms 
fail with internal GLSL compiler error: 

Internal error: assembly compile error for fragment shader at offset 
13132:
-- error message --
line 254, column 1:  error: binding in multiple relative-addressedarrays

After examination it looks like gl_LightSource.attenuation factors are multiply 
generated into intermediate shader constants. This problem was not present on 
Windows drivers before 256 series. I observed the error with driver versions 
257.15 and 
recent WHQL 257.21 on GeForce FX 280 / GeForce FX 8800 GTS and Quadro 5800 
(Quadro Plex D2).
 I have not tested other boards. So it was happening on every board I tested. 

Below are simplified example shaders and generated GLSL compiler error output 
showing the error. 
My program uses 3 lights. Using 1 or 2 lights does not generate error:

Compiling VERTEX source:
  1: varying vec3 position;
  2: varying vec3 normal;
  3:
  4: void main(void)
  5: {
  6:  gl_Position = ftransform();
  7:  position = vec3(gl_ModelViewMatrix * gl_Vertex);
  8:  normal = normalize(gl_NormalMatrix * gl_Normal.xyz);
  9: }


Compiling FRAGMENT source:
  1: varying vec3 position;
  2: varying vec3 normal;
  3:
  4: void main( )
  5: {
  6:  vec4 ambient = vec4(0.0, 0.0, 0.0, 0.0);
  7:  vec4 diffuse = vec4(0.0, 0.0, 0.0, 0.0);
  8:  vec4 specular = vec4(0.0, 0.0, 0.0, 0.0);
  9:
 10:  vec3 normalizedNormal = normalize(normal);
 11:
 12:  for(int lightId = 0; lightId  3; lightId++)
 13:  {
 14:vec3 lightVector = gl_LightSource[lightId].position.xyz;
 15:
 16:float attenuation = 1.0;
 17:
 18:if( gl_LightSource[lightId].position.w != 0.0 )
 19:{
 20:   lightVector -= position;
 21:   float distance = length(lightVector);
 22:   attenuation = 1.0 / (gl_LightSource[lightId].constantAttenuation +
 23:  gl_LightSource[lightId].linearAttenuation * distance +
 24:  gl_LightSource[lightId].quadraticAttenuation * distance * 
distance)
 25:}
 26:
 27:ambient += attenuation * gl_LightSource[lightId].ambient;
 28:
 29:vec3 normalizedLightVector = normalize(lightVector);
 30:
 31:float nDotL = dot(normalizedNormal, normalizedLightVector);
 32:
 33:if( nDotL  0.0 )
 34:{
 35:  diffuse += attenuation * nDotL * gl_LightSource[lightId].diffuse;
 36:
 37:  if(gl_FrontMaterial.shininess  0.0) {
 38:float nDotHV = 0.0001;
 39:float pf = pow( nDotHV, gl_FrontMaterial.shininess );
 40:specular += attenuation * pf * gl_LightSource[lightId].specular;
 41:  }
 42:}
 43:  }
 44:
 45:  gl_FragColor =
 46: gl_FrontLightModelProduct.sceneColor +
 47: gl_FrontMaterial.emission +
 48: ambient * gl_FrontMaterial.ambient +
 49: diffuse * gl_FrontMaterial.diffuse +
 50: specular * gl_FrontMaterial.specular;
 51: }
 52:

Linking osg::Program  id=1 contextID=0
glLinkProgram  FAILED
Program  infolog:
Fragment info
-
Internal error: assembly compile error for fragment shader at offset 13132:
-- error message --
line 254, column 1:  error: binding in multiple relative-addressedarrays
-- internal assembly text --
!!NVfp4.0
OPTION NV_parameter_buffer_object2;
# cgc version 3.0.0001, build date Jun  7 2010
# command line args:
#vendor NVIDIA Corporation
#version 3.0.0.1
#profile gp4fp
#program main
#semantic gl_LightSource : state.light
#semantic gl_FrontMaterial : state.material.front
#semantic gl_FrontLightModelProduct : state.lightmodel.front
#var float4 gl_LightSource[0].ambient : state.light[0].ambient : c[0] : -1 : 1
#var float4 gl_LightSource[0].diffuse : state.light[0].diffuse : c

Re: [osg-users] latest NVIDIA drivers

2010-08-05 Thread Trajce (Nick) Nikolov
Hi Wojtek,

looks like that is it. The shader failing is the lighting shader ... Thanks
!

-Nick


On Thu, Aug 5, 2010 at 11:51 AM, Wojciech Lewandowski lewandow...@ai.com.pl
 wrote:

  Hi Trajce,

 I have noticed issues with Shader compilation on 256 (and above) series on
 Windows 7. Some fragment shaders using gl_LightSource fields were generating
 internal compiler errors. In fact I also posted a bug report to
 NVidia. These errors were normally reported by OSG with compilation log
 showing assembly cg output that was causing trouble for compiler. You don't
 see any compilation errors even with OSG_NOTIFY_LEVEL=DEBUG_INFO ?

 I have attached the bug report I posted to NVidia you can check if it could
 related.

 Cheers,
 Wojtek



  *From:* Trajce (Nick) Nikolov nikolov.tra...@gmail.com
 *Sent:* Thursday, August 05, 2010 8:47 AM
 *To:* OpenSceneGraph Users osg-users@lists.openscenegraph.org
 *Subject:* [osg-users] latest NVIDIA drivers

 Hi community,

 anyone has experienced some weirdness with the latest drivers from NVIDIA?
 My shaders just stopped working with them without any warning/error from OSG
 ...

 -Nick

 --

 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org