FYI the nVidia documentation on this feature is here:
http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf



2014-11-26 15:42 GMT+01:00 Robert Osfield <[email protected]>:

> Hi Christian,
>
> On 26 November 2014 at 14:34, Christian Buchner <
> [email protected]> wrote:
>
>>
>> The way this works is that the nVidia driver (OpenGL or DirectX) is
>> looking for this export symbol in the binary that it's linked against.
>> Whenever it is found, the driver prefers the nVidia graphics card over
>> Intel.
>>
>> I do not think that the binary file in question (i.e. the OSG library, or
>> the program using OSG) is able to control at run time whether or not this
>> export symbol will be visible to the driver.
>>
>
> If it's our application/library providing the global that the driver
> queries for, rather than us getting a setting a global that the driver
> provides then it simplifies things significantly for us - we just have to
> create the global variable via an extern and appropriate export symbol and
> the driver will do the rest.  This will also mean it's safe to implement on
> non NVidia systems and should be possible to place control for it in
> osg::DisplaySettings.
>
> Robert.
>
>
> _______________________________________________
> osg-submissions mailing list
> [email protected]
>
> http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org
>
>
_______________________________________________
osg-submissions mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org

Reply via email to