Jean-Sébastien Guay wrote:
> Hello,
> 
>> Interesting topic for graphics freaks :-) 
>> http://blogs.intel.com/research/2007/10/real_time_raytracing_the_end_o.html
>> 
> 
> Well, considering the fact that since about 2000, research in 
> real-time raytracing has been ongoing, but has relied on distributed
>  computing (sometimes 40 machines or more in parallel), combined to
> the fact that we're seeing more and more cores in CPUs these days,
> and finally that raytracing is easy and efficient to parallelize
> (which as we know is not always the case for GPU-based rendering)
> it's only logical that it's coming.
> 
> The question is when will it start being useful for the general 
> population, as the mainstream use of GPUs is now.
> 
> It's a very powerful and intuitive technique that I wish were more in
>  use today. Hopefully it will be in the not so distant future.
> 
> J-S

http://blogs.intel.com/research/2007/10/more_on_the_future_of_raytraci.php

> David Kirk, Nvidia’s chief scientist was a panelist on a panel called
> “When Will Ray-Tracing Replace Rasterization” at SIGGRAPH 02. There
> he said,
> 
> “I’ll be interested in discussing a bigger question, though: ‘When
> will hardware graphics pipelines become sufficiently programmable to
> efficiently implement ray tracing and other global illumination
> techniques?’. I believe that the answer is now, and more so from now
> on! As GPUs become increasingly programmable, the variety of
> algorithms that can be mapped onto the computing substrate of a GPU
> becomes ever broader. As part of this quest, I routinely ask artists
> and programmers at movie and special effects studios what features
> and flexibility they will need to do their rendering on GPUs, and
> they say that they could never render on hardware! What do they use
> now: crayons? Actually, they use hardware now, in the form of
> programmable general-purpose CPUs. I believe that the future
> convergence of realistic and real-time rendering lies in highly
> programmable special-purpose GPUs.” - David Kirk, Nvidia.
> 
> This was five years ago!


hopefully people are going to implement it with ati and open source drivers. 
maybe they add their gpu as a coprozessor to gcc. would enable to compile a 
renderer with ati chip support.

http://www.realityprime.com/articles/scenegraphs-past-present-and-future#tomorrow

lets see what the future brings and where osg is heading.

_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to