Since I didn't get any replies the first time around, I'd like to offer a possible solution.
What if the main visualization app sent queries (TCP/IP, IPC, etc.) to another process which performed all the intersection tests. Admittedly, this isn't high-performance, but performing these intersection tests from another app most certainly wouldn't result in dropped frames. For my particular problem I can suffer through the lazy intersection tests. One question I have is whether or not I can share the same paged database between two different apps. Thoughts...comments? On 5/16/07, sherman wilcox <[EMAIL PROTECTED]> wrote:
Hello everyone. I'm presently working on performing accurate runtime intersection tests of paged databases. I'm trying to perform these tests in a manner that doesn't result in dropped frames. My app uses an ellipsoid model of the earth that is paged. The terrain model itself is several GB in size and was built using osgdem. I have point data that I need to place just above the terrain @ runtime. I do not know ahead of time the latitude/longitude of the points. I can get as many as 20 a second at peak times. I wrote a small sample app using osgSim::LineOfSight to get the correct height. This works, but I'm told that running it on a background thread that isn't sync'ed with the scenegraph is not safe. If I run it from the main rendering loop, the intersection test can sometimes block for several seconds. Given the quantity of points that are coming in, the performance hit could make the app unusable. My question for the community is this: how do you guys perform accurate intersection tests against paged databases that don't result in frame loss?
_______________________________________________ osg-users mailing list osg-users@openscenegraph.net http://openscenegraph.net/mailman/listinfo/osg-users http://www.openscenegraph.org/