Hi Stephan,

Paul's suggestion of batching would certainly help.

Another route would be to instance a single cube, but use a vertex program to position and set the colours for each indivual cube.  The osgforest example has a GLSL path that does just this.

Alternatively you could render just one continous tube of data as a single gometry object, and break this tube into seperate segments by applying a 1D texture along its depth.  The 1D texture would be where you encode all the readings.  This does limit you to 2k or 4k samples per geometry though, so you'd have to do it in segments.

Another variation of this theme would be to create the tube of data with sprites and colour each one indivually using a colour array.  This would get you over the 4k hurdle of texture size.  The osgpointsprite example will give you a guide to doing a bit of this.

My favourite for simplicist on efficiency is the point sprite approach, you could scale it up to millions of points too and still get near to 60Hz out of it.

Robert.

On 9/23/06, Stephen Northcott <[EMAIL PROTECTED]> wrote:
Dear all,

I wonder if a few people who are more familiar with getting the best
performance out of OSG can make some suggestions..

Some background on my problem..
I am visualizing a pipeline using ultrasonic readings from the inside
of the pipeline.
There are two items of data. The distance from the inside wall and
the outside wall back to the center, repeated many times around the
pipelines circumference..

To get any kind of decent resolution out of this pipe data we rotate
our sensor around 400 times in a 1 meter length, and take 400
readings for each revolution. That makes a total of 160,000
individual readings I am representing in OSG to give us that 1 meter
length of pipe.

If I simply take those 160,000 lengths of data and render them as
unit cubes, stretched to the length of the distance between the
inside of the pipe and the outside and then make a scene graph with
them all arranged into the form of the pipe.. it makes a great
visualization, but makes a massive scene graph which has very high
overheads for culling and actually rendering also. Not unexpectedly
as it's not very efficient..

Now, I do plan to work on optimizations by pre-processing this data,
having 'stretched unit cube' model data in look up tables so that I
don't have to make 160,000 individual unit cubes, and looking for
repeat patterns and so on in the data that can be modeled as
something other than unit cubes... But in theory we could end up with
a length of pipe with 160,000 truly individual readings in each 1
meter length, and as such I want to be able to render that as quickly
as possible for the worst case scenario.

Can anyone suggest some areas in OSG I should be looking to get this
high object scene to cull and render any quicker?

Thanks for any pointers to material that you feel is relevant.

Kind regards,
Stephen.



_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to