On Wed, 2009-08-12 at 19:38 +0200, Kassen wrote: > > that's another issue. as far as i know, fluxus depth sorts on > object level, which means that the camera distance for each > object is calculated then the objects are rendered in order > from back to front. although the particles are depth sorted > when the particle object is rendered, the whole particle > primitive will be rendered at a time specified by its distance > from the camera. this can interfere with the rest of the > scene. polygon level depth sorting could be a solution, but > fluxus does not do that as far as i know. dave will probably > correct me if i'm wrong. > > So this would become a issue when we'd have something like Tetris > shapes fitted together and looked at from a odd angle? In that case > we'd have to treat all parts of the object as separate objects and > potentially chop faces into bits like I did to my plane?
Yep. This is a general problem with the way that hardware rendering works unfortunately. The proper solution is to sort individual polygons, and chop them up as required as you say - which is obviously really slow to do all the time. We could add this, although it would get complicated splitting up things like triangle/quad lists and strips, or indexed polygons - so we'd only want it to happen to a subset of the primitives in the scene. One additional thing I've been thinking of is to give you more control over rendering order by assign primitives to groups or layers, which you can push back or forward like layers in the gimp. cheers, dave
