Hello Johannes, On 2017-03-09 00:13, Johannes wrote: > first off, my understanding is that the TransparencyForceTransparent, > ... give you the possibility to govern the process of transparency > detection of a material. If you have a standard material this can be > done automatically and therefore the default for the attribute is > TransparencyAutoDetection. However, if you have a fancy material as in > my case that does not have any known transparency characteristic you > need to have a way to tell the system that your material is either > transparent or opaque. At least that is my interpetration :-)
ah, I see, that makes sense. > Next, I know that you haven't written the DepthPeelingStage. What I > hoped to learn, is the 'correct' way of writing any Stage with respect > to transparency. If have debugged a little in the stage and see that in > a mixed transparency/opaque scene the rooted stage's renderEnter(Action* > action) method is called exactly once. That mean (correct my if I'm > wrong) that the stage is responsible for discriminating the transparent > from the opaque pass. For instance in the case of shadowing, the > transparent geometry should not be part of the shadow map generating > objects at all, because they do not cast any shadows. So part of a > shadow stage must be able to constrain on opaque geometries. Or in the > depth peeling case the opaque geometry must be rendered with other > materials and possibly other shader configuration. So there must be a > way to detect the render state in the shader. Hmm, I'm not sure if there is a general way. I believe some stages play tricks with the traversal mask - which isn't ideal as those bits should be available to the application. The separation of opaque and transparent objects happens at the DrawTree level. Those are constructed by the RenderPartition to decide the order objects are really being drawn in - allowing to sort by materials (for opaque objects, to reduce state changes) or depth (for transparent ones). > I'm looking for explanation of how these things are to be set up > correctly in any Stage implementation. I think that this is a central > point and I would like to learn these details in order to do my thinks > correctly. Those are valid questions; the thing is that I suspect that there may not be a straightforward answer to them. Especially with the more complex stages that essentially implement entire "rendering algorithms" compositing their results is in general not a problem with a totally obvious solution. Some of these methods were not even invented (or widely used at least) when the Stage abstraction was initially implemented and its initial use was to have a way to render to texture. If you have an algorithm that requires finer grained control over the rendering you may have to go into the bowels of the Render{Action,Partition} and extend what they expose. > On 08.03.2017 17:02, Carsten Neumann wrote: >> >>> Is it even possible to mix opaque and transparent geometry with the >>> DepthPeelingStage core or is the stage incomplete with respect to that task? >> >> I guess that is a possibility. >> > In that case it needs correction to be usable in the common case. True, but see above: compositing arbitrary rendering algorithms in all combinations automatically seems to me like it could turn into tricky problem. >> Transparent objects are rendered after opaque ones in back to front >> order (IIRC using the bounding box center). >> > Yes, but that is not enough in my understanding. There has to be a > pattern of how to write Stages with respect to transparency in the case > that different rendering setups are necessary for transparent and opaque > geometries. > >>> Could you please take a look into the example and give me some hint what >>> I'm doing wrong here? >> >> Not specifically, sorry. In general I would suspect it has something to >> do with the FBOs/Attachments the stages are rendering into and how they >> perform clearing. > I will have a look into the details. > > I really need more explanations for the RenderAction, RenderPartition, > Stage, transparency mix. I have searched the mailing list but did not > get enough information for sorting the issues in my head. The RenderAction (RA) traverses the scene tree visiting the NodeCores along the way. Drawable objects are collected into the active RenderPartition (RP, there is a default one that renders to the window backbuffer), which stores them into its DrawTree. The DrawTree is responsible for organizing objects in the "optimal" drawing order, by default separating opaque and transparent objects and ordering them differently (see above). IIRC the DrawTree is processed (i.e. drawing happens) when its owner RP is finalized. Stages use API on the RA to create additional RPs that then can target custom FBOs, traverse the scene below them multiple times, post-process the FBO attachments, etc. I believe this is how these things fit together. I'll try to answer more specific questions, but please keep in mind that I'm not working with the code on a daily base these days... Cheers, Carsten ------------------------------------------------------------------------------ Announcing the Oxford Dictionaries API! The API offers world-renowned dictionary content that is easy and intuitive to access. Sign up for an account today to start using our lexical data to power your apps and projects. Get started today and enter our developer competition. http://sdm.link/oxford _______________________________________________ Opensg-users mailing list Opensg-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/opensg-users