On 2009-11-28 16:10-0800 Alan W. Irwin wrote: > On 2009-11-27 11:19-0800 Alan W. Irwin wrote: > >> If you run >> >> examples/c/x16c -dev xwin -exclude -nx 20 -ny 20 >> >> a smooth donut is excluded from the 5th page (as expected, since years ago >> Rafael modified the defined algorithm for plshade(s) to give smooth edges >> using some sort of iterative technique). However, if you use >> coarser resolutions, e.g., >> >> examples/c/x16c -dev xwin -exclude -nx 10 -ny 10 >> >> Rafael's iteration sometimes still gives reasonably good results (some edges >> of the donut are smooth) at coarser resolutions but also sometimes gives >> horrible results (the edges of the donut are extremely misplaced or missing >> altogether). I interpret these results as a numerical robustness issue (a >> starting solution that is so bad that it cannot converge, or a convergence >> criterion that stops the iteration before it has converged). > > I have looked into this further. Rafael's iteration technique (routine > bisect in plshade.c) for finding the edge of defined regions is the > bisection method. According to Numerical Recipes, Chapter 9.1, bracketing > issues can be important for the bisection method (especially for our > gradient use case where multiple solutions for finding the edge of > complicated polygons can exist).
I have now (revision 10681) cleaned up and commented exfill and made the logic more robust, but I have not yet made any fundamental changes to the exfill algorithm. Currently that algorithm simply drops all undefined vertices from the polygon and defines new vertex points on each side of the undefined vertices (or vertex) with a bisection method right at the edges of the undefined regions that intersect the polygon. There are two fundamental bracketing issues with the current algorithm. (1) For two consecutive vertices of the same type (say undefined), it is assumed the entire region between the vertices is also of the same type. This is the principal issue with the software fallback gradients where the polygons tend to be long skinny rectangles for a gradient defined with 2x2 points. Assuming you have a defined region consisting of some shape in the middle of the 2x2 area, all points of those 2x2 polygons are in undefined regions so it is assumed the whole polygon is undefined and you get no gradient result at all! To deal with this issue, you have to have some criterion (probably based on distance between the vertices) to decide to search between vertices of the same type for two transitions (one to opposite type from the two vertices and one back again). This is analogous to attempting to bracket two roots of a function between two independent variables where the function is the same sign at the two independent variables. You may be wasting your time because there are no roots there or there may be two (or more) roots there; you just never know. (2) Deal with the situation where there are two or more regions of the polygon which are undefined. Right now the result is two regions are simply spanned (as you can tell from the above description of the algorithm where undefined regions are simply dropped from the polygon). That may be the best approximation if the two undefined regions are unrelated, but if the two undefined regions for the polygon are the two entrances for a "river" of undefined area flowing through the polygon, then the best approximation would be to split the polygon into the two defined pieces. This turns out to be the issue when attempting to use the -exclude option for example 16 for small nx, ny values. Some rectangles just completely span the range of the excluded donut so become completely defined due to issue (1). But even if issue (1) were fixed, then you still have to deal with the fact that there are correlated undefined regions on opposite sides of the rectangle corresponding to part of the donut cutting through the rectangle. Of course, it rapidly gets even more complicated (with more chances of making a completely incorrect interpretation) the more undefined regions of the polygon that you have. Probably, the only way you can deal with this issue directly is to split the polygon into tiny pieces so there is no ambiguity about how the undefined regions on the polygon are connected internal to the polygon. Note, both issues (1) and (2) can be solved externally by using sufficient resolution to thoroughly figure out where the defined regions are located with no ambiguity. So unless somebody here can come up with good internal algorithms to deal with issues (1) and (2), I think I will give up on any internal solutions for these issues and recommend only an external solution; that is, when using the defined capability of plshade(s), always be sure to use a sufficient number of z points so there is no ambiguity about finding the defined region. As far as I know that recommendation affects just two use cases. I believe the defined capability of plshade(s) is only accessible from C and not from any of our other languages. So direct use of the capability must be fairly rare, and such users probably have already figured out they need to use lots of z points. That then just leaves the the second use case which is the software fallback for gradient. For that case, I intend to take my own recommendation and stick to using lots of z points (currently 20x20) and abandon my dream of using just 2x2. After all, our best supported devices have native support for plgradient so they don't use this code path at all. In any case, I plan to give a warning whenever the sofware fallback for plgradient must be used for drivers that don't have a native gradient, and for users that need this capability a lot, that warning should nudge them into using a device that natively supports plgradient. Alan __________________________ Alan W. Irwin Astronomical research affiliation with Department of Physics and Astronomy, University of Victoria (astrowww.phys.uvic.ca). Programming affiliations with the FreeEOS equation-of-state implementation for stellar interiors (freeeos.sf.net); PLplot scientific plotting software package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of Linux Links project (loll.sf.net); and the Linux Brochure Project (lbproject.sf.net). __________________________ Linux-powered Science __________________________ ------------------------------------------------------------------------------ Join us December 9, 2009 for the Red Hat Virtual Experience, a free event focused on virtualization and cloud computing. Attend in-depth sessions from your desk. Your couch. Anywhere. http://p.sf.net/sfu/redhat-sfdev2dev _______________________________________________ Plplot-devel mailing list Plplot-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/plplot-devel