[EMAIL PROTECTED] wrote: > Hi Eric, > > I think I understand the two approaches you mention, I have about the > same mental distinction although with a different spin, maybe because > I come from finite element/data visualisation instead of image > manipulation: I see basically two type of 2D data that can be > visualized as a colormapped image: > > a) point data, where you provide a collection of values with > associated coordinates. Algorithm is then fully responsible for > computing the "best" value to associate to other coordinates (not > provided by the user, but nonetheless needed to build a full image, > because there was a change of resolution or just some data lacking). > One can distinguish sub categories: are the points given on > - an regular rectangular grid, * > - a regular hexagonal/triangular grid, > - an irregular rectangular grid * > - another pattern? like mapped rectangular grid, ... > - at arbitrary locations, without pattern > for now in matplotlib only *-marked subcategories are supported, AFAIK > b) cell data, where you provide a collection of cell (polygons), with > vertex coordinates and color-mapped data. Again we can distinguish su > categories: > - cell-wise data, no interpolation needed * > - or vertex-wise data, interpolation needed > -averaging on the cell > -proximity based interpolation > -(bi)linear interpolation > -higher order interpolation > and also it can be subdivided using cell geometry criterium > - cell part of a regular rectangular grid * > - cell part of an irregular rectangular grid * > - cell part of a mapped rectangular * > - arbitray cells, but paving the plane (vertex coodinate + > connectivity table) > - fully arbitrary cells, not necessarily continuously connected > Again, AFAIK for now the *-marked categories are currently suported > in matplotlib. > > In addition to those categories, one additional thing can be > considered when any kind of interpolation/extrapolation is performed: > Does the interpolation happen on the data, and then the interpolated > data is mapped to color (phong mapping/interpolation/shading in matlab > speach), or are is the interpolation performed on colors after the > initial user-provided data have been mapped (gouraud > mapping/interpolation/shading in matlab speach). > > Some equivalences exists between cell based image and point based > image, in particular my patch can be seen as irregular rectangular > grid patches bith bilinear interpolation, or point data with bilinear > interpolation. It is more of the second one, as for example in > cell-data, one can expect to be able to set line styles for cell > boundaries, while in point data this would have no sense. > > I do not see the fact that for point data the interpolation is choosen > by the algorithm as a bad thing: in fact, if one use point data, it is > because there is no cell information as such, so the fact that the > algorithm choose automatically the boundaries is an advantage. > > For example, your NonUniformImage is a point-data image generator, > which work for non-uniform rectangular grids. It associate with each > pixel the data of the closest (using non-cartesian distance, but a > "manhatan distance" given by abs(x-x0) + abs(y-y0)) available point > data. > My patch added bilinear interpolation, using x and y bilinear > interpolation between the four closest. Other type of interpolation > are possible, but more complex to implement. > > The context in which we want to use NonUniformImage is not for image > processing, but for data visualisation: we have measurement data > dependent on two variables (RPM and frequency), and a classic way to > present such data is what is known in the field (acoustic radiation) > as a "waterfall diagram". However, the samples are not necessarily > taken on a regular grid. They are taken at best on an irregular grid > (especially for the RPM), and at worst using a non-obvious pattern. > There is no cell data here, boudaries are not really relevant, the > only thing that matter is to provide an attractive and > visualy-easy-to-parse image of 2D data which was sampled (or > simulated) on a set of points. In this context both interpolation > (nearest, usefull because if present only measured data extended in > the best way so that they cover the whole figure, without any actual > interpolation), and bilinear (show smoother variations, making more > easy to visually detect some pattern and interresting RPM/FREQUENCY > regions ) are interresting for our users... > > A small example as you requested (a is withing the data, b is outside, > c is completely oustide) > c > 11 12 13 > a b > 21 22 23 > 31 32 33 > > Nearest interpolation (patched (small error), but already present): > color at a = color at 12 or 13 or 22 or 23, whichever is closest to a > color at b = color at 13 or 23, whichever is closest to b > color at c = color at 13 > Bilinear interpolation (new): > S=(x23-x12)(y12-y23) > color at a = color(12) * (x23-xa)(ya-y23)/S > + color(13) * (xa-x22)(ya-y22)/S > + color(22) * (x13-xa)(y13-ya)/S > + color(23) * (xa-x12)(y12-ya)/S > color at b = color(13) * (yb-y23)/(y13-y23) > + color(23) * (y13-yb)/(y13-y23) > color at c = color at 13 > > > As I mentioned in my first message, to complete the full array of > image generation (point data, cell data, various interpolation scheme > and geometric distribution of points or cell geometries, interpolation > before or after the color mapping) is a tremendous work. Not even > matlab has it completed, although it is far more complete than > matploblib for now and can be considered a worthy goal... But I think > this linear interpolation after color mapping on irregular rectangular > point data is already a usefull addition ;-) > > Regards, > > Greg.
Greg, Thank you for your very clear and complete explanation. I have committed your patch with only a few modifications: 0) I fixed a bug with non-agg backends by setting im.is_grayscale. 1) I changed the handling of the interpolation kwarg. The default is still 'nearest'. 2) I took Mike's suggestion to allocate acols and arows only if needed. 3) I renamed pcolor_nonuniform to image_nonuniform, modified it to show both types of interpolation, and added it to the set run by backend_driver.py. This is the most minimal test; one could write a whole test suite to exercise various inputs and options. Among the questions that occur to me: 1) Should the functionality be exposed as an Axes method, and from there as a pyplot function? This could be done by integrating it into imshow, either via the argument signature or via kwargs for X and Y, or it could be a separate method. 2) If X and Y are in fact uniform, should the displayed image be the same as the present imshow? The big difference now is the boundary: NonUniformImage extends boundary values indefinitely while Image uses the background color for anything outside the image. I find the NonUniformImage behavior disconcerting. 3) Should caching be added to NonUniformImage? Every other class in the image.py module uses it. 4) Can we do anything with the internal function naming, at least, to make the distinction between point data functions and cell data functions? My thought was to use "image" for point data and "pcolor" for cell data, but this may reflect my ignorance and particular usage patterns from matlab days. What I would do is rename the _image.cpp pcolor function to nonuniform_image, or something like that, to distinguish it more clearly from pcolor2, which works with cell data. Eric ------------------------------------------------------------------------- This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ _______________________________________________ Matplotlib-devel mailing list Matplotlib-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/matplotlib-devel