On Tue, Mar 11, 2003 at 11:41:30AM -0800, Daniel Rogers wrote:
Weight the pixel value by the alpha value, just like you do with any other operation on pixels. This makes sense when alpha is defined to be the coverage. If a pixel is only really half covered, their should half the impulse response on the convolution kernel.
And so, if you're blurring with some transparent area, it's equivalent to blurring with black? Doesn't make sense to me -- or am I missing something?
/* Steinar */
Not quite the same. Black is not the same as no information. A little coverage is some information, while no coverage is no information.
It is the same problem you have with blurring near the edges of an image. I think the best way to treat to problem is to declare that there is no data, and determine the best way to pad your blur (presumably you would use the same padding stragety you used around the edges of the image). You might even go to the trouble of padding partial pixels (eg, blending the padding pixel with the partially covered pixel). This breaks down though when you start to treat coverage as transparency again.
_______________________________________________ Gimp-developer mailing list [EMAIL PROTECTED] http://lists.xcf.berkeley.edu/mailman/listinfo/gimp-developer