On Aug 5, 2005, at 4:43 AM, Kenneth Waller wrote:
what I was really asking was along the lines of - when making a final determination about the sharpness of a digital image, on the monitor, @ what magnification do people here go to make that determination?
I do the sharpening with the image at 100-200% on screen for its *final output resolution* ... IN other words, if I'm going to print, I look at it at 100-200% of its actual original resolution, and if it's going to the web, I look at it at 100-200% of its downsampled WEB resolution. This allows me to see the growth of haloes and highlight degradation, other artifacts of the image processing, through USM and deblur convolutions accurately.
As others have said, in general, the sharpening required is pretty much a constant, although there is a bit of scene dependency. "High frequency" scenes need different sharpening techniques compared to "low frequency" scenes. Scanned film requires different sharpening techniques compared to digital captures. etc.
I evaluate the sharpness of the image for printing *after sharpening* by scaling it on my screen to be close to the print resolution. It's not perfect, but it's generally a good exercise that allows a reasonable amount of pre-visualization without wasting a sheet of paper. Works best for larger size prints in the 8x10-11x17 range on my screen.
With a scanned image you always have the original image (slide, neg, print) to view.
It's somewhat useful as a reference guide, but I rarely bother looking at the film image past selecting it for scanning. After it's scanned, I use the screen image to evaluate sharpness ... The scanning process and displaying that image at 1:1 resolution will give a much more accurate view of the sharpness, to my eye.
Godfrey

