GSR - FR wrote:

[EMAIL PROTECTED] (2006-03-19 at 2339.24 +0100):

... and is there any difference in picture if I set:

100x100 in 400dpi
200x200 in 100dpi

and make a print from it (in same size - 10cm x 10cm for example)?

Then it is not 400 or 100 DPI, but (rounding to 1 inch = 2.5 cm) a 4
inch print, so you have printed 100 pixels to 4 inches and 200 to 4
inches too, so that was 25 and 50 DPI. Maybe you are confused with the
printer's DPI (300, 720, 1440...) but those are not pixels, but ink
dots. The print system has to convert the file/screen pixels (think
about them like squares or rectangles with different levels of
intensity) to ink dots which are on or off (one level of intensity, so
the printer creates patterns of dots to simulate the intensity levels
when looked from "far away").

I have been trying to understand this better myself. I couldn't find any good help on it. My goal was to determine the maximum size image in pixels that would print exactly on a letter-size paper with no scaling. I built a resolution test image and found that the highest resolution that I could see a 1 pixel line on my printer was about 150 pixel/inch. This is on windows xp. I see that in at least some file formats, the resolution setting is saved in the metadata as dpi. But when I print the image using different applications, I get different results. The apps seem to use or package the data differently for the driver. For example, one app has options in the print dialog for "fit pixels" "fit resolution" and "fit to page". I haven't figured out yet exactly what this does (other than fit to page obviously scales the data up or down
as required.

My printer also has a poster mode (2x2, 3x3,4x4).  As best I can determine
all this does is first scale the image to a single page in the app and then
the printer just zooms it (I don't know if it does any sort ot interpolation but I doubt it).

scott s.

Gimp-user mailing list

Reply via email to