On 20/10/06, Mark Roberts <[EMAIL PROTECTED]> wrote: > Brendan MacRae wrote:
> In the case of a scanner you're pretty safe using either term, DPI or > PPI: A scanner looks at one tiny area (a dot) on the physical medium > and generates one digital picture element (pixel) from it. So at this > particular boundary between the digital and analog realms, one dot = > one pixel, hence PPI and DPI are equivalent. All my scanners literature states dpi "6400dpi film scanning", "Optical resolution 5400 dpi" and "4,000 dpi true optical resolution", and like you suggest PPI and DPI are generally considered interchangeable at the sampling point. > But once you have a digital file, DPI makes no sense because there are > no "inches" to a digital file. This I don't agree with, most image formats have provisions for scaling information, just because most people don't generally use or understand it doesn't mean it makes no sense. If the true scan resolution is embedded in the file you can manage scaling between images in order to ensure uniformity, correct scaling is very important in the print industry. > And at the final digital/analog boundary, the printer, DPI and PPI are > still not equivalent because inkjet printers make "dots" in separate, > individual colors and at DPI resolutions (1440 DPI - 2880 DPI) far > beyond what's sensible for PPI output res of a digital file. Yes this is the point of confusion for those new to digital work flow. -- Rob Studdert HURSTVILLE AUSTRALIA Tel +61-2-9554-4110 UTC(GMT) +10 Hours [EMAIL PROTECTED] http://home.swiftdsl.com.au/~distudio//publications/ Pentax user since 1986, PDMLer since 1998 -- PDML Pentax-Discuss Mail List [email protected] http://pdml.net/mailman/listinfo/pdml_pdml.net

