"Mighty Chimp" wrote on 2003-11-09 17:32 UTC:
> In a recent posting between myself and Han of the Netherlands, we
> discussed the use of dpi in computer printers.
> 
> I made the assumption that the use of dpi is not part of a printers
> hardware, but of its software.  Either through its drivers or the word
> programmes.  Could you tell us, if this is actually true or not?  

The dpi resolution is a property of both the printers hardware and
firmware (= software permanently stored in chips). If you have a 1200
dpi laser printer, then the time in which the drum surface and paper
move by 25.4 mm is identical to the time that the laser needs to scan
1200x over the drum.

The exact ratio between the laser scan frequency and the drum velocity
is controlled by the microprocessor that monitors and controls the speed
of both motors. Its firmware could in principle be changed by the
manufacturer very easily to increase the resolution to 1270 dpi, or 20
micrometers. Perhaps one or two components (a crystal oscillator) might
have to be replaced as well, but the change would be very easy to do for
a newly designed model.

The primary reason for why manufacturers haven't done that can only be
just inerita. That might be amplified by the fact that the language used
to control most professional printers, PostScript, uses by default a
coordinate system in which 72 length units ("points") are one inch long.
The coordinate system can of course be changed trivially using
PostScript's "scale" coommand to any units you like, therefore for the
author of application software and printer drivers, there is no big
reason to stick to inch-related units in any way.

http://www.cl.cam.ac.uk/~mgk25/metric-typo/

Markus

-- 
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__

Reply via email to