Adrian Jadic wrote: "The first and simplest example is the computer screen resolution. Although the pixel size can be easily expressed in mm (0.15..etc.) as it is in Europe the computer manufacturers have invented the dpi. Then they invented the bps (baud per second) then the mips, the flops, the ppm (pages per minute) and God knows how many are there that I don't know about."
First, a pixel or a dot is an arbitrary element, not a unit of measure. In expressing resolution in dots per inch, the industry is not creating a new unit. Obviously, we would rather the measurement be dot/cm. In defining the resolution of a display, the significant measurement is dot pitch (i.e., the distance from the center of one pixel to the center of the adjacent pixel). That is specified only in millimeters (0.28 mm being a common value). Monitor advertisements specifying "dpi" are written by people who don't know what they're talking about, not by the manufacturers of the monitors. (Most ads do get it right, however, and say "dot pitch: 0.28 mm".) Again, the bit is an arbitrary element (the binary digit) and not a unit of measure. The term bps (bits per second -- a baud is already a rate, defined as one change of state per second) is atrocious, of course, but the standards bodies express it, correctly, as bit/s. Again, instructions (mips being "million instructions per second" and flops being "floating point operations per second") are not units of measure Still again, pages are arbitrary. If one is buying a laser printer, it's certainly necessary to know its performance in pages per minute. To reiterate, none of these terms introduces a new unit of measure. They simply use existing units of measure in conjunction with arbitrary, but necessary, elements or characteristics. Bill Potts, FBCS, CMS Roseville, CA http://metric1.org [SI Navigator]
