> If anyone was worried about terminology being corrupted by digital imaging,
> then the term "Gamma" (in digital processes) should have come under assault
> years ago.  I have ~never~ seen a change of gamma of a monitor or of image
> attributes in a picture editor even remotely resemble a change of developed
> gamma of a film.  Somehow I think the propellerheaded software developers
> had no idea what gamma meant in photography, and reinvented it for their own
> purposes.

I don't think you can really blame software developers for this.  The use of
gamma in digital image processing corresponds pretty closely to the meaning
of gamma as used by electronics engineers to describe (or approximate) the
variation in output brightness of a CRT as a function of the input voltage.

If this doesn't match the use of gamma in film-based photography then the
confusion was spawned long before the days of digital imaging packages.

Reply via email to