On Wed, Dec 15, 2010 at 3:52 AM, Larry Colen <[email protected]> wrote:
> You're taking the five bits off the top, not the five bits off the bottom.
> But you've got a point there.
OK, thinking about it a bit more, you're asking about additive light,
such as a veil of light scattered by fog. The original scene had some
modest dynamic range (e.g. 5 stops) but once you add the veil of
light, the light intensity seen by the camera has a much smaller
dynamic range when expressed in stops (e.g. log2(2080/2048) in your
example).
The issue with additive light is basically the same as "Why can't I
see the stars in the daytime?" There's a bright star in the sky that
you can see easily at night. In the daytime, you're still getting all
the photons from that star, but you can't see it because of the sky.
But why? The answer is not trivial, and it has to do with the physics
of light, rather than the way sensors or eyeballs work.
If you wanted to model additive light naively in Photoshop or Matlab,
you could add, say, 2000 counts to every pixel. Then, assuming you
didn't saturate any pixels, you could go back through, and subtract
2000 counts from every pixel, and recover the original scene. I think
this is what you propose the camera do for you, more or less.
The problem is that the "add 2000 counts to every pixel" model isn't
how the real world works. Light consists of discrete photons, which
obey photon statistics. If you have a source that emits, on average,
100 photons/sec, that doesn't mean that photons come evenly spaced in
time, 0.01 seconds apart, like cars on a train. Instead, during a
small interval of time, like a microsecond, you have a certain
probability of observing a photon. In the example of 100 photon/sec,
in one microsecond you have a p=100*1e-6 = 1e-4 probability of
observing a photon, and this is independent of what happened in the
previous microsecond, or how long it's been since the last photon. In
these short windows of time, photon arrival is a binomial process
("coin flip" with a biased coin), which leads to
exponentially-distributed intervals between photon arrivals, and
Poisson-distributed counts over longer periods of time. For larger
photon counts, the Poisson distribution is well approximated by a
normal (Gaussian) distribution, with a standard deviation equal to the
square root of the mean. For example, in 1 second, you collect an
average of 100 photons, with a standard deviation of 10 photons. So
in a sequence of five 1-second exposures, or in five adjacent pixels
of an array, your recorded counts might be 107, 89, 101, 106, 91. (I
used an online RNG to create those example counts.) This variation is
usually called "photon shot noise."
Now let's work an example with additive light. Since this is a
thought experiment and we want the math to be easy, we'll use a
Gedankendetektor 16, which has a 16-bit ADU, perfect quantum
efficiency, and unity electron gain. That is, every photon hitting a
pixel excites one electron, and every electron reads out as 1 ADU (A/D
unit, also called DN "digital number" or just "count"). All the way
from 1 photon = 1 e- = 1 ADU up to 65535 photons = 65535 e- = 65535
ADU. The Gedankendetektor 16 has a read noise of 5 e-/pixel = 5
ADU/pixel.
Now we observe a star. The light from the star falls on a single
pixel. The "sky" is perfectly dark--there is no additive (scattered)
light. During our exposure, say 1 minute, we record 100 photons = 100
ADU from the star. Our read noise is 5 ADU, so our signal-to-noise is
100/5 = 20. The star is well-detected and clearly visible.
Now we add scattered light--sky brightness. During the day, we have a
lot of scattered light from the sun, giving us a blue sky. During a
moonlit night, we have less scattered light than in the daytime.
During a moonless night, we have less than that, but still a some from
atomic and molecular transitions in the atmosphere and light
pollution.
During the same 1 minute exposure, we still get 100 ADU from the star,
but our sky brightness, which is added to that, averages 1,000
ADU/pixel. We can measure and subtract off the mean level of 1,000
ADU, but the photon shot noise, sqrt(1000) ADU/pixel = 32 ADU/pixel,
remains. Our signal to noise is now 100/32 = 3. The star is 3-sigma
above the background noise. This is not very good, and would not
usually be considered a detection. (Remember, our star is actually on
top of noise; if it happens to be on a high point of the noise, we
might measure that pixel 4- or 5-sigma above background; but it's just
as likely to fall on a low point and measure 1- or 2-sigma above
background, which is well deep in the noise.) (Note: We still have
the 5 ADU read noise, but it's small compared to the photon shot
noise, and when you add them in quadrature, they still round to 32
ADU.)
I want to stress that this dramatic reduction in signal-to-noise from
scattered light is a physical reality of the light incident on our
sensor. I've made up a sensor to give us some numbers, but nothing in
this result depends on the sensor characteristics. The sensor could
make things worse, but on the whole you're being screwed over by
physics, not by your sensor engineering or signal processing
techniques.
So how do you do better than S/N = 3? You collect more photons, by
exposing longer or combining multiple exposures. Let's say we double
the exposure. Now we have 200 ADU of signal, and 2000 ADU/pixel of
sky. The photon shot noise from the sky is sqrt(2000) = 45, and our
signal-to-noise has improved to 4.5. This is a general result.
Doubling your exposure results in a sqrt(2) improvement in
signal-to-noise. It's a long hill to climb. Astronomers are often
studying objects that are hundreds or thousands of times fainter than
the sky brightness in front of them, even the darkest skies on earth.
This is why we need big telescopes and very long exposures, and why we
build space telescopes (where the sky is much darker).
In the example, I have neglected photon shot noise in the object of
interest; I assumed we always record 100 photons from it, but this is
not true. Its counts will vary from exposure to exposure and
contribute additional photon shot noise.
So, tl;dr version: Poor image quality in the presence of scattered
light (fog, dirty lens) is mostly due to photon statistics, rather
than sensor characteristics.
--
PDML Pentax-Discuss Mail List
[email protected]
http://pdml.net/mailman/listinfo/pdml_pdml.net
to UNSUBSCRIBE from the PDML, please visit the link directly above and follow
the directions.