This does actually look like it's useable for general photography, not
just microscopes.

For some time I've been thinking that digital cameras must have an
unrealized potential.  For instance, the human eye has aberrations that
increase the depth of field, and jerks around a lot even when looking at
one thing.  And we see processed images.  I thought if aberrations,
contrast, and other characteristics of a lens were known and communicated
to the camera, then barrel distortion could be straightened out in
software, and so on.  Then maybe lenses with new capabilities could be
made, like 20x zooms that produce nice pictures, or at least much cheaper
lenses could be made.  Or a different kind of image stablization; take a
1/4 second video clip of a moving object, identify all the edges (like
legs) and distort the images in software so the legs all fall onto the
midpoint.  It might give screwy looking results sometimes, but it also
might give well exposed images that would never be more than a blur when
taken by traditional methods.

Or if not that, it seemed digital must have possibilities that are being
completely overlooked because people still think of them as basically film
cameras with a CCD instead of film.

Wavefront imaging is a new one for me, but it looks like the same concept.
It seems you can get a greatly increased depth of field without stopping
down the aperture, but by matching lens to the entire system and
processing the image.

Reply via email to