I don't think you need to apologize. The issue here is that what
you're describing is a characteristic of an individual lens, not of the
lens and format really. I don't know of a term for the OOF point of
light on a sensor or film as any other than CoC.
True, in real life the blur of light is usually brighter in the center
with a falloff near the edges, but with certain lens designs, and those
designs don't have to be mirror designs either, the reverse can be true,
with the center dimmer than the edges, which leads to among other things
such as donut bokeh and ring bokeh.
However everything in a generalized tool would assume a "perfect lens",
as an aside would probably be quite boring to use, and a perfect lens
would have no light fall off from center to edge in the circle of
confusion as the circle grew larger the light would simply become
uniformly dimmer. All the formulas I've seen assume a perfect lens.
Sure you could write a function that would approximate this falloff for
a particular lens, it might even come close to matching that behavior of
another lens, especially of a closely related design and construction,
but it wouldn't closely match all lenses or even most lenses.
I don't think anyone could design such a tool because it would be
nothing but exception coding based on empirical observations, or
analysis of particular lens designs and execution.
On 9/16/2019 1:23 AM, Larry Colen wrote:
On Sep 15, 2019, at 9:33 PM, P. J. Alling <[email protected]> wrote:
The "standard" circle of confusion, for any format was calculated based on
"acceptable sharpness" results for a given print size, (which is kind of arbitrary), for
a print at standard viewing distance, but these values are published.
I think that there is a misunderstanding here. What I was referring to was if
you have a point source of light at the focal distance, it will register on the
sensor/film as a point (in theory). If you move it towards the camera the
light it projects on the sensor will get larger. As long as it is smaller than
the CoC (in the final image) by your definition then it is in focus. Once you
come closer than that, it is no longer defined as within the depth of field.
The same works for moving the point source away from the focal distance as well.
That basically uses a threshold on an analog function to give a boolean (in
focus/out of focus) output. What I’m interested in seeing is the shape of the
curve for the radius of that blur of light. Granted, in order to do that, we
need to come up with some definition of the outer boundary for that blur of
light since it won’t be a perfect circle of uniform brightness. My
understanding is that it would end up being something like a gaussian
distribution, brighter in the center and tapering off asymtopicly to zero.
I suspect that the shape of the curve of the size of the blur at a distance
from the focal distance would look to a zeroeth order approximation something
like a parabola. I suspect that the steepness of the sides of the curve will
depend either on both focal length, and distance in such a way that even at
nominally the same depth of field, the way things progress to more out of focus
as you get away from the focal plane, will be different with different sensor
sizes/focal lengths.
If the word for the size of the out of focus blur projected by that point
source of light onto the sensor is not circle of confusion, I apologize.
--
Larry Colen
[email protected]
--
America wasn't founded so that we could all be better.
America was founded so we could all be anything we damn well please.
- P.J. O'Rourke
--
PDML Pentax-Discuss Mail List
[email protected]
http://pdml.net/mailman/listinfo/pdml_pdml.net
to UNSUBSCRIBE from the PDML, please visit the link directly above and follow
the directions.