From a comment thread on e-catworld:

http://www.e-catworld.com/2013/05/rossi-no-longer-controls-e-cat-business/

Pekka Janhunen on May 27, 2013 at 8:01 pm

Off-topic for the thread: the question whether assuming emissivity equal to
one indeed yields to underestimation of the radiated power. Now I think
yes, but to deduce it one has to know something about the Optris PI-160
thermal camera. The specs of the camera say that its wavelength range is
7.5-13 microns. For relevant HotCat temperatures, this is on the long
wavelength side of the Planck emission function (i.e. to the right from the
peak in wavelength space). If the true emissivity is below unity, say 0.8,
and one inputs 1.0 for the camera (as the testers did), the camera
basically integrates the emission in its wavelength band and then finds a
source temperature which, when multiplied by the assumed emissivity 1.0,
yields the same integral. Because the real emissivity is smaller, this
yields an underestimation of the temperature, as the testers said. The
total radiative emission (integral of Planck function over all wavelengths)
is emissivity*sigma*T^4. The real emissivity is less than unity which tends
to make the true emitted power smaller than the estimated one. But the real
temperature is higher which causes a reverse effect. Which effect wins is
not immediately clear. I tested it numerically for temperatures 300-400 C
used in the test (i.e. I numerically integrated the Planck function in the
range 7.5-13 um and then adjusted the source temperature until it matches
the real one; the full story is a bit more complicated but this is the
general idea). It is indeed so that with these numbers, the net effect is
underestimation of the emitted power.

The result is understandable, I think, in the following way. If the camera
measures an integral of the emission function above the peak,
underestimation of temperature moves the real Planck function towards
shorter wavelengths i.e. further away from the camera’s wavelength range.
Thus the camera sees a smaller fraction of the radiated infrared energy
than it thinks based on its own idea of the source temperature which is
based on assuming epsilon=1 instead of the correct value. The numerical
integration also shows that if the camera’s wavelength range would be below
the peak, overestimation of the emitted power would result. But that regime
is never entered in this case because it would correspond to only room
temperature or such.

If the true epsilon is 0.8 and true temperature 400 C, I got an
underestimation factor of 0.889 for the emitted power. I.e., the
underestimation is not likely to be large, but it is anyway of the correct
sign as the testers asserted.

Reply via email to