I brought up this in an internal MFMP discussion and I would be interested
in getting input from the Vorts on the topic.

I have been thinking about thermocouple measurements with a reactor tube or
a glow stick.  I now believe it is a bad idea to measure the core
temperature inside the tube as a metric.  The reason is that this
temperature could be only loosely related to what you are trying to gauge -
the output heat of the device.  We know from analysis of the Lugano
experiments that the emitted heat is related to the surface temperature,
both for radiation and convection.  The same cannot be said for internal
measurements of temperature.
Imagine for the moment that you had 2 tubes, one with 2mm thick wall and
one with 3mm thick wall.  The one with the higher wall thickness will have
more insulation to the outside and the core temperature will be higher for
the same excess heat.  This is an extreme example, but each of the
typically used tubes have different wall thickness from different shrinkage
during firing.
Another concern is that the dummy run and the fueled run will not be
equivalent.  Suppose you fueled the dummy with just the Ni powder or with
nothing at all.  The contact to the thermocouple will be poor.  Now imagine
adding the Ni+LiAlH4.  When the Li melts, it will begin conducting heat at
a much greater rate to the thermocouple and this conductivity will change
with each of the breakdown phases of the LiAlH4.  This will create a
difference in thermometry between the dummy run and the fully fueled run.
I think it would be a more accurate metric to measure temperature at the
outside of the tube as this is a more representative metric of output
heat.  You can still measure the core temperature but its interpretation
may be a little obscure - so don't use it as a metric of output heat - even
in comparison.

Reply via email to