Stephen A. Lawrence <[email protected]> wrote:

> In particular, the input water
> was tap water, with no temperature control, with a very low initial
> temperature, and the temp was measured only at the start and end of the
> run, allowing the possibility that the input temperature varied by some
> amount during the run.


No, the inlet and outlet temperatures were measured at regular intervals
during the run. I do not know how often. They were not measured only once.



> I seem to recall the flow rate was also measured at the start, and then
> assumed constant during the run.


It was measured manually at the start and again later on,
and continuously with a flowmeter that shows cumulative totals, such as the
kind you have in your house, that they use to bill you for water.

These details have not been made public. No report has been published. They
were related to me by phone and e-mail from the people who did the
experiment, and also to Mats Lewan. Granted, that is an informal way to
communicate with the public. With such an important experiment, I wish they
would publish something. You can fault their presentation, but the actual
methods and instruments used in the test are industry standard for working
with boilers of this size. This is how it is done. You might invent a more
sophisticated, complicated method that is much more precise, but it will not
be more accurate. It will not be more credible, or believable, because the
most credible method is the one everyone uses routinely for this purpose, by
definition.

For example, if you are measuring the floodwater level in a river, you look
at the rulers painted on the side of bridges. You might take a photo with a
cell phone camera. You can invent a highly sophisticated instrument that
pings the top of the water with sonar or radar to get a more accurate
reading, but no one will say it is more credible because reading the ruler
is simple and foolproof, because the water level in a river does not vary
much from one side to the other even in a flood, whereas a unique fancy
gadget that you invent might have unknown problems.

In no sense are these methods or instruments "sloppy." An $20 off-the-shelf
watt meter and a clamp-on ammeter may not be as sophisticated as a $16,000
power meter, but there is absolutely no reason to believe they are
inadequate to the task. A modern digital $20 watt meter is more reliable and
accurate than expensive instruments were a generation ago. It is physically
impossible for this to be an artifact of input power in any case.

- Jed

Reply via email to