Jones Beene <jone...@pacbell.net> wrote:

 I’m sorry but that is not what Miles seems to be saying now.
>

That's what he told me. I consulted with him at length when I wrote the
paper about him. He & I went over it many times.



> You are putting words in his mouth. In any event, the rate measured is
> incredibly low – well below any confidence level and well below atmospheric
> levels  . . .
>

It is far about the confidence levels and far below atmospheric levels.
Right there in the sweet spot between them. The confidence level is
background. That is, how much helium leaks in during the course of an
experiment. Which you measure by looking a null experiment; i.e. one with
no heat, or one where you do nothing but let the equipment sit there for
some length of time.



> - so it is of negligible value. It is milliwatt level, in a world begging
> for kilowatts.
>

About 500 mW in one experiment, as I recall. That was the best he could
achieve, so it will have to do. "Negligible value" has no meaning in this
context. It was high enough to measure the heat with great confidence, and
high enough to measure the helium with good confidence, albeit with large
error bars. There is no chance this was leaking in from the outside, for
the reasons Miles gave (and I reiterated). Miles was giving a lecture about
this once and pointed to the graph displayed in the slide. He pointed the
laser to the ceiling and said "if it was leaking from atmosphere we would
have to display this slide 10 floors high."



> IMO - the only result that matters to most of Science, going forward, will
> be the result of experiments of greater than 10 watts, and hopefully 100
> watts or more.
>

That makes no sense. Many previous scientific breakthroughs began with
barely detectable effects, such as the heat from small, impure samples of
radioactive materials. These discoveries mattered. If you can learn the
nature of an effect by studying something barely detectable, and you can
then learn to control and scale it up, that is as good as studying
something easily detected. The initial magnitude of the effect is
irrelevant.

As long as you can be sure you are measuring a real effect, the fact that
it is small has no bearing on the scientific importance of the phenomenon.
It has no bearing on what you might learn from the experiment.

Granted, if Miles could have achieved 10 W it would have been better. He
wished he could. As he noted his calorimeter could not have supported a ~40
W reaction. You cannot make a calorimeter capable of measuring any level of
heat. You have to design it for a particular target range, and 40 W would
have exceeded the range for that instrument.

If the heat had been ~40 W it would have produced just about as much helium
as atmospheric concentration, and then people would say "it was
contamination." So, ironically, this would have been less convincing,
because it would have been out of the sweet spot between the cell
background and atmospheric concentration. With his technique he could not
have increased the concentration by allowing the collection to continue
longer. The duration of the collection period was fixed. He would have to
get more than 40 W to go above atmospheric concentration.

Other people such as McKubre had instruments that did allow them to
continue collection for as long as they wanted. They did achieve
concentration above atmospheric levels.

- Jed

Reply via email to