I do not like to be argumentative. Perhaps I misunderstand Abd's argument
here. But it seems to me he repeatedly claimed that in order to measure a
mysterious source of heat from an unknown phonomenon, you must have
detailed, time-sequenced data. It is not enough to have one number. Even if
the people watching the meters or screen display can see that the
temperature is stable, it is not enough to say "5°C for 18 hours." You have
to have data points from 1080 minutes all showing more or less the same
temperature.

Lomax claims that a single number would suffice for a
well-established phenomenon, such as the heat measured from a combustion
furnace. But it cannot be relied upon for something no one has ever seen
before, such as a sample of radium in 1898 or a cold fusion reactor today.
If Madam Curie had reported only that the "sample remains warm" that should
have been rejected. She would have to say "it remains 1.3°C warmer than
ambient in a well-insulated box, and here are 1000 data points to prove it."
One data point, or the temperature measured once a day for a week would not
prove the point.

I don't get this. As I said, perhaps I misunderstand. But I think that if
the method is well established, and the data is manifestly 1 or 2
dimensional, it makes no sense to say it must be presented in more detail
because the phenomenon being measured cannot be explained. The instruments
prove that radium and the Rossi reactor produce stable, unvarying heat. That
much we know. Even if we no earthly idea what causes the heat, we can still
measure it, quantify it, and characterize it as stable. Or in the case of
the Mizuno glow discharge effect, we can sure that it is extremely unstable.

Let me give a dramatic imaginary example of how established methods of
measuring things can give us absolutely proof even when the cause of
the phenomenon is a complete mystery. Suppose a UFO the size of Manhattan
descends from space over Atlanta, hovers 500 m in the air, and then moves
north toward Washington DC at a leisurely pace of 50 km/h. Television, radio
and Internet news worldwide is interrupted to report this, and millions of
people go outside to watch the object go by. When it reaches Washington DC,
it stops, and then shoots of at 5000 km/h north, out into space, and out of
the solar system. Here are some things we could be sure of:

1. The speed and path of the object from Atlanta to DC could be measured
with great assurance. Radar would detect it and measure it with high
precision. Observers on the ground some distance away could estimate it by
watching the object move eclipsing object to the next, or by measuring with
a stopwatch how long it took the object to move its own length past a
building. This is how Wilbur Wright confirmed that Count Zeppelin's latest
zeppelin really flew as fast as claimed, in 1909.

2. The size of the object could be determined with photographs or visual
observations and trigonometry.

3. Obviously the source of propulsion would be completely unknown to
science. However, we would know some aspects of performance: that it is
capable of 50 km steady speed, and the top range is much higher. This
would be established using simple methods (a stopwatch) and complex methods
(radar). The conclusion would be indisputable. We would not know the upper
limit of performance. We don't know the upper limit for the Rossi device
either. For all anyone knows it might be capable of producing a
kiloton-scale explosion.

4. We would know with absolute certainty that this is a completely unknown,
highly advanced technology. Nothing like this can be made by any person or
corporation. Even if the object weighed little, it could not be something
like a giant helium balloon, since it moved off later at high speed yet it
was not torn to pieces. It cannot be an optical illusion since it showed up
on radar. It cannot be a mental delusion since millions of people saw it,
and measured the speed, and it is recorded close up by millions of cell
phone cameras, and by radar and so on.

In short, even though this object, its source, behavior and everything about
it would be a mystery, the performance and some basic capabilities
and characteristics would be firmly established by conventional means. Even
crude methods yielding only a few data points would suffice. This does call
for modern methods, or photography. If the event had occurred in 1800,
people would have noted it, noted the time in Atlanta, the time it passed
various points along the way, and the time when it arrived Washington 17
hours later, and they would have been able to determine speed. They would
have estimated the size of the object with trigonometry.

During the Trinity atom bomb test in 1945, Enrico Fermi dropped bits of
paper (confetti) and observed how far the paper was moved by the shock wave.
This is a crude, low-tech first-principle method of course, but it worked.
It gave a "ballpark" reasonable estimate of the power of the bomb. It would
have worked even if people had no knowledge of nuclear energy and no idea
what caused the bomb. I don't know when the compressiblity of air and
atmospheric science was established . . . Torricelli discovered barometric
pressure in 1643. I suppose by 1750 or so scientists would have understood
the general principles of Fermi's technique. If you told scientists in 1750
that you were going to trigger a tremendous explosion in the desert, and
invited them to measure the strength of the blast from a safe distance away,
I expect they could have come up with reasonably accurate methods such as
traces on rapidly moving paper, even though they did not know that atoms
exist and obviously they would not have a clue what caused the explosion.
Certainly they could have done this by 1850, when they were capturing sound
vibrations on rapidly moving traces, with the phonautograph.

In short, you can measure the effect of a phenomenon even if you have no
idea what causes it. If the data happens to be 2-dimensional, with one
unvarying value over time (to within some range of error), you can be
certain it is stable, and you can record the result very simply; i.e. "the
UFO moved at 50 km/h for 17 hours" or "the cell produced a 5°C Delta T
temperature for 18 hours." Those statements are accurate and complete. No
additional information is needed to qualify them, except the range of
errors; i.e. the speed was 50.25 +/- 0.03 km/h. Time-sequenced data would
not increase the amount of information or the level of confidence in it.
More sophisticated methods of collecting the data, and more high precision
instruments would not increase the level of confidence. There is no
necessary correlation between the level of sophistication or precision of
the instruments and the confidence of the results. That is a modern
misunderstanding. Scientific instruments and techniques hundreds of years
old can measure temperature, the speed of an object, the speed of light and
many other important values with as much confidence as the best technology
can today; albeit with far less precision.

Note that the speed of light was measured by Roemer in 1676 by observing the
eclipses of the moons of Jupiter. He measured 2.14 * 10E8 m/s. The modern
value is 3.00 * 10E8 m/s. Many other constants such as Avagodro's number
were known with pretty good precision long ago. Ancient techniques worked
well. Look at all those ancient buildings still standing.

- Jed

Reply via email to