Dear All,

Was the metre, as the universal standard of measurement, invented in
England?

Recently, I discovered a web blog at
http://blog.plover.com/physics/meter.html that suggests that the metre was
invented in England 110 years before the French development of the metric
system.

It seems that John Wilkins was comfortable with a truly universal
measurement standard and on the idea of basing the standard on the
circumference of the earth. However, he ultimately plumped to let the
standard length be the length of a pendulum with a known period.

By the way, in about 1658, John Wilkins was the founding chairman, and later
secretary, of the Royal Society.

The details from Mark Dominus' blog are below.

Cheers,

Pat Naughtin
PO Box 305 Belmont 3216
Geelong, Australia
61 3 5241 2008

Pat Naughtin is manager of http://www.metricationmatters.com an internet
website that primarily focuses on the many issues, methods and processes
that individuals, groups, companies, and nations use when upgrading to the
metric system. You can contact Pat Naughtin at
[EMAIL PROTECTED]



Fri, 03 Mar 2006
John Wilkins invents the meter

An Essay Towards a Real Character and a Philosophical Language

I'm continuing to read An Essay Towards a Real Character and a Philosophical
Language, the Right Reverend John Wilkins' 1668 book that attempted to lay
out a rational universal language.

In skimming over it, I noticed that Wilkins' language contained words for
units of measure: "line", "inch", "foot", "standard", "pearch", "furlong",
"mile", "league", and "degree". I thought oh, this was another example of a
foolish Englishman mistaking his own provincial notions for universals.
Wilkins' language has words for Judaism, Christianity, Islam; everything
else is under the category of paganism and false gods, and I thought that
the introduction of words for inches and feet was another case like that
one. But when I read the details, I realized that Wilkins had been smarter
than that.

Wilkins recognizes that what is needed is a truly universal measurement
standard. He discusses a number of ways of doing this and rejects them. One
of these is the idea of basing the standard on the circumference of the
earth, but he thinks this is too difficult and inconvenient to be practical.

But he settles on a method that he says was suggested by Christopher Wren,
which is to base the length standard on the time standard (as is done today)
and let the standard length be the length of a pendulum with a known period.
Pendulums are extremely reliable time standards, and their period depends
only their length and on the local effect of gravity. Gravity varies only a
very little bit over the surface of the earth. So it was a reasonable thing
to try.

Wilkins directed that a pendulum be set up with the heaviest, densest
possible spherical bob at the end of lightest, most flexible possible cord,
and the the length of the cord be adjusted until the period of the pendulum
was as close to one second as possible. So far so good. But here is where I
am stumped. Wilkins did not simply take the standard length as the length
from the fulcrum to the center of the bob. Instead:


...which being done, there are given these two Lengths, viz. of the String,
and of the Radius of the Ball, to which a third Proportional must be found
out; which must be as the length of the String from the point of Suspension
to the Centre of the Ball is to the Radius of the Ball, so must the said
Radius be to this third which being so found, let two fifths of this third
Proportional be set off from the Centre downwards, and that will give the
Measure desired.

Wilkins is saying, effectively: let d be the distance from the point of
suspension to the center of the bob, and r be the radius of the bob, and let
x by such that d/r = r/x. Then d+(0.4)x is the standard unit of measurement.
Huh? Why 0.4? Why does r come into it? Why not just use d? Huh?

These guys weren't stupid, and there must be something going on here that I
don't understand. Can any of the physics experts out there help me figure
out what is going on here?

Anyway, the main point of this note is to point out an extraordinary
coincidence. Wilkins says that if you follow his instructions above, the
standard unit of measurement "will prove to be . . . 39 Inches and a
quarter". In other words, almost exactly one meter.

I bet someone out there is thinking that this explains the oddity of the 0.4
and the other stuff I don't understand: Wilkins was adjusting his definition
to make his standard unit come out to exactly one meter, just as we do
today. (The modern meter is defined as the distance traveled by light in
1/299,792,458 of a second. Why 299,792,458? Because that's how long it
happens to take light to travel one meter.) But no, that isn't it. Remember,
Wilkins is writing this in 1668. The meter wasn't invented for another 110
years.

Having defined the meter, which he called the "Standard", Wilkins then went
on to define smaller and larger units, each differing from the standard by a
factor that was a power of 10. So when Wilkins puts words for "inch" and
"foot" into his universal language, he isn't putting in words for the common
inch and foot, but rather the units that are respectively 1/100 and 1/10 the
size of the Standard. His "inch" is actually a centimeter, and his "mile" is
a kilometer, to within a fraction of a percent.

Wilkins also defined units of volume and weight measure. A cubic Standard
was called a "bushel", and he had a "quart" (1/100 bushel, approximately 10
liters) and a "pint" (approximately one liter). For weight he defined the
"hundred" as the weight of a bushel of distilled rainwater; this almost
precisely the same as the original definition of the gram. A "pound" is then
1/100 hundred, or about ten kilograms. I don't understand why Wilkins' names
are all off by a factor of ten; you'd think he would have wanted to make the
quart be a millibushel, which would have been very close to a common quart,
and the pound be the weight of a cubic foot of water (about a kilogram)
instead of ten cubic feet of water (ten kilograms). But I've read this
section over several times, and I'm pretty sure I didn't misunderstand.

Wilkins also based a decimal currency on his units of volume: a "talent" of
gold or silver was a cubic standard. Talents were then divided by tens into
hundreds, pounds, angels, shillings, pennies, and farthings. A silver penny
was therefore 10-5 cubic Standard of silver. Once again, his scale seems
off. A cubic Standard of silver weighs about 10.4 metric tonnes. Wilkins'
silver penny is about is nearly ten cubic centimeters of metal, weighing 104
grams (about 3.5 troy ounces), and his farthing is 10.4 grams. A gold penny
is about 191 grams, or more than six ounces of gold. For all its flaws,
however, this is the earliest proposal I am aware of for a fully decimal
system of weights and measures, predating the metric system, as I said, by
about 110 years.

Reply via email to