Allan Pratt <[EMAIL PROTECTED]> writes:

> According to a source I read, Hipparchus, a 2nd C BC astronomer
> calculated the length of the year to within six minutes of accuracy.
> Considering that at best he had a sundial and a water clock, how did he
> do this?

I hope a historian will answer this, but I am willing to speculate.

H's minutes were surely defined not with respect to a cesium clock but
as a fraction of a day.  The year is defined by the seasons, i.e., the
declination of the sun.  The declination is most sensitive to the date
around the equinoxes.  Since the equinox is one of the most
fundamental and easily observed astronomical events, it is plausible
that the equinox had been determined and recorded, at least to the
nearest day, for hundreds if not thousands of years before Hipparchus.
If he had available an uninterrupted calendar and a record of an
equinox 240 years earlier, then, by counting the number of days
between that and a contemporaneous observation of an equinox and
dividing by the number of years, he could calculate the length of the
year to a the claimed accuracy:  (1 dy) / (240 yrs X 365 dys/yr) =
1/87,600 = (6 min / 1 yr) / (60 min/hr X 24 hrs/dy X 365 dys/yr).

Alternatively, if he knew what he was about, he could by careful
naked-eye observation determine the time of the equinox to within a
fraction of a day.  If his observations had an accuracy of 0.1 day,
then he would only need observations 24 years apart, easily within a
professional lifetime even in those days.

The observation must not necessarily be of the equinox.  One could use
solar eclipses in a similar way, or simply the date in spring on which
the sun first becomes visible in a notch between two mountains.

Note that you don't even need a sundial or a water clock for any of
this!

--Art

Reply via email to