I  think Nat hit a bullseye.  To the U.S. way of thinking, a centimeter behaves 
like an inch, i.e., it is a larger unit that seemingly gets divided up. Working 
with millimeters is actually easy, but some people might think it is difficult 
because they can't go dividing millimeters fractionally. There is nothing to 
divide. Working with millimeters is more algebraic. All you have are integers. 
With centimeters, one might try looking for quarters and eighths lines even 
though they are not there, but the terrain is more familiar. 
  ----- Original Message ----- 
  From: Nat Hager III 
  To: U.S. Metric Association 
  Sent: 07 Feb 13,Tuesday 10:44
  Subject: [USMA:37944] RE: Metric at the hardware store



  So the question I have is this. How did the centimeter become so dominant in 
American metrication? 

  I think it's more compatible with "inch-type thinking", since the basic unit 
is some fraction of an inch and it still allows you to think in terms of "one" 
of something.  

  Millimeters is a more radical departure, where the basic unit is 100 mm and 
you think of either "percentages" of 100 mm or "modules" of 100 mm stacked 
together.    Little more of a learning curve, but far easier when you get used 
to it.

  My usual class handout attached.

  Nat  

Reply via email to