# Measuring the Meter

comment

We're spatial creatures. We live in a three-dimensional world, with our fourth "dimension" — time — conceptualized in terms of space. That is, we refer to time as if it were space: We turn clocks "forward" or "back" twice a year; we agree to "move up" a meeting; we wonder "where the time went." In short, we live embodied lives in three dimensions of real space and one of pseudo space. This column is about how we measure that space. It all comes down to the meter.

Here in the U.S. (along with Liberia and Myanmar) we pretend we haven't adopted that new-fangled metric system (tell that to STEM folks). More to the point, our standard units of length — inches, feet, yards, miles — are defined in metric terms. The inch, for instance, is exactly 2.54 centimeters, which is an improvement over when it was legally "three grains of barley, dry and round, placed end to end, lengthwise."

How the meter came about is another story, with its origins in the French Revolution. Along with the short-lived 10-hour day (100 minutes per hour, 100 seconds per minute), the utopian visionaries of the National Convention in Paris declared in 1791 that the meter (in Greek μετρω, or metreo, meaning "to measure") was defined as one 10-millionth of the distance from the equator to the North Pole. Previous attempts to tie the meter to the length of a pendulum with a half-period of one second had faltered when it was realized that the period changes from place to place due to variations in local gravity.

You don't, of course, have to measure all the way from the North Pole to the equator to use this new definition. You can extrapolate by measuring the difference in latitude (obtained from the height of the sun at solar noon) between two known points on the same meridian of longitude. So, in 1791, the French Academy of Sciences commissioned astronomers Jean Baptiste Joseph Delambre and Pierre Méchain to measure the exact distance between Dunkirk to Barcelona to determine the length of the meridian arc that runs roughly through Paris.

Two years later, based on the astronomers' provisional results, France — quickly followed by other European countries — adopted the standard meter in the form of an etched iron bar kept in a Paris vault. No matter that Méchain was later found to have fudged his results, and that the prototype meter bar was actually 0.02 percent short because of miscalculation of the Earth's oblateness ("flattening" of the poles). You can read about it in Ken Alder's riveting 2002 book The Measure of all Things: The Seven-Year-Odyssey that Transformed the World. As it happened, the error didn't really matter. Humankind being the measure of all things (according to a man), so long as everyone was now on the same system based on this new "standard meter," national and international commerce could function smoothly.

In 1876, the newly constituted International Bureau of Weights and Measures upgraded the single bar to 30 platinum-iridium bars kept around the world. Then in 1960, the meter was redefined in terms of a particular wavelength emitted by the element krypton when heated in a vacuum. The modern definition of the meter, adopted in 1983, is the distance light travels in a vacuum in 1/299,792,458 of a second. (Sharp-witted readers will be wondering how defining a universal unit of length in terms of time is an improvement on the old pendulum measurement. Fortunately, modern cesium clocks come to the rescue — they routinely measure time to one part in 10 to the 13th power.)

It will probably take another revolution to adopt the metric system here so meanwhile we're stuck with inches, feet, yards and miles. Which, I guess, is why God gave us Siri and Alexa.

Barry Evans (barryevans9@yahoo.com) appreciates the French for setting the circumference of the Earth to within a few metric whiskers of 40,000 kilometers/25,000 miles.