Why is a second equal to the duration of exactly 9,192,631,770 periods of radiations?

That number, 9192631770, was chosen to make the new definition of the second as close as possible to the less precise old second definition. This meant that except for very precise measurements instruments calibrated before the new second was defined would not have to be recalibrated.


It's a definition of a unit, which is an arbitrary choice. In the past we used to define a second as 1⁄86,400 of a solar day and later as "the fraction 1/31,556,925.9747 of the tropical year for 1900 January 0 at 12 hours ephemeris time" but both are pretty poor ways of measuring time because the Earth's movement in the solar system is subject to perturbations and changes in the mass distribution of the planet (by winds in the atmosphere and ocean currents and even large earthquakes alter the length of a day, even though the change is small compared to the "noise" by the former).

When we invented atomic clocks we had better ways of defining the basic unit of time. The currently accepted definition is as "9192631770 cycles of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133". This definition, too, has shortcomings. We have now better atomic clocks than those one can build with cesium atoms and so it can be expected that the definition will change as soon as the national and international bodies responsible for these definitions decide to act on the availability of better clocks.

In the past we also used to define the meter by a wavelength of red-orange light from an optical line of Krypton 86. That made the speed of light a measured quantity. On the other hand, one of our best tested physical facts is that the speed of light is a constant, so we should treat it as one in the way we define our units. Hence we now define the speed of light as a simple numerical constant and the meter as the distance light can traverse in a given time. The definitions of meter and second are therefor linked for the future by one constant factor.

If we could measure distances with higher precision than we can measure time (we don't and it is unlikely that we will have them in the future), then we would make a new physical definition for one meter and use the constant definition of the speed of light to derive one second as the time light takes to traverse a certain distance.

If relativity holds strictly, then the two ways of defining distance/time are equivalent and we can always chose the definition that is the most precise and reproducible.


Most physical units have to be defined in terms of something measurable, and a good definition of a physical unit is one in which the measurement of the unit is very precisely repeatable.

Since prehistory, a day was a very natural way of measuring time, and was highly repeatable, in that the procedure of measuring a day can be performed anywhere on Earth with essentially the same result. The day as a unit then got split into two, for (roughly) sunrise to sunset versus sunset to sunrise, and each half of a day got split into 12 hours. Splitting something into 12 parts was a natural choice at the time, due to the widespread use of a duodecimal (base 12) numbering system in ancient Sumer and India at the time. The later splitting of an hour into 60 minutes, and then later after that splitting a minute into 60 seconds, were natural choices at the time, due to the use in other cultures of sexagesimal (base 60) numbering systems. That definition of a second as being $\frac{1}{24 \times 60 \times 60}$ of a mean solar day was in use from the time it was defined by the Persian scholar al-Biruni 1,016 years ago, until 1967.

However, although measuring time based on the length of a mean solar day was as precisely repeatable of a definition of time units as one could hope for for centuries, astronomical observations in the 1800's and 1900's showed that the duration of the mean solar day wasn't precisely constant, but instead was very gradually getting longer, making the mean solar day a less desirable basis for defining time units. In 1967, the transition time between the two hyperfine levels of the ground state of cesium 133 was about the most precisely repeatable measurements of time that was technologically possible, and certainly more precisely repeatable than measuring the duration of a mean solar day, so in 1967 the definition of a second was changed to be based on cesium 133.

However, redefining the second to be something like 10000000000 of those cesium periods, just because 10000000000 is a "nice" number using modern base 10 numbering, would be a hugely disruptive change for all those people (everybody) who had been using a second as it had been defined for the previous 967 years. To minimize that disruption, the new definition of a second was made to be as close as possible to the same amount of time as the ancient definition of a second.

It's helpful for performing precise calculations to define the second as being an integer number of those cesium 133 periods, and 9192631770 cesium 133 periods was within the range of how long the old second was, i.e., within the experimental error of comparing those two durations as precisely as was technologically possible, so a definition of the second as being precisely 9192631770 of those periods was chosen.

The above history of the definition of a second is slightly oversimplified; see Wikipedia's second article for a more detailed account.