<P> The idea of using atomic transitions to measure time was first suggested by Lord Kelvin in 1879, although it was only in the 1930s with the development of Magnetic resonance that there was a practical method for doing this . A prototype ammonia maser device was built in 1949 at the U.S. National Bureau of Standards (NBS, now NIST). Although it was less accurate than existing quartz clocks, it served to demonstrate the concept . </P> <P> The first accurate atomic clock, a caesium standard based on a certain transition of the caesium - 133 atom, was built by Louis Essen in 1955 at the National Physical Laboratory in the UK . Calibration of the caesium standard atomic clock was carried out by the use of the astronomical time scale ephemeris time (ET). </P> <P> The International System of Units standardized its unit of time, the second, on the properties of cesium in 1967 . SI defines the second as 9,192,631,770 cycles of the radiation which corresponds to the transition between two electron spin energy levels of the ground state of the Cs atom . The cesium atomic clock, maintained by the National Institute of Standards and Technology, is accurate to 30 billionths of a second per year . Atomic clocks have employed other elements, such as hydrogen and rubidium vapor, offering greater stability--in the case of hydrogen clocks--and smaller size, lower power consumption, and thus lower cost (in the case of rubidium clocks). </P> <P> The first professional clockmakers came from the guilds of locksmiths and jewellers . Clockmaking developed from a specialized craft into a mass production industry over many years . </P>

Who knew what time it was when the first clock was made