Friday, January 18, 2008

Coordinated Universal Time

Coordinated Universal Time
Coordinated Universal Time (UTC) is a high-precision atomic time standard. UTC has uniform seconds defined by International Atomic Time (TAI), with leap seconds announced at irregular intervals to compensate for the Earth's slowing rotation and other discrepancies. Leap seconds allow UTC to closely track Universal Time (UT), a time standard based not on the uniform passage of seconds, but on the Earth's angular rotation.
Time zones around the world are expressed as positive or negative offsets from UTC. Local time is UTC plus the time zone offset for that location, plus an offset (typically +1) for daylight saving time, if in effect. UTC replaced Greenwich Mean Time on 1 January, 1972 as the basis for the main reference time scale or civil time in various regions.[1]
As the zero-point reference, UTC is also referred to by the military and civil aviation as Zulu time (Z).[2][3]
Example: Monday, 2007-12-17T06:19UTC
Abbreviation
Compromise abbreviation
source abbreviation
English CUT (coordinated universal time)
French TUC (temps universel coordonné)
compromise UTC (universal time, coordinated)
The International Telecommunication Union wanted Coordinated Universal Time to have a single abbreviation for all languages. English speakers and French speakers each wanted the initials of their respective languages' terms to be used internationally: "CUT" for "coordinated universal time" and "TUC" for "temps universel coordonné". This resulted in the final compromise of using UTC.[4]
"UTC" also has the benefit that it fits in with the pattern for the abbreviations of variants of Universal Time. "UT0", "UT1", "UT1R", and others exist, so appending "C" for "coordinated" to the base "UT" is very satisfactory for those who are familiar with the other types of UT.
"UTC" has been erroneously expanded into "Universal Time Code" or "Universal Time Convention".[5]
Mechanism
As a time scale, UTC divides up time into days, hours, minutes, and seconds. Days are conventionally identified using the Gregorian calendar, but Julian Day Numbers can also be used. Each day contains 24 hours and each hour contains 60 minutes, but the number of seconds in a minute is slightly variable.
Most UTC days contain exactly 86,400 SI seconds, with exactly 60 seconds in each minute. However, since the mean solar day is slightly longer than 86,400 SI seconds, occasionally the last minute of a UTC day will have 61 seconds. The extra second is called a leap second. It accounts for the grand total of the extra length (about 2 milliseconds each) of all the mean solar days since the previous leap second. The last minute of a UTC day is allowed to contain 59 seconds to cover the remote possibility of an Earth rotating faster, but that has never happened. The irregular day lengths mean that fractional Julian days do not work properly with UTC.
UTC is derived from International Atomic Time (TAI), which is a time scale tracking proper time on the rotating surface of the Earth (the geoid). At any particular time, UTC proceeds as a linear function of TAI. From 1972 onwards UTC ticks at the same rate as TAI, but earlier (back to the 1961 start of UTC) UTC ticked at a different rate from TAI. In order to remain a close approximation of UT1 (equivalent to GMT before 1960), UTC occasionally has discontinuities where it changes from one linear function of TAI to another. These discontinuities take the form of leaps implemented by a UTC day of irregular length, and (prior to 1972) changes to the rate at which UTC ticks relative to TAI. Discontinuities in UTC have only ever occurred at the end of a Gregorian month.[6]
The International Earth Rotation and Reference Systems Service (IERS) tracks and publishes the difference between UTC and Universal Time, DUT1 = UT1 - UTC, and introduces discontinuities into UTC to keep DUT1 in the range -0.9 s < DUT1 < +0.9 s. Since 1972 the discontinuities have consisted only of a leap of one second at the end of June 30 or December 31. The IERS publishes its decision on whether to have a leap second on each of these dates a few months in advance, in Bulletin C.[7] In principle leap seconds can also occur on March 31 or September 30, but the IERS has never found this necessary.
As with TAI, UTC is only known with the highest precision in retrospect. The International Bureau of Weights and Measures (BIPM) publishes monthly tables of differences between canonical TAI/UTC and TAI/UTC as estimated in real time by participating laboratories. (See the article on International Atomic Time for details.)
History
Originally, the local time at the Royal Observatory, Greenwich, England was chosen as standard at the 1884 International Meridian Conference, leading to the widespread use of Greenwich Mean Time (GMT) in order to set local clocks. This location was chosen because by 1884 two-thirds of all charts and maps already used it as their Prime Meridian. In 1929, the term Universal Time (UT) was introduced to refer to GMT with the day starting at midnight. Until the 1950s, broadcast time signals were based on UT, and hence on the rotation of the Earth.
In 1955, the caesium atomic clock was invented. This provided a form of timekeeping that was both more stable and more convenient than astronomical observations. In 1956 the US National Bureau of Standards started to use atomic frequency standards in generating the WWV time signals. In a controversial decision, the frequency of the signals was initially set to match the rate of UT, but then kept at the same frequency by the use of atomic clocks and deliberately allowed to drift away from UT. When the divergence grew significantly, the signal was phase shifted (stepped) by 20 ms to bring it back into agreement with UT. Many such steps were used. The signal frequency was changed less often.
In 1958, the International Atomic Time (TAI) service started. It was based on the frequency for the caesium transition, newly established,[8] that was later (1967) used to redefine the second. The WWV time signal's frequency was set to a simple offset from the TAI frequency: initially an offset of 1.0×10-8, so that WWV ticked exactly one second for every 1.00000001 s of TAI. Many 20 ms time steps were used.
Despite the initial controversy, it became clear that basing time signals on atomic clocks was an improvement over the prior system. However, it was widely desired to keep civil time synchronised with the Earth's rotation, and many uses of time signals (such as for navigation) relied on their closely matching Universal Time. WWV's compromise approach was copied by other agencies worldwide, such as the Royal Greenwich Observatory. It now became a concern that time signals should be synchronised with each other, rather than independently determining their own frequency offsets and phase shifts.
In 1960 an international agreement was made on atomic-based time signals. A frequency offset of 1.5×10-8 was adopted by all the participating institutions, matching the then-current rate of UT2. Ad hoc phase shifts were used to synchronise the time signals as far as possible. It was determined that the Bureau International de l'Heure should henceforth choose the frequency offsets and coordinate the time steps. It was also decided to use larger jumps, of 50 ms instead of 20 ms.
UTC was officially initiated at the start of 1961. The TAI instant 1961-01-01T00:00:01.422818 exactly was identified as UTC instant 1961-01-01T00:00:00.000000 exactly, and UTC ticked exactly one second for every 1.000000015 s of TAI. Time steps occurred every few months thereafter, and frequency changes at the end of each year. The jumps increased in size to 100 ms, with only one 50 ms jump having ever occurred. This UTC was intended to permit a very close approximation of UT2, within around 0.1 s.
In 1967, the SI second was redefined in terms of the frequency supplied by a caesium atomic clock. This was the frequency that had been provisionally used in TAI since 1958. It was soon recognised that having two types of second with different lengths, namely the UTC second and the SI second used in TAI, was a bad idea. It was thought that it would be better for time signals to maintain a consistent frequency, and that that frequency should match the SI second. Thus it would be necessary to rely on time steps alone to maintain the approximation of UT. This was tried experimentally in a service known as "Stepped Atomic Time" (SAT), which ticked at the same rate as TAI and used jumps of 200 ms to stay synchronised with UT2.
There was also dissatisfaction with the frequent jumps in UTC (and SAT). In 1968, Louis Essen, the inventor of the caesium atomic clock, and G. M. R. Winkler both independently proposed that steps should be of 1 s only.[9] This system was eventually approved, along with the idea of maintaining the UTC second equal to the TAI second. At the end of 1971 there was a final irregular jump of 0.107758 TAI seconds exactly, so that 1972-01-01T00:00:00 UTC was 1972-01-01T00:00:10 TAI exactly, making the difference between UTC and TAI an integer number of seconds. At the same time the tick rate of UTC was changed to exactly match TAI. UTC also started to track UT1 rather than UT2. Some time signals started to broadcast the DUT1 correction (UT1 - UTC), for applications which required a closer approximation of UT1 than UTC now provided.
The first leap second occurred on 1972-06-30. Since then leap seconds have occurred on average once every 18 months, always on June 30 or December 31. As of 2006 there have been 23 leap seconds in total, all positive, putting UTC 33 seconds behind TAI. It seems unlikely that a negative leap second will ever occur, but there is a small chance of one due to the acceleration of the Earth's crust in the 2000s. This acceleration has already led to the longest ever period without a leap second, from 1999-01-01 to 2005-12-31.
Rationale


Graph showing the difference DUT1 between UT1 and UTC. Vertical segments correspond to leap seconds.
The Earth's rotational speed is very slowly decreasing due to tidal braking, causing the mean solar day to increase in length. The length of the SI second was based on the mean solar day observed between 1750 and 1892, analysed by Simon Newcomb. As a result, the SI second was exactly 1/86400 mean solar day in around 1820. In earlier centuries the mean solar day was shorter than 86400 SI seconds, and in later centuries it is longer than 86400 seconds. At the end of the twentieth century the length of the mean solar day (also known simply as "length of day" or "LOD") was approximately 86400.002 s. For this reason, UT is now 'slower' than TAI.
The excess of the LOD over the nominal 86400 s accumulates over time, causing the UTC day, initially synchronised with the mean sun, to become desynchronised and run ahead of it. At the end of the twentieth century, with the LOD at 2 ms above the nominal value, UTC ran faster than UT by 2 ms per day, getting a second ahead roughly every 500 days. Thus leap seconds were inserted at approximately this interval, retarding UTC to keep it synchronised in the long term. Note that the actual rotational period varies on unpredictable factors such as tectonic motion and has to be observed rather than computed.
The insertion of a leap second every 500 days does not mean that the mean solar day is getting longer by a second every 500 days: it will take approximately 50,000 years for mean solar day to lengthen by one second. The correct reason for leap seconds is not the current difference between actual and nominal LOD, but rather the accumulation of this difference over a period of time: in the late twentieth century, this difference was about 1/500 of a second, so it accumulated to 1 second after about 500 days.
For example, assume you start counting the seconds from the Unix epoch of 1970-01-01T00:00:00 UTC with an atomic clock. At midnight on that day (as measured on UTC), your counter registers 0 s. After Earth has made one full rotation with respect to the mean Sun, your counter will register approximately 86400.002 s (the precise value will vary depending on plate tectonic conditions). Based on your counter, you can calculate that the date is 1970-01-02T00:00:00 UT1. After 500 rotations, your counter will register 43 200 001 s. Since 86400 s × 500 is 43 200 000 s, you will calculate that the date is 1971-05-16T00:00:01 UTC, while it is only 1971-05-16T00:00:00 UT1. If you had added a leap second on December 31, 1970, retarding your counter by 1 s, then the counter would have a value of 43 200 000 s at 1971-05-16T00:00:00 UT1, and allow you to calculate the correct date.
In the graph of DUT1 above, the excess of LOD above the nominal 86400 s corresponds to the downward slope of the graph between vertical segments. (Note that the slope became shallower in the 2000s, due to a slight acceleration of the Earth's crust temporarily shortening the day.) Vertical position on the graph corresponds to the accumulation of this difference over time, and the vertical segments correspond to leap seconds introduced to match this accumulated difference. Leap seconds are timed to keep DUT1 within the vertical range depicted by this graph. The frequency of leap seconds therefore corresponds to the slope of the diagonal graph segments, and thus to the excess LOD.
Future
As the Earth's rotation continues to slow, positive leap seconds will be required more frequently. The long-term rate of change of LOD is approximately +1.7 ms per century. At the end of the twenty-first century LOD will be roughly 86400.004 s, requiring leap seconds every 250 days. Over several centuries, the frequency of leap seconds will become problematic.
Sometime in the 22nd century, two leap seconds will be required every year. The current use of only the leap second opportunities in June and December will be insufficient, and the March and September options will have to be used. In the 25th century, four leap seconds will be required every year, so the current quarterly options will be insufficient. Thereafter there will need to be the possibility of leap seconds at the end of any month. In about two thousand years even that will become insufficient, and there will have to be leap seconds that are not at the end of a month.[10]
In a few tens of thousands of years (the timing is very uncertain) LOD will exceed 86401 s, causing the current form of UTC to break down due to requiring more than one leap second per day. It would be possible to then continue with double leaps, but this becomes increasingly untenable.
Both the one-leap-second-per-month and one-leap-second-per-day milestones are considered (by different theorists) to mark the theoretical limit of the applicability of UTC. The actual number of leap seconds to keep track of time would become unwieldy by current standards well before these, but presumably if UTC were to continue then horological systems would be redesigned to cope with regular leap seconds much better than current systems do.
There is a proposal to redefine UTC and abolish leap seconds, such that sundials would slowly get further out-of-sync with civil time.[11] The resulting gradual shift of the sun's movements relative to civil time is analogous to the shift of seasons relative to the yearly calendar that results from the calendar year not precisely matching the tropical year length. This would be a major practical change in civil timekeeping, but would take effect slowly over several centuries. An ITU study group is to vote on this possibility during 2008, possibly leading to official approval by the World Radio Conference in 2011 and the cessation of leap seconds in 2013.
There is also a proposal that the present form of UTC could be improved to track UT1 more closely, by allowing greater freedom in scheduling leap seconds.[12]
Uses
UTC is the time system used for many Internet and World Wide Web standards. In particular, The Network Time Protocol, which is designed to synchronize the clocks of many computers over the Internet (usually to that of a known accurate atomic clock), uses UTC.
Those who transmit on the amateur radio bands often log the time of their radio contacts in UTC, as transmissions can go worldwide on some frequencies. In the past, the FCC required all amateur radio operators in the United States to log their radio conversations.
Some watches include a second 12-hour display for UTC (such as Pontos Grand Guichet GMT).
UTC is also the time system used in aviation, referred to as Zulu.[13] Weather reports, flight plans, air traffic control clearances, and maps all use UTC to avoid confusion about time zones and daylight saving time.
Because of time dilation, a standard clock not on the geoid, or in rapid motion, will not maintain synchronicity with UTC. Therefore, telemetry from clocks with a known relation to the geoid is used to provide UTC, when required, on locations such as spacecraft.
UTC is a discontinuous timescale. So, it is not possible to compute the exact time interval elapsed between two UTC timestamps without consulting a table that describes how many leap seconds occurred during that interval. Therefore, many scientific applications that require precise measurement of long (multi-year) intervals use TAI instead. TAI is also commonly used by systems that can not handle leap seconds. A fixed 19 second offset from TAI also gives GPS time.
For most common and legal-trade purposes, the fractional second difference between UTC and UT (GMT) is inconsequentially small, so UTC is often called GMT, for example by the BBC, although that usage is not technically correct.[14]
Time zones
Time zones are expressed as an offset of an integral number of hours (and sometimes minutes) from UTC. At every single instant, the number of seconds is the same in all time zones. This is important when leap seconds occur. When a positive leap second occurs, it is denoted as "23:59:60" in UTC, and (for example) "20:29:60" in the Newfoundland time zone at UTC−03:30. If a negative leap second occurs, such that "23:59:59" does not occur in UTC, it is "20:29:59" that is missing in the UTC−03:30 time zone.
The UTC time zone is sometimes denoted by the letter Z – a reference to the equivalent nautical time zone (GMT), which has been denoted by a Z since about 1950. The letter also refers to the "zone description" of zero hours, which has been used since 1920. See time zone history. Since the NATO phonetic alphabet and amateur radio word for Z is "Zulu", UTC is sometimes known as Zulu time. This is especially true in aviation, where Zulu is the universal standard.[15] This ensures all pilots regardless of location are using the same 24-hour clock, thus avoiding confusion when flying between time zones.[16][17]

No comments: