Last Sunday was the end of our Summer-time period. We all set our watches back one hour and were thus allowed to spend 60 minutes more in bed. I completely forgot all about it, and it wasn't until 4pm (when a friend pointed it out to me) that I realised I had just won an extra hour of Sunday. My watch is now back in GMT, Greenwich Mean Time.
I used to think GMT was, by definition, the official time in the UK. It wasn't until recently that I learnt that only in Winter are the two notions equivalent, and that during the period in which Western European Summer Time is used, the official time is technically GMT+1. In other words, while the official time makes two jumps of one hour every year, GMT doesn't. So strictly speaking, if you want to meet up with someone at noon in England in the middle of July, and you want to be a pedant, you should say "11am GMT". Since I learnt this, I have met at least two people who used GMT in the wrong sense, so I take it a lot of people in fact misunderstand the notion of GMT and see it more as a way of distinguishing between time zones, than as a technical term defining a time independent of the season. Greenwich Mean Time is, to be precise, the mean solar time at the Royal Observatory in Greenwich. That doesn't mean that the sun will always be at its hightest exactly at noon GMT every day; it means that the average position of the Sun measured at noon GMT will turn out to be the zenith -- or something like that. Hence the word "Mean".
I dug deeper into this and made some peculiar discoveries. It turns out that there exists a plethora of other time standards, the most important one being Coordinated Universal Time, or UTC (more on that abbreviation later). It is based on the international scientific definition of a second:
The second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the Caesium 133 atom.
This is a direct quote of the Bureau International des Poids et Mesures, the organisation in charge of international standard units, or SI-units. Don't worry if you don't understand it, neither do I. But here's the important bit: The advantage of this definition is that it is a constant, unlike other previous definitions of the second. For example, the time it takes for the Earth to complete one rotation varies slightly over time, so a definition of the second based on that, would yield a unit that wasn't constant.
To get back to our time standards: UTC uses this definition of the second and the assumption that one day lasts for 86,400 seconds (or equivalently, 24 hours, each consisting of 60 minutes, each consisting of 60 seconds), to create a precise time standard. However, it still needs a way to somehow relate to the time indicated by the position of the sun in the sky. There is no point in having a extremely precise and unambiguous time standard if it isn't linked to the relative positions of the Earth and the Sun. How do we make sure that the noon we observe here on Earth is always exactly 12:00:00 in UTC? This is where another version of Universal Time comes into play, namely UT1.
UT1 is another time standard, and hasn't got anything to do with the above definition of a second. In technical terms, it is proportional to the true rotation angle of the Earth with respect to a fixed frame of reference. What does that mean? Essentially, it means that UT1 measures how far the Earth has turned, and sets the time accordingly. To put it very crudely, it is the "actual time" on Earth. It is in that sense a "Universal Time", since it only depends on the Earth's position in space. I won't go into the details of how UT1 is worked out; suffice to know that it is of high precision, and can measure time down to the nearest millisecond (well, almost).
Now, it turns out that the UT1 time standard is slightly slower than the UTC standard. The rotation of the Earth isn't nice and constant, so the "actual time" on Earth will gradually lag behind the scientific UCT standard. This doesn't mean the rotation of the Earth is slowing down (actually it is, but that's a different matter), it just means that the definition of the second given above is a little too "slow" for this specific purpose. The solution to the problem is to regularly "adjust" UTC so that it follows UT1. What this means in practice, is that every now and again, an extra second is added to UTC, giving UT1 the time to "catch up". These seconds, appropriately named "leap seconds", happen on average every 19 months, every time the difference between UTC and UT1 becomes too great. The concept of "leap seconds" is analogous to the idea of "leap years": an extra day is added every 4 years, because a "gap" has appeared between the conventional calendar we use and the actual position of the Earth in its orbit around the Sun. Using this analogy, UTC corresponds to our calendar, and UT1 to the Earth's position. There has been 24 leap seconds in total between their adoption in 1972 and today.
UT1 was introduced in 1928, and when the caesium atomic clock was invented in 1955, various time standards began cropping up, resulting eventually in UTC (officially initiated in 1961) and the subsequent leap-second adjustments. However, there was an issue about the name: The English wanted it to be called CUT (Coordinated Universal Time) while the French wanted it to be called TUC (Temps Universel Coordonné). In the end a compromise was reached, and the name UTC was agreed upon.
Apparently, UTC replaced GMT in most contexts on January 1, 1972. The time used on international news channels like BBC or CNN, is in fact the Coordinated Universal Time, and most Internet application use this system as well. Makes you wonder why no-one seems to know about it.
I think from now on, I'll use the UTC acronym instead of GMT, just to confuse the hell out of everyone.