The key reason is the Coordinated part. By letting the Moon handle it's own timekeeping, it reduces the hassle of getting regular updates from Earth clocks. Earth-based atomic clocks will continue to measure its own precise second lengths and take measurements of quasars to get accurate wall-clock time. Moon-based atomic clocks can have their own separate network of atomic clocks and measure the same quasars to get the same wall-clock time.
Maybe not just that. We generally need to establish the time of events on another celestial body and process that time locally. Having separate clocks is trivial, but recording correctly sequence of events that happened a few light-seconds away is interesting, when taking all effects into account. Should we record the local time of event? The observer time (=local time+distance+gravitational effects)?
Looks like "no" but a few years ago they launched / tested the "deep space atomic clock" which seems like it'll be the basis for future space-based clocks.
If we can't use UTC for whole universe, or even for the nearby solar system, should we rename it Coordinated Global or Earth Time (GTC or ETC) instead?
What you are describing is not just impractical, it is actually impossible. Someone linked to "falsehoods programmers believe about time" but I think maybe "falsehoods non-physicists believe about time" might be more apt.
There is no such thing as a universal frame of reference is the somewhat depressing outcome of Einstein's theories.
I dont think the GP is asking for a universal reference frame in the same way physics describes it. I think he's asking for a universal programmatic reference frame to describe the offsets that sattelites and lunar stations have.
Even though time runs faster on sattelites, and the signal delay differs based on position. Because the vectors of sattelites and the moon is constant, as well as the moons gravity, you could calculate what those offsets should be and synchronise clocks anyway.
To put it another way, you normalise the clock drift and the delay, whats left over is the same clock drift you'll need to account for when you have a quartz clock in your computer and a time server that runs an atomic clock.
The point being, clock drift can become a huge issue unless you define a specification that allows any equipment to sync up with any other clock in Time And Relative Dimensions In Space.
You can select a rest frame based on the velocity of all observable galaxies adjusted for the expansion of the universe and the impact of local sources of gravity.
As far as we can tell any observer would come to similar conclusions about that reference frame would be. It’s just not a particularly useful reference.
There is no objective concept of 'now' even at distance scales of Alpha Centauri with reasonable curvature and speeds.
If you want to stick to special relativity, the Andromeda paradox will show the problem with simultaneity with distance, and Einstein's train thought experiment about trains will show the problem of simultaneity with high speed.
The order of events, timing of events, and even causal structure of events are not fixed.
Even the Conventional Celestial Reference System is only quasi-inertial, only inertial like over small periods of time.
The incredible smoothness of the earth approximating a co-rotating frame with a high error tolerance is the only reason we can treat time as absolute on the surface of the Earth.
But to derive the new definition of the Kilogram requires lots of corrections as do modern atomic clocks.
Don't confuse the ability to choose a reference frame with the Newtonian concept of universal reference frames.
When making measurements SR is a framework that allows you to pick any reference frame that is convenient, but the speed of causality(light) prevents the existence of a universal reference frame.
> The order of events, timing of events, and even causal structure of events are not fixed.
You can calculate observed order of events for any reference frame from any reference frame. Thus they are fixed once everyone agrees to using a given reference frame.
It’s not much more complicated than seismic detectors recording the seismic waves from an earthquake at different times based on their location. Sure detector X see earthquakes in order A, B, C and detector Y sees them in order C, B, A they would both agree that detector Z should see them in order A, C, B based on it’s location, the timing of the events, and wave propitiation speeds.
It only seems unintuitive because you aren’t used to accounting for such issues.
In the case of Δτ^2 = 0, that is equivalent to an inertial observer that is present at both events at the same time. Which will seem to be simultaneous.
Δτ and Δt are practically the same in situations we commonly experience and an assumption of constant velocity motion in flat spacetime is usually accurate enough.
And will an assumption of constant velocity motion in flat spacetime, you can CHOOSE a Δτ that is 0.
There is no universal reference frame where Δτ =0.
More importantly your observations are of the past for distant observers, and don't have a universal concept of 'now' outside of the local case.
You just completely missed my point. GR and curved space time again isn’t an issue as long as everyone knows what space time looks like.
Mission Control on earth, a moon base, and mars would all experiencing time at a different rate and those rates would vary based on the orbits of each body. However assuming we needed extreme precision, such as telescope interferometry, people can pick one of them arbitrarily or perhaps due to politics some arbitrary 4th option. Whatever their choice everyone’s timelines based on that choice end up being absolutely identical.
The math simply works without any ambiguity. So because there’s an obvious choice based on the actual universe we live in, ie maximum amount of time since the Big Bang, then in theory trillions of civilizations across many different galaxies might all use the exact same reference even if no particle experienced it.
Edit: Well not exactly the same there’s uncertainty for such calculations but effectively the same.
I don’t get it, can’t you just set off neutron bomb every second on Earth and use it as a monotonic counter signal?
Isn’t that like, speed of light or wavefront of event propagation in the universe isn’t consistent but unavoidably potato shaped - but if so, wouldn’t it be at least difficult to have multiple potatoes intersecting, or potato being shaped like Klein bottle? Self intersecting potato shaped space-time would be cool, though.
The problem is that they aren't 'potato shaped' rotations, they are better thought of as hyperbolic rotations. Simultaneous distant events in Lorentz transformations will all lay on a shared hyperbola not a distorted circle.
Really this is beyond typical human intuition and requires math, but here is a quick minutephysics video that quickly demonstrates fairly well it without me typing out a bunch of math that no one wants anyway.
I wasn't suggesting we should have a true universal time, I was suggesting that we rename the mislabelled universal coordinated time that we already have to something else... :) )
Why can't Earth time be the universal frame of reference, and everywhere else use a factor of earth time? This article describes the rate difference already, why can't we use that?
For the moon that potentially works because we're close enough to the moon that the drift is dominated by local gravitation rather than relative motion. But for any other context, say Mars, the problem is that the clock drift will be non-uniform. An hour on Earth (as measured by a precise clock on Earth) vs an hour on Mars as measured by a precise clock on Mars) will not be a fixed ratio, but will depend on relative position and velocities at the time that you start measuring the hour.
Aren't those relative positions and velocities predictable enough? I believe NASA regularly makes predictions somewhat like this, such as when specific large asteroids are predicted to come near earth.
Sure, just like the relativistic effects of GPS clocks on Earth are predictable. But it makes clock keeping on Mars much more complicated, always making corrections. The corrections are always changing and syncing time is hard because the motions are always changing. Simplest solution is to use local atomic clocks. And then calculate Earth time when needed.
All for zero benefit because the Earth is variable number of minutes from Mars so there will always be variation. Mars day is different from Earth have to calculate Earth time anyway.
How is it depressing? Relativity of simultaneity implies the universe is basically a giant crystal, future and past both actually exist. They almost have to for relativity of simultaneity to be the way it is.
That means everything happening is happening forever, all at once.
I guess if you live a shitty existence, then its infinitely shittier, but if not, its infinitely meh.
You should not because on Earth, where gravity is relatively strong compared to the Moon, time runs slightly slower due to the gravitational time dilation effect. This means that clocks on Earth tick slightly slower compared to identical clocks in a region of weaker gravity, such as on the Moon's surface. So the UTC is from Earth perspective.
From rough calculations, the time experienced on the surface of the Moon is 0.99999999999848 times the experienced on Earth
I have seen references to this but I am constantly confused by how this is actually a thing.
There was a video I watched that stated that in some situations 2 identical clocks that are in perfect working order, that if one of them was on a ship and doing certain things (that is purposefully vague since I don't remember) that if that ship came back to earth it would have a different time.
It is one of those things that really boggles my mind, even though it fascinates the hell out of me. If you have any resources to read or watch on this (the video I watched was space time I think) I would love it.
The way I've understood it is that everything which can 'move' in spacetime has to have a momentum vector with constant magnitude C (when expressed in the right units, C = 1)
The faster you 'move' through space, the more you have to 'borrow' from the time component of the vector to maintain a magnitude of C
That means your 'position in time' moves slower; and so for people who aren't moving as fast through space as you are, they appear to 'experience more time'
Therein lies the rub. To "move" proves nothing about who is "faster" because there is no absolute frame of reference. You may think that I am moving at 0.8c, but maybe it is just me slowing down to a standstill while you are still receding at 0.8c. This might be a valid interpretation of your observation of me "moving at 0.8c" if... (and only if...) there were an absolute frame of reference. But there ain't.
AIUI it is the flavors of acceleration - including accelerating, decelerating, and gravity - that tinker with time. Which is why I still can't quite wrap my mind around the Twin Paradox, because it is usually explained in terms of speed, not periods of acceleration.
What is the frame of reference, or is there even one, in this case? If you have one spacecraft moving in one direction at 0.5c flying by Earth, is the craft at 0.5c and experiencing "slower" time, or is Earth experiencing slower time? Or do they both experience the same thing as they're not accelerating wrt each other?
If they just pass by each other without changing speeds, each sees the other as experiencing slower time. For the case where one leaves and comes back, see the Twin paradox[1] which is resolved by the simple fact that to change speeds one must accelerate (or that the one who returns must have experienced at a minimum two different frames of references; one to leave and one to return).
There is no independent frame of reference by which we can tell whose time is slower.
From the spacecraft's frame of reference, Earth's time is slower. From Earth's frame of reference, the spacecraft's time is slower. Both are right.
When we say that time is slower or faster on a spacecraft, the Moon, or an exoplanet of Christopher Nolan's, we are implicitly prefixing the statement with "From Earth's frame of reference..."
There isn’t really a “why”, other than the need to match observed reality.
We know from observations that light moves at a constant speed, even when the observer is moving near the speed of light, and we know that this observation is true regardless of your frame of reference.
In order for physics to remain consistent while accounting for the constant speed of light, other things need to flex between the two reference frames: namely, time (time dilation) and length (Lorentz contraction).
The speed of light is a universal constant in a vacuum, like the vacuum of space. However, light can* slow down slightly when it passes through an absorbing medium, like water (225,000 kilometers per second = 140,000 miles per second) or glass (200,000 kilometers per second = 124,000 miles per second).*
Light propagation in a medium is a quite different thing from light in vacuum.
For example, the speed of light in a medium is not the "speed limit" of things in the same medium, and particles in it can actually move faster than light: https://en.wikipedia.org/wiki/Cherenkov_radiation
In other words, the speed of light in vacuum plays a special role in both a vacuum and a medium.
In other words, "to make the math work out". That's kinda what I was poking at, trying to understand if that is some fundamental truth or if it is the result of some underlying mechanism that is more fundamental.
But a century of experimental and observational data proves that it is.
At this point it's generally just taken as a fact that the speed of light is constant for all observers. The explanation given above falls out as a direct mathematical consequence.
All of those things fall out from the speed of light being independent of the speed you are moving at (i.e. regardless of how fast or slow or what direction you are going, you will always get the same answer when you measure the speed of light in a vacuum).
The easiest one to explain is probably the most mind-bending: wheter or not an event is "simultaneous" depends on the frame of reference.
You are sitting in a spaceship that has a large empty cargo bay. There is a lightbulb in the exact center of the cargo bay. If you turn the bulb on, it will hit the front and the rear of the cargo bay at the exact same time. This is true regardless of the speed the spaceship is traveling at (because the speed of light will always be measured as the same value).
Now think of what happens from the frame of reference of someone on a planet as the spaceship is passing by. Since the spaceship is moving forwards, and the light moves in both directions at the same speed, the light will hit the back of the cargo bay slightly before the light hits the front of the cargo bay. The faster the spaceship is moving, the bigger this difference.
It will forever boggle me but the way I sort of rationalize it is that speed basically borrows energy from time. Something like that. So the faster you go, the slower the time goes. So if you go the speed of light, you're taking so much time away that when you make a roundtrip, you're like 50 years into the future.
That's a totally wrong explanation but it helps me sort out the effects and what to expect. If you fly quickly around the Earth, your watch is a tiny bit different than the stationary clock.
AFAIK one can also consider the 4D velocity vector to have a constant lenght. Thus the faster you move in the spatial dimensions, the slower you move in the time dimension to maintain the constant length.
As the sibling comment alludes to, it's otherwise they wouldn't behave according to what we measure.
For a deeper "why", see Feynman[1].
edit: it's related to how spacetime with a finite speed of light is represented mathematically. There's some discussion here[2] which might shed some light.
This is close to the main intuitions of special relativity as a geometric theory. I would phrase is more as "speed borrows space from time". In general relativity (and special relativity if you look at it geometrically) every reference frame moves at 1 second / second in their own coordinates, but an observer in a different frame will see you move in their coordinates -- the 4-vector of your time / position keeps the same magnitude, so since you are moving faster in space, your time is moving slower.
This sounds cool, but I don't think it has explanatory value. It's true that moving close to the speed of light will mess with your intuitions about the passage of time and give you things like the Twin Paradox. But there's nothing special about time there, it also makes things shorter by Fitzgerald contraction.
You can't be in the future, for example. The point of relativity is the equivalence of reference frames, it's not like the space traveller's clock is wrong and the stayhome's clock is correct, it's that humans who never travel at speeds close to the speed of light don't expect that two good clocks could ever disagree.
The underlying reason is causality. Causality is the concept that an event can only be caused by another event in its past (not its future), and events can only impact their own future. Causality cannot propagate faster than the speed of light, ergo for a given event you can express the area of space that could have caused that event as a function of time (it's a circle with radius equal to the amount of time in the past you're looking times the speed of light; e.g. 10s before an event, that event could have been caused by another event anywhere within 10s*the speed of light).
If time is constant, objects moving at a significant fraction of the speed of light break causality. They're moving fast enough that causality propagating the opposite direction of their velocity (things happening in the future) reach them faster than they should, and causality propagating in the same direction as them reaches them slower than it should (because they're moving away from it at a significant portion of the speed causality is approaching them at).
E.g. lets say there are 2 massive celestial bodies A and B. A is not moving at all, B is moving at 50% the speed of light. Let's say they pass close enough to gravitationally interact, but are half a light-year away from each other. Once they pass each other, causality from B will propagate to A at the speed of light, like normal. Causality seems fine.
But causality will propagate from A to B much, much slower because of the relative velocity. If B is 1 light-year away, it would actually take something like 1.3 years for causality to reach it. In other words, A is within B's causality radius (B can cause effects on A), but B is not within A's causality radius (or rather, it is when the event happens, but it won't be there by the time causality gets there). That's a problem for something like gravity. B's gravity can influence A, but A's gravity can't be the cause of events on B, because of the speed of causality. Thus the only valid event is B's gravity on A, accelerating A without decelerating B, meaning we would have actually created energy (at least until causality catches up).
To maintain causality, and preservation of energy, something has to happen to B such that A interacts with B at the same time B interacts with A. The answer is to make movement through time and movement through space inversely correlated. If B is moving fast enough that light takes 30% longer to get there, B's time has to slow down by the same amount so that causality can be simultaneous and not create energy.
Causality essentially requires movement through space and movement through time to add up to some constant. As movement through space increases, movement through time decreases and vice versa. It's basically a formula like (current speed/speed of light) + rate of passage of time = 1.
That's the underpinnings of the idea that FTL travel will allow time travel. If (current speed/speed of light) is greater than 1, the passage of time has to be negative or flowing backwards to maintain causality. I.e. you are moving so quickly that you can catch up to and interact with causality propagating through the universe.
Somewhere out on the edges of the universe, the causality of the meteor that killed the dinos is still propagating outwards. Perhaps if we could move fast enough to reach that wave of causality, we would be able to interact with it somehow. It's all theoretical, and hand-wavy, and trippy, but interesting in concept.
There is also time dilation from motion through space (motion through space + motion through time is equal to the speed of light, so as you speed up, your motion through time slows down) as well as time dilation due to gravity (like when near a black hole I think).
I'm also not a physicist, but read some popsci books in my youth like "A brief history of time" from Stephen Hawking (it's written for normal people). That's probably the best book you'd want. He later wrote a more modernized version called "a briefer history of time" as well. They both cover time dilation from a high level iirc. Of course, to truly understand you'd need to learn the math. I'm looking forward to doing that during retirement in a few decades.
A Brief History of Time deals with this subject very well. It's a very easy and quick read, meant for a wide audience, not a technical one. It's readable in an afternoon.
The concept that time is relative to the observer is where the theory of relativity gets its name from.
It's a bit easy to understand that in the context of gravity, because gravity bends light down. If you shine a flashlight on Earth, that light is in free-fall, bending down. Time follows the same path, because the speed of light is fixed.
It‘s pretty simple. In places with high gravity, time goes slower. In fast moving containers (car, plane, spaceship), mass+gravity are also increased. This is the reason why shining a light forward on a moving train doesn‘t make the photons the speed of light + the speed of the train. The speed of the light emitted from the flashlight is slowed down by the mass/gravitation increase through the movement of the train.
This way, travelling to the future is pretty easy by the way. Just travel in a vehicle almost at the speed of light and the outside worlds time will move faster (relatively), since you are slowed down.
This is actually the guy that finally helped me break through into understanding (at least a little what's going on). Specifically about how the movement through space lengthens distances between objects (including atoms) and causes information (including light) to travel further, along the hypotenuse of a triangle, "in order to do stuff".
"Doing stuff" includes observing things because of the light travel distance, but also biological processes, which involve eg. electrons moving from one atom to another.
I didn't watched the linked video here (more relevant perhaps), but this was the one for me: https://youtu.be/Vitf8YaVXhc
The concept of time dilation in general is actually pretty easy to explain, with the use of a relativistic starship! Imagine we create a ship that could constantly accelerate at 1g per second. What happens if you're on that ship, and start to approach the speed of light (relative to an observe back on Earth)? Well, some quite weird stuff - but you would not actually slow down! The speed of light is not a speed limit, like most people think of it. A human could easily travel billions of light years in a single lifetime.
But it is true that nothing can ever be perceived as traveling faster than the speed of light. So how you can you travel billions of light years in a few decades, yet never be perceived as going faster than the speed of light - one light year per year? Simple - the universe, like a simulation filled with spaghetti code, starts to cheat, and changes the rate at which things start to move through time. An observer back on Earth would see your ship start to accelerate towards the speed of light, but then hit an insurmountable asymptote just before it.
So if you traveled a million light years, they would see your trip taking a million years. But by contrast, only about 26 years would pass for you. So if you traveled a million light years out in our 1g accelerating ship, and then a million light years back, it would take you 52 years, but 2 million years would have passed on Earth. There's a calculator for such trips here. [1] This whole effect is called time dilation. Gravitational time dilation is just a special case of general time dilation, and is essentially the time dilation factor driven by the velocity needed to escape the gravitational well created by a body. So - more massive objects result in greater time dilation. It leads to interesting things like the core of the Earth actually being younger than its surface!
---
A clear way this can be seen in real life (and to also emphasize this is in no way whatsoever an optical illusion) is with particle accelerators. Many emergent or unstable particles tend to decay rapidly. Yet when we accelerate them to speeds near light, we end up being able to observe them for orders of magnitude longer than their decay when at rest. It's because of time dilation. From an at rest observer, time starts to move more slowly for something moving rapidly.
All of this should also be taken with a general 'for illustrative purposes' asterisk. I'm leaving out lots of things, like how as you approached the speed of light you'd start to experience length contraction. It's essentially another way that the universe cheats to ensure that everybody always perceives the speed of light as a constant.
---
This has amusing and interesting social implications for Earth as well, if and when we become able of developing such technology - the rich and powerful, seeking immortality, will undoubtedly seek to thrust themselves into the future. Great setting for some sort of a sci-fi series, not only for those on Earth, but also for those setting out into a future that may not be exactly what they were hoping for.
Accelerating 1g constantly is much more difficult than it sounds. You'd need a practically infinite fuel source or some way to generate fuel while traveling. That's why the EM-drive was so tantalizing. If we could [near] perpetually generate even the smallest amount of thrust, the implications would be unimaginable. But for now this seems impossible.
The length of a second varies at different points on earth too, and varies far more when in low earth orbit. UTC is still good enough though, even with the number of seconds in a year varying by a non-predictable amount.
> on Earth . . . time runs slightly slower due to the gravitational time dilation effect
When people say "time runs slower" in a location, that means that time spent there subjectively feels longer than it really is in the outside world. "Time runs slower at the cabin, a weekend there makes the city and office life seem like a long-ago time and place."
That's the opposite of what happens in gravitational time dilation. You spend a subjective two hours in Gargantua and decades pass on the outside.
Then, as GP implied, the U in UTC is a misnomer. Considering replacing it with a better name is logical considering the development of technology and our improving ability to go out and explore.
The second is defined by taking the fixed numerical value of the caesium frequency ∆ν, the unperturbed ground-state hyperfine transition frequency of the caesium 133 atom, to be 9 192 631 770 when expressed in the unit Hz, which is equal to s−1
But with the Moon's lower gravity, time flows very slightly faster (a moon-second is 0.99999999999848 of an earth second according to a sibling post). I'm no physicist, but my assumption is that the SI definition of a second would still be true on the moon (at least relative to the moon's reference frame). So it would make sense to me to leave a second defined as a SI second, regardless of your reference frame because that's how humans, science & technology measure time passing.
I think the bigger calendar problem with space travel is the differences in the length of a day (or sol), as we know it's 24hrs on earth, Mars is 24h 37m (not sure about the moon since it's tidally locked its ~27x24hrs). The drift of days between Mars and Earth is going to be far more noticeable than a tiny fraction of a second due to relativity.
That might be convenient for the purpose of timekeeping, but then you would also have to redefine all sorts of physical constants, which could be problematic. Oops, our lander crashed because we were doing calculations using Moon seconds, but the gravitational constant G was in Earth seconds. Okay, the difference probably isn't enough to matter for that case, but still any physics calculations where that precision is needed (say interferometry?) you'd have to keep track of which measurement system you are using. In some ways it would be worse than US -> SI because they would be close enough that it wouldn't be immediately obvious you made a mistake.
I'd much rather have SI be universal than our calendar.
The space shuttle had a four-second window to start its entry burn to reach the runway back on Earth. I don't think the discrepancy in a second on Earth versus the Moon is going to mean the difference between a crash or a success. No space-faring vessel had any tolerance as low as .0000001
GTC is easily confused with Galactic Time, which will likely cause mass havoc a rewrites in dead languages like Imperial C and Scala sometime after the year 48,000AD.
A true visionary must plan for job security for millennia to come. Next on the todo list: code up a Kwisatz Haderach to do your all your data modeling work while you slack off.
On a more serious note, Galactic Time probably won't be useful because relativity introduces too much irreconcilable error at that scale. We'll be back in the early days of the railroad when every station had their own clock, adjusted to local noon.
You can't use any time for observers in different reference frames as the length of a second will be different. UTC is generally fine as far as normal time goes as we're all in roughly the same reference frame on Earth, and indeed anywhere in the Solar System. It would still drift by noticeable milliseconds per year, but we can cope with leap seconds just fine with UTC and that's a larger drift.
You'd probably want to avoid leap seconds and use TAI though, but we should avoid leap seconds anyway IMO.
> TT is indirectly the basis of UTC, via International Atomic Time (TAI).
Nobody is going to bother renaming UTC at this point, misnomer or not. Besides, if they stop adding leap seconds to UTC (as is planned) the whole point of UTC goes away anyways.
We can use (Earth based) UTC on the moon, or on the whole universe for that matter.
The problem is that on the moon, due to relativistic effects, a UTC second is not really a second, it is slightly longer, and that makes it inconvenient for conducting experiments on the moon. So for moon-based operations, moon time will be used, it can then be converted to UTC for "universal" operations.
It is not the first time we use different clocks. In fact, we already have plenty: UT0, UT1, TAI, etc... That's because the Earth rotation is not that great at timekeeping, atomic clocks do better, but we still want our days to be aligned with the sun, so depending on whether you are more interested in the Earth rotation or in precisely counting seconds, you will use different clocks, with UTC being a compromise you can always convert to. GPS time is one such example, it is a few seconds off UTC because it doesn't account for leap seconds.
All that is not even accounting for local (earth) time zones.
Aren’t seconds different based on latitude, as well as elevation? Should we perhaps have many more time zones than the 30 or so we have now, all requiring occasional leap-ms to sync up to UTC?
Fun fact!? TIL: UTC = Coordinated Universal Time but the abbreviation is not CUT that would be "logical", but UTC.
"It came about as a compromise between English and French speakers.
- Coordinated Universal Time in English would normally be abbreviated CUT.
- Temps Universel Coordonné in French would normally be abbreviated TUC.
The International Telecommunication Union (ITU) and the International Astronomical Union wished to minimize confusion and designated one single abbreviation for use in all languages.
UTC does not favor any particular language. In addition, the advantage of choosing UTC is that it is consistent with the abbreviation for Universal Time, UT"
So the reason it's called UTC is because the French and British couldn't agree, who could have guessed? :)
Is "time zone" even the correct term when we're looking at a different frame of reference? "Welcome to Hawaii, each day here is 23.4 hours." I thought the article would be about how the lunar day is 29.5 Terran days.
An interesting question is what the frame of reference for this time is - if the Moon's orbit changes significantly and the contraction factor changes, then this timezone shouldn't be adjusted - almost "de-pegging" it from Earth's time, right?
I know some bores so good that 100 seconds passes, and I shake my watch ~ 1 second has passed, so there is a subjective localised field around some people..
To a local observer yes, but to someone moving at a different speed no! FloatHeadPhysics does a good job explaining some of this https://youtu.be/OpOER8Eec2A
Only if you disregard observers in different frames of reference interacting with each other, which you shouldn't when you need high precision and it comes to projects spanning Earth, Earth orbit, and the Moon.
GPS wouldn't work without accounting for relativity, for example.
Yeah, I guess I don't see what's funny about that statement.
Unlike "sometimes a second is longer than a second" (not literally true but it makes some sense in the context of relativity), this one just seems like a tautology to me.
Yeah, it might not be funny, but the tautology draws attention to the fact that there is no privileged frame of reference.
In other words, the only thing we can say without qualification is that a second is just a second in the same frame of reference. All other statements must be heavily qualified.
Even things like "A's second is longer than B's" are only valid in some frames of reference and not others.
But you can observe somebody or something in an accelerating frame having experienced time at a different rate.
In the twin paradox thought experiment, one of the twins really has aged slower than the other (or, from their point of view, the entire earth has aged faster than themselves).
In that sense, relativity has effects more tangible than distortion of observations across a large distance.
This was actually discussed in Heinlein's _The Moon is a Harsh Mistress_ --- and dismissed as idiocy, since being in synch with earth for communications was more important, and anyone who needed to would check a table to determine if it was lunar day or night on the surface.
There's a typo in the headline here on HN: the initialism for Coordinated Lunar Time is LCT, not CLT.
(this isn't my opinion on whether we'll end up following suit with UTC and so on, the article itself calls it "LCT", and "CLT" does not appear on the page at all)
If we will standardize time on solar system level, we’ll probably have another component of time, rather than a flat list of time zones. E.g. Mars should have its own set of zones. Maybe that means having an identifier of coordinate system, which can be omitted in local use cases.
Why? Just use UTC, there is zero reason for anything else.
ETA: There is a minor relativistic issue due to the reduced gravity. If the figures I found online are correct, we are talking about 0.02 seconds per year. Surely such a small difference can just be "smeared" away periodically?
I'm not sure about the larger issues surrounding the lunar time standard, but 55 microseconds per day is about 0.64 ppb. Even good OCXOs drift more than that. A standard crystal oscillator like the kind used in computers drifts by multiple seconds per day. Unless your oscillator costs more than your computer, that kind of accuracy is going to be beneath the noise floor.
The issues arent arising from cellphones and personal computers. Precise time may not be necessary for building web services but it is the basis of precise navigation systems and other scientific endeavors such as astronomy and quantum computing. Since navigation and science will probably be way up on the todo list above twitter I think nanosecond accuracy lunar time is clearly justified.
You're right that CPUs probably weren't the best example. My point was more that there are everyday things that operate at timescales where nanoseconds matter. 55 us per day is larger than the effects of either special or general relativity on GPS, but both need to be accounted for in order for the system to be useful.
No, you would have to calibrate the clocks to run slower to match earth seconds when they are in the moon's weaker gravity. Or constantly sync them to an Earth time source with over two seconds latency.
But, using your own frame of reference, you now need to convert to and fro on each cross-access. I don't really see the advantage. Notably, there is no such thing as an ontological second, it's an abstract measurement unit (loosely) bound to the rotation and revolution of the Earth. Moreover, there is no provision for such a cross-reference of basic progression of time in any Date/Time system currently available. Wouldn't be adding a constant coefficient easier? (Notably, we have modified basic constants several times, already.)
It would be even simpler to use a standard without leap seconds (or at least just without negative leap seconds). The drift happens slowly enough that nobody will notice the qualitative elements of 8am changing slightly over their lifetime.
I'm optimistic we'll have better synchronization mechanisms in the distant future than "8am == sunrise, +/- a few hours".
That said, interestingly, the quadratic nature of that problem (the earth slowing down progressively faster over time) means that the gap between when we have to care about my solution (because we accumulate too many leap seconds per lifetime) and the leap second solution (because you can only insert a little over 365 per year with the current protocol) is much smaller than the gap between now and when we would have to care about my solution.
If you're not too picky, the two solutions "expire" around the same time. The max rate of leap second insertion only admits an hour of drift per decade, which is slow enough that I still think most people wouldn't care, and they might like if our time handling is simple enough that we can't repeat the recent xz supply chain issue in hundreds of thousands of lines of increasingly complex datetime nonsense.
That was my thought. Who cares what time it feels like on the moon, if the people making plans and decisions about it are at, say, Cape Canaveral?
Once we get humans on the moon, they'll probably want their own time zones -- but in that case this solution still won't be enough, because they'll probably want the time to depend on their lunar longitude, just like it does on Earth.
No human has walked on the moon in my lifetime (last human left December 1972). I'm somewhat against going back because I like to tease old people about this (which isn't a good reason I know, but it is fun).
Even if/when humans again walk on the moon it will just be a short trip and they will head back to earth (Likely China, but India is realistic, other countries that could realistically send someone to the moon don't seem very interested in trying) Their missions will be planed from Earth, and they will otherwise want to match earth time so they can contact people back on Earth.
If in the distance future we put colony on the moon, the people will drift apart from earth and so I expect the kids born on the moon will not care about earth time so much and may eventually demand a switch to moon time. OTOH, I think what they really want out of moon time (light cycles are too long to be useful humans) probably isn't what we think today.
I would think it is more like going from North America to Australia. Possible, you are likely to know someone who did - but it isn't common because between cost and the travel time is just isn't worth it.
Who can throw rocks better, a tribe with a giant pile of boulders at the bottom of a valley, or a single man with a pile of fist-sized stones at the top of a mountain?
Its not about how anyone feels, but about making sure that we have accurate time reference on moon for stuff like positioning and networking, the same way GPS is used on Earth and also relies on precision timekeeping.
TAI doesn't help. Observers on Earth and the Moon will forever disagree about the other's duration of a second (unless they account for the relativity corrections that make them appear to be different).
It would have been easy to avoid the redefinitions of all other units by defining the second based on the frequency of an atomic clock that works in a null gravity field.
This would have required to apply a relativistic correction to the measured frequency of any atomic clock, but it would have provided unit definitions independent of the position in the Universe.
However, before contemplating the idea of time keeping on other celestial bodies it was decided to define the second based on an atomic clock that works on the surface of the geoid.
I believe that this was a big mistake, because it ties all the SI units to the Earth and especially because it does not really avoid the use of relativistic corrections. Now the precision of the atomic clocks is so great that for most of them it is necessary to apply relativistic corrections depending on the altitude of the laboratory.
Meter is defined as how far the light travels in a vacuum in 1/299792458th of a second. The actual speed of the light in a vacuum is fixed, so unless you have identical seconds the distance of the meter changes. If you have identical seconds in lower gravity, then you get less of them.
Good luck landing a craft built by multiple teams with possibly differing definitions of Newtons.
Kinda feels like agreement on the definition of the Newton would be the difference between "guidance and propulsion systems worked perfectly" and "debris was scattered across the landing zone".[1]
[1] Not an astrodynamicist, but have done enough problems with inclined planes, pulleys, springs etc to know it's quite important to get the magnitude of the forces right.
The size of football fields might need to change as well to accommodate the change in play style. You probably don't want quarterbacks dunking a touchdown in one flying leap from the 50-yard line.
The actual policy has pretty clear explanation why LTC is needed:
> Additionally, the navigation accuracy a system can achieve with signals from multiple space-based assets, such as a person navigating on Earth with signals from Global Positioning System satellites, depends on the synchronization of those assets with each other. At the Moon, synchronizing each lunar asset with an Earth-based time standard is difficult — due to relativistic effects, events that appear simultaneous at the Earth (e.g., the start of a broadcast signal) are not simultaneous to an observer at the Moon.
...
> Precision applications such as spacecraft docking or landing will require greater accuracy than current methods allow
...
> Beyond these operational challenges, the direct use of UTC at the Moon (i.e., without correction) as the local time scale would have cascading effects for applications that require precise metrology. International System of Units (SI) core unit definitions, including the meter and kilogram, rely on the SI definition of time. Due to relativistic effects, a non-SI unit would introduce uncertainty in core unit definitions. These types of errors will have undesired impacts, such as reducing the accuracy of mapping and inertial navigation products
>due to relativistic effects, events that appear simultaneous at the Earth (e.g., the start of a broadcast signal) are not simultaneous to an observer at the Moon.
Isn't this true on Earth as well, just to a slightly lesser degree? Why are the synchronization mechanisms used to correct for drift in LEO suddenly unable to cope when used on Luna?
- TAI: i.e. Atomic time which is basically the aggregate of a bunch of major atomic clocks to get as close as possible to "true time" on earh.
- UT: i.e. Universal Time on earth with the different UT0, UT1, etc providing different levels of correction based on where you are on earth.
- UTC: i.e. coordinated time that is the same anywhere on earth. This is derived from TAI but receives leap seconds when it is sufficiently out of sync from UT1.
- GPS time: i.e. the specific time standard kept on GPS satellites based on their orbit. This time is derived from UTC(UNSO) which is a specific UTC clock at the US Naval Observatory and the offsets are recorded by the satellites.
So you have your GPS time and your UTC time. What this proposal is doing is effectively the same thing. i.e. creating a new coordinated time standard for the moon (LTC) that tracks an offset of how far it's drifted from UTC so you can effectively coordinate. And eventually when the moon gets it's own GPS (which it will eventually), you'll have GPS(LTC), i.e. GPS time relative to LTC.
However, wouldn't the inclusion of a coefficient solve this easier in a much more general and flexible way than defining a frame of reference for each individual case?
My thought exactly - the fact that time runs slightly faster on the Moon than on Earth is fascinating, but the pragmatic approach would be to just compensate for the deviation.
How would you? Periodic leap seconds, or redefine the second (which would have impacts on many other units of measurements derived from it – are you prepared to also introduce lunar meters, for example)?
The compensation for the relativistic effects just makes sure that the second on the Moon is the same as the second on Earth. According to the article another commenter linked (https://www.ipses.com/eng/in-depth-analysis/standard-of-time...) this is already routinely done on GPS satellites:
> the only corrections made on atomic clocks located on satellites are very small adjustments to ensure that they remain perfectly synchronized with atomic clocks installed on the Earth (usually to correct drifts due to relativistic effects).
this takes care of the effect mentioned in the article. However the more relevant (and debatable) question is if "Moon time" should also observe the leap seconds (which are introduced to account for variations in Earth's rotation, so have nothing to do with the Moon).
This only works on the GPS satellites because they provide the GPS time for Earth's benefit, not their own.
For all internal operations requiring high precision, they'd actually have to keep their own time reference (or calculate a dynamic offset from "GPS time") or they'd get unexpected results since the atomic clocks they carry run fast with regards to the SI definition of a second.
(It could be the other way around, i.e. the clocks running correctly with regards to the satellite's frame of reference and an offset being applied to the signal, but I suspect skewing clocks to fit the Earth-based frame is easier.)
Go back a couple hundred years before there were measures. A bolt from one foundry was a different size than the bolts in another foundry. It wasn't until the early 1900s that things were actually attempted to be standardized. Even until the '50s/'60s (and '70s in some countries), you'd often have to go to your local blacksmith to get them to make something as simple as a bolt.
If memory serves DST is mostly a historical thing for farming and other daylight-centric work due to the Earth's rotational tilt in relation to the sun, right? Unless we have farmers on the moon where this is a consideration I would hope not, given that it's a source of confusion and complexity in timekeeping software.
>It is a common myth in the United States that DST was first implemented for the benefit of farmers.[38][39][40] In reality, farmers have been one of the strongest lobbying groups against DST since it was first implemented.[38][39][40] The factors that influence farming schedules, such as morning dew and dairy cattle's readiness to be milked, are ultimately dictated by the sun, so the clock change introduces unnecessary challenges.[38][40][41]
>DST was first implemented in the US with the Standard Time Act of 1918, a wartime measure for seven months during World War I in the interest of adding more daylight hours to conserve energy resources.[42][41] Year-round DST, or "War Time", was implemented again during World War II.[42] After the war, local jurisdictions were free to choose if and when to observe DST until the Uniform Time Act which standardized DST in 1966.[42][43] Permanent daylight saving time was enacted for the winter of 1974, but there were complaints of children going to school in the dark and working people commuting and starting their work day in pitch darkness during the winter, and it was repealed a year later.
> Locating and directing this mission requires extreme precision down to the nanosecond, errors in navigation which could risk spacecraft entering the wrong orbits.
I don't see how time on the lunar surface helps with that any more than time on Earth surface does...
Might be an excellent opportunity to re-think how we handle timekeeping in general, especially as Local Solar Time will vary wildly by dint of the moon's orbital patterns.
Is it just me, or is the headline terrible? They don't want the moon to have its own timezone; they want it to have its own time standard.
i.e. CLT (which will probably end up being LTC for the same reason that UTC is neither TUC or CUT) replaces UTC for lunar operations and any putative lunar timezones would be offsets from that.
No one is landing on the moon in 2026. SpaceX is two years behind schedule building the lander, and there's not enough room in the calendar to develop and then operationalize the necessary refueling technology. The earliest feasible landing date is probably in 2028.
> is two years behind schedule building the ladder
I initially read it as that, and didn't even blink. Elon's promises are so ridiculous and inept that being two years behind on building a ladder seems entirely plausible.
Just another symbolic gesture from the White House to try to pretend that they're actually serious about achieving Artemis' goals as they continue to cut NASA's budget while raising the allocation to the waste of material that is SLS, doing nothing more than accomplishing their goal of killing science programs and purchasing votes and future 'advisor' roles at Boing.
Whatever time standard we settle on will need to be re-synchronized periodically, since, according to general relativity, time measured in Earth's gravity well will be running slightly slower than clocks on the moon with its lower gravity.
Odd to frame it as the "White House wants..." I know the White House is a metonym for the US government, but it usually refers to something at least slightly in the political wheelhouse, like foreign policy or taxes. Why not just say NASA here?
Its literally a directive from the White House Office of Science and Technology Policy; White House is not being used in a figurative sense for “the US government broadly” (which , incidentally, I’ve never seen any major Western media do; Washington, sure, but not the White House) but in a literal, institutional sense.
> But isn't everything NASA does (by extension of being in the executive branch) what the "white house wants"
In some abstract philosophical sense, perhaps, but that's different than a directive from an actual White House office, which this is.
> Unless this was triggered by an executive order, which is kind of exceptional to the normal process.
Policy directives from the White House Office of Science and Technology Policy are also an exception to the normal NASA-internal decision-making process.
It absolutely isn't the first line that it's a directive from the White House Office of Science and Technology Policy. It just says "The White House wants..." Again, "The White House" is often a metonym for the US government. [1]
I'm not sure why it didn't bother mentioning the Office of Science and Technology Policy. If I had realized that's where it came from, instead of from NASA, I wouldn't have mentioned it.
Only relevant if you simply ignore how the American Govt works. Yes the White House is used as a "metonymy" for the Govt as a while, but with SOME critical thinking, its obvious this is coming from the executive branch, since the legislative and judicial branches are irrelevant for the purposes of directives since they pass legislation/judicial judgements, not yell about directives
"The White House" is used as a metonym for the Presidency, not the entire US government, or even for the executive branch: it would be incorrect to describe something from the various departments as originating from the White House.
USG has something called the Executive Offices of the Presidency, that's included in "the White House". This is one of those councils, they're all hosted at whitehouse.gov