Hypothetical: What happens if Time.time overflows?

Okay, so I don’t really have a specific problem to solve here, but I’m curious (and this could, potentially cause problems in obscure circumstances). Time.time is a 32-bit float that is continuously increasing. Like any floating point value, there is a maximum value that can be represented. My question is simple: what happens when Time.time reaches this value? Does it throw an OverflowException? Wrap around? Saturate to infinity? While in most cases, this is not required knowledge, I can see a few circumstances in which it could be extremely useful to know, just so it can be handled correctly.

Can’t happen. Floats easily go up to 10^30, and a billion years is only 10^11 seconds.

But if it did, which it will never do, sure, it will do something from that list, depending on the exact hardware. There are often a few things in computers/programming where you decide X isn’t going to happen, and if it does, you just let it crash.

Well, the single precision IEEE 754 format has 24 significant binary digits. This value is constant. So all you have to do is look at different intervals and see how many digits you have before the “binary point” and how many are left behind the binary point.

“One year” is equal to 31,536,000 seconds. Just have a look at this table:

    Range   |   binary distribution      | min difference(mSec) | min difference absolute   
  #----------------------------------------------------------------------------------------
   0 -    1 | 0.bbbbbbbbbbbbbbbbbbbbbbb0 |   (varies heavily)   |                          
   1 -    2 | 1.bbbbbbbbbbbbbbbbbbbbbbb0 |          0.0001      | 0.00000011920928955078125
   2 -    4 | 1a.bbbbbbbbbbbbbbbbbbbbbb0 |          0.0002      | 0.0000002384185791015625
   4 -    8 | 1aa.bbbbbbbbbbbbbbbbbbbbb0 |          0.0005      | 0.000000476837158203125
   8 -   16 | 1aaa.bbbbbbbbbbbbbbbbbbbb0 |          0.001       | 0.00000095367431640625
  16 -   32 | 1aaaa.bbbbbbbbbbbbbbbbbbb0 |          0.002       | 0.0000019073486328125
  32 -   64 | 1aaaaa.bbbbbbbbbbbbbbbbbb0 |          0.004       | 0.000003814697265625
  64 -  128 | 1aaaaaa.bbbbbbbbbbbbbbbbb0 |          0.01        | 0.00000762939453125
 128 -  256 | 1aaaaaaa.bbbbbbbbbbbbbbbb0 |          0.02        | 0.0000152587890625
 256 -  512 | 1aaaaaaaa.bbbbbbbbbbbbbbb0 |          0.03        | 0.000030517578125
 512 -   1k | 1aaaaaaaaa.bbbbbbbbbbbbbb0 |          0.06        | 0.00006103515625
  1k -   2k | 1aaaaaaaaaa.bbbbbbbbbbbbb0 |          0.1         | 0.0001220703125
  2k -   4k | 1aaaaaaaaaaa.bbbbbbbbbbbb0 |          0.2         | 0.000244140625
  4k -   8k | 1aaaaaaaaaaaa.bbbbbbbbbbb0 |          0.5         | 0.00048828125
  8k -  16k | 1aaaaaaaaaaaaa.bbbbbbbbbb0 |          1.0         | 0.0009765625
 16k -  32k | 1aaaaaaaaaaaaaa.bbbbbbbbb0 |          2.0         | 0.001953125
 32k -  64k | 1aaaaaaaaaaaaaaa.bbbbbbbb0 |          4.0         | 0.00390625
 64k - 128k | 1aaaaaaaaaaaaaaaa.bbbbbbb0 |         10.0         | 0.0078125
128k - 256k | 1aaaaaaaaaaaaaaaaa.bbbbbb0 |         20.0         | 0.015625
256k - 512k | 1aaaaaaaaaaaaaaaaaa.bbbbb0 |         30.0         | 0.03125
512k -   1M | 1aaaaaaaaaaaaaaaaaaa.bbbb0 |         60.0         | 0.0625
  1M -   2M | 1aaaaaaaaaaaaaaaaaaaa.bbb0 |        100.0         | 0.125
  2M -   4M | 1aaaaaaaaaaaaaaaaaaaaa.bb0 |        200.0         | 0.25
  4M -   8M | 1aaaaaaaaaaaaaaaaaaaaaa.b0 |        500.0         | 0.5
  8M -  16M | 1aaaaaaaaaaaaaaaaaaaaaaa.0 |       1000.0         | 1.0
 16M -  32M | 1aaaaaaaaaaaaaaaaaaaaaaa0. |       2000.0         | 2.0
 32M -  64M | 1aaaaaaaaaaaaaaaaaaaaaaa00 |       4000.0         | 4.0

So after one year you have a resolution of about 2 seconds. The game will stop working way earlier. It should work for about 36h. Here you have a precision of about 10ms

You shouldn’t worry about the maximum value as it’s way beyond the age of the universe. So it’s unlikely that a time value (no matter when it starts) will ever “saturate to infinity”. However as the precision decreases the smallest possible step gets larger and larger. Once it’s “too large” adding a too small amount will just not change the value.

However we simply don’t know how Unity actually tracks time internally. They may actually use an integer / int64 value, maybe the passed ticks or something else. So if they track the time with a more reliable way and finally just convert it to a floating point number the number will still increase. Though keep in mind that the floating point value would increase in larger steps over time. If the precision is down to say 10 seconds it would mean the value would only change about every 10 seconds.

If the deltaTime value stays at the same precision you can simply track your own time using a double time value and add Time.deltaTime every frame.

Well, Time.time increases by Time.deltaTime every frame and occuracy of Time.time degrades. After some period 2 different frames may have the same Time.time values, because Time.time will be very big. Say, you have 60 FPS, Time.deltaTime will be around 0.017. @Bunny83 posted a table where we can see, that we can relay only on 128000 seconds to track difference. It is about 35 hours. By the way, some years ago I made an experiment with AnimationCurve. I created a curve and made an animation using Evaluate() from the curve. I passed Time.time to Evaluate() and left my PC working all night. When I was leaving an office it was smooth. When I came to the office next day I found that animation became ugly and torn because of big values of Time.time.

Accuracy is obviously lost. So, any calculations that you do based on time, assuming you’re interested in how many milliseconds have passed since the last calculation will give you dubious values.

Yes. you can’t do a server hosted game if you are not sure whether it can run without losing accuracy at least one year uninterrupted. It will be good to know how accurate is Time.time after a year running.