Timer is incorrectly effected by Time.timeScale

Hello! I’m working on a timer for my game that’s effected by Time.timeScale:

float diff = Time.deltaTime * Time.timeScale;
if (countDirection == CountDirection.Down) currentTime -= diff;
else currentTime += diff;

Further, for testing, I have Input code for setting the timeScale to different values:

if (Input.GetKeyUp(KeyCode.O)) Time.timeScale = 2f;
else if (Input.GetKeyUp(KeyCode.P)) Time.timeScale = 0.5f;
else if (Input.GetKeyUp(KeyCode.I)) Time.timeScale = 1f;

However, when Time.timeScale is either 2f or 0.5f, the timer is effected by a factor of 2.

ex. 2f = 4 times faster, 0.5f = 4 times slower.

I confirmed this with manual testing on my phone’s stopwatch.

I’m not quite sure why this would be happening? Does anyone have any thoughts?

The thing is that you are doing the Time.timeScale multiplication for every addition to the currentTime.


The thing that must be multiplied by the scale is your currentTime, not every addition or subtraction to it.


Remove the Time.timeScale; from

float diff = Time.deltaTime * Time.timeScale;

then add something like this and use the value where u need it.

float valueYouWantToUse = currentTime * Time.timeScale

There must have been some code obfuscating what was happening… subtracting Time.deltaTime is in fact affected by the timeScale. I can’t pinpoint what was throwing that off, but after cleaning up my code it seems in fact nothing was needed after all…