Built-in ShaderLab "_Time.y" and "Time.time" are not equal?

Hi!

I am trying to make an animation in shader using the built-in “_Time” property. Lets say the animation is fade-out of output color (white to black) based on values of current T0=“Time.time” and some future T1=“Time.time + 10”. In shader I do simple linear interpolation:

fixed f = clamp( (_Time.y - T0)/(T1 - T0) );    
fixed4 color = fixed4(1,1,1,1) * (1-f) + fixed4(0,0,0,0) * f;

From my first impression it all works right, but some time ago I did notice that "_Time.y " is not equal to Time.time. The animation is always playing with delay and I think the longer the scene is playing the longer is delay occured.

So, the question is how the “_Time” value corresponds to Time.time.

I would not mix Time.time and _Time.y: the docs don’t say they are the same, thus this feels unreliable to me - they may be independent variables, and accumulate some difference over time. I would pass Time.time to a shader variable in Update with material.SetFloat(), as you probably is already doing with the variables T0 and T1, or use _Time.y to set T0 and T1 (not always possible, depending on your logic).

_Time.y = Time.timeSinceLevelLoad

Solution:

material.SetFloat( "_Time0", Time.timeSinceLevelLoad );
material.SetFloat( "_Time1", Time.timeSinceLevelLoad + FadeTime );

A slightly stronger solution to this issue is to run one script that passes Time.time (or whichever time value you need) into a GLOBAL shader variable. This solution enables you to access game time from any shader or material without having to worry about individually setting time variables on any of those materials.


For example:

   Shader.SetGlobalFloat("_GameTime",Time.time);

Then, in any shader, you can declare and use

float _GameTime;

Just remember, global shader variables won’t be updated or work properly if you add them to the “Properties { }” section of a shader, so make sure you just put this declaration in the SubShader { } / CGPROGRAM section.

You can also get the shader’s _Time value from a C# script by calling this:

    Shader.GetGlobalVector("_Time")

This way, you don’t have to modify your shaders to use a custom time variable. You can just use the shader time in your C# script.

Rather than using a time variable I would put that logic into a coroutine which drives the animation property of your shader. I go into more detail in this answer but I think the same logic should apply: http://answers.unity.com/answers/1801454/view.html