Unity's default shaders' blending is messed up?

68551-untitled-2.png

When you have to deal with rendering transparency correctly and specially when using a renderTexture, it is crucial to perform correct blending based on pre-multiplying the RGB with the alpha channel. More on that here.

As a general rule, the blending mode for alpha channel should ALWAYS be: One OneMinusSrcAlpha. The blending used for RGB should either be ‘One OneMinusSrcAlpha’ or ‘SRC_ALPHA OneMinusSrcAlpha’, depending on whether the source is alpha pre-multiplied or not.

Let’s see what unity does. The standard sprite shader uses One OneMinusSrcAlpha for RGB and alpha and premultiplies the rgb with the alpha in the fragment shader. This produces correct RGB and alpha values.

However, the Unlit-Transparent shader uses SrcAlpha OneMinusSrcAlpha which results in wrong alpha values. If you render on a solid surface and simply don’t care about the resulting alpha, you won’t notice the problem. If you render on a renderTexture and then render this on screen, you will notice the problem.

I have created a simple unity project that demonstrates the issue by rendering 2 semi-transparent grey sprites on a render texture. You can find it here and go through the readme file…

Also, when you render the renderTexture the rgb values are already premultiplied. Thus you want to use One OneMinusSrcAlpha for rgb. I had to create a “premultiplied” iteration of the standard sprite shader to make this work.

So, my question is if I am missing something and Unity provides a proper way to deal with this or you have to be really careful when dealing with transparency and adjust the shaders yourself?

I’m not sure what the problem is? The built-in shaders provide a range of examples… if they don’t behave the way you want to (and it sounds like you have already identified the way you would prefer), then you just make your own shader.