I have a UI canvas that contains a child object with a transparent sprite and below that I have a mask texture using a white circle texture in the shape of the transparent crosshair sprite above it, and finally below the mask is a raw image that uses a render texture. This render texture is rendered to by a second camera i have in my scene set to only view objects on the “minimap” layer. I have other gameobjects in the scene with round colored textures that are set to the “minimap” layer.
In the editor when I play, it works just as it should, the red icons for the enemies in the scene show up within the crosshair in the corner and you can see through it to see the background behind it. However, when I build and run the build, it instead renders what was completely transparent previously as completely black now.
I’ve tried fiddling with a few different settings and I can’t figure out what’s causing it to behave differently in the build than it does in the editor. Any ideas?