My HUD UI events are being intercepted

I am building a 2D game with a HUD. On the HUD, I have a button (for pause). In the game, I detect taps on the screen to fire a gun with a gameobject that spans the screen. I reviewed EventSystems for UI events from here: Unity UI - Blocking clicks - YouTube

The screentaps are caught by implementing OnPointerClick on a game object that spans the screen. I am successfully generating click events when the screen is touched. However, this object is somehow blocking the UI of my HUD. My HUD is on a canvas, with the canvas render mode to Screen Space - Overlay. It is my understanding that with Overlay that it would be the top-most object, so clicking the pausebutton should not be intercepted.

My object hierarchy is:

Parent

->ScreenTouchManager

->HUD Canvas (Screen - Overlay)

with all elements at Z = 0. So, I’m guessing that this is not a z-order issue but an order of priority issue in the event handling? My Physics Raycaster on my camera is beating out the Graphics Raycaster on the Canvas somehow. I’m guessing there is a little feature or a gotcha here that is easy to solve. Possibly to do with layers and blocking? I’m too new to Unity to figure this out.

EDIT:
Just an extra note, I was able to solve this by moving my ScreenTouchManager onto the canvas. But, I still need an answer to why I can’t get it to work the other way. I feel that it is a Unity thing that I should understand.

#newbie response

When I do raycasting, and clicksorting on different layers… like between different layers of the HUD and the gameplay. Well, then I literally assign them onto layers.

for example in my games I will use a Layer: HudLayer, and depending on the game I may simply a MainGameLayer ( like in your example ).

So when your doing the Physics.RayCast.

CODE HERE:

Physics.Raycast(originVector, directionVector, floatDistance ,int layermask);

that int Layermask = defaultraycasters business… basically it wants a layer in this spot to use. I am not the best at these, but it IS NOT a simple integer. So you cannot just put in the layer # and hope it will work. Someone I am sure will chime in with how it works… but just set this layer you want to filter through in the Editor. In your particular example: I would set the raycast layer only to the MainGameLayer. And when I do set this in the Editor, I leave HUDlayer unchecked. HERE is the code to make it happen

//At the top when initializing:
public LayerMask maskToUse; // this you will set in the editor ( I think its an array)

And then back in your update function or wherever your raycast function is:

//and then just transfer the mask you set in the editor to the raycast
    
    Physics.Raycast(originVector, directionVector, floatDistance ,maskToUse)

EDIT: Unity’s API Definitions are your best friend =)

Additionally, if you are using multiple camera’s be sure you are using it’s DEPTH value appropriately. Negative depth camera’s are drawn before positive depth cameras