- Home /

# Compute Shader performance is dropping from a single boolean.

Hello, I am trying to make a custom 3d renderer with projection and rasterization, but I have a question about computeShaders. I have a compute shader that sees if a pixel is in a triangle, then draw it. My compute shader code:

```
[numthreads(256,4,1)]
void CSMain (int3 id : SV_DispatchThreadID)
{
float2 coords = float2 (
id.x,
id.y
);
float minValue = 1;
bool shadeIt = false;
for (int i = 0; i < numTriangles; i+=3)
{
float3 v0 = vertices[triangles[i+0]];
float3 v1 = vertices[triangles[i+1]];
float3 v2 = vertices[triangles[i+2]];
float pointInside = isInside (
v0.x, v0.y,
v1.x, v1.y,
v2.x, v2.y,
coords.x, coords.y
);
if (pointInside < 0.1f){
shadeIt = true;
}
}
Result[id.xy] = (shadeIt ? float4(2, 1, 0, 1) : float4(0, 1, 2, 1));
}
```

My C# code that runs the shader:

```
vertexBuffer = new ComputeBuffer (mesh.vertices.Length, sizeof(float)*3);
triangleBuffer = new ComputeBuffer (mesh.triangles.Length, sizeof(int));
Vector3[] projectedPoints = new Vector3[mesh.vertices.Length];
int i = 0;
foreach (Vector3 vec in mesh.vertices)
{
projectedPoints[i] = (ProjectPoint (vec) * height) + new Vector3 (width/2, height/2, 0);
projectedPoints[i].z = vec.z;
i ++;
}
vertexBuffer.SetData (projectedPoints);
triangleBuffer.SetData (mesh.triangles);
renderer.SetTexture (0, "Result", output);
renderer.SetBuffer (0, "vertices", vertexBuffer);
renderer.SetBuffer (0, "triangles", triangleBuffer);
renderer.SetInt ("width", width);
renderer.SetInt ("height", height);
renderer.SetInt ("numVertices", mesh.vertices.Length);
renderer.SetInt ("numTriangles", mesh.triangles.Length);
renderer.SetVector ("cameraPosition", camera.position);
renderer.SetVector ("cameraAngles", camera.angles * (3.141f/180));
renderer.Dispatch (0, threadGroups.x, threadGroups.y, threadGroups.z);
vertexBuffer.Dispose();
triangleBuffer.Dispose();
```

And this works, we can plug in any mesh and it renders it: But as you can see, 1 FPS. Now, if I take out this bit of code (Which is important lol): `shadeIt = true;`

Then obviously, the mesh disappears, but the FPS goes up to 60:

Why is this happening and how can I fix it?

### People who like this

**Answer** by Tex5000
·
Jan 20, 2022 at 10:58 PM

Hello,

You're not supposed to put if conditions in shader code. It's run on the GPU which is here to crush numbers (maths). You are supposed to find a way to put this "if" as a mathematical operation.

I didn't write shader code in ages but something like this should work:

```
int colorA = (1.1f - pointInside);
int colorB = 1 - colorA;
Result[id.xy] = colorA * float4(2, 1, 0, 1) + colorB * float4(0, 1, 2, 1));
```

Note that ? is also a conditional operator so don't use it.

Tried it, did not work. Thanks for trying though. It is a real problem and is messing with me.

**Answer** by esgnn
·
Jan 21, 2022 at 06:59 AM

I think second if in line 31 is redundant. Create a float4 outside the loop

```
float4 resultVar = float4(0, 1, 2, 1);
```

then,

```
if (pointInside < 0.1f){
resultVar = float4(2, 1, 0, 1);
}
```

then,

```
Result[id.xy] = resultVar;
```

Also I'd definitely move calculations in c# between lines 5-11 to compute shader as well. I am assuming you are converting points to screen space.

One other thing, check if unrolling for loop increases performance. (I am not an expert on compute shaders) I'd also play with numthreads numbers to see if there is a performance change.

### Your answer

### Welcome to Unity Answers

If you’re new to Unity Answers, please check our User Guide to help you navigate through our website and refer to our FAQ for more information.

Before posting, make sure to check out our Knowledge Base for commonly asked Unity questions.

Check our Moderator Guidelines if you’re a new moderator and want to work together in an effort to improve Unity Answers and support our users.