Hello all, I am using the following equation from Wikipedia:
to calculate the required angle of my projectile to reach a point projected to 3D space from the cursor. I’m sure this would be working perfectly if Unity didn’t keep seemingly messing up the math.
The following is an example script, what it gets, and what I get (from my head and a TI 83 calculator):
var x = 5; // x = 5 Distance to target
var v = grenadeForce; // v = 500 Projectile speed
var g = Physics.gravity.y; // g = -9.81 Gravity
//Debug.Log(g);
var gx2 = g * (x*x); // Unity and I both get -245.25
//Debug.Log(gx2);
var yv2 = 2 * 0 * (v*v); // Unity and I both get 0
//Debug.Log(yv2);
var inPar = gx2 + yv2; // Unity and I both get -245.25
//Debug.Log(inPar);
var minus = g * inPar; // Unity and I both get 2405.903
//Debug.Log(minus);
var sqrt = (v*v*v*v) - minus; //I get 6.249999759e10
Debug.Log(sqrt); //Unity gets -1.924512e9
Why is it that I get 6.24… for the pre-sqrt-ed number, but Unity is getting something completely different? In fact, it’s so different, the complete function can’t even return a real angle to fire with.