I’ve been exploring the possibility of using Unity to build a 2D game. I’m looking at a number of 2D helper libraries, including ex2D and Orthello. The problem is: When I run the example projects that come with these libraries, the motion of the sprites is jerky. (The problem exists with both of the libraries, and actually just with Unity in general - more on that below.)
Let me explain what I mean: Every 1-2 seconds, there’s very subtle but noticeable “blip” in the motion of the graphics. The hiccup is like a metronome; the motion is perfectly smooth for 1 second, and then for 1 instant (a single frame probably), the sprite noticeably jerks very subtly. Then it’s smooth again for 1 second, then a frame blip, etc., ad infinitum. The jerky movement is probably only a matter of a pixel or two; you have to watch carefully to see it.
I dug a little deeper and created a simple Unity game from scratch with an orthographic camera and cube (and nothing else – no 2D libraries, no physics, no nothing), and sure enough, the motion is still jerky. Every 1-2 seconds, like clockwork, a hiccup occurs in the motion.
I come from a Microsoft XNA background, and I experienced a problem just like this with XNA once upon a time. The solution was to disable “FixedTimeStep” for the game, and then the game ran perfectly smoothly. Is there a similar setting in Unity? I found the Time settings, but there’s seemingly no way to disable fixed timestep.
For those curious, here are the objects/code for my simple project that exhibits the jerky animation behavior:
- Main Camera: Orthographic, size 2, position(0,0,-1)
- Cube: position(0,0,0)
And then this C# script on the Cube:
using UnityEngine;
using System.Collections;
public class CubeScript : MonoBehaviour
{
protected float min;
protected float max;
protected float speed;
void Start ()
{
speed = -0.01f;
min = transform.position.x - 2f;
max = transform.position.x + 2f;
}
void Update ()
{
transform.position = new Vector3(
transform.position.x + (Time.deltaTime * speed),
transform.position.y,
transform.position.z
);
if (transform.position.x < min || transform.position.x > max)
speed *= -1;
}
}
EDIT:
As the initial responses correctly pointed out, in my original example code I completely forgot to multiply by Time.deltaTime. But here’s the weird thing: Multiplying by Time.deltaTime actually makes the problem more pronounced! I just updated my code to include Time.deltaTime, since that doesn’t seem to be what’s causing my specific problem. (Regardless, thanks for your responses Eric5h5 and malraux.)
More information: I just discovered that building the game (instead of playing it in the Unity editor) and running it at “Graphics quality: Fastest” seems to eliminate the motion jitter. The problem still exists if I run it at “Good” quality or higher. My video card is an Nvidia 8800GT with up-to-date drivers. I know it’s not the greatest card in the world, but it runs plenty of other 3D games at decent quality with no frame rate hiccups. And my example project is about as simple as you can get: a single untextured cube moving back and forth.
Any other thoughts?