Is there a way to detect a retina display?

There are circumstances where it's nice to know if you are running on a retina display or not.

For example, GUI code that distinguishes between taps and drags. There is typically a threshold, and if the touch moves beyond that threshold, the touch turns into a drag. If the threshold is too small, then it becomes difficult to tap the object- the slightest move turns into a drag. Too high and the start of the drag is delayed and feels abrupt.

Is there a clean way to determine this in Unity?

Since Unity 3.5, Screen has a property dpi that you can use for this purpose:

Screen.dpi

This works on Android and iOS. iPhone Retina displays have 320 ppi, iPad 3 has 260 ppi. dpi = ppi, ppi (pixels per inch) is Apple-speak, dpi (dots per inch) is Android speak. IMHO, ppi makes a little more sense because on screens we have pixels and not dots as in print :wink:

here is little plugin I wrote for iOS screen scale detection

DeviceDisplay.scaleFactor returns int value from 1 to 3 according do device screen scale factor

public class UIScale : MonoBehaviour {

	
	
	void Start () {
		Debug.Log(DeviceDisplay.scaleFactor);
	}
}

You could use:

if (iPhoneSettings.generation == iPhoneGeneration.iPhone4) {
//then device uses retina display
}

I assume this class will get updated as new devices get released. If, for any reason, a future device had multiple resolutions, you could use this with Screen.currentResolution, iPhoneSettings.model, and/or the hardware specs to try to pin down the pixel density.

http://unity3d.com/support/documentation/ScriptReference/iPhoneSettings.html

Try screen.currentResolution. The iPhone 4 will have a much higher one than the normal iPhone.

It’s a hack, but i just get the worldtoscreen coordinates of the bounds of a known box collider, and just use the values (which will vary based on pixel density) as a variable.

check for the resolution using screen.currentResolution and with tht value u can detect whether it is normal or retina display