- Home /

# How to convert coordinates in Unity?

I have a JSON file with coordinates where (0.0) is at top left corner (first picture). In Unity (0.0) is at the center of the screen (second picture). I am looking for a way where I can convert my file's coordinates to Unity coordinates.

So my question is how can I convert the coordinates from the first picture to Unity Coordinates?

For example at position (2.3) I have letter 'A'. How to convert it to match with Unity?

I have already tried Camera.main.ScreenToWorldPoint but I always get the same result for all of the inputs.

EDIT: Screenshot of what I get until now.

**Answer** by Bunny83
·
Aug 02, 2019 at 10:25 AM

Unity does not have one set of coordinates but many. The question is which coordinates you actually want.

Every transform component (and therefore every gameobject) has it's own local coordinate space. Nested Transform components form a coordinate hierarchy. The Transform component of each object is responsible for translating local space coordinates into worldspace coordinates.

Next is the conversion from worldspace into camera space or view space. That's just the inverse transformation from worldspace into the localspace of the camera object. This is also done by the Transform component of the camera. So in the end (inside the shader) every object will be relative to the camera.

The camera space coordinates are then projected into clipspace which goes from -1 to +1 in each axis, Everything that is outside this range will be "clipped" since it's essentially outside the camera view. There are two fundamentally different projection methods: perspective projection (that's usually the default one) or orthographic projection.

After the clip space there are a few other coordinate spaces which usually are less important. However in the end we will have Viewport coordinates which are normalized coordinates between 0 and 1 which usually goes from bottom / left up to the top / right corner. Finally those coordinates are converted into screen space which is the same as viewport space but simply scaled by the actual pixel height / width of the screen.

Camera.ScreenToWorldPoint will convert a point in screen space into worldspace (as the name suggests). However many people forget that there's a projection going on in between and the screenspace coordinate also has 3 coordinates, not just two. If you pass a point to ScreenToWorldPoint that doesn't have a z coordinate (so it's set to 0) you will actually get the point at distance 0 from the camera. For a perspective camera that means you always get the camera position since that's the point where all perspective lines meet. You would need to pass the wanted distance from the camera to ScreenToWorldPoint in the z coordinate to get a meaningful position for a perspective camera. For orthographic cameras the z coordinate does not influence the x and y coordinates, Though the z coordinate will always determine the z coordinate of the converted position relative to the camera.

So it's not entirely clear what coordinates you have and in which space you want them to be converted. If your coordinates from your file do not belong to any space in Unity, you have to specify the conversion yourself. If the two grids you have shown should represent the same area / volume in space all you have to do in this particular case is to subtract (5, 5) from your incoming coordinates (or more generally half the screen / area size). In your case there doesn't seem to be any stretching or rotation going on.

In general when you want to convert coordinates it's usually the easiest way to setup an empty gameobject and let it's local space represent your input coordinate space. You can now move / rotate / scale the transform the way you need to transform the local space into your wanted worldspace. So in your case when looking at the right grid as being Unity's worldspace you could place your gameobject at (-5,-5). So a coordinate of (0, 0) inside the local coordinate space of your gameobject would end up at (-5, -5).

To use a Transform to transform a certain point from local to worldspace you would simply use

```
Vector3 p = coordTransform,TransformPoint(yourCoordinate);
```

If you're struggling with linear algebra I recommend the 3B1B videos on the essence of linear algebra

@Bunny83 Thank you for such a detailed answer.

From the JSON file I am getting coordinates like the left grid image I attached (A[2.3] , B[3.3] , C[3.4], etc). From JSON file the board size can be any size 10x10 or 8x12, etc.

Now in Unity, from what I can understand the Parent GameObject that I am instantiating those gameObjects (Letters) is in World Coordinate because when I drop a gameObject in it with [0.0] is placed in the center of the screen.

My camera has the default values (Orthographic) .

Also the board size I get from JSON might be 10x10 like the picture at the left grid but in Unity this has to be converted to size of a mobile screen (4.65x10). I got those values using the following code:

```
cameraHeight = Camera.main.orthographicSize * 2.0f;
cameraWidth = cameraHeight * Screen.width / Screen.height;
```

Can you help me now that I provided more details?

I attached a screenshot of how it looks until now.

### Your answer

### Welcome to Unity Answers

The best place to ask and answer questions about development with Unity.

To help users navigate the site we have posted a site navigation guide.

If you are a new user to Unity Answers, check out our FAQ for more information.

Make sure to check out our Knowledge Base for commonly asked Unity questions.

If you are a moderator, see our Moderator Guidelines page.

We are making improvements to UA, see the list of changes.