- Home /

# UV mapping a hexagon, starting from a 3D space

I have an irregular hexagon in an arbitrary 3D space, and I want to UV map it. The hexagon is part a mesh (an irregular hexagonal prism), but I'm only trying to UV map the top face of the prism.

Since I have the mesh, I know the vertices in the 3D space, but I can't figure out how to map this to a (0, 0) to (1, 1) grid for UV mapping. How can I manipulate these vertices to UV map them?

I have multiple of these irregular hexagonal prism meshes (procedurally generated at runtime), all different from each other, so I need this to be a general solution. Thanks for any help!

**Answer** by IgorAherne
·
Sep 19, 2017 at 05:45 AM

Your vertices are part of the mesh, and are specified relative to the mesh's pivot;

You need to project needed vertices onto a bounded rectangle, and the resulting coordinate (expressed in terms of units along the rectangle's horizontal and vertical axis will be your u and v coordinates respectively). If rectangle is 10x10 meters, then if your projected point ends up in, say 2,5 then you can say it's [2/10, 5/10] == [0.2f, 0.5] uv coordinates.

To perform projection you need to specify some "warp mechanism" that will take the points and re-specify them relative to the rectangle. For that we use matrices;

To create such a matrix you will need to know the rectangle's position relative to the world's zero coordinate, the rectangle's orientation and scale.

You first take your prizm vertex out of the local space, by multiplying the prizm's `transform.localToWorldMatrix`

- this specifies prism's vertex relative to the world's zero, no longer relative to the prizm's pivot;

Then, you need to take it from world space into rectangle's space (rectangle onto which you will project the point). To do this, use `myRectangleGameObject.transform.worldToLocal`

matrix.

Now the point is relative to the axis of the rectangle. For simplicity, you can imagine the rectangle as being the flat ground, and its 3 axis sticking out. Its y axis sticks out from that rectangle, and can act as a normal of that rectangle.

Now, you have the rectangle of which you are now legally allowed to think of as a flat ground, and a vertex that was specified relative to its 3 axis. The vertex is like a star in the sky

Take that star and project it onto the ground, using plane equations - the normal of the rectangle and distance from your vertex to the *current* zero coordinate (after we've done the above transofrmation, the zero coordinate will be the pivot of the rectangle - since our vertex is now a vector from that pivot, - the vertex is no longer relative to the world's zero or it's mesh's pivot)

Thanks for the response! That mostly makes sense to me, although I'm confused about re-specifying the points relative to the rectangle.

My mesh is part of a GameObject, and the GameObject is located at (0, 0, 0). The mesh vertices are at some arbitrary (x, y, z) location. If I use `gameObject.transform.TransformPoint(meshVertex)`

to get the world position, then the meshVertex doesn't change location.

I then created a plane primitive with `GameObject.CreatePrimitive(PrimitiveType.Plane)`

, which is located at (0, 0, 0). Using `plane.transform.InverseTransformPoint(meshVertex)`

doesn't change the location of the meshVertex either.

Am I doing the transformations incorrectly? I don't think I can use the matrices (worldToLocal, localToWorld) in this case, since the mesh vertices are separate from the gameObject.

My mesh is part of a GameObject, and the GameObject is located at (0, 0, 0). The mesh vertices are at some arbitrary (x, y, z) location. If I use gameObject.transform.TransformPoint(meshVertex) to get the world position, then the meshVertex doesn't change location.

I then created a plane primitive with GameObject.CreatePrimitive(PrimitiveType.Plane), which is located at (0, 0, 0). Using plane.transform.InverseTransformPoint(meshVertex) doesn't change the location of the meshVertex either.

Correct, they don't change location since as you've said - your gameObject's pivots are placed at world zero. However, these matrices will become useful as soon as your objects are not aligned to the world'z zero coordinate, or are not sitting on it.

worldToLocal and localToWorld are the only thing that's used by the mesh, so you should use them - in fact, TransformPoint secretly uses localToWorld, and InverseTransformPoint secretly uses worldToLocal anyway.

What I meant, is you can define a plane using the code, - not a unity's primitive. A plane is a geometrical concept which can be expressed as a position in space and a normal perpendicular to the plane.

You could then do couple of mathematical instructions to compute projected point location on that "virtual plane", and use OnDrawGizmos to visualize it's position.

Anyhow, I advise youtube tutorials on matrices & plane projection; After that it will be clearer what I meant

### Your answer

### Welcome to Unity Answers

The best place to ask and answer questions about development with Unity.

To help users navigate the site we have posted a site navigation guide.

If you are a new user to Unity Answers, check out our FAQ for more information.

Make sure to check out our Knowledge Base for commonly asked Unity questions.

If you are a moderator, see our Moderator Guidelines page.

We are making improvements to UA, see the list of changes.

### Follow this Question

### Related Questions

Normalizing mesh UVS? 1 Answer

Is it safe to assume that all platforms can support a mesh with 4 sets of UVs ? 0 Answers

Creating a mesh programatically for a tile-based 2D game 1 Answer

Whats causing this weird texture stretching on my mesh? 1 Answer

Procedural Mesh UV Problem (UVs 'mirrored' instead of 'repeating') 0 Answers