This site uses strictly necessary cookies. More Information

X- Home /

# 3D Meshes: how to minimize error during reduction of order?

a set of vertices defined by the following code:

```
int xMax; // this is the size of the mesh in the x dimension
int zMax; // this is the size of the mesh in the z dimension
Vector3[] vertices = new Vector3[xMax * zMax];
int i;
for (int z = 0; z < zMax; z++){
for (int x = 0; x < xMax; x++){
i = z + zMax * x; // this is just the index in the vertices array
vertices[i] = new Vector3(x, f(x,y), y); // f(x,y) might perhaps return a value from a heightmap
}
}
```

how do i find a subset with n of these vertices and then triangulate it in such a way that either the standard deviation of the area of the triangles is minimal, or the three-dimensional correlation coefficient is as close to 1 as possible? (preferably the latter) EDIT: any way that keeps as much detail as possible is fine

This is an extremely challenging problem and any help would be appreciated.

$$anonymous$$inimizing error during reductions of order is a fair question. This paper presents a poly-count reduction scheme that might be what you're looking for.

Thank you for your fast response. I don't know if it is anything i can work with yet but i will have a look.

### Your answer

### Welcome to Unity Answers

The best place to ask and answer questions about development with Unity.

To help users navigate the site we have posted a site navigation guide.

If you are a new user to Unity Answers, check out our FAQ for more information.

Make sure to check out our Knowledge Base for commonly asked Unity questions.

If you are a moderator, see our Moderator Guidelines page.

We are making improvements to UA, see the list of changes.

### Follow this Question

### Related Questions

Optimizing Meshes 0 Answers

Triangulation Problems 1 Answer

Triangular Prisms From A Mesh 1 Answer