Augmented Reality Reverse Engineering

I have been recently working with vuforia ar sdk with object detection and i was wondering how this presentation was made:

In my experience achieving such precision in object tracking is nearly impossible. There is always model shaking, detection problem etc.

Is it possible that In these objects are build additional gyroscopes to increase tracking?

Any clues?

Well, this is a custom solution only for presentation purposes. As you can see the table has a gid of points for reference. There might be an additional camera on top of the table to track the general orientation of the objects. Also note that there are only a few tracked objects and all of them have a high contrast to the background. Since it’s a presentation the actual steps are pre-known knowledge. So the task here is not to identify objects arbitrarily but it may expect a certain object at a certain position. Just tracing the edges, measure the relative lengths and orientation of those should be enough to determine the exact location within the camera space. For example the “house” model clearly is known to the system and they have a prepared cutout overlay. This is (most likely) far from being a general AR solution. I doubt that there are additional gyros in those objects.

To get a perfect fit AR you can’t rely on gyro only especially when you have a moving cam. To create a perfect match you have to analyze the image and track certain “featuree” of the objects. This page gives you an overview of the tracking features that vuforia provides.

Hi, I am also working on the similar topic since last year. I got so much support on this topic from Nagpur engineering colleges, I have no idea how these methods would perform in real-life applications or how to turn them into a nice Android app. it was an add-on with a built-in tuning fork gyroscope for all that rotation tracking goodness.