This involves estimating the target region's apparent 2D translation and relative scale across frames. We do this by first finding the 3D camera translation using only the visual signals from the camera. This approach enables AR tracking across devices and platforms without initialization or calibration. The core concept behind Instant Motion Tracking is to decouple the camera’s translation and rotation estimation, treating them instead as independent optimization problems. This tracking information is then used in the rendering system to overlay virtual content on camera streams to create immersive AR experiences. To achieve that, we provide the six degrees of freedom tracking with relative scale in the form of rotation and translation matrices. The Instant Motion Tracking solution provides the capability to seamlessly place virtual content on static or motion surfaces in the real world. Instant Motion Tracking in MediaPipe Instant Motion Tracking In this application, a user simply taps the camera viewfinder in order to place virtual 3D objects and GIF animations, augmenting the real-world environment. Along with the library, we are releasing an open source Android application to showcase its capabilities. This technology also powered MotionStills AR. ![]() ![]() With Instant Motion Tracking, you can easily place fun virtual 2D and 3D content on static or moving surfaces, allowing them to seamlessly interact with the real world. It is built upon the MediaPipe Box Tracking solution we released previously. Today, we are excited to release the Instant Motion Tracking solution in MediaPipe. The ability to perform AR tracking across devices and platforms, without initialization, remains important for powering AR applications at scale. Posted by Vikram Sharma, Software Engineering Intern Jianing Wei, Staff Software Engineer Tyler Mullen, Senior Software EngineerĪugmented Reality (AR) technology creates fun, engaging, and immersive user experiences.
0 Comments
Leave a Reply. |