Our future AR devices must be more perceptive in order to be more genuinely useful to us. In order for devices to understand where they are in relation to people and other objects, and how to make sense of any given situation, they need a virtual 3D map of the things around you. But it’s far too power-intensive to scan and reconstruct a space in real time from scratch, so AR glasses will need to tap into an existing 3D map we call LiveMaps.
LiveMaps uses computer vision to construct a virtual representation of the parts of the world that are relevant to you. With these 3D maps, our future devices will be able to efficiently see, analyze, and understand the world around them and better serve those who use them. These devices will keep track of changes, like new street names, and update them in real-time. The Project Aria device is testing out how this can work in practice.