There’s a lidar scanner on Apple’s newest iPad, but it isn’t really for the iPad at all.

There’s a lidar scanner on Apple’s newest iPad, but it isn’t really for the iPad at all.

But its even more appropriate when you leave your apartment and step out into the world, which introduces complications like walking around. This isnt your run of the mill stand still and point your phone at something AR experience, says Jessica Brillhart, director of the Mixed Reality Lab at the University of Southern Californias Institute for Creative Technologies. Now you can walk and AR in a much more seamless way. Virtual objects will now be easily occludedmeaning assets will be much more baked into the world youre moving through, plus theyll continually map to your perspective and position within space relative to their placement.
That becomes even more important when you offload AR sensors to eyeglasses rather than a tablet or phone. Rather than having to reference narrow slices of a virtual world through a smartphone screen, a lidar scanner will enable experiences that truly envelop you. Apples lidar component also doesnt take up much space.
The technology is ideally suited for all small scale devices, says Wetzstein. Its low power, its lightweight, its small scale, it gives you high-quality depth. You can probably combine a number of things together so they wouldnt interfere with one another, because lidar works by identifying individual points in space rather than an entire room all at once.
So why is it on an iPad Pro again, the least mobile of Apples mobile computers? Think of it as a headset head start, for both the supply chain that provides the component and the developers who need time to figure out what to do with it. iPad feels like an especially odd place for it, but it may just be that suppliers aren’t ready to meet iPhone demand yet, says Troughton-Smith. It also gives Apple a chance to get developers to build experiences now that they can eventually show off whenever the next iPhone is introduced.
That last bit will be especially important. While Apple has spent the last several years pushing its ARKit frameworkdevoting big chunks of stage time to it at its annual Worldwide Developers Conferencethe tech has yet to go mainstream outside of a few specific cases, most notably Pokémon Go. If and when Apple does finally push into a new product category, it needs fully baked experiences to show off along with it. Getting lidar in front of developers through the iPad Pro will help familiarize them now, helping to stave off a potential chicken-and-egg problem later.
Immersive media has a content problem more-so than a hardware problem, says Brillhart. Most immersive technologies fail if engaging content is lacking, which is practically every piece of hardware right now. If I were Apple, I would be trying to seed this ecosystem now so that when I did release hardware, I would be avoiding a similar fate.
Whatever form Apples head-worn AR takes, and when, remains unclear, especially given the disruptive long-term effects of Covid-19. But when it does come, you can expect lidar to play a critical role. The iPad Pro is its dress rehearsal.

Share