Snapchat has unveiled Lens Studio 3.2. This allows AR creators and developers to build LiDAR-powered Lenses for the new iPhone 12 Pro. Lens Creators can create Lenses that leverage LiDAR technology. AR experiences will get closer to the real world.
This new level of scene understanding allows Lenses to interact realistically with the surrounding world.
The LiDAR Scanner on iPhone 12 Pro and iPhone 12 Pro Max enables immersive AR experiences. This will overlay more seamlessly onto the real world. It lets Snapchat’s camera see a metric scale mesh of the scene. It understands the geometry and meaning of surfaces and objects.
This new level of scene understanding allows Lenses to interact realistically with the surrounding world. Thanks to the power of A14 Bionic and ARKit, you can render thousands of AR objects in real time. It will let you create immersive, magical environments for the whole Snapchat community to explore.
“The addition of the LiDAR Scanner to iPhone 12 Pro models enables a new level of creativity for augmented reality,” said Eitan Pilipski, Snap’s SVP of Camera Platform. “We’re excited to collaborate with Apple to bring this sophisticated technology to our Lens Creator community.”
Through a new interactive preview mode in Lens Studio 3.2, developers can create Lenses and preview them in the world, even before the new iPhone 12 Pro drops in stores. Snapchat’s LiDAR-powered Lenses can also be used on Apple’s latest iPad Pro.