Multimedia messaging company Snapchat reveals plans to be among the first third-party developers to capitalize on the new LiDAR sensor of iPhone 12 Pro models.

Snapchat, a popular multimedia messaging company, on Tuesday, confirmed plans to take advantage of the new iPhone 12 Pro's LiDAR sensor. The LiDAR sensor, which was first seen on the iPad Pro family, brings along a host of new augmented reality (AR) and photographic capabilities to Apple's iPhone 12 Pro and iPhone 12 Pro Max. According to Apple, the feature allows for 6x quicker autofocus, particularly in low-light conditions.

If the plan of Snapchat pushes through, it will be among the first third-party app developers to capitalize on the iPhone 12 Pro's new LiDAR sensor feature. Apple, during the "Hi, Speed" event said that the iPhone 12 Pro's LiDAR sensor feature can potentially improve the AR experiences from third-party developers.

On Tuesday, which is the same day as the "Hi, Speed" event, Snapchat announced plans to launch a Lens specifically designed for iPhone 12 Pro devices. The company also revealed that its Lens will capitalize on the new LiDAR sensor of the newest members of the iPhone family.

 
During its keynote, Apple showcased a preview of the new Snapchat capabilities. The new Lens filter, which strongly indicates what the multimedia messaging company has in store, is seen in Apple's keynote video at around 59 minutes and 41 seconds. This was before Snapchat announced its plans to capitalize on iPhone 12 Pro's LiDAR sensor.

In a later interview with TechCrunch, Snapchat confirmed that the Lens in the announcement of Apple is the same one it plans to launch in late 2020. The new iPhone 12 Pro's LiDAR sensor is a time-of-flight system which can precisely generate a depth map of a particular environment through the use of lasers. The result is a quicker and more accurate AR.

With the LiDAR sensor, third-party app developers like Snapchat can perform functions like room and object scanning, which may result in the creation of much improved AR shopping apps, games, or home design tools. It also can generate photo and video effects with more precise placement of AR objects since the iPhone 12 Pro can now "see" the room's depth map.

The capabilities brought along by iPhone 12 Pro's LiDAR sensor can lead to better AR experiences like what multimedia messaging company Snapchat plans to introduce. In the preview, you will see Snapchat's AR Lens place elements like grasses and flowers on the table and floor, as well as birds flying toward the face of the user. While the Lens that Snapchat uses in the preview is the one it is currently working on right now, the company did not provide any more details on how exactly it uses iPhone 12 Pro's LiDAR sensors to give users a new AR experience.