Apple is looking into possible applications for real-time LiDAR surface tracking technology, as well as ways of efficiently recording touch sensations.

The future of computing is near and it may turn out to be as what movies have always portrayed them. There are rumors that Apple is currently looking into applications that can benefit from real-time LiDAR surface tracking technology. The company is also rumored to be investigating different ways to record touch sensations. The same whispered talks indicate that Apple is currently working on a type of augmented reality (AR) or virtual reality (VR) headset that uses light from a display to track motions on practically any surface. Finger devices, on the other hand, will provide details to the system as to what type of objects a user is currently touching.

Apple's work on a type of AR or VR head is the subject of many speculations for some time now. Document leaks showing patent filings and reports show the work of Apple can take them in several directions. For instance, two patents approved by the US Patent and Trademark Office reveal the number of ways the system could function. The use of real-time LiDAR surface tracking technology and touch sensation monitoring made all of these possible.

The company's inclusion of cameras in its AR headset setup, which also uses real-time LiDAR surface tracking technology and touch sensation recording, offers several advantages. Often used to capture an image of a scene as well as for object recognition, the same camera is used by the Cupertino tech giant to track objects. It monitors how an object shifts position in real-time relative to the position of the headset.

Apple is also considering using other types of object-generated light sources like LED indicators or the headset display showing patterns. If the latter is used in its AR headset setup, which uses real-time LiDAR surface tracking technology and touch sensation monitoring, it will provide the entire system with adequate orientation and positioning data, as long as the screen is visible. These light-based waypoints can also include wavelengths that are not visible. Having such capability offers an unobtrusive system that will not distract users or impede their field of view.

Since the system also uses non-visible light, Apple engineers decided to employ real-time LiDAR surface tracking technology and touch sensation monitoring. It offers several advantages to the company as it is already very familiar with the technology, having used it in TrueDepth cameras and on the back of the new iPhone 12 Pro and iPhone 12 Pro Max. LiDAR emits small infrared light points, which then reflect back to the imaging sensor, which then helps in mapping the objects' depth. Apple's patent, which was filed on January 31, 2019, lists Peter Meier as the inventor.