If you own an iPhone 12 Pro or 12 Pro Max, you’ve probably heard a lot about LiDAR. Apple claims that the large black dot accompanying the triple camera system is the reason for its camera performance. Apple included the LiDAR sensor on the iPad Pro before the iPhone 12 Pro series.
The company is always proud to show off advanced AR (Augmented Reality) apps that use the world as a canvas in every formal presentation. As a result, according to Apple’s claims, LiDAR appears to have a wide range of applications in mobile devices.
Furthermore, with reports claiming that LiDAR will be included in all iPhone 13 models this year, it appears that we’ll have to get used to the concept of the LiDAR sensor in general. Moreover, Android phones are expected to receive the LiDAR sensor in the near future.
So, what exactly is LiDAR and why is it such a big deal?
Let’s take a closer look.
What is Apple’s LiDAR Scanner
LiDAR stands for light detection and ranging.
A LiDAR scanner measures the time it takes for a pulse of light (usually a laser) to bounce back to determine the distance between itself and an object. It works in the same way as radar, but instead of radio waves, it uses infrared light.
Because of the way light is absorbed by objects in its path, LiDAR works on a smaller scale than radar. LiDAR scanners work out distances and object sizes with relative accuracy over small distances by sending hundreds of thousands of light pulses per second.
This information can then be used to create 3D models, which is one of LiDAR’s primary applications in construction and engineering. So if you have heard of 3D laser scans being used to create building plans then that is LiDAR.
What is Special About Apple’s LiDAR Scanner?
Many Android phones have time-of-flight (ToF) sensors, which help them sense scene depth and mimic the bokeh effects of larger cameras. But the LiDAR system present in the iPhone 12 Pro and iPad Pro 2020 promises to go even further. That’s because it’s a LiDAR scanner, as opposed to the ‘scanner-less’ systems that have previously been seen on smartphones.
- A scanning LiDAR system fires a train of laser pulses at different parts of a scene over a short period of time, whereas the ‘scanner-less’ systems use a single pulse of infrared light to create their 3D maps.
- This has two main advantages: an increased range of up to five meters and improved object ‘occlusion,’ which is the appearance of virtual objects disappearing behind real-world objects such as trees.
- The data from the LiDAR scanner is combined with data from cameras and a motion sensor, then enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of the scene.
To put it another way, a lot is going on to make it appear seamless.
What Could Apple’s LiDAR Scanner Be Used For?
Apple has previewed a few LiDAR-specific applications in their demos which may come out this year. One of the more interesting takeaways from these demos is the game called Hot Lava.
Hot Lava, a first-person adventure game for iOS and PC, will get a new ‘AR mode’ that uses Apple’s LiDAR sensor to bring its molten rivers into your living room.
So far, the demo isn’t quite as impressive as people have hoped – most of the objects your character leaps around are in-game renders rather than real-life furniture – but knowing Apple, things will definitely improve in the future.
This addition of LiDAR to the iPhone 12 Pro will vastly expand the number of apps that support the technology, which could be a game-changer for the iPhone camera.
So what about non-gaming LiDAR sensor experiences?
Currently, the focus of LiDAR application is on interior design.
The IKEA Place app, for example, allows you to move virtual furniture around in your living room as if you were in a real-life version of The Sims.
App developers could use the LiDAR sensor to create new creative forms that go beyond traditional photography and video.
LiDAR for Vehicular Applications
While LiDAR is a new feature on Apple’s handheld devices, the tech giant has been using it in other applications for years.
Apple vehicles equipped with LiDAR sensors have been seen in California since 2015. This technology is also seen as a critical component in the development of autonomous vehicles, particularly in terms of allowing them to accurately analyze their surroundings.
The company appears to be steadily investing in LiDAR and other related research for vehicular applications, as evidenced by the patent applications.
In 2017, Apple released a research paper describing LiDAR-based 3D object recognition systems for self-driving cars. Fundamentally, the system combines LiDAR depth mapping with neural networks to greatly improve the self-driving car’s ability to “see” its environment and potential hazards.
Could LiDAR become a necessity for phones in the future?
It’s unlikely that this will happen. Apple has been pushing Augmented Reality hard in its ecosystem, but the benefits have been limited so far.
- Apart from AR-based games and a few scanning apps, AR as a concept has not yet reached its full potential. In the Apple world, augmented reality is currently limited to casual games, education, and basic scanning applications.
- Apple’s strategy is likely to include LiDAR sensors in a growing number of devices over time, while developers work on apps that take advantage of the improved performance. Apple appears to be betting big on augmented reality, given the company’s renewed interest in the technology over the last few software releases.
- Google on other hand has also demonstrated that similar results can be achieved without the use of dedicated hardware. On almost all modern devices, including Android phones, you can view AR figures in virtual space. To create decent AR applications, Google’s smart algorithms can rely on intelligence and a single camera sensor.
- Pokemon Go is a great example of a game that works just as well on a low-cost Android device as it does on an iPhone with a LiDAR sensor. Of course, having a LiDAR sensor could aid developers in better mapping the virtual world of Pokemon in low-light situations.
- Similarly, if you enable AR in Google Maps, it will show directions in the AR space. So while LiDAR appears to be useful, Android device makers could simply rely on Google’s smart algorithms to handle all of the necessary AR tasks.
Also as per analysts, Apple may be planning to extend the functionality of LiDAR beyond phones and tablets. Currently, it is rumored that the company is working on AR glasses.
If such a project exists, then it makes sense that accurate AR would be at the heart of the experience.
Is LiDAR Worth an Upgrade?
LiDAR is unlikely to influence your decision between the iPhone 12 and the iPhone 12 Pro. You won’t see any benefits in the short term unless you use a lot of AR-enabled apps or take a lot of photos at night. Even if you’re a die-hard AR gamer or a flat-pack junkie, AR implementation in non-LiDAR-enabled iPhones has vastly improved in just a few generations. LiDAR improves this even more, but the $300 premium Apple is asking for the iPhone 12 Pro is probably not worth it.
Is LiDAR an improvement/upgrade for Face ID?
The LiDAR sensor isn’t there to improve Face ID’s face-scanning login; rather, it’s there to be used in Augmented Reality (AR) applications. Also, the LiDAR scanner is included in the iPhone 12 Pro series and the latest iPad Pro models to improve distance and measurement accuracy, which is notably poor when using just camera sensors alone.
Is the LiDAR coming to other devices?
Since its big reveal on the 11″ and 12.9″ iPad Pro, Apple has released the iPhone 12 Pro and iPhone 12 Pro Mac with LiDAR capabilities. Here the LiDAR scanner brings about improvements to the ARKit applications and Photography. This update combined with Ultra Wide Band technology, you will be able to use the LiDAR scanner for in-door applications and item tracking.
LiDAR has limited applications in the world of smartphones at this time. There are no requirements for LiDAR hardware other than scanning 3D spaces and placing game elements.
Apple’s widespread adoption of LiDAR, on the other hand, may provide a reason for developers to make their apps rely more on this advanced hardware to improve the end-user experience.