iPhone 12 Pro lets people who are blind ’see‘ others around them

id=“article-body“ class=“row“ section=“article-body“>


The lidar scanner on Apple’s new iPhone 12 Pro and 12 Pro Max enables new AR features — and the ability for people who are blind or low vision to detect others around them.

James Martin/CNET

Apple’s iPhone 12 Pro and 12 Pro Max have a new feature for users who are blind or low vision — the ability to essentially see other people coming. 

The devices make use of the new lidar sensor on the back of the phones to detect how close other people are to the user, something Apple has named People Detection. Lidar is a type of depth sensor that helps with augmented reality apps and serves as the eyes of self-driving cars. Now, Apple is applying it to accessibility in an effort to help people who have vision problems better navigate the world around them. 

When someone who is blind is grocery shopping, for instance, they’ll be able to turn on People Detection on their iPhone 12 Pro to let them know when they need to move up in the checkout line. Or someone walking down a sidewalk will get alerts about how close other people are as they pass by. People who are blind or low vision can use the feature to figure out if a seat is available at a table or on public transit, and they’ll be able to maintain proper social distance when going through health screening or security lines at an airport.

People Detection will be able to tell the person’s distance from the user in feet or meters, and it works up to 15 feet/5 meters away. Anyone in the iPhone 12 Pro’s wide-angle camera view can be detected by the feature. If there are multiple people nearby, People Detection will give the distance from the one closest to the iPhone user.

The technology will be available as part of iOS 14.2 over the coming weeks. Apple released a beta version of the software for developers on Friday. 

Globally, at least 2.2 billion people have a vision impairment or blindness, according to a World Health Organization report from last year. In the US, over 1 million people over the age of 40 are blind, according to the Centers for Disease Control and Prevention. By 2050, that number could skyrocket to about 9 million because of the „increasing epidemics of diabetes and other chronic diseases and our rapidly aging US population,“ the CDC said. 

Apple has made accessibility a focus for decades. It builds features into its technology to and allow people with motor impairments to virtually tap on interface icons. Four years ago, Apple and showing off its new, dedicated site.

„Technology should be ,“ Apple CEO said at the time. 

Apple, in particular, has long made features to help people who are blind or low vision. Its new people detector takes that a step further.

Lidar sensing

The technology makes use of the new lidar scanner built into the camera array of the iPhone 12 Pro and 12 Pro Max. It’s also on the newest iPad Pro and is likely to come other other devices in the future. The scanner itself is a tiny black dot near the camera lens on the back of the new, highest-end iPhones. 

People Detection won’t work on older iPhones, the iPhone 12, 12 Mini or even the new iPad Air. None of those devices come with lidar scanners, which is essential for the people sensing technology. 

Now playing:

Watch this:

Our in-depth review of the iPhone 12 and 12 Pro


People Detection uses Apple’s ARKit People Occlusion feature to detect if someone is in the camera’s field of view and estimate how far away the person is. The lidar scanner makes the estimate more accurate. It sends out a short burst of light and measures how long it takes the light to come back to the lidar scanner. The new feature doesn’t work in the dark or low light environments. 

All of the sensing happens in real time to give feedback on how far away a person is from the iPhone 12 Pro user. 

The user gets feedback from People Detection in four possible ways, and they can be used in any combination. All can be customized in settings. One way to get information about a person’s closeness is through an audible readout. The phone will say out loud, „15, 14, 13“ and so on, when it comes to feet. It gives the distance in half meters Effective Yoga Asanas For Reducing Waist Side Bulges people who choose that unit of measurement. 

The iPhone 12 Pro users also can set a threshold distance with two distinctly different audio tones. One is for when people are outside that distance and another is for when people are closer to the user. The default threshold setting is 6 feet, or 2 meters.