Judy Dixon

"Nine, eight, seven, six..."—my iPhone cheerfully counted down the number of feet between me and the person ahead as I moved up in line at the COVID-19 vaccination site. Of all places, I was so happy to be able to independently socially distance here. What made this electronic assessment of the distance between me and the next person possible is the latest amazing feature to come to an iPhone, LiDAR.

What is LiDAR?

In September 2020, Apple released four new iPhone models. The two higher-end models, the iPhone 12 Pro and the iPhone 12 Pro Max, as well as the iPad Pro 2020, included a new, highly touted feature called LiDAR. It was a bit of a mystery at first. We knew it had something to do with the camera, but Apple's description, "AR at the speed of light," sounded pretty sensational and didn't do a great deal to tell us how it was going to change our lives.

But LiDAR is definitely life-changing. LiDAR stands for “light detection and ranging,” and works by bouncing lasers off objects to very accurately measure their distance based on how long it takes for light to return to the receiver. It's like radar except that instead of radio waves, it uses infrared light.

LiDAR technology has been around since the 1960s. It has been used in industries such as robotics, surveying, meteorology, and even in the space program. LiDAR is often used to make high-resolution maps. In fact, LiDAR came to the attention of the public in 1971 when it was used by astronauts to map the surface of the moon. It is also used in self-driving cars to judge the distance to cyclists and pedestrians, and by the guidance systems of robot vacuums.

There are many types of LiDAR sensors. The scanning LiDAR systems used by iPhones and iPads fire thousands of laser pulses at different parts of a scene over a short period of time. This type of scanner has a range of up to five meters (about 16 feet). The LiDAR scanner's data is analyzed along with data from the cameras and motion sensor, then enhanced by computer vision algorithms for a more detailed understanding of the entire scene. 

This means the iPhone's LiDAR scanner is designed more for room-scale applications, like games or IKEA's Place app, which lets customers move prospective furniture purchases around in their own homes so they can see how it will look. The LiDAR on iPhones is currently not accurate enough to create 3D scans of individual objects.

The iPhone's LiDAR scanner can also help the phone take better pictures, especially at night when combined with Night Mode. By sensing the distance between your iPhone and the subject you’re taking a picture of, the camera can figure out at what distance it should focus to get the best result.

But even with distance and other limitations, the LiDAR on iPhones can have huge benefits for blind users. Let's have a look at Apple's People Detection feature and three apps that are using LiDAR to provide information to blind users.

People Detection

One of the first practical uses for LiDAR came along in November 2020 with the release of iOS 14.2. Apple added a People Detection feature. This is the one I used at the vaccine center. People Detection is part of the Magnifier app, which is on the phone by default. It uses the back-facing camera to alert you to people nearby. It can help you maintain a physical distance from the people around you by keeping you informed of how far away they are.

You can invoke People Detection in several ways:

  • Open the Control Center. By default, Magnifier is in the Control Center near the very bottom. Double tap to open it and double tap the People Detection button near the bottom right corner of the screen.
  • Perform a VoiceOver gesture. By default, the four-finger triple-tap gesture turns People Detection on. To assign a different gesture, go to Settings > Accessibility > VoiceOver > Commands > Touch Gestures. When People Detection is on, this same gesture takes the focus to the End button at the top left of the screen.
  • Add People Detection to the Accessibility Shortcut menu. Do this by going to Settings > Accessibility > Accessibility Shortcut. If you already use the Accessibility Shortcut to turn VoiceOver on and off, adding another option to this menu will make things a bit more complicated. Using the menu to start People Detection works well, but when you use it to turn VoiceOver off you will have no access to the menu when you try to turn VoiceOver back on. The only option here is to use Siri to turn VoiceOver on.
  • Add People Detection to Back Tap. This lets you turn it on with a double or triple tap on the back of your phone. Do this by going to Settings > Accessibility > Touch > Back Tap. If you do enable Back Tap to turn on People Detection, you will need to be very careful whenever you set your phone down on a hard surface.

If you prefer to have an actual Magnifier app, then go to Settings > Accessibility. You will find Magnifier near the top of the list. Magnifier is off by default. Double tap it. This will bring up a screen where you can turn it on. Here, you can double tap Magnifier Off and it will become Magnifier On. Turning Magnifier on in Settings will add the app to your App Library.

If you would like to have the Magnifier app on one of your home screens, go to the App Library and use the Search function to locate Magnifier. Once found, bring focus to Magnifier, be sure your rotor is on Actions, swipe down once and you will hear "Drag Magnifier." Double tap it. Now, do a three-finger swipe right to move to your home screens. Do this as many times as necessary to get to the screen where you would like to place Magnifier. Swipe down until you hear an instruction such as "Drop Magnifier before Notes" (or something similar), and double tap to place Magnifier in the location described.

If you do this, you have two additional ways to open Magnifier and get to People Detection. They are:

  • Launch Magnifier and double tap the People Detection button near the bottom right corner of the screen
  • Tell Siri "Open Magnifier," and double tap People Detection near the bottom right corner of the screen

The People Detection screen has a large Viewfinder in the center and an End button in the top left. When no one is visible, it says "no people detected," near the bottom of the screen. As soon as the app detects a person, it begins beeping and vibrating and the number of feet away is spoken and displayed near the bottom of the screen. As you get closer to the person, the number of feet spoken decreases, and the speed of the beeps and vibrations increases. At times, it does seem to get confused by mirrors and photographs of people. When you press the End button, you are returned to the Magnifier app.

You can change settings for People Detection. Double tap the Settings button in the lower left corner of the main Magnifier screen. The first section is Customize Controls. It is divided into Primary Controls (always visible) and Secondary Controls. Here you can move People Detection to another location in the rows of buttons on the lower part of the screen.

Near the bottom of the screen is a heading for Other Controls. If you double tap the People Detection button under this heading, you will have the following options:

  • Units: Choose meters or feet.
  • Sound pitch distance: Swipe up or down on the adjustable control to adjust the distance. When people are detected within this distance, the pitch of the sound feedback increases. The default is six feet.
  • Feedback: Turn on any combination of Sounds, Speech, and Haptics. If you turn on Speech, iPhone speaks the distance between you and another person.

When you have adjusted the people detection settings to your liking, double tap the Back button in the upper left corner then the Done button in the upper right corner.

Other Apps Using LiDAR

There are at least three other apps specifically made for blind users that feature LiDAR. Let's have a look at Seeing AI, Super LiDAR, and LiDAR Sense.

Seeing AI

Seeing AI is a free app from Microsoft. They describe it as an app that "narrates the world around you." It uses multiple channels to perform many different tasks such as reading short text and documents, scanning barcodes, describing people and scenes, and detecting color and light, and it even takes a stab at reading handwritten text. It is available in Czech, Danish, Dutch, English, Finnish, French, German, Greek, Hungarian, Italian, Japanese, Polish, Spanish, Swedish, and Turkish.

In December 2020, Seeing AI added a World channel. This channel is only visible with devices that are equipped with a LiDAR scanner. As you scan the camera slowly around a room, the app will detect things in your environment such as doors, windows, furniture, cups, bottles, books, and even people. It will speak the distance to an item it detects when it first sees it, but it doesn't continuously change the distance as you move around in the way that People Detection does.

The World channel's main screen has the following controls: Menu, Proximity Sensor, and Quick Help buttons across the top, Spatial Summary and Place Beacon buttons near the bottom, and the adjustable Channel Selector across the bottom.

The Menu button brings up a list of general items for the app such as Help, Settings, Feedback, and About.

The Proximity Sensor causes the phone to vibrate as it detects items in your environment. The vibration becomes more intense as you get closer to the object.

Quick Help provides a summary of the functions of the app and reminds us that this feature is experimental so we must be cautious. At the bottom of the Quick Help screen is a Check Headphones button. Double tapping this causes the app to speak the words "left ear" then "right ear." This lets you ascertain that your headphones are set up properly.

Spatial Summary causes all the objects that have been detected in the room to speak their name. A special feature of this app is its ability to provide a spatial view using headphones. So, if you are wearing headphones when you request a spatial summary, each object's name will be spoken from its location.

Place Beacon lets you create a beacon to guide you to a specific object. It brings up a screen that lets you choose a detected object from a picker item. When you have selected your object, swipe right to a Start button and double tap it. A tone will sound. Turn to face the tone and walk in that direction. If the tone moves to one side or the other, you must turn to keep it in front of you. A different tone will sound when you have reached the object.

I tried this. I walked into my office and slowly scanned the room. I heard the app say "bottle." I double tapped Place Beacon, selected bottle from the list, and double tapped Start. The tone sounded in my right ear. I turned to face it, followed the tone, and walked right to my water bottle. Curiously, it detects my clear water bottle when there is water in it but not when it is empty.

Super LiDAR

Super LiDAR is an app from Mediate, the same developer as Super Sense. It uses LiDAR to detect objects and people and verbally indicates the distance to them as well as providing feedback with tones and vibration.

When you first launch the app, you will be greeted with a Welcome screen that contains the descriptive help text for the app. You are required to enter an email address before you can proceed. Once that is done, double tap the Get Started button and you will be presented with another block of text that describes the main screen of the app with an OK button at the bottom.

When you get into the app itself, it begins sounding a tone and vibrating. Both the tone and vibration vary as you scan your environment based on how close you are to anything in its view. The higher the pitch, the farther away is the object that the app is seeing. The vibration becomes more intense the closer you are to an object.

The app verbally identifies specific objects that it knows about such as window, door, ceiling, table, seat, and so forth. At the moment, Super LiDAR can detect about fifteen different objects.

As soon as the app sees any part of a person, it starts saying "a person." When it sees the person's face, it will tell you if the person is wearing a mask. This app does keep changing the distance to an object or person as you move around, but it does it a bit eratically.

There are only two buttons on the screen. Open Menu is in the top left and Stop Detecting is in the bottom center. When you open the menu, you will find buttons to select the detection type. The options are Detect People, Detect Environment, or Detect All. Detect All is selected by default. There are also buttons for Help, Request a Call, Rate Super LiDAR, About, and Open Super Sense. The latter is Mediate’s companion app that can scan text, detect barcodes, identify currency, and much more.

If you stop detecting before you close this app, it will be off when you re-open it.

LiDAR Sense

LiDAR Sense is yet another app that can detect objects and people in your environment. But this one has very few features. It also sounds tones and vibrates.

When you first launch the app, you will get a brief description of how to use it. Double tapping the Continue button will bring up the camera permission screen, and then the app will be live.

The main screen has three buttons: Start/Stop in the center, with Settings and Information Menu below.

When you press Start, the app begins sounding tones and vibrating. It does not speak at all. It simply emits tones and vibrations of various strengths to indicate your proximity to nearby objects. The more intense the tone and vibration, the closer you are to an object.

Settings lets you disable vibration, disable sound, turn spatial audio on/off and change the maximum range for vibration and sound. The default maximum range for vibration is 2.1 meters and the default maximum range for sound is 5 meters. Spatial audio is only enabled when you are using headphones that supports this feature. When I used this app with my AirPods, spatial audio was turned on automatically.

The Information Menu lets you Replay the instructions, provides an email address to contact the developer, and links to informational websites.

The Bottom Line

I have found that the LiDAR feature on the iPhone works better indoors than out. Sixteen feet is really not that far. In general, the apps are optimized for indoor use because that is where the objects they know can be found, but Seeing AI does recognize a car. The app's Proximity Sensor also does a nice job finding our rubbish bins after the collection people have tossed them about a bit. Both Super LiDAR and Seeing AI recognize my front door as a door if I remember to angle the phone up a lot because it is at the top of my front steps.

While LiDAR is far from perfect, it does represent another step for blind people toward getting accurate information about our surroundings. We can then use that information for many different practical and useful purposes, from finding misplaced items to following a line of people at the bank, airport, or vaccination center.

This article is made possible in part by generous funding from the James H. and Alice Teubert Charitable Trust, Huntington, West Virginia.

Comment on this article.

Related Articles:

Previous Article

Next Article

Back to Table of Contents

Author
Judy Dixon
Article Topic
Product Reviews and Guides