Full Issue: AccessWorld September 2017

More Home Appliance Accessibility and Home Automation Articles Coming to AccessWorld

Lee Huffman

Dear AccessWorld readers,

As our regular readers know, AccessWorld has covered home appliance accessibility in the past, and when we speak of technology, home appliances and home automation are becoming more and more a part of that conversation. Readers regularly comment on these articles in AccessWorld and ask for more on the topic. In direct response to those comments, later this year and next year, AccessWorld will be looking more frequently at home appliance accessibility and home automation from the perspectives of people who are blind and people who have low vision. We will cover features such as tactilely discernable controls, audible tones, font size and style of control labeling, color contrast, glare, and the positioning of controls. We will also report on app-controlled home environment and appliance options as well as emerging uses for the Amazon Echo and Google Home. AccessWorld staff hopes this will provide useable information for our readers and guidance when purchasing home appliances. So, stay tuned.

As I'm sure you have all noticed, the days are now growing noticeably shorter. Students have returned to school, and it's now a logical time to begin thinking about work and careers. October is National Disability Employment Awareness Month, and next month AccessWorld will recognize its observance by taking a closer look at employment resources for people with vision loss as well as by revisiting tried and true job search strategies. Of course, we will also be looking at technology to support and enhance your career and work life. As employment is such a critical issue for people with vision loss, the October issue will be an expanded issue with more articles and information than usual.

The AccessWorld team hopes you will read each article in this and every issue to gain as much access information as possible. As technology is always advancing, we encourage you to stay proactive in seeking out new access strategies that may better meet your situation.

Sincerely,

Lee Huffman

AccessWorld Editor-in-Chief

American Foundation for the Blind

Letters to the Editor

Dear AccessWorld Editor,

I am very excited about new technology. I have pre-ordered the Victor Reader Trek and use JAWS, Openbook, and the KNFB reader app. I sometimes worry about a so-called big cyber-attack that renders useless all the great technological advances I rely on every day. Have you received similar concerns from others?

Sincerely,

Gregory Hinote

Dear AccessWorld Editor,

It is unacceptable that in 2017, with all the attention given to the subject, accessibility remains an afterthought in the minds of many major corporations.

My name is Mike Horn, and I approved this message!

Respectfully,

Mike Horn

Dear AccessWorld Editor,

This letter is in response to Janet Ingber's August article, Obtaining Accessible Cable Television: A Frustrating Experience.

Janet Ingber wrote a great article in the August issue of AccessWorld. I left Verizon FiOS last June to get Apple TV, because Verizon had no accessibility service. I called Verizon on August 5th, and the customer rep said that they do not have an accessible cable box. In her article, Ms. Inger said Verizon had one. My son lives in Maryland, and he has an accessible cable box from Comcast; he is not blind.

Respectfully,

Larry McMillan

Dear AccessWorld Editor,

Bill Holton did a good job on his overview of the Microsoft Seeing AI app.

In Short Text mode, try pointing it out a car window while moving. It is astounding that it reads signs along the road.

It can even read street signs, signs in front of the car, like the name of a bridge, or type of bridge. Living in Portland, Oregon, we have a lot of bridges and some are draw bridges. It read "draw bridge" when we were waiting because of a bridge lift…

This is truly an amazing app, and more amazing because it is free, and likely to remain so.

Best,

Richard Turner

Dear AccessWorld Editor,

This message is in response to Bill Holton's August article, An Accessibility Check of the Rachio Smart Wi-Fi Sprinkler Controller.

I know Amazon sells the Rachio, because I got it during a promotion and it was much cheaper than the $199.

Thanks,

Gwen Givens

Dear AccessWorld Editor,

Janet Ingber's August article on obtaining accessible cable programming sounded like déjà vu. I went through a lot of this with Spectrum. Since my rates were getting higher and higher, I actually did cut my cable and I went the Apple TV route. I am currently using Netflix and Hulu. Netflix, of course, does have a good bit of accessible programing, but Hulu does not. I'm trying to work with them. There is Hulu live which gives you access to all of your local channels. I'm going to try to work myself up through the various layers and see if I could get to somebody that I can actually deal with.

Best regards,

Gwen Givens

Dear AccessWorld Editor,

This message is in response to Bill Holton's August article, Microsoft Seeing AI: A Quick First Look at this Groundbreaking iOS App.

I totally agree with everything said so far. This is an app that sent me scurrying around my office taking pictures of everything! I haven't been this impressed with versatility for a long, long time.

Sincerely,

James Puchta

Dear AccessWorld Editor,

Dear Editor,

Could you talk about the braille Kindle in the next AccessWorld issue?

Thank you,

Paige Melhberger

Dear AccessWorld Editor,

This message is in response to Bill Holton's August article, Microsoft Seeing AI: A Quick First Look at this Groundbreaking iOS App.

I have had the Microsoft Seeing AI app for a week, and I love it! It can take a bit of searching to locate a bar code, but it then does its thing very well. It is now so easy to get cooking directions off a package. I have a terrible time picking shirts out of my closet so I am going to try using the facial recognition mode in labeling them. If it works, it will be fantastic! As far as accessibility goes, my PC falls way short of VoiceOver on my iPad, so I was surprised when I used this app the first time. Way to go Microsoft!

Sincerely,

Chris Hoffman

Dear AccessWorld Editor,

I recently downloaded the Microsoft Seeing AI app, and I found controls for the app were very difficult to use. Changing to the different channels was extremely hard to understand. I opened the app the other day and it just started automatically taking one picture after another until I closed it. I aimed it at my wife and it gave no description at all. I don't understand how to control the lights. I was able to read some short text. I even went to the website, and it described many of the features Mr. Holton lists in his article, but it did not give directions on changing functions there either. Tapping on the quick help still gave very little info.

It is a great tool, but the documentation was terrible from my standpoint. I am not an advanced user.

Sincerely,

Wayne Smith

Dear AccessWorld Editor,

This article is in response to Janet Ingber's August article on obtaining accessible cable programming.

My husband and I had the same experience when we tried to get an accessible cable box through Spectrum. We did not keep the laptop or the Roku.

The Spectrum app for IOS is accessible, and it is possible to access audio description for programs that have it. I have also tried the Android version, but I have found no way to access audio description using it.

Thanks,

Kathy Blackburn

Dear AccessWorld Editor,

At times, I enjoy perusing the back issues of AccessWorld. This article is in response to J.J. Meddaugh's December 2013 article, Audio Description in Theaters: Making Theaters More Accessible.

I am thrilled by this option at the theater. I did try a different theater yesterday, and the over-the-ear headphones only played the audio description. It was very awkward to have one side on and the other side off in order to hear dialogue and sound effects. My local theater has the pager type device that allows me to bring my own headphones and only put one in. That seems to work much better. I wonder when the iOS apps will work and we will not have to monkey with devices that the theater loans us? Thanks for the article!

Sincerely,

David Life

Dear AccessWorld Editor,

This message is in response to Jamie Pauls' January 2016 article, Making the Transition from English Braille to UEB.

This is the best article on the subject I have read. It is definitely worthy of an award.

Thank you for publishing.

Sincerely,

Diane Scalzi

AccessWorld News

Global Blind Population Set to "Triple by 2050:" Study

"Researchers in a study have warned that the global blind population will triple by 2050 due to a growing population of aging individuals." Read more about this study.

American Printing House for the Blind (APH) and HumanWare Announce the MATT Connect, a Combined Video Magnifier, Distance Viewer, and Accessible Educational Tablet

The American Printing House for the Blind (APH) and HumanWare have announced the MATT Connect, a combined video magnifier, distance viewer, and accessible Android tablet with included software. The device consists of the magnifier/tablet, a folding stand, and wireless camera for distance viewing. The video magnifier allows for 24X magnification while being used live. Magnification levels up to 80X are available for captured images. OCR is also possible. The wireless camera for distance viewing can provide magnification to a maximum of 40X. The device retails from APH for $2,995 and, if a customer purchases the MATT Connect from APH before September 29, the purchase will include an extended care plan from HumanWare at no additional charge.

NV Access Releases NonVisual Desktop Access (NVDA) 2017.3

NV Access, the developers of the free and open source NonVisual Desktop Access (NVDA) screen reader, have announced the release of NVDA 2017.3. This update includes some significant new features, including the ability to use Microsoft's OneCore Text-To-Speech voices, the ability to write in contracted braille, and the integration of Windows Optical Character Recognition (OCR) for those using the Windows 10 operating system.

Freedom Scientific Releases New Focus 40 Blue Braille Display

Freedom Scientific has announced the release of a new generation of their Focus 40 Blue braille display. The new device has been engineered to be resistant to falls and impacts. The exterior casing, keyboard, braille cells, and ports have all been engineered to contribute to the overall ruggedness of the device. The new display also uses a USB3.1C connector; this connection type has the benefit that the connector is always in the proper orientation when connecting it to the device. Freedom Scientific has provided a press release describing the features of the Focus 40 Blue.

HumanWare Announces the Release of the explorē 8 Handheld Electronic Magnifier

HumanWare has announced the release of the explorē 8 handheld video magnifier. This magnifier contains an 8-inch display with magnification up to 30X. The magnifier has two cameras, one for close viewing and a second for distance viewing. The magnifier is equipped with a touchscreen as well as physical buttons for changing contrast and magnification. The explorē 8 can be purchased from HumanWare for $1,099.

AFB Press Announces Publication of Orientation and Mobility Techniques: A Guide for the Practitioner, Second Edition by Diane L. Fazzi, PhD and Janet M. Barlow, MEd

This very first techniques book in orientation and mobility has been completely revised and updated for today's fast-changing world, while remaining true to Hill and Ponder's simple organizational principles that generations have known and loved. A new, easy-to-read color format, accompanying photographs, updated information on street crossings at complex intersections, and a new chapter on O&M for people with low vision make this revised edition a must-have in your O&M library.

Orientation and Mobility Techniques, Second Edition is 290 pages long and is now available in paperback for $54.95; the electronic formats: e-books priced at $38.95; and online subscription priced at $32.95, are coming soon.

Book Review: Across Two Novembers: A Year In The Life Of A Blind Bibliophile, by David Faucheux

Browse any virtual or brick-and-mortar bookstore or library, and you will undoubtedly find dozens of inspiring tales of blind people who have excelled. Extend that search to a collection whose target audience is blind people and that number will be in the hundreds.

I have binged on those books at least once a decade since the late 1970s. Memoirs or biographies highlighting the real or metaphorical scaling of heights which have, ultimately, positioned blind people at the pinnacle of achievement in chosen fields of music, science, law, drama, sports, adventure, and more.

David Faucheux wants us to know (indeed, methinks he doth protest too much on the subject) that he is not that kind of blind person. He is not a "super achiever" or super hero. He is not wealthy or well connected. And yet, as one who has overindulged on blindness memoirs more than once in the past few decades, Across Two Novembers struck me as the most genuine depiction of life from the blind side that I have read.

The book is a journal, spanning 13 months from November 2013 through November 2014. And, in keeping with the title, it is clear throughout these pages that the author is a blind man and that he loves books.

A Look Into A Life

Because it is a journal, we walk with David Faucheux day after day as we read this book. His life is small by some standards: he lives in an apartment in Lafayette, Louisiana, and does not have a job or school expecting him to attend every day. He has the occasional meal with a friend, goes to church and medical appointments, gets groceries or a takeout meal, and distributes what wealth he has, books and food, with others in a way that is both commendable and touching. Because he has Fibromyalgia Syndrome (FMS) as well as a possible sleep disorder, his schedule is sometimes truncated and his energy easily depleted. These elements are woven into his daily entries, as are the more mundane minutiae of any real person's day: the weather, household tasks, what was for dinner, and private fears. He has two college degrees and once hoped to have a career in the world of libraries, but those plans didn't work out.

There are quirks in his routines that might prompt some head scratching. He marinates laundry in the refrigerator and/or bathtub, for example, and has a remarkable reserve of information regarding supplements and healthy food. He irons his clothes and is creative in the kitchen. The pieces of each day that comprise life for David Faucheux are, in other words, simple and unremarkable. And yet, his recounting of these parts is far from prosaic.

We wait with him (and feel his anxiety) when the paratransit is late or the street too wide for comfort. We travel with him and understand his choices to spend some quiet time alone one day and venture into the unknown the next.

The single common thread that runs brilliantly throughout the book is the author's love of books and of knowledge. Just when it seems you have hit a mundane plain in Across Two Novembers, Faucheux sprinkles in another delicious quote from an author, summary of a book that you want immediately to add to your own reading list, or a fascinating factoid from history.

Sometimes, the bonus he slides in so casually is information about body or brain health or the highlights of a famous person who has died that day or even, sometimes, instructions for preparing something delicious in the kitchen.

All About that Tech

Readers who are anywhere on the blindness and low vision continuum will find some common denominators with Faucheux's relationships with technology. From beginning to end of this book, the integral part played by technology in the life of any competent blind person is readily apparent. As is the case with everything Faucheux does, the references to access technology are casually woven into other bits of his day. The death of a movie star or politician, the anxiety over catching a paratransit ride, and the sheer helplessness he experiences when a computer stops speaking—all of these might appear on the same page. This, of course, is exactly as it should be, as this is how a real life unfolds.

We learn about his 20-year-old trusted braille notetaker and his special Dvorak keyboard (purchased in the hope that typing on it would increase speed and reduce fatigue). He marvels more than once at the miracle of having so many books to read (thousands on his computer, he tells us), after having grown up in the 1970s when braille books were, by today's standards, precious and rare.

With the same candor that he writes of the frustration of looking for hours for a plastic container, he recounts his disappointment as it becomes increasingly clear that a software package he needs for a job he has worked hard to pursue just might not be within the grasp of a blind computer user.

We learn about online sources of entertainment and education, some specifically for blind people, and we learn how central email communication is in the life of this man's quiet life. We learn about captchas and programs to interpret them. We learn about computer companies and publications. And, of course, AccessWorld is mentioned more than once.

Since books are so intrinsic to his relationship with the world, we learn about the many sources and formats for accessing reading materials for people who are blind or low vision. From early records to cassettes to digital books and downloadable material, he clearly explains for the uninitiated and stirs memories for those who have sung in the same choir. Organizations relevant to every aspect of blindness - from access technology to civil rights to fun and games online or on chat lines - he provides abundant information without ever being tedious.

Just when an entry is bordering on contrite, Faucheux's sense of timing comes into play and he throws in another bit of history or trivia or a book review.

I would be negligent not to mention Faucheux's vivid descriptions of all things food-related. He quips that perhaps he should get an MFA in Gastronomy, but I, for one, would encourage him to think seriously about such a pursuit. Whether describing food he has prepared, a dish in a restaurant, or just some musings about combining particular ingredients, his words about food will make you hungry!

Resources

Across Two Novembers is not the kind of book many of us will read in one sitting. You might read a month of journal entries at a time, for example, put it down, reflect, and come back several days later for more.

The resources the author has compiled are outstanding. In addition to a detailed bibliography of books referenced, he also includes a wealth of blindness resources, sources for book lovers, and websites related to his various interests.

David Faucheux has not broken any world records or even yet fulfilled his dream of appearing on Jeopardy! He has, however, done a wonderful job of capturing the essence of an ordinary blind person's daily life, where no tool and no emotion takes center stage, but all of it together makes a whole. If you want to introduce a sighted person to blindness, this book would be an excellent place to start.

Book Information

Across Two Novembers: A Year in the Life of a Blind Bibliophile, by David Faucheux

Available at: Amazon (Kindle and epub), $4.99; and at Bookshare.

Comment on this article.

SEE3D: Teenagers Expand the Visual World for People Who Are Blind

As an adult, one of my favorite shopping venues has long been toy stores, or, to be more precise, the section in toy stores where the plush and/or plastic animals and characters reside. Like most blind people, I see with my hands. While using the tactile to translate the visual results in fabulous images delivered to the brain, there are definite limitations on the range of "sight" when touch is required.

I can't touch a rat, a fox, or a crocodile. Nor would I want to touch them. But three-dimensional replicas fill the void.

SEE3D at the Tech Olympics

Caroline Karbowski, a bright and talented senior at Summit Country Day School in Cincinnati, Ohio, did not yet have any blind friends when she started thinking about the power of 3D printing to deliver visual images into the hands of people who could not see in the conventional way. As a high school junior, she attended a college open house at Xavier University and happened to meet Casandra Jones, a disability services professional who is blind.

She asked Cassandra the probing question, "What images would you like to see?"

The answer, more delightful than profound, was "Mickey Mouse and a Disney castle."

Caroline found a 3D printer and an online image and the amazing palm-sized model was soon in Cassandra's hands.

People suggested to Caroline that she meet Haley Thurston, the daughter of the Spanish teacher at Caroline's school. The same age as Caroline, Haley was immediately enthusiastic. She was also ready with ideas of images she would like to see.

Haley, it turned out, longed to see a map of the world, various geometric shapes, and insects.

Caroline, with her 3D printing, was off and running.

Speaking of running, ideas seem to spark in Caroline's imagination almost faster than she can catch them, but one idea she caught and tackled was to find some collaborators with more tech experience than her own. She wanted to build a project worthy of competition in the Tech Olympics.

She is not a techie herself, she says, just a person with an idea of how to help blind people see the world. Her tech teacher at Summit and the school Tech Club jumped on board. A website was launched, a plan developed, and a project called SEE3D won second place in Cincinnati's Tech Olympics. That was February, 2017—history, you might say, but Caroline and See3D have just begun.

Consulting the Experts

Caroline Karbowski has no shortage of intellect or creativity. She told me that she learned the braille alphabet when she had an hour of boredom to fill as a sixth grader, accomplishing the task with a pencil point and encyclopedic image of the six-dot code. She is wise enough, in other words, to realize that to make See3D a truly successful venture, she needs to gather information from experts. To that end, she has traveled with her parents to sources in Ohio, Indiana, Chicago, and New York, and is still gathering information. She has shared information with teachers of the visually impaired, and the information is clearly flowing in both directions.

In Indiana, she was thrilled to see the project of a teacher there who is assembling, piece by piece, a 3D replica of the Indiana School for the Blind and Visually Impaired.

In Chicago, she delighted teacher and students alike with her models of minions!

She gave them minions and butterflies and they gave her some clear plastic labeling sheets, so she could begin making braille labels for all her images.

While Caroline and the half dozen students working with her have created plenty of images for fun, she sees the greatest future for the project in making 3D images with a purpose. In particular, she is focusing on STEM (science, technology, engineering, and mathematics) images. Touring thingiverse.com, a website dedicated to sharing images created for 3D printing, she has downloaded and produced 3D representations of DNA, molecules, myriad shapes, topographical maps, a chameleon, a cell, and more. Besides visiting schools for the blind and talking to professors and teachers involved with the education of blind children, she is contacting blind people one by one as well.

Growing See3D

While Caroline maintains that using 3D printers is dramatically less expensive than purchasing science-related kits designed for the blind, producing images still costs money. Her school has a half dozen 3D printers and she has enlisted collaboration from students at other area high schools, but after the success at the Tech Olympics, she launched a GoFundMe page to help purchase supplies. So far, she has used the money to buy filament and plastic labeling paper. Her hope is to purchase a Perkins Brailler, which her friend, Haley, has been teaching her to use.

See3D received a grant of $250 (the Jane Goodall Roots & Shoots grant), and a 3D printing package through the GE Additive Education Program. The latter includes two Polar Cloud-enabled polymer printers, one Polar 3D printer and one XYZprinting printer. It also includes Polar 3D's STEAMtrax curriculum with a two year license, six rolls of filament for each printer and one of the STEAMtrax module kits, "Tinkering with Turbines."

Producing the Models

At this point, Caroline and her collaborators are not creating new images, but searching for existing images that will work well for blind people. Her primary source is the Thingiverse website where she invites schools and individuals to browse and request images of interest found there. The filament used to create the images comes in rolls that resemble dinner plates, and is available in a variety of colors and textures. The filament is threaded, something like spaghetti, into the machine which, as Caroline describes it, functions more like a hot glue gun than a printer.

At present, See3D students are not particular about the colors used for a given model. Rather, to be economical, they are inclined to simply use a color until the roll of filament is used up.

The goal of See3D is to build a collaborative network of high school students who can produce the 3D models for teachers of the visually impaired and blind individuals themselves who request them.

The Bottom Line

Caroline had been given my name by other blind people in Cincinnati and contacted me to ask if I'd like to receive any of the 3D models. After our first conversation, she sent me a Cinderella castle and a butterfly.

The professional nature of the packaging surpassed that of some for-profit companies shipping products to blind and low vision individuals.

On the outside of the package were braille labels, so that I immediately knew it was the package sent by Caroline of See3D. Inside, the models were protectively wrapped. There was a braille letter (Caroline's grade 2 braille was not perfect, but absolutely clear) along with business cards that bore both print and braille contact information.

The models themselves are delightful. There is something mesmerizing about the butterfly in particular. Each time I pick it up, I find running my fingers over its wings and antennae somewhat irresistible.

If you are blind or have significant low vision, reflect for me on a few questions: Can you confidently describe a Disney castle? A butterfly? Shrek? Or a minion? How about the inner layers of the earth? Or a particular constellation of stars? Do you have a mental image of the face of Abraham Lincoln or Barack Obama?

The sense of touch (or, more precisely in this context, touch translating for sight) can deliver powerful visual images to the brain, but there are countless images all around us that are well beyond the typical three-foot reach of a human being's arm.

If you would like to request a 3D model for yourself or your students, send your request via email.

Caroline is about to begin her senior year and she is involved in academics, theater, music, golf, not to mention all of the rigorous planning involved in choosing and getting ready for college, but she is passionate about this project and is recruiting more collaborators to help.

As the project grows, you can read more at the SEE3D website.

Comment on this article.

Audio Description Comes to Amazon Prime

In the July 2015 issue of AccessWorld, we took a look at the audio description feature that had just recently been added to many shows in the Netflix lineup. Audio description is an additional audio track that describes visual and unspoken aspects of a movie or TV show. Audio description is provided primarily so that people with visual impairments can gain a better understanding of what is going on onscreen.

When it first launched, some subscribers had difficulty getting the audio description feature to work on their various devices, but Netflix quickly worked to solve all existing problems. When that July 2015 AccessWorld article was written, there were 87 shows on Netflix containing audio description.

One year later, we took a look at how far Netflix had come with implementation of audio description on their network. The number of programs containing an audio description track had jumped from 87 to over 150, and accessibility to the service had improved dramatically across all devices used by the blind community. Today, according to the list available from the American Council of the Blind's Audio Description Project, there are currently around 445 audio-described programs on Netflix.

Shortly after Netflix began offering audio-described content, Apple started offering movies with audio description in the iTunes Store. Along with the ability to filter search results in order to only see content with audio descriptions, iTunes makes it quite easy to determine whether a movie includes an audio description track by simply looking at the details of the movie provided in the Store, just like you would to see if a show had closed captioning.

With Netflix and Apple both providing audio-described content with their movies, the blind community began to ask other providers when they planned to do likewise. The most recent content provider to step up to the plate has been Amazon.

On June 9 of this year, the American Council of the Blind and Amazon announced that Amazon Prime was offering 117 movies and 10 TV series with audio description. The ACB's Audio Description Project (ADP) now links to a page on Amazon that shows all content with audio description. As of this writing, there are 133 titles available. Many are free with your Prime membership, while others must be rented or purchased. Titles are sorted by heading for easy navigation with a screen reader, and links are provided to watch programs, rent or purchase them, or add them to a watch list for later viewing. The ADP also offers its own page with an alphabetized listing of Amazon programs with audio description.

How to Access Audio-Described Content On Amazon Prime

For this article, I tested one documentary, 13 Hours: The Secret Soldiers of Benghazi, available to watch for free with my Amazon Prime subscription. I tested Amazon Prime's audio description feature using a Windows 10 computer running JAWS version 18 and the latest version of the Firefox browser, a Mac running Safari, and an iPhone 6 running the latest version of iOS 10. I did not test the feature using an Amazon tablet, or an Android device, and audio description is not yet available on the Apple TV, according to the information found on ADP's Amazon page.

Accessing Amazon Prime's Audio-Described Content On a Windows 10 PC

After pressing Enter on the link found in the title of my documentary, I was easily able to find controls to resume watching the program (I had been watching it earlier) or start from the beginning. Try as I might, after I began playing the program, I was unable to get to the screen I needed to enable audio description. ADP's Amazon page states that a sighted person must enable this feature once, but that it will stay enabled thereafter. I did not ask my sighted wife to enable the feature for me. I will wait until I can turn on audio description independently before I watch audio-described Amazon content on my PC.

Accessing Amazon Prime's Audio-Described Content On a Mac

Unfortunately, I had no success accessing audio description on my Mac, either. After I pressed Enter on the title of the documentary, I repeatedly got stuck in a dialog box that kept popping up wanting me to give the program a star rating. I could not seem to get past this dialog in order to start playing the program.

Accessing Amazon Prime's Audio-Described Content On iOS 10

Using the free Amazon Prime Video app on my iPhone, I did a search for the title of my documentary. After opening the details page of the program, I was again given the chance to resume watching the program, or start from the beginning. Unfortunately, as was the case with the two other devices I tested, I was unable to determine whether the program was audio-described from reading any of the detailed information available from this page. Because I had enabled audio description on my iPhone previously, the show, complete with audio description track, began playing as soon as I activated the control to begin watching the program from the beginning. When the program began, my phone went into landscape mode. Since I did not quickly begin examining the screen, the video controls disappeared. I had to double-tap the "video" label to get them back. I was able to then swipe right to the "audio and subtitles" option, which stopped playback of the program. "English [Audio Description]" was already selected for me, but I could have easily selected this option had it not already been enabled. Closing this menu of options, or making a selection resumes playback of the program, and closes the menu.

I found the process of enabling audio description and watching content on my iPhone to be quite straightforward. This is how I will watch Amazon Prime's audio-described content for the foreseeable future.

The Bottom Line

Two years ago, there were no mainstream content providers of television programs and movies that offered audio-described content for people with visual impairments through their online streaming services. Today, Netflix, iTunes, and now Amazon Prime offer this feature. Currently, iTunes is the only service that makes it easy to determine whether or not an audio description track is available in a program simply by looking at the details page of the movie or TV show in question. All of the services do provide access to a list of audio-described content on their site by either allowing for the filtering of search results to show audio-described content, and/or providing a list of that content somewhere on their site.

Although I was unable to access audio description on my PC or Mac, I was able to obtain the content on my iPhone. I would certainly like to see better accessibility across all of my devices, but I consider this a really good start.

If Amazon continues to add audio-described content to their list of programs on a regular basis, and improve accessibility across all devices, blind people will have yet another excellent source of television programs and movies with audio description tracks readily available and easily accessible.

We can only hope that services like Hulu will feel increasing pressure to add audio description to their program lineup. People with visual impairments, as much as they can, should be encouraged to subscribe to services that offer audio-described content, and encourage others to do so. Also, positive comments on social media sites such as Facebook and Twitter will encourage companies like Amazon to continue offering the content that many in the visual impairment community have been requesting for so long.

Product Information

Amazon Prime costs $99 per year. You can also pay $10.99 per month, or subscribe to the video only plan for $8.99 per month. Many TV shows and movies are available for free with your membership; other programs can be rented or purchased.

The Amazon Prime Video app for iOS is free, and requires that you sign into your Amazon Prime account.

Audio description must be enabled initially when any new device is used to play Amazon Prime's video content, but the setting is remembered on each device the next time you play content.

Comment on this article.

International Association for Accessibility Professionals: Promoting and Improving Digital Accessibility Across the Globe

It's becoming increasingly difficult to imagine a world without some degree of access to digital content. Digital content is present in virtually every aspect of people's lives these days, including education, employment, news, entertainment, health, banking, social media, and the purchasing of essential goods and services. For people with disabilities, accessible digital content can mean the difference between full, equal access and the inability to access even the most basic of services and resources.

Accessibility and the Law

Webster's Dictionary defines "accessible" as: "easy to approach, reach, enter, speak with, or use." This definition sums up what people with disabilities hope to experience with digital content. There is certainly a compelling argument to be made for why any organization should strive toward greater digital content accessibility. Some of the benefits of accessibility can include better search engine optimization (SEO), improved usability and accessibility for everyone, and an expansion of an organization's target audience. The most common driving factors that motivate organizations to achieve greater accessibility are compliance and risk mitigation, which reduce the potential for costly settlements, lawsuits, and negative public relations that can have an unwanted impact an organization's bottom line.

The legal repercussions for inaccessible digital content are taken very seriously by many organizations. As technology changes, so to have the laws related to the accessibility of digital content, also known as information and communication technology (ICT) accessibility. On January 18, 2017, the United States Access Board published a final rule updating the requirements for information and communication technology covered by Section 508 of the Rehabilitation Act and Section 255 of the Communication Act, often referred to as "508 Refresh" or "ICT Refresh." As a result of this update, American ICT accessibility standards are more consistent with the standards of many countries across the globe that adhere to the Web Content Accessibility Guidelines (WCAG) 2.0 of the World Wide Web Consortium (W3C).

On July 26 of this year, the Americans with Disabilities Act (ADA) celebrated its 27th anniversary. Although far from perfect, the ADA represents one of America's most comprehensive pieces of civil rights legislation. It prohibits discrimination and guarantees that people with disabilities have the same opportunities as everyone else to participate in the mainstream of American life—to enjoy employment opportunities, to purchase goods and services, and to participate in state and local government programs and services. The ADA is considered an "equal opportunity" law for people with disabilities.

The ADA was signed into law six months prior to the creation of the Internet. The impact that technology has had in people's lives following the ADA could not have been forecasted. Because of the ubiquity and importance of ICT in virtually every aspect of our lives today, over the last several years the Department of Justice has taken the position that ICT accessibility represents a critical civil rights issue for people with disabilities, and therefore falls under the ADA.

Although laws and standards such as Section 508 Refresh, WCAG 2.0 and the ADA, among others, are of paramount importance to ICT accessibility, they do not in and of themselves provide the tools, resources, professional development, sharing, and networking needed by individuals and organizations to make widespread ICT accessibility a reality.

International Association for Accessibility Professionals

The International Association of Accessibility Professionals (IAAP), is the only organization of its kind in the world. A non-profit, membership-based division of the United Nations' Global Initiative for Inclusive Information and Communication Technologies (G3ict), the mission of IAAP is "to define, promote and improve the accessibility profession globally through networking, education and certification in order to enable the creation of accessible products, content and services." For more information, visit the About IAAP page.

Overview of IAAP Benefits

IAAP provides a number of offerings through its membership and services, including professional and organizational development tools and resources, certifications, and networking for individuals and organizations.

IAAP Webinars

IAAP hosts several webinars throughout the calendar year that feature industry leaders from across the globe. Topics for presentations include current trends, key technical issues related to achieving ICT accessibility and how to address them, and best practices. Live webinars allow attendees to ask questions and make comments, and are also archived for future retrieval. Complimentary access to the live and archived webinars is available to IAAP members based upon member level.

IAAP Certification

In addition to membership benefits, education, and resources, the IAAP currently offers two internationally recognized certifications, with additional certifications expected in the future that will include Procurement Specialist, Document Content Specialist, and Mobile Accessibility. The technical and academic aspects of the IAAP certification program, along with the logistics, are managed in partnership with AMAC Accessibility at the Georgia Institute of Technology.

CPACC Certification

The IAAP Certified Professional in Accessibility Core Competencies (CPACC) requires demonstration of knowledge and understanding of disabilities, accessible and universal design, and accessibility related laws, standards, and management strategies.

The CPACC is an ideal certification for those who manage and support accessibility and require high-level conceptual knowledge and general understanding of accessibility. IAAP offers an outline of resources and information needed to prepare for the CPACC certification exam.

WAS Certification

The IAAP Web Accessibility Specialist (WAS) certification is intended for accessibility professionals who are required to evaluate the accessibility of existing content or objects in accordance with published technical standards and guidelines such as WCAG 2.0, and who provide detailed remediation recommendations. These professionals are expected to know and use the relevant technologies, not merely be aware of them. IAAP also provides an outline of resources and information needed to prepare for the WAS certification exam.

IAAP Connections

Connections is a member-only collaboration tool, facilitating member-to-member support on a number of topics, including resource sharing, technical questions, and guidance on best practices. IAAP Connections also contains an integrated member directory for one-on-one connections, as well as the establishment of sub-communities to foster discussions on specific areas of interest.

Nordic and UK Chapters

The IAAP serves as a center for accessibility professionals from across the globe, allowing professionals and those learning about accessibility to network, share, and develop their skills. With members from over 38 countries, IAAP currently has approximately 1,200 individual members and 67 organizational members committed to accessibility. In June of 2017, the Nordic chapter was officially formed. This chapter is led by Funka, a leading accessibility company based in Sweden. The Nordic chapter represents Denmark, Finland, Norway, and Sweden. In July of 2017, the United Kingdom chapter was formed. It is being led by AbilityNet, a non-profit out of the UK that focuses on digital technology for people with disabilities. The objective of these and other emerging chapters is to promote the accessibility profession in their respective geographical regions by conducting awareness-raising events, promoting IAAP membership services including the IAAP training and certification programs, and providing input to IAAP on the needs of each region.

The Bottom Line

With society's ever-increasing dependency on digital content, leveling the playing field so that everyone has full and equal access has never been more urgent. Achieving a higher level of ICT accessibility, which can mean anything from more accessible PDF forms to online websites, apps and applications, also adds tremendous value to organizations' bottom lines. Many milestones have been reached when it comes to accessibility, but there is still a lot of work to be done. IAAP has quickly become a global leader for individuals and organizations pursuing a greater understanding and awareness of ICT accessibility and its implementation.

Comment on this article.

Vision Tech: Earlier Diagnosis and Intervention Can Help Stanch Increasing Rates of Blindness

According to a recent study led by Anglia Ruskin University in the UK, the global blindness population is on course to triple by the year 2050. In percentage terms the numbers are improving. Global blindness declined from 0.75 percent of the population in 1990 to 0.48 percent in 2015, while the rate of moderate to severe vision impairment reduced from 3.83 percent to 2.90 percent. Unfortunately, the raw numbers tell a much different story. Researchers predict an increase in the number of blind individuals around the world from 36 million to 115 million within the next four decades—primarily due to growing senior populations worldwide. Since the majority of vision impairments are in some way age-related, as the population "grays," the raw number of people affected will similarly continue to increase globally.

Early detection and intervention are among our most effective tools to prevent blindness, particularly in the developing world where medical services and qualified professionals are limited. This article introduces a pair of recent developments: the first, EyeLogic, is a new way of testing for the early signs of diabetic retinopathy. The second is a new eyedrop that may soon take the place of injected VEGF inhibitors to treat advanced macular degeneration. Both of these breakthroughs have the added advantage of not requiring highly trained ophthalmologists to administer, which means they may be deployed in areas where these professionals are at a premium.

An EyeLogic Approach to Retinopathy

With new and improved treatments for river blindness and more available cataract surgery, diabetic retinopathy is quickly becoming one of the leading causes of preventable blindness globally. There are nearly 30 million individuals with diabetes in the US alone, and another 86 million who are prediabetic. Globally the World Health Organization estimates the number of diabetics at over 420 million, or one in every eleven adults.

At least 40 percent of people with diabetes will develop diabetic retinopathy at some point in their life—their chances increase with each passing year they have the disease. Unfortunately, here in the US only half of diagnosed diabetics will consult an eye doctor before experiencing permanent vision loss. In the developing world the numbers are even worse, due to a lack of medical resources and trained diagnosticians.

Early diagnosis can go a long way toward staving off the effects of diabetic eye disease with medication, surgery, and, in the earliest stages, diet and lifestyle changes. The exam is straightforward. The individual's retina is viewed through a slit lamp, which scans for abnormalities in the delicate capillaries that supply the tissue with blood. Certain cholesterol formations known as hard exudates may also signal the presence of the disease process, and, in later stages, retinal hemorrhages.

At some institutions, such as the VA, diabetes patients have their retinas routinely photographed, and these images are forwarded to remote specialists for screening. The images are also banked for training and research. Recently, a team of researchers led by Theodore Leng, MD, Assistant Professor of Ophthalmology at the Stanford University Medical Center, used this extensive database to develop EyeLogic: a new and less resource-intensive way to screen for and diagnose this vision stealing disease.

You may already be familiar with the way Google and others have improved language translation by developing algorithms and then feeding them massive amounts of data to sort, match, and perform the "deep learning" which makes it possible for you to obtain nearly instant translations of webpages, text and even speech. To put it very simply, Dr. Leng and his team have done the same with retinal images. "After creating our algorithms we fed the software over 80,000 retinal images, ranging from healthy to severely damaged," Leng explains. "The images were marked with the appropriate level of disease, if any was present, and we then employed a method of computer learning using a convolutional neural network until the software could detect the difference."

The next step was to test the software using a different set of retinal images from a different collection. "This time we didn't include the diagnoses. We allowed the trained machine to perform the screening," says Leng.

The results? "The algorithm had a 97 percent success rate, which matched or exceeded the average rate of disease detection by a retinal specialist," says Leng. "The false negatives, which would encourage individuals to not seek treatment, were also equally minimal."

Google has also performed some "machine learning" on retina imaging, but, Leng explains, "in addition to identifying diseased eyes, our algorithm marked the pathologic regions on the images to aid in identifying the areas of concern. They also indicate a severity level from zero to four." Level zero indicates no presence of retinopathy. Levels one through four indicate mild, moderate, severe, or proliferative levels, respectively.

Since the images were taken using differing resolutions, light levels, and pixel sizes, considerable pre-processing is required to standardize the images for screening. Nonetheless Leng and his team have been able to load the entire EyeLogic package onto a desktop computer, and even an iPhone, using a $300 attachment that can image retinae. The plan is to have the database stored on secure and HIPAA-compliant EyeLogic servers. After providing an image, the software only takes three or four seconds to process the image and highlight any possible pathologies.

Leng and his team hope to have a pilot program up and running soon. "We'll start with systems where they are already forwarding their images to specialists for screening," he explains. "We'll screen a duplicate copy, judge how well the system works, and add even more images to our database."

According to Leng, FDA approval may be years off. "The agency has not yet approved any algorithm-based diagnostic tool for eye care, so we will need to provide considerable study data," he says.

The team hopes to soon begin rolling out screening systems in developing countries where there is a lack of facilities and vision experts. "Eventually we would like to see EyeLogic in the offices of every physician who treats diabetes," he says. "If the doctor can not only tell you that you have the early stages of retinopathy, but actually show you the affected area on the scan, we feel a lot more people will seek the early treatment which is critical."

After getting the EyeLogic system up and running Leng says the technology also holds promise for early detection of age-related macular degeneration. In the meantime…

Eyedrops to Treat AMD

Age-related macular degeneration (AMD) is the leading cause of blindness in people age 65 and older. Currently, an estimated 13 million people in the US suffer from AMD, and the problem is expected to grow as the population ages. One large study found that people in middle age have about a 2 percent risk of developing AMD. By age 75 this risk increases to nearly 30 percent.

A new class of medications, known as vascular endothelial growth factor (VEGF) inhibitors, can slow or sometimes even halt the progression of the disease. VEGF is the protein that promotes blood vessel growth, also called angiogenesis. Angiogenesis is necessary for body growth and wound healing, but when it misfires in overstressed retinal cells it can lead to excessive vessel formation in the macula, causing the permanent damage known as AMD. VEGF inhibitors work by suppressing angiogenesis. Unfortunately, VEGF inhibitors are large molecules, "Much too large for an eyedrop to penetrate the cornea and other outer structures of the eye," says biochemist Felicity de Cogan, Ph.D., Research Fellow at the Institute of Microbiology and Infection, University of Birmingham, and leader of their Ocular Drug Delivery Team. "Between the tear film and the cornea and the vitreous, basically, the eye is designed to keep things out."

The active compounds in glaucoma eyedrops tend to be small molecules that are able to penetrate the eye. However currently VEGF inhibitors can only be effective if they are injected directly into the vitreous—the fluid that fills the eyeball and keeps it inflated--where they can make contact with the damaged retina. Patients often require between 8 and 12 injections in each treated eye every year for at least three years to stem the tide of damaging vessel growth. Needless to say, the procedure is rather uncomfortable at best. "Apart from being an unpleasant procedure for patients to undergo, the injections can cause tearing and infections inside the eye, and an increased risk of blindness," de Cogan notes. "There is also an issue of calculating dosages. "With an injection you're giving the retina a large dose each month only to have it decrease over the time until the next injection. With VEGF inhibitor eyedrops the eye would be able to receive the same dose every day," says de Cogan.

In a recent laboratory breakthrough, de Cogan and her Ocular Drug Delivery Team were able to do just that—successfully administer a VEGF inhibitor into animal eyes using an eyedrop. The work began with the study of certain peptides used to battle bacteria. "One particular peptide used a positive charge to create an electrostatic interaction with the negatively charged cell wall and penetrated the bacteria," de Cogan explains. "We discovered the same peptide would briefly open a tiny pore in corneas and other eye structures. We then employed the same electrostatic effect to bind the VEGF inhibitor to the end of the peptide chain, and when the peptide opened a pore and passed through the eye tissue the VEGF inhibitor was dragged along for the ride, and subsequently released into the vitreous."

In animal testing the VEGF inhibitors were still present in the vitreous up to 72 hours after administration. More testing is currently underway, but de Cogan is cautiously optimistic. "Without the need for expensive and painful injections more people will be able to undergo treatment earlier," she says. "We will also be able to more effectively raise and lower dosages, depending on how the medication works on any individual. Since the medication may eventually be administered by optometrists instead of more highly trained specialists, the treatment will be much less expensive and more widely available around the world. Patients will also be more involved in and in control of their own treatment, which can only enhance compliance."

Comment on this article.

Instant Access to Information with Aira: An Introduction (Part 1 of 2)

Imagine needing sighted assistance and just tapping a button to be connected quickly with someone who has been trained to help you. The agent both hears and sees either through your smart glasses or your smart phone (which are pointed wherever you want). He or she can describe your surroundings, describe an intersection, guide you through a busy airport, read your mail, or help you find a dropped object; there are an infinite number of uses for this kind of service. This is what Aira does. Aira (pronounced: eye-rah) provides "instant access to information" for people who are blind or visually impaired.

As stated on the Aira website: "Our name is derived from two interesting sources: the emerging field of Artificial Intelligence (Ai), and the ancient Egyptian mythological being and symbol known as the Eye of Ra (ra). Steeped in the mysteries of Egyptian mythology, the Eye of Ra has symbolized protection, healing, and the power to perceive and interpret both the seen and unseen in the universe."

After more than two thousand beta trials, the service became available in late 2016. Aira does not replace a white cane or guide dog. Amy Bernal, Aira's Vice President for User Experience, explained, "We are never going to compete with or replace a user's own skills. We're not going to replace a mobility aid. We really are there to be a benefit of information that a sighted person would have access to. That's the kind of information we're making instantly available to our user base." Aira is currently only available in the US, but they hope to be in other countries within a year.

Bernal added, "There are great tools out there but you have to use a lot of them to accomplish what Aira can accomplish. Our agent team distinguishes us from the combination of all those apps." Aira is available seven days a week from 7 a.m. to 1 a.m., Eastern standard time.

Using Aira

When you sign up with Aira, you will receive a pair of smart glasses and a small Mifi unit. The Mifi unit is a personal hot spot. Aira uses the AT&T network. There is no additional charge for this equipment. Aira users are called "explorers" and the people providing assistance are called "agents." An explorer's visual acuity can range from low vision to total blindness. As of this writing Aira's explorers range in age from late teens into the 70s.

There are four monthly subscription options:

  1. 100 minutes for $89
  2. 200 minutes $129
  3. 400 minutes $199
  4. Unlimited $329

Bernal said that most users take either the 200- or 400-minute plan. She added that Aira is trying to get their service paid by various state and federal agencies. For example, they are working with the Veteran's Administration so Aira can be covered as a device issued to a veteran.

The explorer downloads the Aira app on their smart phone, available for both iOS and Android. Through the app, the explorer activates a "Call Aira" button on the app's home screen and within a few seconds, an agent is speaking with them.

The Equipment

Once you have signed up with Aira, you will receive your smart glasses and a Mifi unit. Both pieces of equipment will be shipped in one box. The larger box has the glasses and the smaller has the Mifi unit.

The Smart Glasses

Aira ships two different glasses. Bernal said, "It depends on when a user signed up in terms of which hardware we were shipping. Today we do ship both. [Which one you get] depends on preference for shades, head shape, and other considerations the user may have."

One pair of glasses feels like a regular pair of glasses except the right frame is thicker than the left. There is a small rectangular camera lens unit with two buttons on top attached to the right frame. When wearing the glasses, the button furthest back is the power button. The one in front is no longer used. There is a charging port on the bottom side of the right frame. Total weight of the glasses and camera unit is very low, slightly less than two ounces.

Under the box insert is a charger, wall plug, cloth for cleaning the glasses, and a fabric case. The charger can work on AC or USB.

The second type of glasses looks like a pair of Google Glasses. They come in a zipper case. When the case is open, there are mesh compartments for the glasses, charger, and AC Adaptor. The glasses weigh 1.46 ounces. They do not feel as sturdy as the other glasses but they fit me much better.

The left frame of the glasses is metal and the right frame is plastic. The pair I received did not have any lenses. The camera is in the front of the right frame and the power button is on the end of the frame, by your ear. On the bottom of the right frame is the charging port. The charger is magnetic and will easily attach to the port. They can also be charged by USB.

When the frames are unfolded, the glasses will speak the battery level and announce that they are searching for Wifi. Once connected, the glasses will announce that they are ready for service.

The Mifi Unit

Aira mostly ships AT&T Mifi units. This unit measures approximately 4-1/2 inches by 2-1/2 inches by 5/8 inches. There are two buttons. The button closest to the side is the on-off switch. On the opposite side is the charging port.

The charging cable and AC adaptor are also in the box, underneath a cardboard lid. The charger can work on AC or USB.

Aira sometimes ships a Verizon Mifi device. It is oval shaped weighing 2.9 ounces and measures 2.25 by 3.4 by .55 inches. There is only one button on the unit. On the opposite side is the charging port. It can be charged by AC or USB.

The Aira App

The Aira app has five tabs on the bottom of the screen: Home, My Glass, WiFi, Usage, and More.

At the top of the home screen are two buttons labeled Glass and Phone. There are two ways you can contact Aira: from your glasses using the MiFi or from your phone using a cellular connection. Some data is used when contacting Aira from a cell phone. Below these buttons is a button labeled "Call Aira." Depending on which call method is selected, the button will say "Call Aira from Glass" or "Call Aira from Phone." Select the appropriate button and in a few seconds, an agent will respond.

The My Glass tab opens a screen that contains information about your glasses including whether they are connected to the Internet, battery level, and estimated time for usage. If using the AT&T Mifi, information about the MiFi unit including battery level and signal strength is also listed. There is also an option to shut down MiFi when you are done.

This MiFi information is not available if using the Verizon unit. However, the screen will have the MiFi's name. According to Aira tech support, they are aware of this situation and hope to resolve it soon.

The WiFi tab lets the user add a WiFi network. This is useful since the MiFi unit isn't necessary when on WiFi. There are edit boxes to add the network name and password. Once the WiFi network is saved, it can be set as your default network. Just remember to switch back to MiFi when not in range.

The Usage tab has information about your subscription and time used. Activating the "View My Usage Detail" button loads a screen with the date, time, length of time with the agent, and the agent's name. Activating an individual listing opens a screen for leaving feedback about the agent.

The More tab has information about your account, your profile, an inbox, your photos, an option to contact Aira, and other options.

Agents

Agents are not allowed to tell you when it is safe to cross the street; you must decide on your own. They may tell you that the light is green. They also do not express their opinions. They are trained to provide information. Bernal said, "Agents are there to serve as a set of eyes, not as a brain." As of this writing (August 2017) there are approximately 60 agents, but this number should increase significantly to around 200 by the end of the year.

Agents are located throughout the United States. It doesn't matter where an agent is physically since they can answer Aira calls from anywhere. Agents go through a rigorous training process. Erin Cater, Agent Operations Manager said, "We are recruiting top talent."

Aira looks for people who have at least two years customer service experience. Candidates then go through a pre-screening test which measures customer service and map reading skills. Cater said this test eliminates around 40 to 50 percent of applicants.

Those who successfully pass the pre-screening test are offered an interview; since they are located throughout the country, the Zoom conference program is used for the interview. Applicants are asked about their work history and are asked some situational based questions. Cater added, "As a surprise, the interviewees aren't aware of this in advance, we throw them on the dashboard." The dashboard is where an agent interacts with an explorer.

She continued, "Being an Aira agent is very unique so you never know until you do it whether it's something you're going to enjoy." Some candidates have dropped out after the dashboard session.

Successful candidates move onto the training phase, where they receive 20 to 30 hours of training before they are put on the dashboard under supervision. Cater explained, "It starts with an agent workshop which is about four hours long. That goes over Aira as a whole, company policies, where we came from, where we are going, and we get into the nitty gritty of the agent details. A good example is working with a user who is using a white cane versus a guide dog. As an agent, you assist them slightly differently. After the workshop, we do more trainings. One of the trainings is going over the details of our dashboard. Then we breakout into individual one-on-one training where our new agents are working with our blind and visually impaired staff and doing hands-on training where they're sitting with an agent analyst. The analyst is shadowing them and watching their behavior on the dashboard throughout this training. Toward the end of the training, our new agents shadow our veteran agents." Before an agent can work alone on the dashboard, they must pass a background check.

Aira's dashboard consists of Google Maps, where the agent can use satellite, map, and street view. Cater especially likes satellite view because it can zoom in and see details. The dashboard also contains the user's profile, including name, address, emergency contact, whether they use a white cane or dog, how much detail the user likes and whatever additional information the explorer wants to add. The dashboard shows the user's GPS location. The agent can take a photograph of whatever the camera is facing and can then manipulate the photo for a better view.

Cater recommends using the glasses for travel since they give the agent a better view. For tasks such as reading the mail, the phone is fine.

She added that Aira is working on having glasses in three different sizes.

Cater started out as an agent with Aira. She told an unusual story about helping an explorer read the scale while naked, because he was in a weight loss competition. Cater said, "Weird and strange tasks are few and far between. Our explorers are opening our eyes to possibilities we never imagined." She talked about a woman who had an Aira agent help her independently locate her father's grave in Arlington National Cemetery.

First Aira Session

The first Aira session is called "On-Boarding." Once you get your glasses and MiFi unit, an agent will call to go over how to work the glasses and MiFi, navigate the Aira app, set up your profile, explain what the agents can see, do at least one "exploration," and answer questions. The agent will ask if there is a special way you like to be guided: some people like a lot of detail while others do not. You might want to record this session since a lot of information is given.

I did two explorations as part of my on-boarding call. On the first, I asked the agent to read the label on a bottle of salad dressing and on the other, we took a walk near my house. I used the glasses for both tasks though I later found out that the phone camera is sometimes better. The agent gave clear directions about how to hold my head so he could read the bottle. Outside, he read street signs, described the neighborhood, and described the sidewalk.

After the session, the agent said to call if I had any questions or problems.

Aira Keynote at the National Federation of the Blind Conference and Convention

At the 2017 National Federation of the Blind's annual convention, Aira CEO Suman Kanuganti gave the keynote address. He announced some new features in Aira. In households with more than one family member who is visually impaired, Aira now has a Family Share Plan. Family members can share the glasses and subscription minutes.

Aira will be launching Chloe, their AI agent, to work with explorers through their smart glasses. Her first task will be reading, which can eliminate the need for a live agent. More capabilities will be announced in coming months.

BlindSquare, a popular GPS app for the blind, will integrate with Aira. This will provide even more information for explorers.

Priority Connectivity AT&T Network

Aira and AT&T have been working on improving Mifi connectivity between the explorer wearing smart glasses and the Aira agent. The two companies designed and are rolling out "Dynamic Traffic Management." This new network will insure that explorers get immediate access to Aira agents.

You can watch the entire keynote address here.

Part 2 of this article will feature experiences from Aira Explorers.

Resources

Aira website

Aira was featured in "Wired."

Comment on this article.