Highlights from The 2018 American Foundation for the Blind Leadership Conference
Each year, the American Foundation for the Blind Leadership Conference (AFBLC) brings together individuals from across the vision loss field. In addition to educational sessions, the conference also serves as a networking space and includes the presentation of many of AFB's annual awards. The information presented at the conference covers a range of topics, from aging with vision loss to public policy. For this article, I will be focusing primarily on the technology content presented at the conference.
AccessWorld Technology Summit
The AccessWorld Technology Summit occurs each year and is a full-day session that showcases new technologies and technological research of interest to people with vision loss. Presenters include mainstream and assistive technology companies as well as other organizations active in the technology arena.
New Developments in Accessibility for Google Products
The first session of the AccessWorld summit was presented by members of the Google accessibility team. Members included Kyndra LoCoco, Accessibility Community Manager, Victor Tsaran, Technical Program Manager for Android, Laura Palmaro, Program Manager for Chrome, and Roger Benz, Program Manager for G Suite. Each presenter discussed the accessibility features of their respective products with a focus on what accessibility features have been included recently.
Various accessibility improvements have been included in Android Orio (Android 8.0). Over the past several years, the team has focused on bringing productivity improvements to TalkBack, the Android screen reader. One aspect of this is the inclusion of extensive keyboard shortcuts for TalkBack that provide access to most features of the screen reader and allow TalkBack to be used similarly to a desktop screen reader. Other improvements include the addition of an "Accessibility" button on the navigation bar where the "Back", "Home," and "Overview" buttons are located. This gives the user the ability to launch a specific accessibility tool and to assign a TalkBack gesture to the fingerprint censor, if present.
The redesigned ChromeVox screen reader for the ChromeOS operating system is now simply titled ChromeVox and the previous version has been retired. This version of the screen reader has redesigned keyboard shortcuts, new earcons (audio cues used to identify on screen elements), and the ability for the screen reader to be used consistently across the operating system. Another key feature recently introduced is the ability to use USB-capable braille displays on ChromeOS. The ability to use Bluetooth displays is in development.
Several accessibility advances are in development for G-Suite apps (Docs, Drive, Sheets, etc.) with some available on certain combinations of platforms and screen readers. Magnification has been built into Google Docs. This feature is available on ChromeOS and Mac with access on Windows with ZoomText coming soon. Braille support has also been added to Google Sheets with access currently available on ChromeOS.
Google has introduced a new video conferencing system titled Hangouts Meet. In addition to software, specially designed hardware has been produced for use with the system so that it can be used in physical meeting rooms. The hardware includes a touch panel for controlling the system that is accessible using the ChromeVox screen reader.
An Introduction to Oath
Oath is a Verizon company that incorporates various Web properties including Yahoo, AOL, HuffPost, TechCrunch, and Flickr. The presentation was given by Darren Burton, Accessibility Specialist for Oath's Yahoo property. He explained that much like other parent companies (such as VFO and Alphabet, the properties under the Oath umbrella will mostly remain distinct and separate. In addition to working with development teams at Oath to promote app accessibility, new hires are provided with information on accessibility processes during initial orientation. In addition, Oath has prepared a set of learning modules for their developers explaining methods for implementing the most common forms of accessibility to give their developers guidance when the accessibility team is not available. Recently, Oath has been developing a second usability experience lab in the New York City area.
Youngsun Shin, the Principal designer for user experience for digital appliances at Samsung, discussed recent efforts to bring accessibility to Samsung appliances. One recent enhancement is the introduction of rising and falling tones to indicate the changing of temperature for Samsung refrigerators. Refrigerators with speakers will also announce the temperature; the tones have been included so that refrigerators that only have a buzzer have a method for providing the information in an accessible form. Samsung is also beginning to add braille to certain appliances; where this is impossible the company will generally have tactile stickers produced to provide tactile feedback for controls.
The team has also recently designed a voice-activated washer for the Korean market with the assistance of an employee with a visual impairment. The technology is in development for a global release. Samsung also provided the voice-enabled washer that was used in the athletes' village for the Paralympics. This allowed the team to receive further feedback from those with visual impairments on the accessible washer.
Audio Description Research
David Vialard of Illinois State University discussed research he is conducting regarding audio description. There is more audio description available than ever before for those who are visually impaired, though research into best practices is scarce. The International Collection of Child Art at Illinois State University was used as a beta project for providing audio description for images. Because a variety of individuals submitted transcripts for the images, researchers are able to see which items are commonly described in an image as well as differences among describers. Vialard also uses technology that can track eye movement to determine where individuals focus when looking at images. Data relating to focus patterns and the duration of a viewer's focus on specific areas are collected for analysis. This information can be combined and displayed to show focus patterns for multiple individuals so that similarities and differences can be more easily detected and studied.
Bluetooth Beacons for Indoor Navigation
Mike May of Envision, Inc. discussed the current state of indoor wayfinding for those with vision loss, current concerns, and future developments in this space. The most common form of indoor navigation uses Bluetooth beacons. These mainstream devices can be placed in a space to communicate information to a smartphone app when a user comes within a certain distance of the beacon. Currently, many companies deploy beacons in public spaces such as schools, malls, and airports. At the moment, most beacons are associated with a single app, which means a user must have multiple indoor wayfinding apps on their device and know which one to use in a given venue. To attempt to alleviate this issue, several indoor wayfinding providers formed a coalition to share beacon data. Some companies involved in the effort include Sendero's Seeing Eye GPS app, BlindSquare, and Indoo.rs. The company Radius Networks is also involved in the effort. Radius makes beacons and also has a database of beacons in a format similar to a wiki. The goal of this coalition is to share beacon data across apps so that everyone can benefit from the beacons, regardless of who placed them.
One disadvantage of beacon technology is that their accuracy is a bit imprecise, though a new technology may provide a solution. Wi-Fi fingerprinting takes into account all available signals and their strengths for a device at a specific location. The distinct signals and signal strengths provide a virtual fingerprint for a user when they are in a given location. This could provide accuracy up to a foot and could lead to accurate turn-by-turn directions for indoor spaces.
Verizon and the Development of 5G Wireless Technology
Zachary Bastian of Verizon discussed Verizon's research into 5G wireless technology, its benefits over current technologies, and deployment challenges. 5G technology has been shown to display speeds of 1 Gigabit per second for both upload and download. Current 4G and 4G LTE technology can achieve speeds of around 12 Megabits per second. The technology is still in development stages as the way in which 5G waves interact with objects is still being studied.
Compared to current technology, 5G wireless will require many more cells than current wireless data technologies. 4G wireless can be provided for a wide area using only one tower where possibly hundreds of smaller towers may be needed for 5G. This means that 5G will most likely appear in urban areas before being deployed in more rural locations. In addition, the antennas used with the technology are larger than those currently used so the technology will most likely appear as a replacement for home internet before being used elsewhere.
Peter Tucic of HumanWare spoke about new developments in the company's braille products. The Brailliant 14 is a 14-cell braille display that can connect to five Bluetooth-capable devices at one time. The device contains a basic notetaker and it's possible to sync notes from the Brailliant to the iOS device using the Brailliant sync app for iOS. The Brailliant Sync app is currently only available on iOS but there are plans to develop an Android version in future. Instead of traditional cursor routing keys, the Brailliant has touch censors above each cell that perform the same function. Part of this design choice is to hopefully improve the durability of these controls.
KeySoft version 5 has been released for HumanWare's BrailleNote devices. One key addition to this version is the implementation of one-handed operation and improvements to math content. It is now possible to review a visual graph using the braille display. When viewing the graph in this way, the user can essentially scroll through the graph with the X and Y axes being represented by specific braille symbols while the equation being graphed appears as a mass of braille dots forming the line. It is possible to emboss a hardcopy tactile graphic of the equation being graphed to allow the user to view an overview of the graph as well.
Paul Schroeder of Aira discussed the Aira technology, demonstrated the service, and detailed recent developments for the company. The Aira service operates by connecting visually impaired customers (Explorers) with trained sighted individuals (Agents) who can provide assistance with a variety of tasks. Though the user can use a smartphone's camera to provide video for an agent, the Aira service generally relies on a set of camera-equipped smartglasses. The recently released Horizon smart glasses have up to 7 hours of battery life and also provide a wider field of view for the Aira agent than earlier models. In addition, the picture quality transmitted by the camera has improved, which makes it faster for Aira Agents to discern small details, such as when reading small print.
Aira also announced that Explorers who use Aira for job seeking tasks, such as editing a resume or picking out an outfit for an interview, will receive those minutes free. Aira is also partnering with airports to provide free access to travelers when inside terminals. In addition, Aira is reaching out to other public sites about providing access to visitors with visual impairments.
Each day at AFBLC begins with a general session discussing pressing issues in the field of vision loss. This year, the issues discussed ranged from technology to employment. The first general session brought together representatives from mainstream technology firms who discussed how their companies improved product accessibility, supported employees with vision loss, and improved opportunities for people with visual impairments. The panel consisted of Sarah Herrlinger of Apple, Mark Lapole of eBay, Megan Lawrence of Microsoft, and Jeffrey Wieland of Facebook. Highlights included braille display support for Apple TV and the inclusion of emoji accessibility on braille displays from Apple; the Facebook Navigation assistant, which makes Facebook easier to navigate and use for blind or low vision users; the recently released Soundscape wayfinding app from Microsoft; and eBay Mind patterns, a compendium of accessibility examples that can be applied to various situations when developing for accessibility. All companies also described the importance of accessibility knowledge among students and their efforts in these areas.
The second panel featured both representatives of mainstream technology companies as well as individuals with visual impairments with successful careers. The goal of the panel was to determine what challenges currently face those with vision loss when attempting to find employment and what methods companies are employing to increase opportunity for people with visual impairments. The panel consisted of Jennison Asuncion of LinkedIn, Dina Grilo of JPMorgan Chase, Jen Guadagno of Microsoft, and Megan Mauney of Florida Blue. The companies represented have accessible hiring processes as well as supports for individuals with vision loss after hiring. Interestingly, the panelists explained that their companies find it useful if an applicant discloses their vision loss early in the application process as it allows the company to be sure aspects of hiring are accessible. Disclosing a disability early also allows companies to be better prepared with accommodations, such as assistive technologies, once an applicant is hired.
The final general session was presented by Bryan Bashin, CEO of the San Francisco Lighthouse. Using available data and extrapolation, he detailed how people with visual impairments are using technology to learn new skills, complete tasks, and network in greater numbers than before. The time spent using technologies and learning from one another was compared to the time spent learning or receiving services from traditional visual impairment services with the former far outnumbering the latter. In addition, he described how the general public's understanding and experience with visual impairment is shaped much more by individuals. An example is the number of YouTube videos regarding visual impairment, created by individuals who aren't associated with any organization, which have been viewed in greater numbers than those produced by visual impairment organizations.
AFBLC also serves as a forum where AFB can present various awards to those who have achieved much in the field of visual impairment. For the first time, the AFB Helen Keller Achievement Awards, which are usually hosted at their own venue, were presented at the conference. This year marked the 22nd Helen Keller Achievement Awards, with Microsoft and Facebook receiving the award for improving accessibility to their existing products and developing new accessible products or services. Haben Girma, a disability rights lawyer who was also the first deafblind individual to graduate from Harvard Law School, also received the award for her dedication to improving opportunities for people with visual impairments.
The Corinne Kirchner research award is named for AFB's former staff member Corinne Kirchner, who is a leader in conducting research in the field of visual impairment. Recipients are honored for their contributions to the field through valuable research that reveals ways of improving the lives of those with visual impairments. This year's recipient was the Envision Research Institute, which conducts research on a range of topics related to vision loss, from improving rehabilitation for people with visual impairments to accessible wayfinding. Mike May, formerly CEO of the Sendero Group, received the award on behalf of the organization.
The Stephen Garff Marriott Award was named for Stephen Garff Marriott, who after losing his vision rose through the ranks at Marriott and served as a role model for others with vision loss. The award honors those who have been extraordinary mentors or have been extremely successful in their careers. This year's award was presented to Jennison Asuncion, who is the Engineering Manager--Accessibility at LinkedIn.
The Migel Medal is named for AFB's first chairperson, M.C. Migel, who established the award in 1937 to recognize those whose work has significantly contributed to improvements in the lives of individuals with vision loss. The first recipient was Larry Campbell, formerly the president of the International Council for Education of People with Visual Impairment (ICEVI) who has also long served in the field of education of people with visual impairments. The second recipient was Ted Henter, who was the key developer of the JAWS screen reader.
The Bottom Line
This year's AFB Leadership Conference was filled with information from across the field of blindness and visual impairment. Key mainstream and assistive technology companies took part providing details on their new products but also possible future developments in technology as a whole that will benefit individuals who are blind or have low vision. Next year, AFBLC returns to Arlington, Virginia, with the conference being held February 28-March 1, 2019.
Comment on this article.
More from this author:
Previous Article | Next Article |
Table of Contents
Copyright © 2018 American Foundation for the Blind. All rights reserved. AccessWorld is a trademark of the American Foundation for the Blind.