If you are a Twitter user with vision loss, this has undoubtedly happened to you: A tweet comes through on your timeline with some provocative text. There is an attached image you suspect will amuse, bewilder, or enrage. If only you knew what was in that image file…
Take heart and read on.
In this article we will introduce you to an exciting new mobile app that will do just that—describe a Twitter pic and recognize any embedded text. First, however, we want to introduce you to the developer competition that is making this and other smartphone, tablet, Web, desktop, and wearable access solutions possible. It's called the AT&T NYU Connect Ability Challenge, and it's going on right now, with teams and individuals vying for both recognition and a share of $100,000 in total prize money.
The Connect Ability Challenge
AT&T frequently partners with educational institutions and government organizations on a variety of initiatives. During the past several years, the company has invested in numerous developer challenges with cash prizes for technology solutions that can offer new and creative solutions to improve the lives of New Yorkers and others around the world. For example, last year the New York Metropolitan Transit Authority teamed with AT&T, Transit Wireless, and the New York University Center for Urban Science and Progress (CUSP), to solicit development of new mobile solutions designed to help improve commutes for millions of subway, bus, and rail riders across the five boroughs. The $10,000 grand-prize winner was YoTrain, an app that automatically notifies NYC subway riders of departing times and destinations as they near the station.
Other 2014 winners included NYC Accessible, an app that lets riders know which stops are ADA accessible and the current working status of elevators and escalators, and Departures NYC, an app that offers transit information displayed using augmented reality.
"This year, in honor of the 25th Anniversary of the American's with Disabilities Act, AT&T is proud to be partnering with New York University's Ability Lab to offer the first ever 'Connect Ability Tech Innovation Challenge,'" says Marissa Shorenstein, AT&T's New York State President. "The Americans with Disabilities Act was landmark legislation aimed at enabling people with disabilities to participate more fully in our society. Twenty-five years later, as advances in technology have allowed us all to connect in ways we never imagined, we believe we can further remove barriers the disability community continues to encounter."
The NYU Ability Lab is an interdisciplinary research center dedicated to the development of various adaptive technologies. The AT&T NYU Connect Ability Challenge combines the resources and expertise available at the lab with those of a quartet of Exemplars, defined by Merriam-Webster as: "An admired person or thing that is considered an example that deserves to be copied."
Throughout the competition, Challenge participants can consult with contest Exemplars to gauge the usefulness of their ideas and brainstorm new features that might make their apps and wearables even more productive.
The Challenge web site features short audio-described videos from each of the four Exemplars, who are:
Gus Chalkias: An assistive technology specialist, career counselor, and college student from Queens who is blind.
Xian Horn: A teacher, speaker and writer from Manhattan who has cerebral palsy, which has an impact on her mobility.
Paul Kotler: A student, blogger, lecturer, and advocate from Philadelphia who has autism. Kotler communicates using computer-assisted technology and struggles with anxiety and impulse control.
Jason DaSilva: A filmmaker from Brooklyn who has Multiple Sclerosis. DaSilva uses a powered wheelchair and has limited upper- and lower-limb use.
Developer teams and individuals can submit their work in any of five categories:
- People with sensory disabilities
- People in need of mobility solutions
- Social and emotional solutions
- Solutions for people with communicative and cognitive disabilities
- Solutions affecting policy and society
As part of the Connect Ability Challenge, this past April, AT&T and NYU sponsored a two-day Hackathon in downtown Brooklyn. All four of the Exemplars were on hand at the event.
"We each gave a presentation, then we spent the rest of the time answering questions and offering advice," says Gus Chalkias. "Participants ran their ideas past us. We told them whether they were on target, if their projects would benefit the disabled community, and why or why not."
According to Chalkias, "Most of the Hackathon participants seemed genuinely interested in learning about accessibility and possibly making a difference; the potential of a cash prize was secondary."
Brooklyn developer Cameron Cundiff began his career with an internship working with Adobe's Group Product Manager for Accessibility, Andrew Kirkpatrick. "I've always been interested in structured content and information architecture. That led me to Web standards, and from there it was a short jump to Web accessibility."
Cundiff first heard about the Connect Ability Challenge on Twitter. "I began thinking about Twitter memes, and how so many of them involve images," he recalls. "Almost none of these images have alt tags describing what's going on, and I thought, 'Maybe I can find a way to provide them.'"
When Cundiff spoke with Gus Chalkias about his idea he worried it might seem a bit frivolous. "Most of the other projects involved productivity or navigation," he says. "But Gus agreed—the blind should have the opportunity to enjoy social networks the same as anyone else."
Cundiff began by setting up a Twitter account for his alt_text_bot. service: @alt_text_bot. He then wrote a Ruby application that would scan Twitter for posts that included @alt_text_bot and contained an attached image. The images were forwarded to CloudSite, one of the same image recognition APIs used by TapTapSee. When the recognition results were returned, the Alt Text Bot program replied to the tweet and included the descriptive text.
One of the first images Cundiff forwarded to alt_text_bot showed a woman riding a bike. The image recognition engine returned: alt=woman in black red white blue cycling suit smiling.
"I was really excited and a little surprised at how well it worked right off the bat," he recalls.
Alt_text_bot went on to win the Hackathon's $5,000 First Prize. The $3,500 Second Prize was taken by StenoSpeak, a mobile app that improves upon open-source stenography technology to speed up text translation to a conversational pace for those who cannot use their voices to communicate. The $1,500 Third Prize went to Tranquil Tracker, a bio-sensing system that can predict and prevent anxiety attacks.
Cundiff is using part of his Hackathon prize money to support the alt_text_bot service and pay for image recognition API. He's also working to make improvements, such as a browser plugin to offer real time image description, and he is still eligible for Connect Ability Challenge prizes. "I plan to use a large portion of any prize money to help support the service," he says.
Individuals and teams have until June 24 to submit their completed Connect Ability Challenge entries. Winners will be selected by a panel of judges including the four exemplars, members of the NYU Ability Lab, and representatives from the Rehabilitation Engineering and Accessibility Society of North America (RESNA.)
Cash prizes totaling $100,000 will be awarded at a banquet in New York City on July 26, the 25th Anniversary of the date President George H. W. Bush signed the Americans with Disabilities Act into law.
AccessWorld will report on all the winners in an upcoming issue. In the meantime, you can read about many of the projects at the Challenge website, which includes GitHub links to the various applications and several invitations to get an early peek and even help beta test.
You can also follow the latest AT&T NYU Connect Ability Challenge news on Twitter by searching for the hashtag #ConnectAbility.
More articles from this author: