Subscribe to Newsletter
Tell a Friend
Print this Page
The Accessibility of the User Interface
Persons with Disabilities have a range of accessible devices to choose from. However, with an increasing number of gadgets going Touch, are some products becoming less accessible to a section of the community? Lucy Greco has the answer.
Very early on in the evolution of computers, their usefulness to people who were blind and visually impaired became apparent. With the creation of synthetic speech and the ability to output text to synthetic speech, the assistive technology industry was born. Very early on, command line interfaces were able to output synthetic speech for the blind user. Talking terminals were available in a few universities so that blind students could participate in computer science. As terminals were replaced by personal computers, screen-reader technology was developed. I've spoken in previous posts about my first computer and how I was able to interact with it. Due to my age, I never experienced talking terminals, the predecessors of personal computers. As I switched from an Apple to a personal computer running DOS, my entire life changed. If I had never seen the difference between the two systems, I may never have become an assistive technology advocate.
Talking terminals were available in a few universities so that blind students could participate in computer science
I used Microsoft DOS version 2.0 through version 5. Each time I upgraded, my screen reader worked a little better. Before I stopped using DOS, I was able to use mainstream tools such as spell checkers effectively. I have always been an early adopter. And early on before many other blind people started using a graphical user interface, I was one of the very early users of OutSpoken for Mac. In fact, I was such an early user that my serial number consisted of only two digits. This screen reader was the first of its kind. It permitted a blind person access to a graphical user interface for the first time.
I remember learning how to use the product with the very well done tactile graphics included with the product. These tactile graphics gave a blind person reference points to the different elements of the interface. I used the system for many years until after it really stopped working with the operating system. As Apple upgraded from OS 8 through to OS 10, my faithful screen reader slowly died. Once again I jumped on a new boat as quickly as I could. Today, I use a Windows screen reader as my primary interface. I could use a Mac today and I do so often but not as much as I could.
In the IT marketplace I have many more choices than ever before. Computing is not done on a PC anymore; computing today takes place in the palm of your hands or in many other ways we could never have dreamt of barely a few years ago. As different devices enter the market, smaller and smaller new interfaces are introduced. Today, almost every device has some form of a touchscreen. The laptop I'm using, a MacBook Pro running windows under boot camp, to write this article is no exception. I have a multi-touch touchpad that I can use whenever I boot as a Mac to give various commands to the screen reader.
Recently I've been shopping for new appliances for my home. The touchscreen, or in the case of appliances, the touch interface seems to be all I could find. Even the refrigerator I wanted only had a touch interface for the temperature and humidity controls. Only commercial ovens had dials that I could use to set the temperature. More and more appliances are moving to touch screens and complex user interfaces. I just purchased a crock pot for the first time and have already learned that I should have spent more time researching how to use one. The one I ended up with has only two buttons and one of the two is off. The interface requires you push the main button continuously to change what setting you are in. As you toggle through the different modes, a little light indicates which mode you will be using. This means that I often don’t set it up right as I forget how many times to push.
It’s very hard to find any household appliance that is accessible. My washer and drier have a bunch of buttons and a dial but there is no way to know which setting the dial is on. The dial turns around and never has a definitive beginning or end. A sighted person can tell by the screen what setting will be used, but I need to guess. Well, I don’t like doing laundry anyway. Sometimes the new interfaces can be accessible. For example there is a new television from Samsung that has voice control. You can say channel up and down, volume up and down and more. There is also a remote gesture mode.
As the world is changing, the way we use devices has also changed. Who would have thought even ten years ago that a new father could send a picture of his one-minute-old baby out to everyone? And every one could get that picture in their preferred mode of viewing. It used to be that we would have to wait days for things like that. Now, even a blind person is able to do so. I got just such a text message the other day from a blind friend. The phone my friend was using was an iPhone. At first thought, the unfriendliest interface to blind users, but no, it’s actually an example of how if you try to make something accessible, anything can work. Apple, riding the wave of its iPhone, has become the most favorable brand for blind users in a long time.
As of June 2009, the iPhone has included a very clever interface just for blind users. It is so specialized for blind users that sighted friends can’t easily use a blind person’s phone without knowing the gestures needed
We are already seeing the next generation of interfaces and it is coming as a response to safety issues, not access. However, access will ride on the back of safety if it has to. With the introduction of Siri to the Apple line of products, we find the iPhone becoming more accessible. A person only needs to push that little button to make things happen. It was first thought of as a way to keep people from texting while driving or at least from keying in a text message while driving. But now, it is a way for disabled users to take notes with their phone and access the interface just by speaking. Not all disabilities are addressed by innovations such as this, but many more people can do much more than ever before. I worked with a student the other day who has almost no use of her hands and who uses an iPhone for everything. She dictates her e-mails and takes voice notes with Siri and records her classes, all in one device.
So now we have to start thinking differently about how we see interfaces. I always used to put touchscreen devices aside every time I saw them, but now I always think, “But wait, how does it really work? Is there a way I can use it? Or maybe, this may not work for me but can help someone else.”
When I first got my job at Berkeley, I needed to start looking at everything in this new way. Maybe a touch screen did not work for me but there were lots of my students it did work for. A student with severe hand pain or very little dexterity has more ability to use a touch screen then any other type of interface.
I have a student who only has the use of one finger and he flies on an iPhone. That one little movement he makes can let him access everything he needs. Earlier, he needed the assistance of an attendant to set him up with a computer. He needed a mike and single switch and a few other wires to even write a simple e-mail. Today, he can send e-mail and calendar appointments and browse the web all on his phone that is mounted right next to the joystick of his wheel chair. The same student, if he wants to use a computer, can do so on campus via a Bluetooth connection. Yes, he still needs someone to start the connection on the computer, but then without any change in where his hand is, he can pair the joystick on his chair and turn on the on-screen keyboard and fly on any computer you show him. This liberates him to be the programmer he wants to be.
I have really changed the way I think about technology. But we need to keep a close eye on it. When new products are created, post-secondary institutions need to teach students to think about access. If the engineering students of today do not have an understanding of the needs of their users, they might just keep producing inaccessible products. The more we teach the public about disability, the more access can be built in. Students in architecture are taught how to put ramps and elevators in buildings; why can’t we insist that computer science students and electrical engineering students learn how bad access can be if they don’t build it in from the start? Why does a stove need a touch screen that can’t be used, even if it is made tactile? Every piece of electronics must be certified safe. What about saying, “Useable by all” as well as the mandatory ‘safe’?
Lucy is a blind advocate for accessible technology. An Assistive Technology Specialist at UC Berkeley, San Francisco Bay Area, Greco is the user of various assistive technologies since the early 1980s. She is passionate about the ways technology makes the world more accessible to everyone but especially to individuals with disabilities
Related Blog: Taking a DEEPer Look at Accessibility by Robert Pearson. Read the article.
Related Event: G3ict M-Enabling Global Briefing Tour Inaugural Session, "New Milestones for Mobile Accessibility", at FCC, Washington, D.C. on June 4, 2012. Read event agenda and proceedings.
Related Publication: Benefits and Costs of e-Accessibility (Business Case White Paper) | March 2012. Order your copy.back
• G3ICT PUBLISHES INTERNATIONAL SURVEY OF WEB ACCESSIBILITY POLICIES WHITE PAPER BY THE CENTRE FOR INTERNET AND SOCIETY, BANGALORE, INDIA
• Three Things to make your App stand out when Building for Accessibility
• Nominations Open for U.S. FCC Chairman’s Award for Advancement in Accessibility (AAA)
• 7th European e-Accessibility Forum: Developing e-Accessibility as a Professional Skill, Paris, France
No records were found.
Post new comment:
Only register users can add comments please Log-in