A Brief History of Android Accessibility

In the beginning (okay, 2009), there was Android. It was called cupcake (1.5). It had no accessibility, and that was not good.

Then came the others:

  • 1.6 – Donut
  • 2.0 & 2.1 – Éclair
  • 2.2 – Frozen Yogurt or Froyo
  • 2.3 – Gingerbread
  • 3.0, 3.1, & 3.2 – Honeycomb
  • 4.0 – Ice Cream Sandwich
  • 4.1, 4.2, & 4.3 – Jelly Bean

All had accessibility, and that was good.

Enough already!

This is a rough outline of changes to Android accessibility. If anyone has corrections or additions to make, feel free to post them to the comments.

The Early Years

Basic accessibility was introduced in Donut (1.6), expanding gradually through Gingerbread (2.3). The first level of accessibility was delivered by screen readers:

  • TalkBack (2009) by Google’s Eyes-Free Project.
  • Spiel (2009) by Nolan Darilek.
  • MobileAccessibility (2011) by Code Factory, which combines a screen reader with a suite of simplified apps.

All three screen readers offered roughly the same level of accessibility. With them, blind and low-vision Androidites could use their D-pads or trackballs to navigate to all focusable items and activate buttons, checkboxes, and other objects. They could also enter and review text. Initially (1.6) text input was possible only with a hardware keyboard, and text review involved moving focus away from then returning to an edit field, listening to the entire block of text, and counting presses of the arrow keys or deleting to a desired point. Over time (by 2.2 as Darilek, Code Factory, and Google developed virtual keyboards), however, on-screen typing and voice input became available. So did aural text review in edit fields and use of a virtual D-pad.

Though devices were quite usable in Donut through Gingerbread, creativity and workarounds were part of the eyes-free Android adventure. They were needed to deal with unlabeled buttons and edit fields, unfocusable objects, inaccessible web views, and the disruptions caused by device manufacturer overlays. Despite general improvements partially brought about through apps that took advantage of accessibility API’s, blind and low-vision users continued to choose devices that sported a hardware keyboard (strongly recommended) and a joystick (required) that could be located by touch (e.g., 4 arrow keys, a D-pad, a Trackball, or a trackpad).

The Middle Years

A significant accessibility improvement was released with 3.0 (Honeycomb), the OS version for tablets. It was the introduction of Enhanced Web Access, the injection of self-voicing java scripts that make web views accessible. In Gingerbread and earlier, developers needed to do lots of coding manually, an undertaking that must have seemed above and beyond the call of duty, assuming more than a few even knew of their existence and purpose, but with Enhanced Web Access, many apps became screen-reader friendly without any developer intervention, email readers and web browsers being common and notable exceptions.

More exciting was the introduction of touch exploration in Ice Cream Sandwich (4.0). For the first time, eyes-free Android users could touch the screen and get spoken feedback about what they were touching without inadvertently phoning home or starting up the music player. They could simply slide a finger over the screen, and when they heard the screen reader announce the right button or checkbox, they could lift their finger and tap the object directly. Hardware keyboards and physical navigational controllers were no longer needed, but an accessible virtual D-pad was still required for text review in edit fields.

Ice Cream Sandwich also included system-wide font adjustment for low-vision users. In earlier Android versions, font size could be increased in some apps, not all or even most, and it could only be resized to levels set by developers who mostly have little experience with people who read really large text.

An additional feature introduced in Ice Cream Sandwich, which sounds minor but isn’t, was the ability to set up the device and turn on accessibility without sighted assistance. End users could now draw a clockwise rectangle on the activation screen to start TalkBack before setup. Though welcome by most eyes-free users, the gesture caused a stir as many had trouble getting it to work.

The Present

Each version of Jelly Bean (4.1, 4.2, and 4.3) has included important improvements to eyes-free accessibility.

4.1 had lots of new goodies.

Accessibility focus was easily the biggest and flashiest addition. Eyes-free users could swipe vertically or horizontally to hear the next or previous focusable object, then double-tap anywhere on the screen to activate it, an approach many found easier than locating one button in a crowd.

Other screen-reader gestures became possible as well. Gestures could be used to go back, home, to notifications, and to recent apps. Gestures could also be used for reviewing text in and out of edit fields, reading continuously, and quickly jumping to part of the screen.

Braile support became available to eyes-free users. They could read on-screen content, except in web views and a few other environments such as Play Books, and they could input text using braille displays. Earlier versions of Android did include braille through Code Factory’s MobileAccessibility product, but the option became available to all Android users in Jelly Bean, first through an accessibility service called BrailleBack, developed by Google, and later through the BRLTTY project, developed by Dave Mielke.

Screen magnification, which made a brief appearance in an earlier TalkBack beta, was cooked into the operating system in 4.1. Low-vision users could seriously zoom in on parts of the screen, pan, and enjoy other features of magnification software, whether a screen reader was being used or not.

Finally, a new gesture for turning on accessibility at the setup screen was introduced in response to the hubbub over Google’s first offering. Now eyes-free users had the option of drawing the infamous rectangle or holding two fingers down on the screen after pressing the power button. While there were a few hitches—such as problems produced by inadvertently placing the fingers on the language item or disruptions caused by manufacturer overlays—the new gesture was well received.

4.2 had more goodies.

The ability to restart accessibility/TalkBack was the most important of these. If the Accessibility Shortcut gesture was enabled in Settings>Accessibility, eyes-free users could restart TalkBack by waking up their devices and holding two fingers on the screen. The significance of this feature is huge, especially when sharing a device with a sighted family member, friend, or main squeeze.

4.3 has lots of beautiful tweaks. Since they’re not listed in one tidy place, eyes-free users are comparing notes to get a full list. Two are worth mentioning at this time:

Accessible text selection has been long awaited. Now eyes-free users working in edit fields can select chunks of text for copying, cutting, and pasting. Previously, text selection was an all or nothing deal.

Enhanced Web Access has been the other interesting development. Since their coming into being in Honeycomb, the java web scripts have been an option in Settings>Accessibility, which end users have needed to activate. In 4.3, the option has disappeared from the Accessibility screen, suggesting that the scripts are part of the operating system itself.

Who was your first?

What is Android?

Android is a touch screen based operating system (OS) for cell phones and tablets. It includes off-the-shelf accessibility for blind and low-vision users via services like

  • TalkBack, which speaks the content of the screen.
  • Explore by Touch, which provides an accessibility layer so eyes-free users can run one or more fingers over the screen to interact with their devices.
  • BrailleBack, which enables input and output with a Bluetooth braille display.
  • Large fonts, which adjusts font size system wide.
  • Screen magnification, which zooms in on parts of the screen for better viewing.

Though these features also benefit end users with other accessibility needs, this blog focuses on eyes-free and low-vision use.

Eyes-free users shopping for a new device should keep in mind that All devices running pure Android are accessible, but accessibility varies on devices with custom UI’s, like Samsung Touchwiz, Motorola Motoblur, and HTC Sense. Of these manufacturer overlays, Touchwiz (its newer versions) is the most accessible, having features some eyes-free users prefer over stock Android. Motoblur is also relatively unintrusive, becoming less of an issue with each version of Android. Sense is now and has always been the most troublesome, requiring that blind and low-vision users find alternative apps for such basics as the dialer and contacts. Other manufacturers tend to be less disruptive with their custom skins, often doing little more than changing the launcher, which can easily be replaced. So when researching devices, it is important to find out whether a manufacturer UI is part of the installation.

Another thing to take into account is that newer is better when it comes to Android versions. Android accessibility, though very good, still has a few areas that are works in progress. Text selection is one example. Through 4.2, the only way to select chunks of text while editing is to use a physical keyboard. In 4.3, however, it is possible to select text with the touch screen alone. Hence, for the best accessibility experience, eyes-free users should invest in the newest version of Android (4.3 at this time).

This blog covers Jelly Bean, with occasional references to earlier versions of Android. Most posts provide step-by-step instructions on how to perform tasks on a stock/pure Android device. Users of customized phones and tablets will be able to follow along as differences are relatively small.

For information about the accessibility of devices running Gingerbread (2.3) and earlier, visit the Accessible Android blog on Blogger


which is the previous incarnation of this site.