The JetRuby team cares about the users of our products. That’s why we pay close attention to the accessibility of our interfaces. This article is dedicated to the tools we use to ensure a comfortable user-experience when developing Android apps.
Android Accessibility Scanner
Google has developed a tool to make mobile application interfaces for Android OS more accessible to users with disabilities. It’s called Accessibility Scanner, and it scans the graphical user interface and displays a description of the accessibility issues found. It also gives recommendations for fixing them. For instance, to make controls bigger or add text labels on them, increase image contrast, change a font, etc. All of that will improve the usability and accessibility of the interface.
For that matter, the following components are being tested:
- Content shortcut;
- Touch targets;
- Clickable elements;
- Text and image contrast.
You can use Accessibility Scanner on smartphones and tablets running Android 6.0 or higher. It’s also up for download in Google Play.
How it works
The Accessibility Scanner application does not require special technical skills for its use and, among other things, is recommended for ordinary people who can file a report on the problematic interface and send it to the developer. Our developers, though, usually use this tool themselves, since the results of its tests may look unintelligible to a person who’s not acquainted with the app developing process.
From a technical point of view, Accessibility Scanner is a so-called accessibility service, or an application that runs in the background and interacts with the Android OS accessibility API in order to provide additional functionality for users with disabilities.
Having opened the interface that needs to be tested, the scanner will sequentially describe all the issues found and offer options for fixing them. It’s also possible to display all found issues in a single list and send it as a report by Email.
The Accessibility Scanner is a great tool for basic testing, but it has a significant drawback. The scanner only detects obvious issues, like text size, touch target size, missing image shortcut, and so on. In more complex cases, it’s not that effective. That’s why it can’t compete with real manual testing.
TalkBack is Android’s built-in screen reader. When TalkBack is on, users can interact with their Android-powered device without seeing the screen. Users with visual impairments might rely on TalkBack when using our applications, so it’s important to make sure there aren’t any issues.
How it works
People with vision impairments use their fingers to “explore” the interface, and when they come across any element that can be acted on, or any block of text, TalkBack helps. For text (including things like time and notifications) the screen reader service tells exactly what’s written on the screen. What concerns clickable elements, TalkBack explains what button it is, and lets people act with a double tap or move to the next element without triggering anything. It’s pretty well thought out, making it possible for people with visual impairments to use audible prompts and basically do anything on their smartphone.
What we test for
- Are all the elements properly labeled allowing TalkBack to read them to the user?
- Are notifications or popup windows being read to the user?
- Can users swipe through a page to navigate and explore every element on the page?
- Are users able to use the double-tap feature to randomly explore the application or pick specific elements to explore?
Testing the accessibility of an app explains how easy it is to navigate, access, and comprehend its content.
Keeping Web Content Accessibility Guidelines (WCAG) in mind, we perform both manual and automated Accessibility testing. Thus, in order to avoid pitfalls, we usually incorporate Accessibility testing in the early stages of the Software Development Life Cycle.
Interested to learn more development tips? Follow our blog and don’t miss our new article!