Apps Android
[Narrator:] Native Android apps are typically programmed in the programming languages Java or Kotlin. Google’s development environment is called Android Studio. Android Studio is available for Windows, macOS and Linux. Even though other companies offer different solutions, the majority of native apps are programmed using these languages and tools. They offer the best integration with Android.
In general, a developer does not need to do much to make an app accessible. Many features work out of the box without changing a single line of program code. They are implemented at the operating-system level. These are Magnification, Display accommodations including colour correction and colour inversion, Select to Speak, Voice Access and Audio and media settings. If these features are new to you, please consult the Assistive technology – Android chapter.
There are two exceptions: TalkBack and Switch Access.
Switch Access
Switch Access lets you interact with your Android device using one or more switches instead of the touchscreen. It can be helpful for people who have impaired dexterity that prevents them from interacting directly with the Android device.
Switch Access scans the items on your screen, highlighting each one in turn until you make a selection. Switch Access requires that all elements for user interaction are focusable. Otherwise, a user cannot operate this element. Make sure views that require gestures can also be properly navigated to without using the required gesture. If an app implements a book reader and pages can only be turned by swiping, a user using Switch Access will not be able to turn the pages. The problem can be solved by implementing an additional click handler.
These are also requirements for TalkBack. As TalkBack requires more precautions by a developer, we will focus in the following sections on how to improve accessibility for TalkBack.
TalkBack
TalkBack is a screen reader built into Android. A screen reader is designed for users with visual impairments. If you don’t know what a screen reader is, you can learn more about it in the Screen readers – Android chapter.
A screen reader reads text aloud to the user, but what if the text on the screen is not enough to make the app usable? There might also be buttons or images, which would require alternative descriptions. In this case, the screen reader reads the content descriptions of the graphical user-interface elements. In the next section, we will see a brief example of how to make an app accessible.
We are using API level 31 for Android 12, which was the latest API level at the time this course was created.
Please note that this chapter is not a substitute for a complete programming course for Android apps. Please refer to Google’s documentation to see how to make an app accessible.
Content labels
The most basic application scenario for an accessibility attribute is the labelling of an image. These XML layout code lines generate an ImageView to be loaded with the content of an image file. If the TalkBack cursor hits this image, TalkBack will try to figure out what to tell the user. As there is no text to describe the image, TalkBack will ignore the image. By adding the following line of code, TalkBack can offer a description of the image. In our example, the actual text is a localised string, which will be fetched from the string.xml file to support multiple languages.
Please note that you do not have to mention the element type in the description; for example, “Photo of a finger”. TalkBack will always tell the user what kind of user-interface element has been found. The same principle applies to an ImageView created programmatically.
In this example, we create an ImageView using Kotlin code. By adding the contentDescription attribute, we attach a label which enables TalkBack to describe the image. Adding the contentDescription attribute works fine for ImageViews, ImageButtons, CheckBoxes and other views that convey information graphically.
For EditText or TextView elements, the text itself offers the contentDescription. For them, we use an android:hint attribute to indicate the purpose of the text field. Often, an EditText element is presented close to a TextView describing the required input. In this case, use an android:labelFor attribute to indicate that the TextView should act as a content label for an EditText view.
Ignore elements
Decorative images or images that don’t convey meaningful information graphically do not require content labels. In such cases, set an android:contentDescription attribute of “@null”. You can see here its application in XML layout and Kotlin code presentation.
As we have seen before, the contentDescription attribute cannot be used for TextViews. For TextView elements, set the android:importantFor accessibility attribute to a value of “no”. Please note that, even though our examples have always shown XML layout and Kotlin code, it is sufficient to use only one of them to achieve the desired function.
Android offers many more possibilities for improving accessibility. For a better understanding, we will use an example.
A basic example
Here we have a little demo app. It shows a table of recipes. Each cell shows an image, a title and an evaluation indicator shown using a set of hearts. Once we touch a cell, we see the name of the recipe, shown using its title, and a slider. We can use the slider to mark how much we like a recipe.
In Android Studio we have created three layouts. The first one defines the layout of the main activity, presenting the scrolling menu. This layout contains the RecyclerView, which is filled with the data from all recipes. Every item in the RecyclerView uses a layout of its own. Here we define the positions of the image, the recipe title and the evaluation hearts. The third file defines the layout of the detailed activity, presenting the data for a selected recipe.
Here we find the placeholders for the ImageView and a TextView for the question. Below we find a horizontal layout containing the thumb icons and the slider. Finally, we have the list of ingredients.
Let’s look into the Kotlin codes. The Recipe class offers members for the image, the title and the evaluation. As the evaluation is of the Integer type, we create an evaluation string containing as many hearts as the evaluation indicates. We inherit from the Application class to create a companion object, which offers us the possibility to create global values.
For demonstration purposes we store all recipe-related example data here, in a list of recipe objects. We find here a function that offers the possibility to change the evaluation of a recipe. The MainActivity file is very basic. It just loads the RecyclerView. The RecipeListViewAdapter loads the layout of a list entry.
For every data entry, a list item is filled with the image, the title and the evaluation string. If the user touches an item in the list, a click listener event is ready to navigate to the DetailActivity. By passing the list index position to the following activity, the DetailView will be able to detect which list item the user has selected.
The RecipeListViewHolder implements the interface between the layout of a list item and the actual data. Finally, we come to the DetailView. This code fetches the layout, searches for the correct recipe and assigns its data to the different views.
The slider uses a listener, which is triggered whenever the user changes the slider value. In this case, we update the evaluation value of the global recipe object.
We start TalkBack to see how the app works for a visually disabled user.
[Screen reader:] Curry rice. Orange heart. Orange heart. Orange heart. 1 of 12, in list 12 items. Tap to activate. Hamburger, Orange heart. Orange heart. 2 of 12. Hotdog. Orange heart. Orange heart. Orange heart. Orange heart. Orange heart. 3 of 12. Hamburger, Orange heart. Orange heart. 2 of 12. Tap to activate.
[Narrator:] The number of hearts indicates how much we like a recipe, but TalkBack does not present its meaning. TalkBack does not detect the image. We do not like the order in which TalkBack speaks these elements. We would like the title to be spoken first, then the image description and finally, the evaluation indicator.
[Screen reader:] Hamburger. Navigator button. Out of list. Tap to activate. Hamburger. How do you like this recipe? Thumbs down. 25%, slider. Swipe up or swipe down to adjust. 50%, slider. 75%, slider. Thumbs up.
[Narrator:] There is no description for the large image. The functionality of the slider is not easy to understand. We need to fix this.
[Screen reader:] Navigator button, window Hamburger. Demo. Curry rice. Orange heart. Orange heart. Orange…
[Narrator:] We open the ActivityDetail layout. The thumb icons do not provide any added value. We want TalkBack to ignore those symbols. Therefore, we add an importantForAccessibility attribute with the value “no”.
We want a better description of the slider functionality. We achieve this by adding a contentDescription. The question before describes the slider functionality. Therefore, we add a labelFor attribute referencing the ID of the slider.
In the Recipe class, we add a label for the image, which should be read once the TalkBack cursor reaches an image. The evaluation needs to be converted into understandable text, which we store in the evaluationDescription variable. A simple function converts the integer value into a text string. To activate these descriptions, we load the RecipeListViewAdapter code.
We set the label as the contentDescription of the image. As we cannot set a contentDescription for a TextView, we use the “hint” attribute. In the DetailActivity view, we reuse the image label for the contentDescription attribute for the large recipe image.
Let’s try it with TalkBack.
[Screen reader:] A plate of curry rice with carrots and peas. Curry rice. Orange heart. Orange heart. Orange heart. So-so. 1 of 12, in list 12 items. Tap to activate. A hamburger with tomato, cheese, ham and salad. Hamburger. Orange heart. Orange heart. Bad. 2 of 12. A hotdog with bread, sausage and mustard. Hotdog. Orange heart. Orange heart. Orange heart. Orange heart. Orange heart. Excellent. 3 of 12. A hamburger with tomato, cheese, ham and salad. Hamburger. Orange heart. Orange heart. Bad. 2 of 12. Hamburger. Navigator button. Out of list. Tap to activate. Hamburger. A hamburger with tomato, cheese, ham and salad, image. How do you like this recipe? Indicate how much you like this recipe by increasing or decreasing the value. Slider for How do you like this recipe? Tap to activate. 25%. Indicate how much you like this recipe by increasing or decreasing the value, slider. Swipe up or swipe down to adjust. 50%. Indicate how much you like this recipe by increasing or decreasing the value, slider. 75%. Indicate how much you like this recipe by increasing or decreasing the value, slider. Navigator button. A plate of curry rice with carrots and peas. Curry rice. Orange heart. Orange heart. Orange heart.
[Narrator:] That’s much better, but still not good. In the main menu, TalkBack interprets the text hint as additional information and does not replace the list of hearts. In addition, TalkBack does not speak the list elements in the desired order.
We make one final modification. In the RecipeListViewAdapter, we remove the existing modifications. We use the complete list item view as one single element to create a contentDescription. In this way, we can assemble elements in any order we want.
One more test.
[Screen reader:] Curry rice. A plate of curry rice with carrots and peas, image. Your evaluation: so-so. 1 of 12, in list 12 items. Hamburger. A hamburger with tomato, cheese, ham and salad, image. Your evaluation: bad. 2 of 12. Hotdog. A hotdog with bread, sausage and mustard, image. Your evaluation: excellent. 3 of 12. Tap to activate.
[Narrator:] That’s much better.
Accessibility testing
The Android development environment offers a lot of possibilities to test your program for accessibility. You can of course do manual testing. This is what we have done so far. We started TalkBack and navigated through our program. Can we identify all of the visual elements acoustically? Can we navigate to all screens? Can we operate all of the functions? Try it with other accessibility features like Switch Access.
Android Studio offers built-in accessibility checks, which can help to detect accessibility issues. Here we opened the DetailView layout. This is the program version before we made our code corrections. When switching from code view to design view, we notice some warnings relating to different user interface elements. Here is a hint notifying us that the slider has no content description. We can unfold the hint to get an explanation of the problem, including a proposal for a fix. The next remark indicates an insufficient colour contrast. As we do not have a real text here, just symbols for illustration purposes, we ignore this warning too. Below this, the editor informs us about a missing content description for the image. We ignore the warnings about the hard-coded texts as this is just a demo program. Texts should of course be made available in multiple languages, but we ignore this for our demonstration.
Accessibility Scanner is a tool created by Google that suggests accessibility improvements for Android apps, such as enlarging small touch targets, increasing contrast and providing content descriptions, so that individuals with accessibility needs can use your app more easily. You can download Accessibility Scanner from the Google Play Store. After you have installed Accessibility Scanner, navigate to Settings and then Accessibility. Then turn on Accessibility Scanner. The first time you turn it on, the system will ask for several permissions. You need to allow Accessibility Scanner to have full control of your device. In addition, you need to allow Accessibility Scanner to display its user-interface elements over the screens of other apps.
Once this is done, we can switch to our app. Notice that the Accessibility Scanner creates a floating blue action button, which is overlayed on top of any content on the screen. The first time you turn it on, the system will ask for permissions. To move the Accessibility Scanner button to another area of the screen, you can long-press on it and drag it. Tap on the Accessibility Scanner button to launch the scan. Once confirmed, Accessibility Scanner can examine the user interface of your screen to perform a quick audit for accessibility and prepare suggestions for accessibility-related improvements. Accessibility Scanner highlights the views that may have accessibility issues and offers suggestions on how you can fix those issues.
The Android platform supports several testing frameworks, including Espresso, that allows you to create and run automated tests that evaluate the accessibility of your app. Espresso is an Android testing library. It is targeted at developers who are using automated testing as an integral part of the development life cycle. As Espresso can do much more than just accessibility tests, and because Espresso tests can be complex, we will not demonstrate it here. Last but not least, why not invite potential users to test it? Along with the other testing methods, user testing can provide specific and valuable insights about the usability of your app.
Testing using a device
As the final goal is for the app to run on a real device, we should always test on real devices. Here we can evaluate all the different settings and operations our app offers by running them as users will experience them. This is the only way of testing whether your app can access and use device-specific functions like the accelerometer or geolocation.
Be sure to test the app on more than one device. The screen size or the operating-system version may make a difference. For practical tests, you need to know how the accessibility features of the operating system work. If you are not sure, please refer to the Assistive technology – Android chapter. If you do not know how to operate the screen reader, please consider studying the Screen readers – Android chapter.
Where to continue?
You have now been introduced to ways to improve the accessibility of Android apps. In this chapter, we have just scratched the surface of Android accessibility programming. Please refer to Google’s documentation for a more detailed explanation.
Depending on your personal interests, you could continue with one of the following chapters:
-
Apps iOS
-
Apps cross-platform
[Automated voice:] Accessibility. For more information visit: op.europa.eu/web/accessibility.