Testing as a sighted person with iOS VoiceOver Screen Reader
Testing websites and apps for screen reader suitability is not always easy for sighted people. In the following I offer you a small guide for this. Of course, only iOS apps can be tested, otherwise VoiceOver can of course also be used for web applications or in the editorial area. Instructions for Android Talkback for the sighted.
- What is VoiceOver
- What does VoiceOver change?
- App testing
- Testing the content
- More on screen readers
What is VoiceOver
VoiceOver is a screen reader that is firmly integrated in all iPads and all iPhones from version 3 Gs. It brings along a voice output that is sufficiently good for sighted people. However, the features qualify it as an ideal test environment for sighted people:
- As already mentioned, it is available on all devices and requires no additional installation.
- It is comparatively intuitive to use. Desktop screen readers require a considerable amount of learning, especially for sighted people, since you have to remember countless key combinations.
- A particular benefit of VoiceOver is the rectangle used to highlight a focused object. This allows the viewer to see which object is being focused and whether what is being spoken matches what they are seeing and what should be there.
From this point of view, VoiceOver can be used for both content and development purposes.
Another feature that, funnily enough, I've only just discovered makes the application even more interesting: VoiceOver can also be controlled mostly with a keyboard, for example with an external Bluetooth keyboard. So you can also use classic features of a desktop screen reader.
What does VoiceOver change?
When VoiceOver is turned on, the way you use iPhone changes. For one thing, each focused object is pronounced. On the other hand, a double tap instead of a single tap is required to start an application or trigger an action. A tap speaks the object or function, a double tap starts it. Scrolling is done with three fingers instead of two. In order to operate a slider, the same must first be focused, then it is held for a moment until audio feedback is heard, then the slider is moved. To move from one object to the next, you swipe your finger from left to right or right to left.
An important feature is the rotor, with which you can control what triggers the wiping movement from top to bottom or from bottom to top. The rotor is operated by placing two fingers on the display and rotating the fingers clockwise or counterclockwise. This allows you to control whether the swipe movement increases the speaking speed or whether you can jump from heading to heading and so on.
So much for the basic operation. VO can be found in the accessibility under Settings -> General.
I'm assuming that your developers have followed the Apple guidelines for designing apps, otherwise you can save yourself the testing right away. Testing apps is relatively simple. Basically it has to be checked
- whether all elements can be reached with the swipe gesture and please in the correct order, i.e. from left to right and from top to bottom
- whether all elements are announced correctly as input fields, buttons, sliders, text, graphics with descriptions for the blind, etc.
- - whether all VO enabled elements are operable and fillable.
Testing the content
It is of course necessary that your content is accessible via the web or that it is displayed in the content management system almost completely rendered.
Here you can check if all graphics have alternative text by touching the graphic with your finger, use the swipe gesture to check if all headings have been marked up, and so on. Here you may have to work with the rotor mentioned above. Or you work with an external keyboard, a list of There are keyboard shortcuts.