The Role of assistive Technology in Accessibility

The discussion about accessibility focuses heavily on accessible development, but ignores two equally very important factors: the client and, of course, the assistive technology.

Article Content

Client

The client is the layer that lies between the user and the content. In a broader sense, it is the computer or other terminal device such as a smartphone, tablet or terminal. In a narrower sense, the platform is less important than the software running on it, usually the browser, programme or app used plays the key role.

The assistive technology

If the client supports the necessary accessibility APIs and the content provides the necessary information, the assistive technology gets the key role in content accessibility. If the assistive technology can't process the information provided by the API, accessibility does not work.

For example, do you have to make screen reader users use a more up-to-date browser and a more up-to-date screen reader? The answer to both questions is yes. The simple reason is, if you don't force them, they won't do it. As long as everything I want to do works with my ancient configuration I see no reason why I should touch anything.

Technical conservatism among the blind

In an older article, I wrote that accessibility often fails not because of the websites, but because of the assistive technology used. Around 2009 I was still using a notebook running Jaws 6.4 and Internet Explorer 6. For the non-experts, that was totally outdated even then, but for me it was crucial that the software worked. The laptop itself was ancient and would have struggled to cope with an update. My current laptop is running Jaws 9, which is also not fresh anymore. NVDA runs parallel, the current version. Another really ancient laptop only runs NVDA.

Since no blind person can afford new versions of commercial Windows screen readers, many blind persons uses outdated technology. Of course, this is also due to the fact that Windows XP has been on the market for so long that a general update of the screen reader was not absolutely necessary until now. Win XP was launched in 2002, and all screen readers launched since then support XP. So in the worst case, the screen reader could be ten years old. And there are definitely still some persons using older versions of Windows. I don't need to tell you what that means in terms of security.

Five or ten years ago, it might have been OK to update your system once or just use it until it broke and you had to buy a new one anyway. But most blind persons don't seem to realise that they can't participate in the innovations of the web - no matter how accessible or not - if they don't use up-to-date browsers and screen readers.

Favourable technology

The most important reason not to update has gone away with cheap or free screen readers like VoiceOver, Orca or NVDA. If someone seriously tells me that someone can't use a form because their screen reader is older, I'd like to tell them that no one who uses at least Windows XP needs to use an older screen reader today. The alternative is: eat or die, update or do without new features and improvements. There is no ARIA and no HTML5-based Youtube for Jaws 6.4 and Internet Explorer 6, you can be annoyed as long as you like, it won't change anything. Of course, those affected can use their old screen reader as long as they like. But it is then no longer legitimate to complain about websites not working. The guidelines and regulations point out that older assistive technology should also be supported, but that cannot mean supporting ten-year-old assistive technology on a cutting-edge site. After all, the whole thing degenerates into a Sysiphus task, how many versions of jaws, Window Eys, cobra and so on are to be supported and who wants to remember which version supports which functions? Do we seriously expect a developer not only to be familiar with screen readers, but also to know the various bugs of the different products and their individual versions over the last five years? The effort required to change over does not seem to me to be a good counter-argument either. I have been working with Word for about 15 years and with various versions of Jaws for about 10 years and may claim that I do not know most of the functions or shortcuts by heart, many functions I do not know at all. For the usual operation of a computer, a handful of keyboard shortcuts and the programme's documentation are enough.

If its important for you, you can customise NVDA to respond to roughly the same keyboard shortcuts as Jaws. Besides, you had to learn the screen reader once anyway, and you don't have to switch completely right away, but can do it gradually and use both programmes in parallel at first.

So if you want to participate in the new technical or digital possibilities, you have no alternative but to update your programmes. It certainly won't get easier if you wait longer.

Incidentally, I don't believe that the fatigue of adjusting has anything to do with disability. Usually it's convenience, after all there are enough non-disabled persons who are also traveling with ancient configurations.

Carl Growes rightly noted that assistive technology does not make the website accessible. But outdated support technology can very well prevent a website that is accessible from being used comfortably.

Windows 8 could also be very interesting for the blind, although the Metro interface is very visually oriented.

If reasonable touch support has been or will be implemented in a similar way to iOS, completely new application scenarios for blind persons will be conceivable. Apps in iOS are nice, but a real desktop program is sometimes inevitable. Drag and drop, for example, would make it much easier for blind persons to use Web 2.0 applications, tables in Excel would become more accessible, presentations could be created more easily, and so on. A large display allows exploration and control with both hands, there is great potential for innovation in this, not only for the blind.

Assistive Technology as limiting factor

We talk a lot about accessibility, but rarely about the tools available. On the one hand, a lot has happened thanks to mobile computers/smartphones and the constant availability of the Internet. On the other hand, we still have to spend a lot of money on the simplest things today because the health insurance companies have more or less said goodbye to the provision of medical aids. There are not only the barriers in the environment, the auxiliary technology is also often a barrier.

Even ten years ago, it wasn't easy to be on the go with outdated programs and ancient screen readers. Today it is not only inconvenient, it hinders us. Often enough, it is not the internet technology that is the blocking element, but the auxiliary technology that cannot do this and cannot do that and thus makes a accessible solution accessible.

The innovations take place elsewhere

Many innovations in assistive technology do not come from classic aid manufacturers. The market leaders for screen readers and screen magnifiers, for example, consider it innovative to launch a version for a new operating system. The situation is similar with hearing aids.

Apple's accessibility features have made using the Internet and the environment much easier for many blind people - and other disability groups. Microsoft has made a contribution with Kinect without actually having intended it done for accessibility. Unless I'm wrong, there were no real electronic aids for people with learning disabilities - feel free to correct me.

Accessibility and assistive technology will become more and more important in the future. On the one hand, people are getting older and disabilities inevitably increase with age. On the other hand, it is becoming increasingly difficult to get funding for auxiliary technology. In addition, only a few people are willing or able to familiarize themselves with complex technology.

Promoting a culture of innovation

Many innovations today do not come from large companies, but from young startups. Internet startups get a lot of attention.

A relatively new trend is Startups in the social sector. They are based on the social business idea.

Companies in the social business field operate according to the usual rules of other companies, but are dedicated to solving social problems. The classic example of this is the Grameen Bank microcredit scheme in Bangladesh. A number of start-ups are now also addressing social problems. For example, they develop solutions to social or ecological problems.

Many startups emerge from university research. Government-funded research is often criticized for turning into commercial products. But that's still better than usable solutions that, due to a lack of interest, get dusty in the university drawers because nobody notices anything about them.

The market for auxiliary technology is larger than it appears at first glance. Voice input, for example, is not aimed at helping people with motor disabilities - otherwise it would cost an estimated 20 times more. The fact that she can still help you is a nice side effect. A Swedish company is working on eye-tracking systems for everyone, also a technology that will be interesting and probably affordable for people with limited mobility.

Above all, it is important that the communities start articulating their interests out loud. Nobody develops a solution for a problem that they do not know.

Is open the future

In no way do I want to belittle the work of the OpenSource movement. The screen reader NVDA, for example, is one of the best that has benefited blind people in recent times, along with Apple's VoiceOver. We tend to forget how better off we are compared to people from poor countries who can't even afford their own computer. For you there is hardly anything other than open source solutions, because the commercial solutions of the West are unaffordable for them.

But open source always has to struggle with two problems: too little money and too few resources. In any case, we need open source, but we also need commercial solutions if no open source product is established. And services that we also need can also be poorly implemented as open source.

As an example, take VerbaVoice, a service provider specializing in the conversion of spoken language to writing for deaf and hard of hearing people takes care. Serotek is offering its System Access screen reader for $400 instead of $1,000 like the market leader, Freedom Scientific. By the way, the German version of Jaws costs 2600 euros, nobody knows why.

Another project wants to build a £300 braille display. Dirt cheap when you consider that hardly any Braille display costs less than 5000 euros.

Aids must be affordable for everyone

The Jaipur foot was developed in India 40 years ago, a prosthesis for people who had lost part of their leg for a mere $40. Unfortunately, I can't say what the high-tech prostheses used by athletes with disabilities cost, but they won't be really cheap.

In the blind sector, manufacturers of aids have adapted to selling their products at high prices. Either the blind people have the money or the health insurance has paid for the products. With the development of smartphones, they've gotten into a bit of a squeeze. A smartphone or tablet can be used, for example, as a mobile magnification system for the visually impaired, as a navigation system for the blind, as a mobile scanner or as a device for reading prices. At some point, blind people will no longer see why they should spend thousands of euros on aids when a smartphone can provide the same service for a few hundred euros.

Will assistive technology compensate shortcomings in accessibility?

Assistive technologies are tools that support people with disabilities to better use digital content. When it comes to digital accessibility, we usually talk about software or a combination of software and hardware, such as a screen reader + braille display or eye control + camera.

Trend Integration

In general, a shift can be observed: individual functionalities or entire systems migrate to the operating systems. While the assistive functions were less than rudimentary up to Windows 7, things have improved since Windows 8 to date. Windows 11 has a basic screen reader, a high-performance zoom function, eye control and voice control (the quality of which I cannot rate) and a few other functions. There is hardly any external assistive software for the Mac, iOS and Android, but many basic functions are integrated. Except for Windows, it is rare that your own assistive technology is also installed. This applies to the software; such a development would also be desirable for the hardware.

The price of assistive technologies

As mentioned above, assistive technology usually no longer needs to be purchased. But if this is the case, it becomes very expensive: a Braille display that can display 40 characters costs around €4,000. The Jaws screen reader costs around €2600 (according to my current status). Why is that?

The aid market is not particularly attractive: high development costs are offset by relatively low sales figures. At least the blind customers in Germany are very critical and, despite the blind allowance, are not always willing to pay for things that actually make sense.

In rich countries you only have two options:

Either you try to keep the price so low that private individuals also take advantage. The range is still relatively large: the various Bluetooth keyboards specifically for blind people cost many times what even high-priced conventional Bluetooth keyboards from Apple, for example, cost. However, the prices for special smartphones are relatively reasonable; they are still cheaper than the current models from Apple or the top models of the Android group - but the usage options for the special devices are not as great as with conventional smartphones. < /p>

Or you try (in Germany) to get an entry in the health insurance companies' list of medical aids. This takes time, but is certainly possible. There is no alternative to high-priced aids anyway; e-wheelchairs, for example, can cost as much as a mid-range car, as can guide dogs for the blind. However, the health insurance companies already have and will most likely pay for fewer aids in the future, with reference to the integrated or free aids mentioned above. At €2600 for Jaws and €0 for NVDA with relatively the same range of performance, this is somewhat understandable.

Many companies are actually founded by those affected or their caregivers. You won't get rich from it; German companies are all medium-sized companies. Apart from Whispero (Jaws, Zoomtext etc., which bought everything together, there are hardly any players who operate globally and have reached a critical size. But even Whispero does not reduce the prices for Jaws outside the USA, even though the software is considered to be fully developed can and a comparatively large number of licenses can be sold. For what reason, the Westerners still pay for it.

However, the majority of disabled people who live in poorer countries and even there tend to belong to the poorer class are left behind. Even the relatively inexpensive Braille displays like the Orbit Reader would make up more than a year's salary. They benefit from the integrated assistive technologies as long as they have access to a computer or smartphone and their disability is not so severe that they cannot cope with this technology themselves.

What role does assistive technology play in digital accessibility?

While assistive technology used to be the bridge, it is gradually becoming a compensator for a lack of accessibility. Of course, a completely blind person cannot use a digital interface without a screen reader. Of course, the prerequisite is that the software has been developed correctly so that the screen reader can work with it.

But there are new developments that are gradually making us more independent of the good will of the providers. Apple iOS has had a function for a long time, I mean since iOS 16, that can recognize unlabeled UI elements and their values. Android has an automatic image description. NVDA has the AI Describer extension, which can also describe controls and images. Google Bard is reportedly able to describe the GUI on a screenshot uploaded there.

Don't get me wrong: it works somewhere between good and not at all. For example, even if a UI element is recognized, this does not mean that the user and the screen reader are able to interact with the element. A blind person cannot know whether the description is adequate or whether the algorithm is talking nonsense. For the time being we will rely on accessible GUIs.

However, in my opinion, the role of assistive technologies is likely to increase in the near future and the quality of the GUI is no longer as important as it used to be. With increasingly better training data, the recognition quality increases. The next step would be to make recognized UI elements operable. If it's just a click, that's no longer a problem. But there are other UI elements such as sliders or elements that do not behave predictably. The values of elements may not always be recognized, for example with a Toogle button.

And the good thing is that these things are no longer available to the exclusive Western club, but also to poorer people.

Further reading