The Curb Cut Effect - how Non Disabled profit from inventions made for Disabled
A number of inventions or developments have been inspired or anticipated by disabled persons. There's even a term for it - curb-cut effect.
- SMS (Short Message Service)
- web and email
- audio books
- Speech Output
- voice Control
- 3D printing
- Write texts faster
- The multi-touch screen
- Eye and touch-free technologies
- prosthetics and plastic surgery
- home automation
- Dark mode
- And it goes on
- Social Factors
In this post I would like to present some of them.
Of course, you always have to be careful here: hardly any major development came about through a single impulse or a single person. Most inventions would have been made at a different time by a different person for a different reason. And sometimes it may also be about urban legends, of which it is no longer possible to verify whether they actually happened.
Alexander Graham Bell is credited with inventing the telephone. His way to this development was through his mother, who was hard of hearing. Apparently, she inspired him with the idea of converting sounds into electrical impulses and transmitting them over distances.
In addition to Bell's undoubted merits in science, he also played a not entirely creditable role in the spread of sign language. But that's another topic.
SMS (Short Message Service)
The Finn Matti Makonen developed SMS in the 90s , among other things, so that the deaf could communicate with each other and with the hearing, probably unaware that it would become almost as important as the actual phone call.
web and email
The hard of hearing and the deaf have also played an important role for the Internet and e-mail.
Vinton Cerf, now 75, was actively involved in the development of the first Internet and e-mail protocols. He himself is hard of hearing. The Internet and e-mail enabled non-verbal communication with other persons, which certainly motivated him to become involved with the Internet. Among other things, he wanted to be able to communicate with his deaf wife via e-mail.
Audio books were already old hat for us blind persons 20 years ago. Today they belong to an important segment of the book market.
In 1931, the American Foundation for the Blind and the Library of Congress launched the first audio book program for the blind . Funnily enough, one does not speak of audiobook, but of talking book.
Today there are around 60,000 German-language audio books especially for the blind, over 100,000 commercial German audio books and who knows how many English audio books.
The typewriter shapes our everyday lives to this day. The PC keyboard on the desk and the keyboard on the smartphone - both are inspired by the typewriter in terms of the arrangement of the letters. And how should it be otherwise, behind their invention is a love affair.
In 1808, the Italian Pellegrino Turri developed the first working typewriter. His lover, the blind Countess Carolina Fantoni da Fivizzono should be able to write him love letters independently. One of these letters has survived to this day, which is probably not the case for all too many love letters.
The teletext is also impossible to kill. Videotext, or teletext, was developed by BBC staff in 1974. They were looking for a way to display subtitles on and off for the deaf and hard of hearing on the television. It quickly became apparent that not only subtitles but entire pages of text could be transmitted.
Thanks to Alexa, Siri and telephone dialog systems, everyday life is hardly imaginable without voice output. But blind persons have been working with them for decades.
In 1986 the American Jim Thatcher developed for his blind friend Dr. Jesse Wright developed a system that would make the computer accessible to him, the IBM screen reader. To date, screen readers are the software used by blind persons when using technology. They output content as speech or Braille.
Many persons complain about the synthetic sound of popular programs. But if you take a closer look at it, you will find that the synthetic voices are sometimes superior to the natural-sounding voices. The factor of interest to the sighted is prosody, or vocal melody, that is, the intonation on certain syllables that makes a sentence sound natural. Anyone who listens to the typical train announcements or has a text read out by ReadSpeaker will immediately notice what I mean. The words are read out incoherently and the pauses between the words sound not natural, which quickly makes the language sound unnatural.
Automatic speech recognition is also almost old hat. It has long been used by many disabled persons to control their computers and dictate longer texts.
Siri and other systems have only simplified this process: With the classic systems like Dragon Natural Speaking, command sets have to be memorized and the systems have to be trained. Siri and Co. transfer the data to the Internet and, as far as I know, are not trained for an individual voice. That is why most virtual assistants also have problems with dialects and do not get better for the individual over time. And without an internet connection, they're usually useless, not to mention that you can't control the entire device with them.
Making with 3D printing, phrases and other tools has only been around for a few years outside of specialized production areas, right. p>
Tactile models have played an important role in schools for the blind for decades. They are used in physics, chemistry or biology, for example. Tactile models for atoms or organs are used. Of course, 3D printing has made things much easier and cheaper here too. But it has been possible for a long time. By the way: Many dedicated teachers of the 80s and 90s were very involved in this area and sometimes made tools in their free time to enable their students to teach them better.
Write texts faster
As a result of his AMS illness, Stephen Hawking gradually lost his physical abilities and, above all, his ability to speak. Walter Woltosz, Managing Director of Words Plus, had developed a system for his stepmother to enable her to communicate. She had ALS like Hawking. This system was adapted and further developed for Hawking.
Hawking used a mixture of augumented communication - as we would call it today - and speech synthesizers. He could select words from a list of terms, combine them into sentences and have them output via his voice.
Many persons use a similar technology on their smartphones today: the auto-complete, which often writes crap, but also the word suggestions, which significantly speed up typing on the smartphone.
Hawking and his illness have inspired many persons to make developments that should of course primarily help Hawking himself, but then also persons in similar situations.
The multi-touch screen
Touch screens have been around for a long time. The first touchscreen that understood gestures was developed by Wayne Westerman and John Elias. It was aimed at persons whose mobility was restricted by conditions such as mouse arm. The company of the two gentlemen was finally bought by Apple and you can guess three times which product Apple equipped with it.
Eye and touch-free technologies
Eye and touch-free technologies mean operating concepts that do not require a display or haptic contact. Input and output mostly take the form of language. Such technologies are useful in cars, for example, but have become widespread thanks to Alexa speakers.
Since blind persons cannot use a display, they have to rely on information being given to them in verbal form or in Braille. Other disabled persons cannot operate a display because of motor disabilities and therefore use other input methods such as speech. Interface designers could learn a lot from these groups.
prosthetics and plastic surgery
It is true that plastic and cosmetic surgery has a longer history. However, it experienced a real boom after the First World War.
Many soldiers and civilians lost parts of their bodies or suffered extensive burns. Surgery was used to help them. Prosthetics also developed during this period to replace body parts.
It is easy to forget that the transition from cosmetic surgery, with its sometimes dubious procedures, to plastic surgery is often fluid. Spina Birifida, for example – the open back – can now be treated during pregnancy or immediately after birth thanks to plastic surgery.
Since 20 years it will soon be celebrating its final breakthrough: heating systems that can be controlled from a distance, shutters that can be controlled with your voice, warning systems if you forget a pot on the stove. For various reasons, these systems have not yet become widely accepted.
Disabled persons have had such systems for a long time. They are of particular interest to paraplegics who are paralyzed from the neck down. They can't move anything apart from their heads and without such systems they are permanently dependent on help from others. Many systems were and are controlled entirely by voice and a central computer. In principle, they allow what home automation should also allow and has been doing since the 1990s.
Unfortunately, one has to say that these systems are primarily available to persons who have the necessary big money - and that's not so many of those affected. Those who want to automate their home can find cheaper solutions today, but they can still cost a five-figure sum. Alexa and Co. are useful in certain situations, but for large-scale automation you need significantly more and more expensive systems and conversion measures.
Emojis have become indispensable in everyday communication. But symbols have played an important role in communication for much longer. There are persons who cannot communicate verbally or using sign language because of a disability. They use methods of Augmentative communication. One branch of this communication deals with symbol-based communication. The person concerned uses individual symbols or combines several symbols to communicate with other persons. The symbols are then often output as speech by a special tool or mobile device. We find many of these symbols or symbol systems in the emojis.
Dark mode is old hat for the visually impaired and programmers. The so-called screen magnifier has been allowing the colors of the screen to be inverted for decades. It's just stupid that the colors of graphics are also reversed, so that you can only recognize what is shown on them with training. This can be remedied by special screen settings, which have also been possible with the high-contrast mode since Windows XP at the latest. Only the colors of the operating system are reversed, the images are spared.
Programmers have also been using a dark background for a long time, which can still be seen in command line tools such as PowerShell in Windows.
And it goes on
The source of inspiration is far from dried up. There are experiments with movement and eye control, brain-computer interfaces and other input methods. Some of them have been used by disabled persons for years or decades, but they can also be of interest to non-disabled persons. The gaming industry is already demanding new ways of interacting with computer games.