Why you should be careful with Studies in digital accessibility

In recent years, the number of junk studies has increased massively: These are pseudo- or even completely unscientific studies, which accordingly only produce nonsense. This is also playing an increasing role in accessibility.

Junk studies are investigations of various formats that are supposed to confirm or refute a certain statement. You always have to think of the inverted commas for study, the word is supposed to give the whole thing a scientific or at least objective look.

Article Content

What junk studies have in common

These studies have certain features in common:

  • The statement that is to be confirmed or refuted is already fixed in advance. Science is generally open; if the desired result does not come out, one does not keep calculating until it does fit.
  • The study design is flawed. If I ask website operators whether they consider accessibility, the result will be different than if I look at their websites.
  • The methodology is flawed: I work with non-representative samples, select respondents according to my own criteria, use leading questions and so on.
  • The results are misrepresented or distorted.

Surveys/Studies are created for PR

Practically all such studies and surveys are about PR, not to gain new insights. Either one simply wants to generate a few headlines or sell an offer. You can find junk like this on press release portals almost every hour.


Let's look at a few examples. The latest example is the company Sapera Studios - never heard of it? Neither have I. Somehow in Germany they investigated the biggest webshops and found out all kinds of nonsense. The commented errors that can be found in the text:

  • Font sizes and contrasts are not to be set via the website, but by the user.
  • Video content should not play automatically, this disturbs the speech output and triggers persons with seizure disorders.
  • Comprehensible language is not yet regulated in the guidelines. A provider like Conrad is aimed at a specialist audience anyway.
  • The methodology of the analysis is not even noted here. If anyone finds it, I would be happy to hear about it.

Conclusion: Nobody claims that web shops are nearly accessible. However, the Sapera study does not help us and we can hope that no one follows the recommendations there. I would recommend not to take advice from this agency.

The German Ueberwachungsstelle für Barrierefreiheit

Even federal institutions are not protected from such nonsense. Last year, the federal monitoring agency published a study on gender-equitable language. In the end, a - in my opinion not very experienced - person with screen readers had looked at different combinations of signs and persons in charge from associations for disabled persons were interviewed. They could have just guessed, that would have taken less time. The study design does not yield any useful findings, which is unfortunately true of many studies in this field.

WeAIM Million

I have already taken apart the WebAIM study. Just in a nutshell: One derives the accessibility of a large number of websites from an automatic testing tool without weighting problems. Whether 1 or 100 errors, whether a small quirk or a huge error, for WebAIM it's all the same. In my opinion, this does not help the discourse on accessibility.

Headlines and Summarys are not enough

The problem is that hardly anyone actually takes the trouble to look at the studies more closely. Most persons only read the headlines and maybe the summaries. The nonsense is then shared on social media with a click, truly no place for deep discourse.

You don't really have to be deep into empirical research to question the appropriateness of methods, study groups or premises. My recommendation is therefore to ignore such research altogether if one does not have the time to examine it more closely.

More on Data