Search engine optimization SEO and digital Accessibility
Search engine optimization (SEo) is almost as old as the World Wide Web. There are many cross-links between accessibility and SEO. Find out why SEO managers should also care about the accessibility of their web projects. In this post I will only go into Google because Google is almost synonymous with search engine and its competitors use similar algorithms.
Article Content
- Semantics/Machine Readability
- Alternative texts for SEO
- Clean and lightweight code
- Mobile first
- Information architecture and usability
- Comprehensible language
- trustworthiness and authority
- Device independent inputs
- Subtitles and transcripts make video and audio content accessible
- Could accessibility become an SEO factor?
- Conclusion
- Online Editing and Content Accessibility
Semantics/Machine Readability
Despite all the talk about artificial intelligence, search engines and other technologies still rely on machine-readable information. Like a screen reader, Google cannot classify a section of text as a heading or paragraph. However, headings are important for both humans and machines: it is about weighting the relevance of a text to a specific topic. The more important keywords appear in a headline, the more important is the text on the respective topic - at least according to Google. Because of this, text is weighted more heavily when it's in the HTML heading tags than when it's in body text.
Machine-readability is becoming more and more important: The better data such as addresses, event dates and the like can be recognized by Google, the better the website’s chances of a good ranking.
When it comes to machine readability, tables are often forgotten. They are ideal data feed for machines.
Strangely enough, hardly any of the SEO gurus talk about using semantic markup as promoted by Schema.org. Assistive technologies (AT) would also benefit from this, although admittedly I'm not yet aware of any AT that processes schema data.
ARIA and HTML5 structures such as nav, footer, main and so on, on the other hand, do not seem to have any role for Google's weighting so far - at least they say so, but it can also change.
Another advantage of clean and consistently used HTML is that information units can be better differentiated automatically. The macro semantics knows navigation, content and footer. In micro-semantics, units such as paragraphs can contribute to better recognition of related information.
Alternative texts for SEO
Google now provides automatically generated image descriptions for images in its own Chrome browser if they do not have a description for the blind. I assume that is a by-product of crawling: Google is of course very interested in the content of images and uses this to train its image recognition. It can match the alternative texts created by humans with what the algorithm recognized - this is classic machine learning.
A German SEO had been concentrating on optimizing the images for the search engine for a while. Because Google today often not only displays pure text results, but also images, text descriptions play an important role for SEO.
Clean and lightweight code
Websites have grown in size in terms of storage space over the past few years. Some pages add several megabytes to the virtual scales - by the way, this is also an ecological problem, because an unnecessarily large amount of storage space and bandwidth is used. An optimized website can be 200 kilobytes with HTML, CSS, Javascript and a few optimized images. This reduces the loading time for both Google and assistive technologies. Valid and modern code is also important in order to make the display clean for the browser.
The consensus in the SEO scene is that page speed has become one of the most important on-site factors for Google. Of course, this has to do with the dominance of smartphones: Nobody wants to wait several seconds for the first content to load. An important factor for loading speed is leaner and cleaner code. This means, among other things, that CSS and Javascript are stored in their own files and not in the code of every subpage. This is also a basic requirement of accessibility - the separation of behavior, structure and design. This not only reduces the loading time for AT, but also enables the content to be displayed more flexibly.
Incidentally, Google also has an economic interest in lean websites: Anyone who has to crawl several hundred million websites every day is happy about every kilobyte saved. Although Google has a lot of computing power at the start, they are of course also happy if they can save a few large data centers or use them for other fee-based services.
The two best-known portals for web accessibility in Germany are Einfach-fuer-alle.de and accessible-webdesign.de. Both sites rank high for many accessibility search terms and perform excellently in a speed test, each scoring 91 out of 100 in the Google Speed Insights test. Even if digital accessibility is not one of the mega-topics, there are alternative portals.
Mobile first
Mobile first - i.e. the preferred design of websites from the aspect of smartphone friendliness - is popular among web developers today. No self-respecting designer creates their own version for smartphones and desktops these days. One fits all is the order of the day, the website should be displayed well on displays of different sizes. Google now prefers to crawl the mobile version of the website and its quality is one of the most important factors in weighting when displaying search results. This is also old hat when it comes to accessibility. For decades it has been demanded that websites can be presented on different displays without any problems and zooming and text enlargement is one of the core requirements of accessibility. In WCAG 2.1, the requirement was added that a website should be usable both vertically and horizontally.
Information architecture and usability
What was described above is old hat in a way: Google has only shifted the emphasis because many websites have become structurally very similar. The last step was switching the weighting to Mobile First, but that was a few years ago, too. The next step will be to automatically analyze the information architecture and usability on websites. There are human quality raters who check websites manually. However, these are relatively few persons and websites. In the freely accessible Google Playbooks you can read how Google imagines this. It is likely that factors such as semantic segmentation, readability, comprehensibility of the content, clean structuring, good machine translatability and a lean website will also play a major role here.
Comprehensible language
As far as I know, Google and Co. are still nowhere near able to "understand" a text as a human would understand it. This means that you still rely on statistics and computing power. But even with this a lot can be achieved. You don't have to be a stylist to realize that an eight-word sentence tends to be more understandable than a 30-word sentence with five punctuation marks. A text that is structured in HTML with paragraphs, headings, tables and lists is more likely to come from a professional than a text that only consists of visually differently formatted texts. The issue of speech intelligibility could become increasingly important. On the one hand, easily understandable texts increase the length of stay. On the other hand, they are easier to automatically translate.
A very simple factor is separating words with hyphens. A challenge in the German language are composites, i.e. long, compound words. This creates very long, new words that Google may not recognize. The hyphen on such words improves human and machine readability.
trustworthiness and authority
Another factor to be considered is the trustworthiness and authority of a website. This is particularly important for sensitive topics such as health and finances, the SEO scene also speaks of YMYL - Your Money, Your Life. Pretty much all of the major algorithm updates of the last few years have made their way onto health-related sites, sometimes with fluctuations of around 30 percent in visibility and visitor numbers.
With current tools, the trustworthiness and authority of a website is difficult to analyze automatically - at least today. You will probably primarily look at established brands like Amazon, Wikipedia, Mercedes and so on. Also, I'm pretty sure that Google gives more weight to public authority websites on a topic than corporate or NGO websites. You can read all about it in Google's Quality Rater Guidelines.
Trustworthiness is a soft factor of accessibility, as I put it in my book. Internet newbies in particular are at risk of falling for fraudulent websites. In this respect, the fact that questionable or fraudulent websites are weighted less or are removed from the index also has an indirect positive effect on accessibility. This is unfortunate for smaller web portals, they have less power to build a brand and the corresponding awareness. It will be exciting to see how Google's analytical capabilities regarding information architecture, usability, trustworthiness and authority will develop. And how the whole thing will affect the SEO scene and ultimately the quality of the website.
Device independent inputs
Google is increasingly forcing webmasters to optimize their mobile sites. Those who use the Search Console are more likely to get indications of poor usability, for example, it is said that buttons are too small or too close.
Here, too, the connection to accessibility is obvious. A website should be independent of a specific input device: It shouldn't matter whether a website is controlled by mouse, keyboard or touch. Anyone who has optimized their website for mouse clicks will sooner or later have problems with smartphone users.
Subtitles and transcripts make video and audio content accessible
Podcasts and videos are a challenge for search engines. Automatic speech recognition has made great strides. But it's far from optimal.
However, those who use closed subtitles or transcripts write down what is said and thus ensures that videos and podcasts can be weighted more easily by search engines. It would be interesting to see if videos with similar content rank better on YouTube if they contain validated subtitles.
Could accessibility become an SEO factor?
Some elements of accessibility contribute to SEO. However, accessibility in itself is not a factor for the weighting of a website, as Google itself announced.
However, the idea is not so far-fetched: the gap between usability, information architecture and accessibility is not that big.
Also, since there are stricter accessibility rules in Google's home country, Google could rank accessible websites higher for prestige reasons - prestige for itself, of course. In addition to those mentioned above, there are a few other factors such as the existence of an accessibility declaration, sign language videos, texts in plain language and many more that can be easily determined and weighted algorithmically.
Conclusion
This article is by no means intended as adulation to Google. Such is the power of this private company that all website operators that depend on traffic must submit to its rules. So Google can rebuild the web according to its ideas. In addition, Google's logic includes discriminating against small website operators compared to large brands. Google and YouTube also play an inglorious role in spreading hate speech, bullying and conspiracy theories. I don't even want to open the topic of data protection.Still, there's no denying that Google's notions of a good website could contribute positively to web accessibility. Google may eventually consider accessibility as a ranking factor as well.