Google’s Martin Splitt questioned the usefulness of specific suggestions made by SEO auditing tools, noting that while some advice may be valid, much of it has little to no impact on SEO. He acknowledged that these audits can be valuable for other purposes, but their direct influence on SEO is limited.
Automated SEO Audits
There were two hosts of this month’s Google SEO Office Hours, John Mueller and Martin Splitt. It sounded like the person answering the question was Martin Splitt and the technical level of his answer seems to confirm it.
The person asking the question wanted to know what they should proceed with suggestions made by automated SEO tools that suggest changes that don’t match anything in Google’s documentation.
The person asked:
“I run several free website audits, some of them suggested me things that were never mentioned in the search central documentation. Do these things matter for SEO?”
Martin Splitt On Automated SEO Audits
Martin’s answer acknowledged that some of the suggestions made by SEO audit tools aren’t relevant to SEO.
He answered:
“A lot of these audits don’t specifically focus on SEO and those that don’t still mention a bunch of outdated or downright irrelevant things. unfortunately.
I’ll give you some examples. The text to code ratio, for instance, is not a thing. Google search doesn’t care about it.”
Text to code ratio is an analysis of how much code there is in comparison to how much text is on the page. I believe there was a Microsoft research paper in the early 2000s about statistical analysis of spam sites and one of the qualities of spammy sites that was noted was that there was more text on a typical spam page than code. That might be where that idea came from.
But back in the day (before WordPress) I used to create PHP templates that weighed mere kilobytes, a fraction of what a typical featured image weighs, and it never stopped my pages from ranking, so I knew first-hand that text to code ratio was not a thing.
Next he mentions minification of CSS and JavaScript. Minification is condensing the code by reducing empty spaces and line breaks in the code, resulting in a smaller file.
He continued his answer:
“CSS, JavaScript, not minified that you got apparently as well is suboptimal for your users because you’re shipping more data over the wire, but it doesn’t have direct implications on your SEO. It is a good practice though.”
SEO Is Subjective
Some people believe that SEO practices are an objective set of clearly defined with black and white rules about how to “properly” SEO a site. The reality is that, except for what Google has published in official documentation, SEO is largely a matter of opinion.
The word “canonical” means a known standard that is accepted and recognized as authoritative. Google’s Search Central documentation sets a useful baseline for what can be considered canonical SEO. Official documentation is the baseline of SEO, what can be agreed upon as what is verified to be true for SEO.
The word “orthodox” refers to beliefs and practices that are considered traditional and conventional. A large part of what SEOs consider best practices are orthodox in that they are based on beliefs and traditions, it’s what everyone says is the right way to do it.
The problem with orthodox SEO is that it doesn’t evolve. People do it a certain way because it’s always been done that way. A great example is keyword research, an SEO practice that’s literally older than Google but practiced largely the same way it’s always been done.
Other examples of decades-old SEO orthodoxy are:
- Meta description should be under 164 words
- Belief that keywords are mandatory in titles, headings, meta description and alt tags
- Belief that titles should be “compelling” and “click-worthy”
- Belief that H1 is a strong SEO signal
Those are the things that were important twenty years ago and became part of the orthodox SEO belief system, but they no longer impact how Google ranks websites (and some of those never did) because Google has long moved beyond those signals.
Limitations Of Google’s Documentation
Martin Splitt encouraged cross-referencing official Google documentation with advice given by SEO auditing tools to be certain that the recommendations align with Google’s best practices, which is a good suggestion that I agree with 100%.
However, Google’s official documentation is purposely limited in scope because they don’t tell SEOs how to impact ranking algorithms. They only show the best practices for optimizing a site so that a search engine understands the page, is easily indexed and is useful for site visitors.
Google has never shown how to manipulate their algorithms, which is why relatively noob SEOs who analyzed Google’s Search Quality Raters guidelines fell short and eventually had to retract their recommendations for creating “authorship signals,” “expertise signals” and so on.
SEJ Has Your Back On SEO
I’ve been in this business long enough to have experienced firsthand that Google is scrupulous about not divulging algorithm signals, not in their raters guidelines, not in their search operators, not in their official documentation. To this day, despite the so-called leaks, nobody knows what “helpfulness signals” are. Google only shares the general outlines of what they expect and it’s up to SEOs to figure out what’s canonical, what’s outdated orthodoxy and what’s flat out making things up out of thin air.
One of the things I like about Search Engine Journal’s SEO advice is that the editors make an effort to put out the best information, even if it conflicts with what many might assume. It’s SEJ’s opinion but it’s an informed opinion.
Listen to the question and answer at the 11:56 minute mark:
[embedded content]
Featured Image by Shutterstock/Ljupco Smokovski