Google Search Advocate John Mueller said, “search is never guaranteed,” in response to a site owner asking why their content isn’t indexed.
In the r/TechSEO forum on Reddit, a user is dissatisfied that a website re-publishing identical content is getting indexed faster.
The Reddit user posts a thread asking if they can use the Indexing API to brute force their way into Google’s index.
Mueller shoots that idea down, saying the Indexing API is reserved for specific types of content, such as livestreams and job postings.
He follows up with the statement that search isn’t guaranteed and offers insight into what Google prioritizes regarding indexing.
Mueller On What Makes Content worth Indexing
Mueller has previously stated sites need to be worthwhile to be indexed, and he shares that sentiment again on Reddit.
Further, Mueller goes over several criteria Google’s algorithm looks for when considering whether the content is worth indexing:
“Search is never guaranteed, and there are tons of sites that are trying to push their updates into Google. I think what ultimately works best is that you prove to Google (and users) that the updates you’re providing are valuable: unique, compelling, high-quality, and not something that’s already published elsewhere.”
For context, the Reddit user runs a website that publishes obituaries.
In the next part of his response, Mueller acknowledges it can be challenging to ensure content contributed by other people meets Google’s criteria.
That point can’t be overstated regarding obituaries when Google Search is the last thing people think about.
However, the whole site is within the Reddit user’s control, and they’re not limited to publishing contributed content.
Mueller tells the site owner to think of ways to make their website more valuable, increasing the likelihood that Google indexes pages quickly.
“I realize that’s hard when it comes to user-generated content (which I assume some of this will be), but ultimately your site is what you publish, regardless of where it initially comes from. So the more you can do to make sure the indexable content on your site is easily findable and significantly valuable to the web, the more likely Google will be able to pick it up quickly (and that can mean that you block content that you determine is less-valuable from being indexed, for example).”
When Google indexes re-published content faster than the original version, it perceives the source’s website as less valuable than the site copying its content.
There’s no easy solution to that issue, such as using Google’s Indexing API. The Reddit user will have to make changes to their website to prove it’s worth getting crawled and indexed faster.
That said, Mueller suggests a slight change all sites can make to signal to Google there’s new content available.
Linking to new website content from the home page can get it on Google’s radar faster, Mueller says:
“One of the things even smaller, newer sites can do is to mention and link to updates on their homepages. Google usually refreshes homepages more frequently, so if there’s something important & new, make sure you have it there. Many sites do this intuitively, with a sidebar or a section for updates, mentioning the new headlines & linking to the content.”
While that will help get content noticed faster, there’s still no guarantee Google will index it more quickly.
Source: Reddit
Featured Image: JHVEPhoto/Shutterstock