
The web – and the way in which humans interact with it – has definitely changed since the early days of SEO and the emergence of search engines in the early to mid-90s. In that time, we’ve witnessed the internet turn from something that nobody understood to something most of the world cannot operate without. This interview between Bill Gates and David Letterman puts this 30-year phenomenon into perspective:
[embedded content]
The attitude 30 years ago was that the internet was not understood at all and nor was its potential influence. Today, the concept of AI entering into our daily lives is taken much more seriously to the point that it is something that many look upon with fear – perhaps now because we [think] we have an accurate outlook on how this may progress.
This transformation isn’t so much about the skills we’ve developed over time, but rather about the evolution of the technology and channels that surround them. Those technologies and channels are evolving at a fast pace and causing some to panic over whether their inherent technological skills will still apply within today’s Search ecosystem.
The Technological Rat Race
Right now, it may feel like there’s something new to learn or a new product to experiment with every day, and it can be difficult to decide where to focus your attention and priorities. This is, unfortunately, a phase that I believe will continue for a good couple of years as the dust settles over this wild west of change.
Because these changes are impacting nearly everything an SEO would be responsible for as part of organic visibility, it may feel overwhelming to digest all these things – all while we seemingly take on the challenge of communicating these changes to our clients or stakeholders/board members.
But change does not equal the end of days. This “change” relates to the technology around what we’ve been doing for over a generation, and not the foundation of the discipline itself.
Old Hat Is New Hat
The major search engines have been actively telling you, including Google and Bing, that core SEO principles should still be at the forefront of what we do moving forward. Danny Sullivan, former Search Liaison at Google, also made this clear during his recent keynote at WordCamp USA:
[embedded content]
The consistent messages are clear:
- Produce well-optimized sites that perform well.
- Populate solid structured data and entity knowledge graphs.
- Re-enforce brand sentiment and perspective.
- Offer unique, valuable content for people.
The problem some may have is that the content we produce is moreso for agents than for people, and if this is true, what impact does this make?
The Web Is Splitting Into Two
The open web has been disrupted most of all, with some business models being uprooted by taking solved knowledge and serving it within their platform, appropriating the human visitor, which they rely on for income.
This has created a split from a complete open web into two – the “human” web and the “agentic” web – where these two audiences are both major considerations and will differ from site to site. SEOs will have to consider both sides of the web and how to serve both – which is where an SEO’s skill set becomes more valuable than it was before.
One example can be seen in the way that agents now take charge of ecommerce transactions, where OpenAI announced “Buy it in ChatGPT,” where the buying experience is even more seamless with instant checkouts. It also open-sourced the technology behind it, Agentic Commerce Protocol (ACP), and is already being adopted by content management system (CMS), including Shopify. This split between agentic and human engagement will still require optimization in order to ensure maximum discoverability.
When it comes to content, ensure everything is concise and avoid fluff, what I refer to as “tokenization spam.” Content isn’t just crawled; it’s processed, chunked, and tokenized. Agents will take preference to well-structured and formatted text.
“Short-Term Wins” Sounds Like Black Hat
Of course, during any technological shift, there will be some bad actors who may tell you about a brand-new tactic that is guaranteed to work to help you “rank in AI.” Remember that the dust has not yet settled when it comes to the maturity of these assistance engines, and you should compare this to the pre-Panda/Penguin era of SEO, where black hat SEO techniques were easier to achieve.
These algorithm updates closed those loopholes, and the same will happen again as these platforms improve – with increased speed as agents understand what is truly honest with increased precision.
Success Metrics Will Change, Not The Execution To Influence Them
In reality, core SEO principles and foundations are still the same and have been throughout most changes in the past – including “the end of desktop” when mobiles became more dominant; and “the end of typing” when voice search started to grow with products such as Alexa, Google Home, and even Google Glass.
Is the emergence of AI going to render what I do as an SEO obsolete? No.
Technical SEO remains the same, and the attributes that agents look at are not dissimilar to what we would be optimizing if large language models (LLMs) weren’t around. Brand marketing remains the same. While the term “brand sentiment” is a term used more widely nowadays, it is something that should have always been a part of our online marketing strategies when it comes to authority, relevance, and perspective.
That being said, our native metrics have been devalued within two years, and those metrics will continue to shift alongside the changes that are yet to come as these platforms deliver more stability. This has already skewed year-over-year data and will continue to skew for the year ahead as more LLMs evolve. This, however, could be compared to events such as replacing granular organic keyword data with one (not provided) metric within Google Analytics, the deprecation of Yahoo! Site Explorer, or devaluation of benchmark data such as Alexa Rank and Google PageRank.
Revise Your Success Metric Considerations
Success metrics now have to go beyond the SERP into visibility and discoverability as a whole, through multiple channels. There are now several tools and platforms available that can analyze and report on AI-focused visibility metrics, such as Yoast AI Brand Insights, that can provide better insight into how your brand is interpreted by LLMs.
If you’re more technical, make use of MCPs (Model Context Protocol) to understand data more via natural language dialogs. MCP is an open-source standard that lets AI applications connect to external systems like databases, tools, or workflows (you can visualize this as a USB-C port for AI) so they can access information and perform tasks using a simple, unified connection. There are several MCPs you can work with already, including:
You can take this a step further by coupling this with a vibe coding tool such as Claude Code, where you can use it to create a reporting app that uses a combination of the above MCP servers in order to extract the best data and create visuals and interactive charts for you and your clients/stakeholders.
The Same But Different … But Still The Same
While the divergence between human and agentic experiences is increasing, the methods by which we, as an SEO, would optimize for them are not too dissimilar. Leverage both within your strategy – just as you did when mobile gained traction in the same way.
More Resources:
Featured Image: Vallabh Soni/Shutterstock