Making Sense of Google Updates, Being a Quality Rater & the Future of SEO
While the traditional approach to SEO may survive for awhile, marketers need to branch out and create content that can't easily be replicated by AI.
No website hit by Google's Helpful Content Update (HCU) last year has recovered so far. It also now appears that the distinct HCU classifier is gone and has become part of Google's Core Update.
There's lots of advice circulating about what marketers should do, notwithstanding Google's repeated (but vague) statements about creating content for users and not trying to impress or "show Google" that you're doing what you think it wants.
In our 150th podcast we spoke with well-known SEO Cyrus Shepard about his recent on-page ranking-correlation study, his experience as a Google Quality Rater and his take on the future of SEO. You can listen to the full recording or read on. The following are edited excerpts of Shepard's remarks.
On-Page Ranking Correlation Study
Google says create helpful content but nobody understands what's going on. And to this day we've seen shockingly few recoveries. People don't understand what Google means when they say "helpful content" and Google themselves are very opaque about it so I just started collecting sites and trying to get a correlation between factors that Google seemed to be rewarding and demoting.
We've seen these traditional ranking factor studies over the years. They all do word count, how many incoming links, what's the domain authority of the site. And those are all what we'd call [ranking] factors. But with Google moving into more machine learning, I wanted to look at things that quality raters would actually be looking at, things that are hard to quantify. A lot of them were just human value judgments.
One of the obvious things was the use of ads – really heavy ads use. That was one of the largest negatively correlated factors. Anything with a fixed ad that covered the content when you scroll; that increased your chances of getting hit by the HCU by like 50% to 60% in some cases.
I [also] looked for first-person pronouns and first-hand experience. Are they describing something that they did themselves? Do they have unique pictures? Are they explaining a process? How they made a recipe or something like that. And those were the highest correlated factors; first-person pronouns I think was the highest correlated factor.
A lot of contact information [was correlated with better visibility]. Google and their quality raters talk about looking at who is responsible for the website, who created the content, does this person have expertise? Do the authors have their email address, things like that? So contact information was highly correlated. (I don't think these are ranking factors. It's kind of like that firsthand experience just trumped everything else.)
Use of stock images [was negatively correlated with visibility]. When I say stock images, I mean, any images that were not original, even if they were licensed from a stock image site or that taken from somebody's Instagram page. They only were qualified as non-stock images if they were completely original.
Does this page demonstrate firsthand experience or does it not? That's something that I couldn't feed into a computer and get a yes/no answer. So I just had to check a box here, just like a Google Quality Rater would. A lot of it was like, what would I be looking at if I was trying to judge these sites on experience?
Observations and conclusions (more here):
- Firsthand experience: you want to create content that not anybody else can create just by doing a lot of research on the internet.
- I think they were responding to all the SEO tools that teach you how to pump out content that looks like everybody else. And they're trying to reward human based signals. So that's the big one: human based content.
- I'm advising my clients these days, just get rid of all your stock images, replace them with something else, do something unique.
Being a Quality Rater
I wanted to become a quality rater because I suspect that Google is using quality rater data much more extensively than they let on. They currently have 17,000 quality raters around the world. They're cutting the workforce by 3,500 in April. They sort of downplay it if you listen to their public statements. But I think they're probably using it much more extensively than that.
With machine learning they're undoubtedly feeding some of these example websites, that quality raters rate, into their deep learning algorithms to reward this, demote that. And I wanted to learn more with that process was like.
When I started the job, I was thinking like an SEO. I would be like, "Whoa, this page has keywords." When I stopped thinking like an SEO and I started thinking, "What sites would an elderly parent or grandparent who's not internet savvy trust?", that's how you want to approach working as a quality rater. I think that's why Google hires humans to do this job, because those are very human signals are really hard to do with machine learning or traditional ranking signals.
We have to consider that Google probably is looking at user interaction data in the Helpful Content Update: seing if people are clicking on a result, the pogo sticking, bounce rate, whatever you want to call it. Are people satisfied with what they're searching for online? Is it actually helpful?
In the [Google default search] anti-trust trial, I think that they testified RankBrain uses 13 months of click data. And if they're using months of click data for the Helpful Content Update, it may explain why we've seen shockingly few recoveries.
I [also] think if more people realized sites were being rated on mobile, they might pay a little more attention to that experience.
The Future of SEO
We've seen over the last 10, 15 years, Google used to be the gateway to the internet. Now it's a gateway to a product. You're looking at a product that Google is delivering to you – and the web is somewhere else.
I think when we say the internet is broken we're talking about the gateways to the internet: the TikToks, Instagrams, Facebook, Google. These gateways are all ruled by algorithms and all send us to a particular experience and a particular product. But I think creators are doing better than they ever have been.
Because Google has productized the experience, that product is more easily replicable by AI. They've created something that can be replaced by a different experience in a few years. And we're going to see a big shift; I don't know what's gonna happen, but it's being reshaped.
I think the model that we've had over the last few years – where you do your research on Ahrefs or Moz or Semrush, find some keywords then you use another tool to optimize your content and it looks like all the other content on the web – that sort of model has an expiration date on it. It can't last forever, because that process can be easily replicated by AI.
But I still think 99% of people can continue the traditional SEO model that we've seen work and keep investing in those channels, at least for a while, but start to think about branching out.
We need more people creating original content with original perspectives. Try to think about the things that can't be replicated by AI, such as unique video content, unique graphics research. Maybe shift a little bit towards the original research and expertise if you're not doing that already. But otherwise, keep doing what you're doing for now.