Google Confirms You Can’t Add EEAT To Your Web Pages via @sejournal, @martinibuster

Google’s John Mueller offered an overview of EEAT (Experience, Expertise, Authoritativeness, and Trustworthiness) at the Search Central Live NYC event and affirmed why it matters for some sites and why it’s not something SEOs can add to a website.

EEAT’s Relation To Quality Raters And YMYL Websites

John Mueller started this part of his discussion by explicitly tying the concept to its use as a way for the third party quality raters to provide a more objective judgment about the quality of the search results. He did not say that EEAT was created for SEOs to use as a ranking factor guide, in fact he expressly said that’s not how it works.

What is especially notable is that Mueller says that EEAT comes into play algorithmically for sites that are in topics that affect health or finance, what Google terms Your Money Or Your Life (YMYL) topics.

This is what he said, according to my notes, which contains some paraphrasing:

“EEAT is one of the ways that we look at page quality. EEAT is experience, expertise, authoritativeness and trustworthiness. And this is something that we tell the third party quality raters to watch out for when they’re doing page quality evaluation and something that we take into account when we think the query or a set of pages is on a specific topic where it’s more critical, where we call them your money, your life pages. Where we think that the user actually needs to have something that they can rely on and some signs that they can rely on the content that is present.”

EEAT Is Not Something You Add To Web Pages

In his follow-up statements he dismissed the idea that an SEO can add EEAT to their web pages. EEAT is not something you can add to a website. That’s not how it works. So if adding EEAT is part of what you do for SEO, stop. That’s not SEO.

This is what Mueller said:

“Sometimes SEOs come to us or like mention that they’ve added EEAT to their web pages. That’s not how it works. Sorry, you can’t sprinkle some experiences on your web pages. It’s like, that’s that doesn’t make any sense.”

Photo From Google Search Central Live NYC

EEAT Is Not Needed On Non-YMYL Pages

Lastly, Mueller repeated the point that EEAT is not something that they’re looking for in run of the mill websites. Obviously it’s great if the content has expertise and trustworthiness and so on. But he said it’s not something they’re algorithmically alert for on those kinds of sites, specifically naming recipe sites.

This is what he said:

“From a practical point of view, it’s important to look at this, especially if you’re publishing things on these critical topics, and to look at how you can highlight what it is that you’re already doing so that it’s clear for users.

But if you’re creating a recipe for cookies, you don’t need to have the sidebar with like, ‘this author has created cookies for 27 years.’ I think most people will be able to understand.”

Takeaways

EEAT’s Purpose and Scope

EEAT is used by third-party quality raters to assess search result quality. It was not created by Google as a list of ranking factors for an SEO checklist.

EEAT’s Role in YMYL Topics

Google algorithmically considers EEAT for pages that affect users’ finances or health, which is  referred in the Quality Raters Guidelines  Your Money or Your Life (YMYL) topics. These are the topic areas where reliability and trust are critical for user safety and confidence and where Google is especially concerned that those qualities are expressed in some way internally and/or externally about those sites. Google doesn’t say what those signals are.

Misconceptions About EEAT in SEO

John Mueller emphasized that EEAT is not something SEOs can “add” to a website the way they might add keywords or internal links. Attempting to “add EEAT” is a misunderstanding of how the concept works within search.

EEAT and Non-YMYL Websites

EEAT is not something that is required in an algorithmic context for non-YMYL sites, such as recipe blogs or other kinds of non-critical content. While it’s useful in a general or marketing sense to to reflect expertise and trust, it’s not a ranking focus for most topics.

EEAT is explicitly created for the third party quality raters to use as a more objective benchmark. That fact gets lost in all the conversations by SEOs about the topic of EEAT. It is also something that’s not particularly important for sites that are outside of YMYL topics. Lastly, EEAT is not something that an SEO can add to their page. Creating a bio with an AI generated image, linking it to a fake LinkedIn profile and then calling it EEAT is not a thing. Trustworthiness, for example, is something that is earned and results in people making recommendations (which doesn’t mean that SEOs should create fake social media profiles and start talking about an author at a website). Nobody really knows what the EEAT signals are.

Featured Image by Shutterstock/RealPeopleStudio

Google Updates Unfair Advantage Policy, Advertisers React via @sejournal, @brookeosmundson

On Friday, Google sent out a subtle but impactful policy update to advertisers, confirming changes to its long-standing “Unfair Advantage Policy”.

While the official enforcement date is April 14, 2025, the conversation has already started — and it’s anything but quiet.

The PPC community is buzzing with opinions, questions, and concerns. But this update didn’t come out of nowhere.

About a month ago, Google quietly laid the groundwork for this change without most people noticing.

Let’s unpack exactly what’s happening, why it matters, and how advertisers are reacting.

What Did Google Change?

The core of the update is about limiting how many ads a business, app, or site can show in a single ad location. Here’s Google’s new language:

Google email to advertisers about Unfair Advantage policy update.

The new language is crucial to understand.

The focus isn’t on restricting brands from showing multiple ads across different placements—it’s about stopping advertisers from stacking multiple ads in the same slot, which would effectively block competition and inflate dominance.

It’s not a total ban on multiple ads from the same advertiser showing on a single page, but rather a limit within a specific ad location.

However, as with many Google Ads policies, the phrase “single ad location” is doing a lot of heavy lifting—and advertisers are left wondering how Google will interpret and enforce it in practice.

One notable detail: Google says violations won’t lead to instant account suspensions. Advertisers will receive a warning and at least seven days to address any violations before facing suspension.

This is important. Google seems to be trying to strike a balance between tightening policy and giving advertisers room to adapt.

The Breadcrumb Many Missed – February Auction Documentation Update

Interestingly, this isn’t the first time Google has hinted at this shift.

Back in February 2025, advertisers noticed that Google updated its documentation on “How the Google Ads Auction Works”.

The update clarified that Google runs separate auctions for each ad location, meaning that the auction for the first position is distinct from the auction for the second, third, and so on.

Ginny Marvin, Google Ads Liaison, even acknowledged the change directly in LinkedIn discussions. This detail flew under the radar for many but now seems like a foundational piece for this official Unfair Advantage update.

Effectively, Google was setting the table a month ago. This policy update simply formalizes how those auctions will now prevent advertisers from “double-serving” or stacking ads in the same position.

Why Google Is Doing This, And Why Now

Google’s goal here appears twofold:

  1. Auction Fairness — Google wants to prevent scenarios where advertisers, affiliates, or large multi-account setups game the system by occupying multiple positions within a single auction.

  2. Affiliate Abuse Control — This rule directly calls out affiliates who break affiliate program rules, a growing concern in Google’s search ecosystem.

Of course, some advertisers suspect there’s a third goal: protecting the user experience and, more directly, protecting Google’s own long-term revenue by encouraging more advertisers to compete rather than allowing the largest players to squeeze others out.

Advertisers Give Mixed Reactions to Google Update

While this update was emailed to advertisers on Friday afternoon, marketers didn’t waste time sharing their takes on the update.

Andrea Atzori, who also received the email from Google, took to LinkedIn to provide his take on the update.

Atzori highlighted that this change is more about clarification than transformation, as he’d seen the same advertiser in multiple locations previously.

Navah Hopkins also took to LinkedIn with a more brief update, eager to hear thoughts from fellow marketers on the Unfair Advantage policy.

Hopkins and others noted that while the update may sound fair in theory, the proof will come in how it affects impression share, Auction Insights, and real-world campaign performance.

From the comments on Hopkin’s post, early reactions seem to lead towards skepticism and questions:

Chris Chambers commented:

This is going to be wild from a metric reporting standpoint since it seems like right now it counts as 2 impressions and also affects your impression share and position in Auction Insights (same with competitors).

But it also seems like now the advertisers with the most to spend in each niche will get even more real estate and be able to show twice, potentially cutting out smaller competitors completely from the first page.

Steve Gerencser had a similar take to Chambers:

I wonder how they are going to count people that pogo from one ad right back to the next and then back to something else? I can see a lot of wasted ad spend, or an opportunity for someone with deep pockets to dominate.

Some worry that well-funded advertisers will still find ways to dominate, while smaller brands hope this levels the playing field.

What Advertisers Should Watch For

While the policy may not seem earth-shattering at first glance, it does come with a few things advertisers should actively monitor.

First, smaller and mid-sized advertisers may stand to benefit, at least in theory. By limiting how many ads a single business can show in one location, Google could slightly reduce the dominance of big-budget brands that have historically owned the top of the page through multiple placements.

This could open up space for other players to get visibility where previously they were pushed out.

But, as several PPC pros pointed out on LinkedIn, the big question is how Google defines and enforces a single ad location in practice.

Google clarified last month that each ad location runs its own auction, meaning it’s technically possible for a brand to show up in multiple places on the same page—just not in the exact same slot.

So, while the policy aims to limit dominance, it doesn’t necessarily mean fewer total appearances for advertisers with deep pockets.

This also has potential ripple effects on Auction Insights reports. If Google starts filtering or limiting how often multiple ads from the same business appear in a given location, expect impression share metrics and overlap rates to behave differently—maybe even unexpectedly.

Advertisers will need to watch Auction Insights and Impression Share trends closely post-April to see if any patterns emerge.

Additionally, affiliate marketers and businesses using aggressive multi-account or multi-site strategies should be especially careful. The updated policy makes it clear that affiliates must play by their program’s rules and can no longer try to sneak multiple ads for the same offer into the same auction.

While Google says you’ll get a warning before any suspension, it’s probably wise to get ahead of this now, rather than risk a compliance issue later.

And finally, there’s still some ambiguity about multi-brand or franchise setups. If you’re managing a brand with multiple sub-brands, sister companies, or franchisees, the question remains: will Google treat you as one business under this policy or multiple?

That detail could make a big difference, especially for large organizations or verticals like automotive, real estate, or hospitality.

Final Thoughts: Is This Really a Game-Changer?

Honestly? It’s hard to call this a monumental shift yet. The update feels more like a formalization of existing enforcement patterns than a radical new rulebook.

That said, the PPC community is right to question what this will look like in Auction Insights and daily performance reports. Whether this is a minor tweak or the start of stricter anti-duplication policing will become clearer as advertisers see real-world data throughout Q2 and beyond.

Either way, it’s worth watching—especially if you’ve ever benefitted from, or competed against, someone taking up too much SERP real estate.

Google Says They Launch Thousands Of Updates Every Year via @sejournal, @martinibuster

Google’s John Mueller explained during a session of the Search Central Live NYC event that they do over 700,000 tests per year in order to keep up with user expectations. His explanation of why Google performs so many tests and launches thousands of changes should give SEOs an idea of the pace of change going on at Google and should inspire publishers and SEOs to consider ways that they too can take steps to anticipate user expectations and roll out changes to satisfy them.

Updates Are Not Done In Isolation

The first thing that Mueller said about updates is that they’re not done in isolation but rather they use a third party raters, a fresh pair of eyes, to evaluate their tests and new updates to their algorithms.

Mueller explained:

“So there is a lot of activity happening on the web and we kind of have to keep up with that as well.

How we look at things when it comes to updates, I think this is maybe a bit of a jarring transition here, but essentially when we work on changes with regards to search, one of the things that is core to all of the changes that we do is that we don’t do them in isolation just because we think they’re good, but rather that we find ways to actually test to make sure that they are actually good.

And one of the ways that we do that is we work together with so-called quality raters.”

Number Of Tests And Launched Updates

Google conducts a staggering number of tests every year and launches thousands of changes (updates).

Photo Showing Number of Google Updates Per Year

John Mueller said (includes paraphrasing):

“When it comes to changes that we do this number is from 2023. I imagine the number from last year is similar. We’ve made over 4,700 launches. And these launches come from over 700,000 tests and we make tests all the time. You can try to calculate like how many tests are running every day. If you assume that a test maybe runs for two weeks over the course of the day, like there are lots of tests that are happening in parallel.”

Google Says It’s All About User Expectations

Mueller offered an explanation of what motivates Google to do so many tests and launch so thousands of updates to the search results. He said it’s all about meeting user expectations.

This is what he said:

“And that also means that when people look at the search results, they see things that are sometimes a bit different.

From our point of view it’s not so much that we’re doing all of this work to keep making changes to make it hard for people to keep up, but rather because we see that users have very high expectations of the web and we want to make sure that whatever expectations they have tomorrow we can still kind of fulfill.”

Takeaway For Publishers And SEOs

Google’s not running hundreds of thousands of tests a year to confuse SEOs and publishers. They’re doing it to stay ahead of what users want before users even know they want it.

SEO has historically been reactive, which means that search marketers and publishers wait until Google announces an update and then they run back to their websites and “fix” whatever they think is broken. SEO eyes are always on Google when they should really be thinking ahead about how their consumers or site visitors are aging out or no longer reading blogs and whose habits might be changing. Do you have to wait until Google announces an accessibility update before you test if your site is usable for visitors on screen readers? Are client sites usable for people who are color blind or are you going to wait for an update? That’s reactive.

One of the reasons Google is number one in many things is because they didn’t wait for someone else to do it first. Before GMAIL all email providers gave their users email space measured in megabytes. Google killed their competition because they were offering users gigabytes of free space.

So maybe SEOs and publishers should scroll up and re-read the reasons that John Mueller gave to explain why Google does hundreds of thousands of tests and launches thousands of updates every year. If you’re not already being proactive then I really think that this is the year you start thinking about ways to do that.

Takeaways:

Google’s Testing Volume and Frequency

  • Google performs over 700,000 tests annually.
  • In 2023 alone, these tests led to over 4,700 changes to Search.
  • Tests often run in parallel, with many active at the same time.
  • This volume reflects a continuous, high-speed development cycle.

Why Google Runs So Many Tests

  • Google’s motivation for running so many tests is to anticipate user expectations.
  • Despite their setbacks with AI, the number of tests and changes is the reason why Google remains a formidable competitor.

Implications for SEOs and Publishers

  • Search marketers and publishers who want to keep up with Google should consider emulating Google’s approach to users and look for ways to anticipate user behavior, expectations, and trends.
  • Start testing and improving now rather than waiting for a Google update before accounting for shortcomings.
  • Consider a site audit by a fresh pair of eyes.
Google Shares Valuable SEO Takeaway About Quality Raters Guidelines via @sejournal, @martinibuster

At the recent Search Central Live NYC, Google’s John Mueller discussed the third-party quality raters they use to evaluate changes to Google’s search algorithms. Although it wasn’t stated explicitly, the nuance was implied: keeping a human in the loop remains an important factor in fine-tuning your SEO decisions.

Third Party Quality Raters

Hopefully by now everyone knows that Google employs third-party quality raters to review algorithm changes and provide feedback that can be used to judge the success of various algorithm updates and tests. They don’t actually affect the rankings of individual websites, their judgment are about the effectiveness of the algorithms which themselves affect hundreds of thousands and millions of sites across the Internet.

Ordinarily such judgment calls of whether a site is useful or not are highly subjective (a matter of personal opinion). That’s why Google created a set of guidelines for the quality raters to use so as to standardize the criteria the raters use and make their judgments more objective (like considering facts that are either true or false).

Here is, according to my notes, how John Mueller explained it:

“And one of the ways that we do that is we work together with so-called quality raters. These are external people who review the quality of search results, who review the quality of web pages to let us know, are we in a good place? Are we going in the right direction? Are the changes that we are working on actually making sense and acceptable for you?”

What’s notable about that exchange is that the whole point of judging the algorithms is whether or not they are acceptable to humans.

Mueller next introduced the topic of the quality raters guidelines and how it’s important for SEOs and publishers to read. In fact, he calls it important and encourages anyone concerned about ranking better to at least give it a scan for topics that may be important to the individual.

He continued:

“So we have a set of guidelines that we published for these quality raters, which I think is actually surprisingly important. It’s a gigantic book, something I don’t know, 180 pages long. But it’s a lot of guidelines where we kind of draw out what we think makes sense for quality raters to review with regards to the content. And this is publicly available. You can look at it yourself as well.

I think for most websites it makes sense at least have gone through it, or maybe control F and search through it for keywords that you care about just so that you have a sense of what Google is thinking when they’re making changes.”

The Quality Raters Guidelines Is Not A Handbook Of Ranking Factors

Three are many SEOs who have spread the misinformation that the quality raters guidelines offers a peek into what Google is using for ranking websites. That’s false.

Mueller continues (my paraphrase):

“Obviously quality rater guidelines is not a document that says like, this is how we do ranking, but more just, this is how we review things on the web when we ask for input from these quality raters.

They do a number of different tasks for us and so one of them is page quality where they tell us like, is this a high quality page or not? Another one is to evaluate whether the pages that we show in the search results meet the needs of a user. Which is highly subjective sometimes, but we give them information on what they can do there and the other one is A/B testing, side-by-side testing where we present quality raters with a set of pages before and a set of pages afterwards, and they tell us which one of these is actually better.”

Humans In The Loop

The important takeaway from Mueller’s discussion about the quality raters and the guidelines they use is that how humans react to the search results is at the heart of what Google is doing with their algorithms. Some people tend to think of Google’s algorithms as mechanical machines that are cranking out search results and that’s pretty much what they are but they’re also emulating human judgment about what is and is not spam, what is and is not a high quality search result.

Rote SEO is highly focused on feeding the machine but the machine itself is emulating human judgment. SEO today is more than ever about considering how every choice made about a site affects humans and less about worrying about whether you’ve got enough entity keywords on a page and if the H1 heading is missing.

Human Judgment Is Core to Google’s Algorithm Development

Quality raters are used to judge whether algorithm changes make search results better for people. Algorithms are adjusted based on human reactions, not by machine metrics.

Quality Raters Guidelines Reflect Google’s Values

Google’s Quality Rater Guidelines are not a ranking manual. They define what Google considers useful, high-quality content. They can serve as a mirror that business owners and SEOs can hold up to their own content to see how it aligns with Google’s criteria for high quality.

SEO Today Is About Human Experience

The deeper message buried in what Mueller was talking about is that Google’s algorithms are trying to emulate human judgment, so SEOs should focus on user experience and usefulness, not checklists or busy-work like adding author bios with superfluous information that does nothing for site visitors.

Google Revisits 15% Unseen Queries Statistic In Context Of AI Search via @sejournal, @martinibuster

At Search Central Live NYC, Google’s John Mueller revisited the statistic that 15% of search queries Google sees are completely brand new and have never been encountered. He also addressed what impact the advent of AI has had on the number of novel search queries users are making today.

Understanding Language

Understanding language is important to serving relevant search queries. That means Google needs to understand the nuances of what people mean when they search, which could include unconventional use of specific words or complete misuse of words. BERT is an example of a technology that Google uses to understand user queries.

Google’s introduction to BERT explains:

“We see billions of searches every day, and 15 percent of those queries are ones we haven’t seen before–so we’ve built ways to return results for queries we can’t anticipate.

…Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”

Are There More Unknown Search Queries?

In the context of an overview of Google Search John Mueller briefly discussed the statistic of new queries Google search sees and if LLMs have made any impact.

This is, according to my notes, what he said:

“15% of all queries are new every day. This is something that I’m surprised is still the case. I would have thought at some point most of the searches would have been made people just ask the same thing over and over again.

But when when we recalculate these metrics, it’s always around 15%. I imagined maybe with LLM’s and the AI systems that maybe it would be a bit higher in recent years but it’s still hovering around that number..”

He then speculated why that 15% number remains the same, attributing it to things are always changing and that life is not static.

My paraphrase of what Mueller observed:

“It’s fantastic to see because it means to me that people keep going to search and looking for something new… and if people would stop going to search or stop searching new things, and to me that would be a sign that maybe something is wrong here. So this is a great number.”

Curious Outcome

It’s amazing that something as groundbreaking like AI search and the ability to search visually would have added more complex searches that Google has never seen before but that 15% number keeps holding steady.

Google’s SEO Tips For Better Rankings – Search Central Live NYC via @sejournal, @martinibuster

Google’s Search Liaison answered a question at Google Search Central Live NYC about whether Google prefers brands. Sullivan took that as an opportunity to affirm that Google is working to show more independent sites and also offered insights into how independent sites can improve their search performance.

Google Wants Good Independent Sites To Rank

Someone at the Search Central Live NYC event submitted a question asking whether Google was focusing on just showing a smaller set of sites from the Internet that’s limited to big brand sites. Danny Sullivan, aka Google Search Liaison, immediately responded, no. He responded that he understands that there’s a sense that big brands always rank well on Google and that many people say that Google only wants to show big brands.

Google Search Central Live New York City

Photo of Google Search Liaison Danny Sullivan taken at Search Central Live NYC Event

Sullivan acknowledged that this is a valid concern from small independent sites because there are many who are doing good work who aren’t ranking as well as they should be and explained that they were working on it.

The following is a paraphrase based on my notes:

“And we’ve been spending a lot of time (and we’re going to continue to spend a lot of time) to understand how can we do a better job on better understanding and perhaps guiding some of the smaller creators and small independent sites so they can be successful. It has been like a huge chunk of my time over the past year. And I’m not alone in it.

We were just in Zurich last week. We were just out there and we were looking at a bunch of real queries from small creators, independent sites and sitting with the ranking team and going through them and what’s happening here and …we made a note that you know, we have done some changes that we think help and we have done some changes that have helped. We also anticipate working through the whole rest of the year.”

Why Changes Are Incremental

Danny explained that independent sites and their topic areas vary widely which complicates applying a single algorithmic solution to help them all. That explains why Google keeps saying they’re making incremental changes.

According to my notes, he said:

“One of the things I would say is I don’t expect you’re going to suddenly see one day we do a big huge, ‘And here is the independent small site update’ type of thing. I think it’s going be these incremental things that we do, in part because these kinds of sites are not monolithic.”

That Thing You Need To Know About Brands

Danny discussed how serious they are about finding solutions for independent publishers and eventually began speaking of more tangible things that publishers can do to help themselves, specifically about becoming memorable to site visitors.

This is something that I’ve been doing for over twenty years. I never rolled out an affiliate or AdSense site that didn’t have a carefully planned domain name, logo and mascot in place. That mascot is super important because it helps make the site memorable to site users. They’ll forget the domain name but they’ll remember that mascot and the site.

Danny said that Google’s systems are not tuned to identify big brands and rank them well. He acknowledged that sites with a lot of branded searches might rank well and this is the point where it felt like okay, am I really hearing this? It’s the kind of information you come to these events for.

This is a paraphrase from my notes of what Danny said:

“And I’ve seen where people do research and say, ‘I’ve figured out that if you have a lot of branded searches…’ That’s kind of valid in some sense.

But it’s not like you have a lot of big branded searchers or small branded searchers or whatever and you’re finding that correlates to your traffic. What it’s saying is that people have recognized you as a brand, which is a good thing. We like brands. Some brands we don’t like, but at least we recognize them, right?

So if you’re trying to be found in the sea of content and you have the 150,000th fried chicken recipe, it’s very difficult to understand which ones of those are necessarily better than anybody else’s out there.

But if you are recognized as a brand in your field, big, small, whatever, just a brand, then that’s important.

That correlates with a lot of signals of perhaps success with search. Not that you’re a brand but that people are recognizing you. People may be coming to you directly, people, may be referring to you in lots of different ways… You’re not just sort of this anonymous type of thing.

So, one thing I would encourage anybody, but especially to smaller and independent ones that are kind of feeling like the big brands are kind of getting it all is, are you making sure that people understand who you are?”

Differentiate Yourself. A Lot.

Danny Sullivan discussed that users submitted over 13,000 sites with feedback about Google’s algorithm and claimed that he’s confident that he’s looked at more sites than any SEO in the audience has. He acknowledged that many of the submissions had valid concerns but he also said he noticed that some sites that were high quality also lacked that extra bit that made them different and better.

What he was referring to, in my words, not Danny’s, was a clear narrative on the page that lets site visitors know who is behind the site. He wasn’t talking about the sidebar with the bio and a photo that travel and recipe sites all have. He was talking about something that goes beyond the generic narrative that many bloggers use.

This is a paraphrase from my notes about what Danny said:

“I can land on a site and have no idea who runs the site, what the site is about. Who’s behind it? That’s not to say that if you put an ‘about us’ link on your site that now you’ll rank better. But people come to websites from Search and they don’t know what they’re getting into.”

He then contrasted social media to search to show how a forum or a social media site offers a carefully curated experience where you know where everything is at, where expectations are managed. He then said that Search is completely different. While Danny didn’t explicitly say this, I believe what he meant to communicate was that the randomness of sites that Google sends people to can be jarring to users who consequently aren’t sure whether to trust a site. It’s a different experience than the carefully curated experience of a forum or social media site and for that reason it’s important to be able to give a sense of who is behind the site.

This is a paraphrase of what Danny said:

“Search is nothing like that. Search is a grab bag. It’s weird. You don’t know what you’re going to get. It’s like I’m feeling lucky. You don’t know what you’re going to end up with.”

And please, I beg you, especially those of you that said that Google wants everything to be the same. That’s not what we want. We don’t want every website to be a cookie cutter site.

We want you to build websites that you think makes sense for your readers.

Anytime you ever have a question about what you should be doing to be successful in Google search and your answer is to ask if it’s a good thing for your readers, if you do that, you are aligning with the things we’re trying to do because we’re trying to send people to satisfying content so that they go, ‘This was great! This is wonderful, I loved it!’

So when they wind up on your website, probably for the first time and they don’t know you from anything and they’re coming from this crazy world where they don’t even know where the profiling for the author is, make it easy for them. Make it easy for them to come into the site and know exactly what you’re about.

I know the travel bloggers, you all have the thing on the side that says, ‘we love travelling the world…’ It’s like, OK, that’s fine and at least people know to expect that from travel bloggers and you’ve got it there.

But help them understand what’s unique or different about you, that makes you a brand. And that is a really good thing.”

Insights From Search Central Live NYC

Google Actively Supports Independent Sites

Danny Sullivan said multiple times that Google is spending a significant amount of time into improving the algorithm so that more independent publishers will attain visibility in search. However these improvements are incremental because of the wide variety of sites and topics makes it so that one change won’t affect all sites equally.

Brand Recognition Drives Search Success

Being recognized as a brand to site visitors is a quality that highly successful sites tend to have. It’s not that cultivating a brand is a ranking factor, but rather that cultivating site users leads to stronger search signals.

Differentiation Is Important

Some high-quality sites fail to stand out because they do what they think they do what everyone else is doing. Site visitors may appreciate more effort to make it clearer who is behind the site. An example of something to consider avoiding are things like rote generic bios in favor of providing a real sense of why the site is important or matters.

Clarity Builds Trust

Recognize that the web has an element of randomness that make some site visitors wary about visiting a site for the first time. Design with this understanding in mind.

Design for the Reader, Not the Algorithm

One of the most common mistakes I see by publishers is that they can list all of the things they did for SEO but very little if anything that they did for their site visitors. Danny Sullivan recommends basing decisions on whether a change is good for the site visitors because that will align it with the kinds of sites Google wants to rank.

Google Completes March 2025 Core Update Rollout via @sejournal, @MattGSouthern

Google officially completed the rollout of its March 2025 Core Update today at 5:34 AM PDT, ending two weeks of significant volatility in search rankings.

This update began on March 13 and has created notable shifts in search visibility across various sectors and website types.

Widespread Impact Observed

Data collected during the update’s rollout period revealed some of the most volatile search engine results pages (SERPs) in the past 12 months, according to tracking from Local SEO Guide.

Their system, which monitors 100,000 home services keywords, showed unprecedented movement beginning the week of March 10th.

SISTRIX’s Google Update Radar confirmed these findings, detecting substantial changes across UK and US markets starting March 16th.

Forum Content Recalibration

One of the most significant trends emerging from this update is a recalibration of how Google values forum content.

After approximately 18 months of heightened visibility for forum websites following Google’s mid-2023 “hidden gems” update, many forum sites are now experiencing substantial drops in visibility.

SEO strategist Lily Ray highlighted this trend, noting steep visibility declines for platforms like proboards.com, which hosts numerous forum websites.

Ray pointed out that while Reddit continues gaining visibility, many other forum sites that benefited from the 2023 algorithm changes are now diminishing their rankings.

“The SEO glory days of ‘just be a forum and you’ll rank’ might be coming to an end,” Ray observed.

Additional Patterns Identified

Andrew Shotland, CEO of Local SEO Guide, identified several other potential patterns in this update:

  1. Forum Content Devaluation: While Reddit remains strong, other forums are seeing their previously gained visibility disappear.
  2. Programmatic Content Penalties: Sites creating large volumes of programmatic pages, particularly those designed specifically for SEO rather than user value, are experiencing significant declines.
  3. Cross-Sector Impact: Unlike some updates that target specific industries, this core update has affected sites across retail, government, forums, and content publishers.

Industry professionals commenting on the update have noted the potential connection to Google’s broader efforts to improve search result diversity and combat low-value content.

This recalibration may also relate to the ongoing integration of AI-generated content in search results.

What This Means for SEO

With the update now complete, SEO professionals can begin to assess the full impact on their sites and implement appropriate strategies.

For those managing forum content, this update signals the importance of quality over quantity and suggests that simply having forum content is no longer sufficient for strong rankings.

Sites negatively impacted by the update should focus on improving content quality, removing programmatic or low-value pages, and ensuring their content genuinely addresses user needs rather than being created primarily for search engines.

Search Engine Journal will continue to monitor the aftermath of this core update and provide additional analysis as more data becomes available.

Google Rolls Out AI-Powered Travel Updates For Search & Maps via @sejournal, @MattGSouthern

Google has released its annual summer travel trends report alongside several AI-powered updates to its travel planning tools.

The announcement reveals shifting travel preferences while introducing enhancements to Search, Maps, Lens, and Gemini functionality.

New AI Search and Planning Features

Google announced five major updates to its travel planning ecosystem.

Expanded AI Overviews

Google has enhanced its AI Overviews in Search to generate travel recommendations for entire countries and regions, not just cities.

You can now request specialized itineraries by entering queries like “create an itinerary for Costa Rica with a focus on nature.”

The feature includes visual elements and the ability to export recommendations to various Google products.

Image Credit: Google

Price Monitoring for Hotels

Following its flight price tracking implementation, Google has extended similar functionality to accommodations.

When browsing google.com/hotels, you can now toggle price tracking to receive alerts when hotel rates decrease for selected dates and destinations.

The system factors in applied filters include amenity preferences and star ratings.

Image Credit: Google

Screenshot Recognition in Maps

A new Google Maps feature can help organize travel plans by automatically identifying places mentioned in screenshots.

Using Gemini AI capabilities, the system recognizes venues from saved images and allows users to add them to dedicated lists.

The feature is launching first on iOS in English, with Android rollout planned.

Gemini Travel Assistance

Google’s Gemini AI assistant now offers enhanced travel planning support, allowing users to create “Gems” – customized AI assistants for specific travel needs.

Now available at no cost, these specialized assistants can help with destination selection, local recommendations, and trip logistics.

Expanded Lens Capabilities

Google Lens continues evolving, offering enhanced AI-powered information delivery when pointing your camera at landmarks or objects.

The feature is expanding beyond English to include Hindi, Indonesian, Japanese, Korean, Portuguese, and Spanish, complementing its existing translation capabilities.

Image Credit: Google

Travel Search Trends

According to Google’s Flights and Search data analysis, travelers are increasingly drawn to coastal destinations for the Summer of 2025.

Caribbean islands, including Puerto Rico, Curacao, and St. Lucia, are seeing significant search growth, along with other beach destinations like Rio de Janeiro, Maui, and Nantucket.

The data also reveals continued momentum for outdoor adventure travel within the U.S.:

  • Cities with proximity to nature experiences (Billings, Montana; Juneau, Alaska; and Bangor, Maine) are experiencing higher search volume
  • “Cabins” has emerged as the top accommodation search for romantic getaways
  • Family travelers are increasingly searching for “dude ranch” vacations
  • Weekend getaway searches concentrate on natural destinations, including upstate New York, Joshua Tree National Park, and Sedona.

An unexpected trend in luggage preferences was also noted, with “checked bags” queries now exceeding historically dominant “carry on” searches.

Supporting this shift, space-saving solutions like vacuum bags and compression packing cubes have become top trending travel accessory searches.

Implications for SEO and Travel Content

These updates signal Google’s continued investment in controlling the travel research journey within its own ecosystem.

The expansion of AI-generated itineraries and information potentially reduces the need for users to visit traditional travel content sites during the planning phase.

Travel brands and publishers may need to adapt their SEO and content strategies to account for these changes, focusing more on unique experiences and in-depth content beyond what Google’s AI tools can generate.

The trend data also provides valuable insights for travel-related keyword targeting and content development as summer vacation planning begins for many consumers.

Top SEO Shares How To Win In The Era Of Google AI via @sejournal, @martinibuster

Jono Alderson, former head of SEO at Yoast and now at Meta, spoke on the Majestic Podcast about the state of SEO, offering insights on the decline of traditional content strategies and the rise of AI-driven search. He shared what SEOs should be doing now to succeed in 2025.

Decline Of Generic SEO Content

Generic keyword-focused SEO content, as well as every SEO tactic, has always been on a slow decline, arguably beginning with statistical analysis, machine learning and then into the age of AI-powered search. Citations from Google are now more precise and multimodal.

Alderson makes the following points:

  • Writing content for the sake of ranking is becoming obsolete because AI-driven search results provide those answer.
  • Many industries and topics like dentists and recipes sites have an oversaturation of nearly identical content that doesn’t add value.

According to Alderson:

“…every single dentist site I looked at had a tedious blog that was quite clearly outsourced to a local agency that had an article about Top 8 tips for cosmetic dentistry, etc.

Maybe you zoom out how many dentists are there in every city in the world, across how many countries, right? Every single one of those websites has the same mediocre article that somebody has done some keyword research. Spotted a gap they think they can write one that’s slightly better than their competitors. And yet in aggregate, we’ve created 10 million pages that none of which show the purpose, all of which are fundamentally the same, none of which are very good, none of which add new value to the corpse of the Internet.

All of that stops working because Google can just answer those kinds of queries in situ.”

Google Is Deprioritizing Redundant Content

Another good point he makes is that the days where redundant pages have a chance are going away. For example, Danny Sullivan explained at Search Central Live New York that many of the links shown in some of the AI Overviews aren’t related to the keyword phrase but are related to the topic, providing access to the next kind of information that a user would be interested in after they’d ingested the answer to their question. So, rather than show five or eight links to pages that essentially say the same thing Google is now showing links to a variety of topics. This is an important thing publishers and SEOs need to wrap their minds around, which you can read more about here: Google Search Central Live NYC: Insights On SEO For AI Overviews.

Alderson explained:

“I think we need to stop assuming that producing content is a kind of fundamental or even necessary part of modern SEO. I think we all need to take a look at what our content marketing strategies and playbooks look like and really ask the questions of what is the role of content and articles in a world of ChatGPT and AI results and where Google can synthesize answers without needing our inputs.

…And in fact, one of the things that Google is definitely looking for, and one of the things which will be safe to a degree from this AI revolution, is if you can publish, if you can move quickly, if you can produce stuff at a better depth than Google can just synthesize, if you can identify, discover, create new information and new value.

There is always space for that kind of content, but there’s definitely no value if what you’re doing is saying, ‘every month we will produce four articles focusing on a given keyword’ when all 10,000 of our competitors employ somebody who looks like us to produce the same article.”

How To Use AI For Content

Alderson discouraged the use of AI for producing content, saying that it tends to produce a “word soup” in which original ideas get lost in the noise. He’s right, we all know what AI-generated content looks like when we see it. But I think that what many people don’t notice is the extra garbage-y words and phrases AI uses that have lost their impact from overuse. Impactful writing is what supports engagement, and original ideas are what make content stand apart. These are the two things AI is absolutely rubbish at.

Alderson notes that Google may have anticipated the onslaught of AI-generated content by emphasizing EEAT (Experience, Expertise, Authoritativeness and Trustworthiness and argues that AI can be helpful.

He observed:

“And a lot of the changes we’re seeing in Google might well be anticipating that future. All of the EEAT stuff, all of the product review stuff, is designed to combat a world where there’s an infinite amount of recursive nonsense.

So definitely avoid the temptation to be using the tools just to produce. Use them as assistance and muses to bounce ideas around with and then do the heavy thinking yourself.”

The Shift from Content Production to Content Publishing

Jono encouraged content publishers to focus on creating original research, expert insights, to show things that have gone unnoticed. He suggested that succesful publishers are the ones who get out in the world and experience what they’re writing about through original research. He also encouraged focusing on authoritative voices rather than settling for generic content.

He explained:

“I think there’s definitely room to publish good content and publish. 2015-ish everyone started saying become a publisher and the whole industry misinterpreted that to mean write lots of articles. When actually you look at successful publishers, what they do is original research, by experts, they break news, they visit the places, they interact with things. A lot of what Google’s looking for in those kind of EEAT criteria, it describes the act of publishing. Yet very little of SEO actually publishes. They just produce And I think if you …close that gap there is definitely value.

And in fact, one of the things that Google is definitely looking for, and one of the things which will be safe to a degree from this AI revolution, is if you can publish, if
you can move quickly, if you can produce stuff at a better depth than Google can just synthesize.”

What does that mean in terms of a content strategy? One of the things that bothers me is the lack of originality in content. Things like concluding paragraphs with headings like “Why We Care” drive me crazy because to me it indicates a rote approach to content.

I was researching how to flavor shrimp for sautéing and every recipe site says to sprinkle seasonings on the shrimp prior to a quick sauté at a medium high heat, which burns the seasonings. Out of the thousands of recipe sites out there, not one can figure out that you can sauté the shrimp, add some garlic, then when it’s done add the seasoning just after turning off the flame? And if you ask AI how to do it the AI will tell you to burn your seasonings because that’s what everyone else says.

What that all means is that publishers and SEOs should focus on hands-on original research and unique insights instead of regurgitating what everyone else is saying. If you follow directions and it comes out poorly maybe the directions are wrong and that’s an opportunity to do something original.

SEO’s Role in Brand-Building & Audience Engagement

When asked what the role of content is in a world where AI is producing summaries, Alderson suggested that publishers and SEOs need to get ahead of the point where consumers are asking questions, go back to before they ask those questions.

He answered:

“Yeah, it’s really tricky because the kind of content that we’re producing there is going to change. It’s not not going to be the “8 Tips For X” in the hope the 2% of that audience convert. It’s not going to work anymore.

You’re going to need to go much higher up the funnel and much earlier into the research cycle. And the role of content will need to change to not try and convert people who are at the point of purchase or ready to make a decision, but to influence what happens next for the people who are at the very start of those journeys.

So what you can do is, for example, I know this is radical, but audience research, find out what kind of questions people in your sector had six months before they purchased or the kind of frustrations and challenges- what do they wish they’d known when they’d started to engage upon those processes?”

Turning that into a strategy, it may mean that SEOs and publishers may want to shift away from focusing solely on transactional keywords and toward developing content that builds brand trust early. As Jono recommends, conduct audience research to identify what potential customers are thinking about months before they are ready to buy and then create content that builds long-term familiarity.

The Changing Nature of SEO Metrics & Attribution

Alderson goes on to offer a critique about the overreliance on conversion-based metrics like last-click attribution. He suggests that the focus on proving success by showing that a user didn’t return to the search results page is outdated because SEO should be influencing earlier stages of the customer journey

“You look at the the kind of there’s increasing belief that attribution as a whole is a bit of a pseudoscience and that as the technology gets harder to track all the pieces together, it becomes increasingly impossible to produce an overarching picture of what are the influences of all these pieces.

You’ve got to go back to conventional marketing …You’ve got to look at actually, does this influence what people think and feel about our brand and our logo and our recall rather than going, ‘how many clicks did we get out of, how many impressions and how many sales?’ Because if you’re competing there, you’re probably too late.

You need to be influencing people much higher the funnel. So, yeah… All, everything we’ve ever learned in the nineteen fifties and sixties about marketing, that is how we measure what good SEO looks like. Yeah, it looks like maybe we need to step back from some of the more conventional measures.”

Turning that into a strategy means that maybe it’s a good exercise to rethink traditional success metrics and start looking at customer sentiment rather than just search rankings.

Radical Ideas For A Turning Point In History

Jono Alderson prefaced his recommendation for doing audience research with the phrase, “I know this is radical…” and what he proposes is indeed radical but not in the sense that he’s proposing something extreme. His suggestions are radical in the sense that he’s pointing out that what used to be common sense in SEO (like keyword research, volume-driven content production, last-click attribution) is increasingly losing relevance to how people seek out information today. The takeaway is that adapting means rethinking SEO to the point that it goes back to its roots in marketing.

Watch Jono Alderson speak on the Majestic SEO podcast:

Stop assuming that ‘producing content’ is a necessary component of modern SEO – Jono Alderson

Featured Image/Screenshot of Majestic Podcast

AI Crawlers Are Reportedly Draining Site Resources & Skewing Analytics via @sejournal, @MattGSouthern

Website operators across the web are reporting increased activity from AI web crawlers. This surge raises concerns about site performance, analytics, and server resources.

These bots consume significant bandwidth to collect data for large language models, which could impact performance metrics relevant to search rankings.

Here’s what you need to know.

How AI Crawlers May Affect Site Performance

SEO professionals regularly optimize for traditional search engine crawlers, but the growing presence of AI crawlers from companies like OpenAI, Anthropic, and Amazon presents new technical considerations.

Several site operators have reported performance issues and increased server loads directly attributable to AI crawler activity.

“SourceHut continues to face disruptions due to aggressive LLM crawlers,” reported the git-hosting service on its status page.

In response, SourceHut has “unilaterally blocked several cloud providers, including GCP [Google Cloud] and [Microsoft] Azure, for the high volumes of bot traffic originating from their networks.”

Data from cloud hosting service Vercel shows the scale of this traffic: OpenAI’s GPTBot generated 569 million requests in a single month, while Anthropic’s Claude accounted for 370 million.

These AI crawlers represented about 20 percent of Google’s search crawler volume during the same period.

The Potential Impact On Analytics Data

Significant bot traffic can affect analytics data.

According to DoubleVerify, an ad metrics firm, “general invalid traffic – aka GIVT, bots that should not be counted as ad views – rose by 86 percent in the second half of 2024 due to AI crawlers.”

The firm noted that “a record 16 percent of GIVT from known-bot impressions in 2024 were generated by those that are associated with AI scrapers, such as GPTBot, ClaudeBot and AppleBot.”

The Read the Docs project found that blocking AI crawlers decreased their traffic by 75 percent, from 800GB to 200GB daily, saving approximately $1,500 per month in bandwidth costs.

Identifying AI Crawler Patterns

Understanding AI crawler behavior can help with traffic analysis.

What makes AI crawlers different from traditional bots is their frequency and depth of access. While search engine crawlers typically follow predictable patterns, AI crawlers exhibit more aggressive behaviors.

Dennis Schubert, who maintains infrastructure for the Diaspora social network, observed that AI crawlers “don’t just crawl a page once and then move on. Oh, no, they come back every 6 hours because lol why not.”

This repeated crawling multiplies the resource consumption, as the same pages are accessed repeatedly without a clear rationale.

Beyond frequency, AI crawlers are more thorough, exploring more content than typical visitors.

Drew DeVault, founder of SourceHut, noted that crawlers access “every page of every git log, and every commit in your repository,” which can be particularly resource-intensive for content-heavy sites.

While the high traffic volume is concerning, identifying and managing these crawlers presents additional challenges.

As crawler technology evolves, traditional blocking methods prove increasingly ineffective.

Software developer Xe Iaso noted, “It’s futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more.”

Balancing Visibility With Resource Management

Website owners and SEO professionals face a practical consideration: managing resource-intensive crawlers while maintaining visibility for legitimate search engines.

To determine if AI crawlers are significantly impacting your site:

  • Review server logs for unusual traffic patterns, especially from cloud provider IP ranges
  • Look for spikes in bandwidth usage that don’t correspond with user activity
  • Check for high traffic to resource-intensive pages like archives or API endpoints
  • Monitor for unusual patterns in your Core Web Vitals metrics

Several options are available for those impacted by excessive AI crawler traffic.

Google introduced a solution called Google-Extended in the robots.txt file. This allows websites to stop having their content used to train Google’s Gemini and Vertex AI services while still allowing those sites to show up in search results.

Cloudflare recently announced “AI Labyrinth,” explaining, “When we detect unauthorized crawling, rather than blocking the request, we will link to a series of AI-generated pages that are convincing enough to entice a crawler to traverse them.”

Looking Ahead

As AI integrates into search and discovery, SEO professionals should manage crawlers carefully.

Here are some practical next steps:

  1. Audit server logs to assess AI crawler impact on your specific sites
  2. Consider implementing Google-Extended in robots.txt to maintain search visibility while limiting AI training access
  3. Adjust analytics filters to separate bot traffic for more accurate reporting
  4. For severely affected sites, investigate more advanced mitigation options

Most websites will do fine with standard robots.txt files and monitoring. However, high-traffic sites may benefit from more advanced solutions.


Featured Image: Lightspring/Shutterstock