Google’s Updated Raters Guidelines Target Fake EEAT Content via @sejournal, @martinibuster

A major update to Google’s Search Quality Raters Guidelines (QRG) clarifies and expands on multiple forms of deception that Google wants its quality raters to identify. This change continues the trend of refining the guidelines so that quality raters become better at spotting increasingly granular forms of quality issues.

TL/DR

Authenticity should be the core principle for any SEO and content strategy.

Quality Guidelines Section 4.5.3

Section 4.5.3 has essentially been rewritten to be clearer and easier to understand but most importantly it has been expanded to cover more kinds of deception. One can speculate that the quality raters weren’t overlooking certain kinds of website deception and that these changes are addressing that shortcoming. This could also signal that Google’s algorithms may in the near future become more adept at spotting the described kinds of deception.

The change in the heading of section 4.5.3 reflects the scope of the changes, with greater detail over the original version.

The section title changed from this:

4.5.3 Deceptive Page Purpose and Deceptive MC Design

To this:

“4.5.3 Deceptive Page Purpose, Deceptive Information about the Website, Deceptive Design”

The entire section was lightly rewritten and reorganized for greater clarity. It’s not necessarily a new policy but rather a more detailed and nuanced version of it, with a few parts that are brand new.

Deceptive Purpose

The following is a new paragraph about deceptive purpose:

“Deceptive purpose:

● A webpage with deliberately inaccurate information to promote products in order to make money from clicks on monetized links. Examples include a product recommendation page on a website falsely impersonating a celebrity blog, or a product recommendation based on a false claim of personal, independent testing when no such testing was conducted.”

Google very likely has algorithmic signals and processes to detect and remove sites with these kinds of deceptive content. While one wouldn’t expect that a little faking would be enough to result in a sudden drop in rankings, why take the chance? It’s always the safest approach to focus on authenticity.

To be clear, the focus of this section isn’t just about putting fake information on a website but rather it’s about deceptive purpose. The opposite of a deceptive purpose is a purpose rooted in authenticity, with authentic intent.

Deceptive EEAT Content

There is now a brand new section that is about fake EEAT (Expertise, Experience, Authoritativeness, and Trustworthiness) content on a website. A lot of SEOs talk about adding EEAT to their web pages but the fact is that EEAT is not something that one adds to a website. EEAT is a quality of a website that’s inherent in the overall experience of researching a site, learning about a site, and in the process of consuming the content, which can result in signals that site visitors may generate about a website.

Here’s the guidance about fake EEAT content:

“● A webpage or website with deceptive business information. For example, a website may claim to have a physical “brick and mortar” store but in fact only exists online. While there is nothing wrong with being an online business, claiming to have a physical “brick and mortar” (e.g. fake photo, fake physical store address) is deceptive.

● A webpage or website with “fake” owner or content creator profiles. For example, AI generated content with made up “author” profiles (AI generated images or deceptive creator descriptions) in order to make it appear that the content is written by people.

● Factually inaccurate and deceptive information about creator expertise. For example, an author or creator profile inaccurately claims to have credentials or expertise (e.g. the content creator claims falsely to be a medical professional) to make the content appear more trustworthy than it is.”

Deceptive Content, Buttons, And Links

The new quality raters guidelines also goes after sites that use deceptive practices to get users to take actions they didn’t intend to. This is an extreme level of deception that shouldn’t be a concern to any normal site.

The following are additions to the section about deceptive design:

“● Pages with deceptively designed buttons or links . For example, buttons or links on pop ups, interstitials or on the page are designed to look like they do one thing (such as close a pop up) but in fact have a different result which most people would not expect, e.g. download an app.

● Pages with a misleading title or a title that has nothing to do with the content on the page. People who come to the page expecting content related to the title will feel tricked or deceived.”

Takeaways

There are three  important takeaways from the updates to section 4.5.3 of Google’s Search Quality Raters Guidelines:

1. Expanded Definition Of Deceptive Purpose

  • Section 4.5.3 now explicitly includes new examples of deceptive page intent, such as fake endorsements or falsified product testing.
  • The revision emphasizes that deceptive purpose goes beyond misinformation—it includes misleading motivations behind the content.

2. Focus On Deceptive EEAT Content

  • A new subsection addresses deceptive representations of EEAT, including:
  • Fake business details (e.g. pretending to have a physical store).
  • Made-up author profiles or AI-generated personas.
  • False claims of creator expertise, such as unearned professional credentials.

3. Deceptive Design and UI Practices

The raters guidelines calls attention to manipulative interface elements, such as:

  • Buttons that pretend to close popups but trigger downloads instead.
  • Misleading page titles that don’t match the content.

Google’s January 2025 update to the Search Quality Raters Guidelines significantly expands how raters should identify deceptive web content. The update clarifies deceptive practices involving page purpose, false EEAT (Expertise, Experience, Authoritativeness, Trustworthiness) content, and misleading design elements. The purpose of the update is to help raters to better recognize manipulation that could mislead users or inflate rankings and could indicate the kinds of low quality that Google is focusing on.

Featured Image by Shutterstock/ArtFamily

Google Shares Insight About Time-Based Search Operators via @sejournal, @martinibuster

Google’s Search Liaison explained limitations in Google’s ability to return web pages from a prior date, also noting that date-based advanced search operators are still in beta. He provided one method for doing this but omitted discussing an older, simpler method that almost accomplishes the same thing.

How To Find Articles By Older Published Date

The person asking the question understood how to find articles published from the previous year, month, and twenty four hours. But didn’t know how to find articles published before a specific date.

The post on Bluesky asked:

“Is there a way to search for articles OLDER than a certain date?

I know advanced search can guarantee in the past year, month, 24h, but I want to specifically be able to find articles published BEFORE X historical event happened, and I can’t find a way to filter. Help?”

Search Liaison posted this way to do it which can be difficult to memorize if you’re a busy person:

“We have before: and after: operators that are still in beta. You must provide year-month-day dates or only a year. You can combine both. For example:

[avengers endgame before:2019]
[avengers endgame after:2019-04-01]
[avengers endgame after:2019-03-01 before:2019-03-05]”

Another Way To Do Time-Based Search

In my opinion it’s a lot easier to just use Google’s search tools:

Tools > Any Time > Custom Range

From there you just set whatever time range that you want, nothing to memorize. However, you can’t search before a certain date, you have to set the starting date and ending date you’re searching for.

Caveat About Time-Based Search

Search Liaison shared an interesting insight about how the advanced search operators for time work:

“Just keep in mind it can be difficult for us to know the exact date of a document for a variety of reasons. There’s no standard way that all site owners use to indicate a publishing or republishing date. Some provide no dates at all on web pages. Some might not indicate if an older page is updated.”

Takeaways:

The time-based advanced search operators are still in beta, which means that Google is testing to see how many people find it useful. Google might at some time in the future remove the search operators if it’s not popular or useful.

The other takeaway is that it’s hard for Google to know the exact date that a document is published.

Read the discussion on Bluesky.

Yelp Vs. Google Antitrust Case Survives First Big Test

Yelp’s antitrust case against Google just survived a critical test.

While a judge tossed several claims, she allowed Yelp’s core challenge – focused on Google’s alleged monopoly in local search – to move forward. The broader battle over Big Tech’s power is far from over.

The central claims that Google unfairly used its dominance in the local search and local ad markets are now headed toward a jury trial.

If Yelp prevails, the case could significantly reshape the structure of local search in the U.S. and beyond.

Even if it doesn’t, the discovery process alone may offer an unprecedented window into Google’s internal thinking around local search.

Last week, Judge Susan Van Keulen of the U.S. District Court for Northern California denied in part Google’s motion to dismiss the case.

She ruled that Yelp’s key allegations – that Google monopolized and abused its control over local search and local advertising – were plausible enough to proceed.

Statute Of Limitations Doesn’t Apply

Google argued that its conduct, such as changes to search results design and integration of local content, dated back to at least 2007 (e.g., Universal Search) and that Yelp’s claims were therefore filed too late.

The judge disagreed, holding that for claims under Section 2 of the Sherman Act, the clock starts ticking not when the conduct occurred, but when the defendant both:

  1. Possessed monopoly power in the relevant market.
  2. Engaged in exclusionary conduct that injured the plaintiff.

The judge noted that “Because the governing statute here concerns an exercise of monopoly power, the qualifying ‘act’ must involve Google’s alleged monopoly; exclusionary conduct on its own, in the absence of market power, is insufficient.”

She also noted that Yelp had not definitively alleged when Google obtained monopoly power in local search or local search advertising, and that the timeline wasn’t clear enough from the face of the complaint to justify dismissal and went to say that “The timing of when Google crossed the 65% threshold of market power is simply not clear from the face of the Complaint.”

Where Google’s Motion To Dismiss Succeeded

The judge did, however, dismiss Yelp’s claims that Google used its general search monopoly to force users into its local search services (tying) and to gain dominance in local search advertising (monopoly leveraging).

She agreed with Google that the tying claim was time-barred and that Yelp hadn’t properly shown unfair expansion into new markets. However, the court granted Yelp the opportunity to amend its tying claim.

Yelp appeared pleased with the results, stating to Near Media that the ruling “marks an important step forward in Yelp’s case against Google. As we argued in our opposition to Google’s motion, and as the Court recognized, Google’s anticompetitive behavior deserves to be examined.”

The judge also agreed that Yelp’s argument – that local search and local ads are distinct markets – is a valid and plausible claim.

She allowed the case to proceed to determine whether these markets are entitled to antitrust protection and whether Yelp is entitled to damages.

Additionally, the court accepted Yelp’s allegations that Google has a long history of exclusionary acts.

Key questions remain: When did Google achieve monopoly status in local search and advertising, whether the four-year statute of limitations applies, and – more critically – whether Google has committed new, unique exclusionary acts that could restart the statute of limitations clock.

These cases unfold on their own timeline. Next steps include whether Yelp amends its tying claims, a discovery phase, and an attempt at alternative dispute resolution.

This process is likely to take about 18 months. Given that Google is unlikely to dramatically change its local search behavior – and that Jeremy Stoppelman, Yelp’s CEO, is unlikely to settle for anything less – resolution outside of court seems improbable. A trial could occur around the end of 2026.

Why This Case Matters

This is the first time anyone has examined Google’s behavior in local search, including how it won its monopoly and the impact that has had on local search and local ads, both within Google and across the broader local ecosystem.

If the court finds Google’s practices unlawful, it could force changes:

  • Local search, as we know it, could dramatically change. It could spell the end of the Local Pack.
  • Local ads, particularly Local Service Ads (LSAs), exemplify the kind of new, exclusionary behavior Google has leveraged to dramatically reduce organic and local opportunities (Yelp argues). This could become a target of any settlement.
  • The outcome could also influence the EU’s interpretation of Google’s monopoly under the newly implemented DMA regulations.
  • Any decision could also impact Google’s ability to introduce new AI features to local search, particularly features like local AI Overviews or “Learn something specific” that could be construed as a form of exclusionary local behavior.

What We May Discover

As with the U.S. government cases against Google, discovery is likely to uncover fascinating details about how Google positions local search – and, hopefully, some algorithmic insights into how local search and ads actually function.

Even if Yelp ultimately loses, the discovery process could still offer an unprecedented look inside Google Local and LSAs, giving us the first definitive glimpse into the world in which we live, work, and breathe.

Eighteen months may seem like a lifetime in internet marketing and Google local developments, but this is one that is worth keeping an eye on.

You can read a full timeline of the events here.

And read the original documents here:

More Resources:


Featured Image: Phanphen Kaewwannarat/Shutterstock

Reddit Q1 Report: What It Means For Digital Marketing And SEO via @sejournal, @martinibuster

Reddit’s first-quarter 2025 earnings report shows strong growth in traffic and user engagement, particularly among logged-out users. Although the announcement and shareholder letter did not mention search engines or the temporary decrease in traffic in February, the increase in logged-out traffic suggests year-over-year growth from external referrals from search engines and social media.

The shareholder letter indirectly referenced search traffic:

“For seekers, Reddit’s open nature is essential—it allows our content to surface across the open web and be easily found in search. We remain one of the last major platforms that doesn’t require you to sign in to learn something because we believe that by giving everyone access to knowledge, we are helping fulfill the purpose of the internet. This openness broadens visibility, drives awareness, and brings us new users— but it also means that some of our traffic from external sources is variable. “

TL/DR

  • Over 400 million people visit Reddit weekly, and total revenue reached $392.4 million for the quarter, representing a 61% year-over-year increase.
  • More users are logged-in, improving Reddit’s value for ad targeting and audience engagement, which is good news for digital marketers.
  • Growth in logged-out users suggests increased external referrals from search and social, a trend that may concern publishers and SEOs

Platform Growth and User Activity: Daily Active Unique Visitors

Reddit refers to their daily active uniques as DAUq. The Reddit quarterly report defines Daily Active Uniques (DAUq) as:

“We define a daily active unique (“DAUq”) as a user whom we can identify with a unique identifier who has visited a page on the Reddit website, www.reddit.com, or opened a Reddit application at least once during a 24-hour period.”

Reddit reported 108.1 million Daily Active Uniques (DAUq), a 31% increase compared to the same quarter last year. Weekly Active Uniques (WAUq) reached 401.3 million globally, also up 31% year-over-year.

Daily Active Uniques: U.S. Versus International

Daily Active Uniques for both U.S. and International saw significant increases compared to the same period last year, with international visits post nearly double the gains for the United States.

  • U.S. DAUq was 50.1 million, up 21%,
  • International DAUq rose to 58.0 million, up 41%.

This contrast highlights that Reddit’s fastest user growth is occurring outside the U.S., signaling expanding global reach and rising visibility in international markets. This trend may indicate growing discovery opportunities through non-U.S. referral sources and increased relevance in regions where Reddit has historically had lower reach.

Logged-in Daily Active Uniques

Logged-in users can comment, moderate, start discussions, and vote. A bottom line significance is that being logged-in allows for better behavioral tracking and a higher ad targeting value. Thus, logged-in users are an important metric of the viability of the Reddit community.

Logged-in Daily Active Uniques (DAUq) were up in both the U.S. and internationally, with international growth outpacing the U.S. on a year-over-year basis:

  • Logged-in U.S. DAUq: 23.0 million (up 19% year-over-year)
  • Logged-in International DAUq: 25.8 million (up 27% year-over-year)

Logged-Out Daily Active Uniques (DAUq)

The rise in overall Daily Active Uniques (DAUq) extended to logged-out users, with that segment experiencing strong year-over-year growth. This suggests that Reddit continues to be a popular destination for reading opinions and reviews from real people. It may also suggest that search engines and social media are sending more visitors to Reddit, as those users are more likely to arrive via external referrals, although the quarterly report did not mention search traffic or referral sources.

Logged-out users increased in both the U.S. and internationally, with international growth more than double that of the United States.

  • Logged-out U.S. DAUq: 27.1 million (up 22% YoY)
  • Logged-out International DAUq: 32.2 million (up 54% YoY)

Revenue and Monetization

Reddit’s total revenue reached $392.4 million, an increase of 61% compared to Q1 2024. Advertising revenue accounted for $358.6 million, also up 61%. Other revenue totaled $33.7 million, a 66% increase year-over-year.

Average Revenue Per Unique (ARPU) also increased:

  • U.S. ARPU: $6.27, up 31%
  • International ARPU: $1.34, up 22%

These increases in Average Revenue Per Unique (ARPU) suggest that Reddit is improving its monetization of user activity.

It also reported $115.3 million in adjusted EBITDA. EBITDA stands for earnings before interest, taxes, depreciation, and amortization. The adjusted version excludes certain non-recurring or non-cash expenses. This figure is often used to show how profitable a business is from its core operations.

  • Net income: $26.2 million
  • Adjusted EBITDA: $115.3 million
  • Operating cash flow: $127.6 million
  • Free Cash Flow: $126.6 million
  • Gross margin: 90.5%

Takeaways

Platform Usage Growth

  • Total Daily Active Uniques (DAUq) rose 31% year-over-year to 108.1 million.
  • Weekly Active Uniques reached 401.3 million globally, also up 31%.
  • Growth among international users (41%) outpaced U.S. growth (21%).
  • Logged-out users grew faster than logged-in users, especially internationally (+54% YoY).

Logged-In vs. Logged-Out Behavior

  • Logged-in users are critical for Reddit’s ad targeting and engagement features.
  • Logged-in DAUq rose in both U.S. (+19%) and international (+27%) markets.
  • Logged-out DAUq showed steeper growth: +22% U.S. and +54% internationally.
  • The rise in logged-out traffic suggests Reddit may be benefiting from increased exposure via search engines and social media, despite the report not directly mentioning search.

Revenue and Monetization

  • Total revenue grew 61% YoY to $392.4 million.
  • Advertising revenue: $358.6 million, also up 61%.
  • Other revenue grew 66% to $33.7 million.

ARPU (Average Revenue Per Unique)

  • U.S. ARPU: $6.27 (up 31%)
  • International ARPU: $1.34 (up 22%)
  • ARPU growth suggests improved monetization per user, especially through ad impressions in international markets.

Profitability and Financial Health

  • Net income: $26.2 million
  • Adjusted EBITDA: $115.3 million
    (Reflects earnings before interest, taxes, depreciation, and amortization, excluding non-cash or one-time costs)
  • Operating cash flow: $127.6 million
  • Free Cash Flow: $126.6 million
  • Gross margin: 90.5%

Reddit’s Q1 2025 earnings report highlights strong year-over-year growth in both logged-in and logged-out user activity, with usage rising significantly in the U.S. and even higher internationally. The company also reported a 61% increase in total revenue and positive cash flow, showing that Reddit is becoming more effective at monetizing its growing user base, which is useful information for digital marketers.

The report reflects growth in usage, revenue, and Average Revenue Per Unique (ARPU). Reddit’s expanding reach and monetization suggest it remains a relevant platform for users and remains a destination for referral traffic.

Featured Image by Shutterstock/gguy

Google Expands AIO Coverage In Select Industries via @sejournal, @martinibuster

BrightEdge Generative Parser™ detected an expansion of AI Overviews beginning April 25th, covering a larger quantity of entertainment and travel search queries, with noteworthy growth in insurance, B2B technology and education queries.

Expanded AIO Coverage

Expansion of AIO coverage for actor filmographies represented the largest growth area for the entertainment sector, with 76.34% of new query coverage focused on these kinds of queries. In total, the entertainment sector experienced approximately 175% expansion of AI Overview coverage.

Geographic specific travel queries experienced substantial coverage growth of approximately 108%, showing up in greater numbers for people who are searching for activities in specific travel destinations within specific time periods. These are complex search queries that are difficult to get right with the normal organic search.

B2B Technology

The technology space continues to experience steady growth of approximately 7% while the Insurance topic has a slightly greater expansion of nearly 8%. These two sectors bear a little more examination because they mean that publishers increasingly shouldn’t rely on keyword search results performance but instead focus on growing mindshare in the audience that are likely to be interested in these topics. Doing this may also assist in generating the external signals of relevance that Google may be looking for when understanding what topics a website is authoritative and expert in.

According to BrightEdge:

“Technical implementation queries for containerization (Docker) and data management technologies are gaining significant traction, with AIOs expanding to address specific coding challenges.”

That suggests that Google is stepping up on how-to type queries to help people understand the blizzard of new technologies, services and products that are available every month.

Education Queries

The Education sector also continues to see steady growth with a nearly 5% expansion of AIO keyword coverage, wiwth nearly 32% of that growth coming from keywords associated with online learning, with particular focus on specialized degree programs and professional certifications in new and emerging fields.

BrightEdge commented on the data:

“Industry-specific expansion rates directly impact visibility potential. Intent patterns are unique to each vertical – success requires understanding the specific query types gaining AI Overviews in YOUR industry, not just high-volume terms. Google is building distinct AI Overview patterns for each sector.”

Jim Yu, CEO of BrightEdge, observes:

“The data is clear, Google is reshaping search with AI-first results in highly specific ways across different verticals. What works in one industry won’t translate to another.”

Takeaways

Entertainment Sector Sees Largest AIO Growth

  • Actor filmographies dominate expanded coverage, making up over 76% of entertainment-related expansions.
  • Entertainment queries in AIO expanded by about 175%.

Travel AIO Coverage Grows For Location-Specific Queries

  • Geographic and time-specific activity searches expanded by roughly 108%.
  • AIO is increasingly surfacing for complex trip planning queries.

Steady AIO Expansion In B2B Technology

  • About 7% growth, with increasing coverage of technical topics.
  • Google appears to target how-to queries in fast growing technology sectors.

Insurance Sector Expansion Signals Broader Intent Targeting

  • Insurance topics coverage by AIO grew by nearly 8%.

Education Sector Growth Is Focused On Online Learning

  • 5% increase overall, with nearly one-third of new AIO coverage tied to online programs and professional certifications in emerging fields.

Sector-Specific AIO Patterns Require Tailored SEO Strategies

Success depends on understanding AIO triggers within your vertical and not relying solely on high-volume keywords, which means considering a more nuanced approach to topics.  Google’s AI-first indexing is reshaping how publishers need to think about search visibility.

Featured Image by Shutterstock/Sergey Nivens

33% of Google Users Stuck with Bing After a Two-Week Trial: Study via @sejournal, @MattGSouthern

A study found that 33% of Google users continued to use Bing after trying it for two weeks. This challenges the prevailing notion about search engine preferences and Google’s market dominance.

The research, published by the National Bureau of Economic Research, suggests Google’s market share isn’t just because it’s better. Many users haven’t tried alternatives.

The study was initially published in January but flew under our radar at the time. Hat tip to Windows Central for surfacing it again recently.

After reviewing the study, I felt it deserved a closer examination. Here are the findings that stand out.

Google’s Market Power: More Than Just Quality

Researchers from Stanford, MIT, and the University of Pennsylvania tested 2,354 desktop internet users to understand why Google holds about 90% of the global search market.

They looked at several possible reasons for Google’s dominance:

  • Better quality
  • Wrong ideas about competitors
  • Default browser settings
  • Hassle of switching
  • Users are not paying attention
  • Data advantages

While many think Google wins purely on quality, the research shows it’s not that simple.

The researchers challenge these claims with their findings:

“Google, however, maintains that its success is driven by its high quality, that competition is ‘only a click away’ given the ease of switching, and that increasing returns to data are small over the relevant range.”

The “Try Before You Buy” Effect

One key finding stands out: after being paid to use Bing for two weeks, one-third of Google users continued to use Bing even after the payments stopped.

The researchers found:

“64 percent of participants who kept using Bing said it was better than they expected, and 59 percent said they got used to it.”

The study further explains:

“Exposure to Bing increased users’ self-reported perceptions of its quality by 0.6 standard deviations.

This represents “a third of the initial gap between Google and Bing and more than half a standard deviation.”

This suggests people avoid Bing not because it’s worse, but because they haven’t given it a fair shot.

Challenging Common Beliefs

When Google users were asked to choose their search engine (making switching simple), Bing’s share grew by only 1.1 percentage points.

This suggests that default settings affect market share mainly by preventing users from trying alternatives.

The authors state:

“Our results suggest that their perceptions about Bing improved after exposure. Default Change group participants who keep using Bing do so for two reasons. First, like Switch Bonus group participants, their valuation of Bing increases due to experience. Second, some participants may continue to prefer Google but not switch back due to persistent inattention.”

Analysis of Bing’s search data showed that even if Microsoft had access to Google’s search data, it wouldn’t dramatically improve results.

The researchers concluded:

“We estimate that if Bing had access to Google’s data, click-through rates would increase from 23.5 percent to 24.8 percent.”

The EU requires Google to display users with a choice of search engines, but this study suggests that such measures won’t be effective unless users try the alternatives.

The researchers add:

“Driven by the limited effects of our Active Choice intervention, our model predicts that choice screens would increase Bing’s market share by only 1.3 percentage points.”

How They Did the Research

Unlike studies that ask people questions, this one used a browser extension to track real search behavior over time.

The researchers split users into groups:

  • A control group that changed nothing
  • An “active choice” group that picked their preferred search engine
  • A “default change” group paid to switch to Bing for two days
  • A “switch bonus” group paid to use Bing for two weeks

They also measured how users’ opinions changed after trying different search engines. Many users rated Bing higher after using it.

What This Means

These findings suggest Google’s advantage comes from exposure, not just from being technically superior.

Current legal cases against Google may not have a significant impact unless they encourage more people to try alternatives.

The researchers conclude:

“Our results suggest that regulators and antitrust authorities can increase market efficiency by considering search engines as experience goods and designing remedies that induce learning.”

This research comes as Google faces legal challenges in both the US and the EU, with courts considering ways to increase competition in the search market.

For search marketers, the study suggests that Google’s position may not be as secure as many think, although competitors still face the challenge of persuading users to give them a try.


Featured Image: gguy/Shutterstock

Google Is Launching Search Central Deep Dive Events via @sejournal, @martinibuster

Google announced via their Search Off the Record podcast that they are launching a multi-day conference series to enable in-depth workshops on SEO topics that matter. The series is launching as a test pilot in the Asia-Pacific region, then expanding from there.

Google’s Gary Illyes said that he’s been thinking of doing a multi-day event for the past year because he believes that the one-day format only allows for a relatively shallow coverage of important topics. He said that they’re constrained to 25 minutes to cover a topic which means that they end up speeding through the discussion without being able to “contextualize” them, to show how they’re relevant for people.

Gary explained:

“One of my pet peeves with Search Central Live is that we have these well-rehearsed talks that speed through one topic, and then you do, with that information, whatever you want. Basically, we don’t have time, like we have 25 minutes, maybe, for a talk. …how do you link the topic that you talked about to something tangible? Like, for example, if you are talking about crawling, then how do you show people how that looks like in Search Console or in server logs or whatever, if you don’t have the time, if you only have 25 minutes or even less?”

A Googler named Cherry commented:

“With longer time, of course, we can talk about more things, deeper things. We can have more time for networking, interactive… or even practical things that usually we might not have.”

Topics To Be Covered

The Googlers indicated that they’re not settled on the topics that they’ll cover, whether it will focus on the technical or marketing side of SEO or both. User feedback during the signup process may influence the sessions that they choose to present so that they can keep it relevant to what users in any particular geographic are are most concerned about.

Location Of Deep Dive Events

In a sign that this event is still in the planning stage, the Googlers said that they haven’t chosen where the first event will be held, only that they’re looking to kick them off in the Asia Pacific (APAC) region, mentioning that Bali is on the list of places under consideration. Budget is one of the considerations.

Search Central Live Global

Lastly, they announced that they will still be presenting Search Central Live but will be expanding it to more locations globally, including possibly to the Baltics.

Listen To Search Off The Record Episode 90

Featured Image by Shutterstock/fongleon356

Reimagining EEAT To Drive Higher Sales And Search Visibility via @sejournal, @martinibuster

The SEO Charity podcast recently discussed a different way to think about EEAT that focuses on activities that leads to external signals that Google may associate with the underlying concepts of EEAT (expertise, experience, authoritativeness, and trustworthiness). Google’s John Mueller recently said that EEAT is not something that you can add to a site and most of what was discussed on the show lines up perfectly with that reality.

The podcast, hosted by Olesia Korobka and Anton Shulke, featured Amanda Walls (LinkedIn profile), founder of Cedarwood Digital in Manchester, UK.

Aristotle And SEO

Amanda introduced the concept of applying Aristotle’s principles of ethos, pathos, and logos to SEO strategy. These principles are three ways to persuade site visitors and potential customers:

  1. Credibility (ethos)
  2. Emotional appeal (pathos)
  3. Logical reasoning (logos), which is used to convince an audience.

Amanda explains these concepts in more depth but those three principles form the basis for her approach to creating the circumstances that lead to positive external signals that can be correlated to concepts like expertise, experience, authoritativeness, and trustworthiness.

Why It Matters for SEO

Amanda says that SEO is ultimately about driving leads and conversions, not just rankings and I agree with that 100%. The history of SEO is littered with gurus crowing about all the traffic they gained for clients but they never talk about the part that really matters which is sales and leads.

Link building historically falls into that trap where both the client and the link builder focus on how many links are acquired each month and look to traffic as evidence of success. But really, as Amanda points out, everything that a good SEO does should be focused on increasing sales. Nothing else matters.

Amanda explained:

“SEO is more than just rankings, it’s about conversion. It’s about business return. It’s about getting that success, those leads, those sales that we need… Bringing people to a website ….means nothing if they don’t convert. …we don’t just want to bring people to the website, we want them to engage and love your brand and have a really, really good reason to go through and fulfill the conversion journey.”

Reputation Management

Amanda recommends focusing on managing the business’s reputation, such as in reviews, interviews, and what’s written online about the brand.

She cites the following statistics:

  • 87% of consumers will back out of a purchase decision if they read something negative about the brand.
  • 81% of consumers do extensive research before a purchase, as much as 79 days.

Amanda prescribes findability, credibility, and persuasion as the ingredients for successful search optimization:

“We’re working on SEO to help people find us, and then most importantly, we are convincing them or we’re persuading them to actually go and purchase our product…”

Monitor Off-Site Signals

Amanda recommends regularly researching your brand to uncover potential issues, to monitor the online user sentiment, and to assess media coverage because poor off-site sentiment can remove users out of the conversion funnel.

Manage On-Site Signals

Amanda also recommends using the About Us page for sharing relatable stories that users can generate actual positive feelings for the brand, using the phrase emotional appeal to describe the experience users should get from an About Us page. She says that this can be as simple as telling potential customers about the business.

User-Generated Content And Authenticity

Many of the fastest growing business on the Internet cultivate high quality user generated content. Encouraging customers to post reviews and images helps to build confidence in products.

Amanda explains:

“And then also from a pathos perspective, you know, really getting that kind of user generated content, getting people to connect… because fundamentally humans, they buy from humans and the more human and the more emotional that we can be in our sales process, the more likely that we are to get that buy-in and that connection that we need to actually get across to our audience.”

Pitching To Journalists

This last part, pitching story ideas to journalists, is something that link building companies consistently get wrong. I know because I get approached by them all the time and they consistently have the wrong approach, which is focusing too much on links and not enough on understanding my audience.

I specialized in link building back in the early days of SEO (early 2000s). I was even the moderator of the link building forum at WebmasterWorld. Although I don’t do link building anymore, I have a vast, vast amount of experience persuading publishers to give my clients a link.

My opinion is that PR to journalists should be approached strictly for brand exposure. Don’t make links the goal.

Focus instead on building positive stories with journalists and let them write those articles with or without adding a link, let them decide. What will happen is that the consumers will go out and type your business’s name into Google and that’s a strong, strong signal. I prefer thousands of consumers typing my website’s name on Google over a handful of links, every time, all day long.

I strongly agree with what Amanda says about understanding a journalist’s audience:

“92% of journalists say that understanding their audience is crucial for them to consider a story pitch.”

Understanding the audience is super important. I’ll go even deeper and recommend understanding what motivates the audience. Focus on the reasons why a journalist’s readers will click an article title that’s displayed on Google News. Once you understand that part, I can practically guarantee that PR outreach approval rates will skyrocket.

Takeaway

The SEO Charity podcast episode featuring Amanda Walls introduces a novel way to build signals associated with Google’s EEAT (expertise, experience, authoritativeness, trustworthiness) by focusing on credibility, emotion, and logic in content strategy. Walls emphasizes using Aristotle’s persuasive principles to influence reputation, brand perception, and conversion, encouraging SEO strategies focused on meaningful business outcomes like leads and sales, with better search visibility that supports those ends.

Watch the SEO Charity episode on EEAT:

Reimagining E-E-A-T with Amanda Walls

Featured Image by Shutterstock/Ollyy

Google’s Updated Raters Guidelines Refines Concept Of Low Quality via @sejournal, @martinibuster

Google’s Search Quality Rater Guidelines were updated a few months ago, and several of the changes closely track the talking points shared by Googlers at the 2025 Search Central Live events. Among the most consequential updates are those to the sections defining the lowest quality pages, which more clearly reflect the kinds of sites Google wants to exclude from the search results.

Section 4.0 Lowest Quality Pages

Google added a new definition of the Lowest Rating in the Lowest Quality Pages section. While Google has always been concerned about removing low quality sites from the search results, this change to their raters guideline likely reflects an emphasis on weeding out a specific kind of low quality website.

The new guideline focuses on identifying the publisher’s motives for publishing the content.

The previous definition said:

“The Lowest rating is required if the page has a harmful purpose, or if it is designed to deceive people about its true purpose or who is responsible for the content on the page.”

The new version keeps that sentence but adds a new sentence that encourages the quality rater to consider the underlying motives of the publisher responsible for the web page. The focus of this guidance is to encourage the quality raters to consider how the page benefits a site visitor and to judge whether the purpose of the page is entirely for benefiting the publisher.

The addition to this section reads:

“The Lowest rating is required if the page is created to benefit the owner of the website (e.g. to make money) with very little or no attempt to benefit website visitors or otherwise serve a beneficial purpose.”

There’s nothing wrong with being motivated to earn an income from a website. What Google is looking at is if the content only serves that purpose or if there is also a benefit for the user.

Focus On Effort

The next change is focused on identifying how much effort was put into creating the site. This doesn’t mean that publishers must now document how much time and effort was put into the creating the content. This section is simply about looking for evidence that the content is not distinguishable from content on other sites and offers no clear advantages over the content found elsewhere on the Internet.

This part about the main content (MC) was essentially rewritten:

“● The MC is copied, auto-generated, or otherwise created without adequate effort.”

The new version has more nuance about the main content (MC):

“● The MC is created with little to no effort, has little to no originality and the MC adds no value compared to similar pages on the web”

Three things to unpack there:

  1. Content created with little to no effort
  2. Contains little to no originality
  3. Main content adds no additional value

Publishers who focus on keeping up with competitors should be careful that they’re not simply creating the same thing as their competitors. Saying that it’s not the same thing because it’s the same topic only better doesn’t change the fact that it’s the same thing. Even if the content is “ten times better” the fact remains that it’s still basically the same thing as the competitor’s content, only ten times more of it.

A Word About Content Gap Analysis

Some people are going to lose their minds about what I’m going to say about this, but keep an open mind.

There is a popular SEO process called Content Gap Analysis. It’s about reviewing competitors to identify topics that the competitors are writing about that are missing on the client’s site then copying those topics to fill the content gap.

That is precisely the kind of thing that leads to unoriginality and content that is indistinguishable from everything else that’s on the Internet. It’s my number one reason I would never use a software program that scrapes top ranked sites and suggests topics based on what the competitor’s are publishing. It results in virtually indistinguishable content and pure unoriginality.

Who wants to jump from one site to another site and read the same exact  recipes, even if they have more images and graphs and videos. Copying a competitor’s content “but doing it better” is not original.

Scraping Google’s PAAs (People Also Asked) just like everyone else does not result in original content. It results in content that’s exactly the same as everyone else that’s scraping PAAs.

While the practice of content gap analysis is about writing about the same thing only better, it’s still unoriginal. Saying it’s better doesn’t change the fact that it’s the same thing.

Lack of originality is a huge issue with Internet content and it’s something that Google’s Danny Sullivan discussed extensively at the recent Google Search Central Live in New York City.

Instead of looking for information gaps, it’s better to review your competitor’s weaknesses. Then look at their strengths. Then compare that to your own weaknesses and strengths.

A competitor’s weakness can become your strength. This is especially valuable information when competing against a bigger and more powerful competitor.

Takeaways

1. Google’s Emphasis on Motive-Based Quality Judgments

  • Quality raters are now encouraged to judge not just content, but the intent behind it.
  • Pages created purely for monetization, with no benefit to users, should be rated lowest.
  • This may signal Google’s intent to refine their ability to week out low quality content based on the user experience.

2. Effort and Originality Are Now Central Quality Signals

  • Low-effort or unoriginal content is explicitly called out as justification for the lowest rating.
  • This may signal that Google’s algorithms may increasingly focus on surfacing content with higher levels of originality.
  • Content that doesn’t add distinctive value over competitors may struggle in the search results

3. Google’s Raters Guidelines Reflect Public Messaging

  • Changes to the Guidelines mirror talking points in recent Search Central Live events.
  • This suggests that Google’s algorithms may become more precise on things like originality, added value, and effort put into creating the content.
  • This means publishers should (in my opinion) consider ways to make their sites more original than other sites, to compete by differentiation.

Google updated its Quality Rater Guidelines to draw a sharper line between content that helps users and content that only helps publishers. Pages created with little effort, no originality, or no user benefit are now listed as examples of the lowest quality, even if they seem more complete than competing pages.

Google’s Danny Sullivan used the example of travel sites that all have the sidebar that introduces the smiling site author and other hallmarks of travel sites as an example of an area where sites become indistinguishable from each other.

The reason why publishers do that is that they see what Google is ranking and assume that’s what Google wants. In my experience, that’s not the case. In my opinion it may be useful to think about what you can do to make a site more original.

Download the latest version of Google’s Search Quality Raters Guidelines here (PDF).

Featured Image by Shutterstock/Kues

Google Answers Why Landing Page Ranks For An E-Commerce Query via @sejournal, @martinibuster

Google’s John Mueller answered a question on Bluesky about why an e-commerce page with minimal content is ranking, illustrating that sometimes optimized content isn’t enough.

E-Commerce Search Results

A person posted their concerns about an e-commerce site that was ranking in the search results with barely any content. In fact, the domain that was ranking redirects to another domain. On the face of it, it appears like something is not right. Why would Google rank a landing page about a domain name transfer, right?

Why would Google rank what is essentially a landing page with virtually zero content for a redirected domain?

Why A Landing Page Ranks

The company with the landing page had acquired another company and they subsequently joined the two domains. There was nothing wrong or spammy going on, one business bought another business, it happens every day.

The person asking the question dropped a URL and a screenshot of the landing page and asked:

“How does Google think this would be the best result and also, do you think this is a relevant result for users?”

Google’s John Mueller answered:

“It looks like a normal ecommerce site to me. They could have handled the site-migration a bit more gracefully (and are probably losing a lot of “SEO value” by doing this instead of a real migration), but it doesn’t seem terrible for users.”

Site Migration

Mueller’s comment about the site migration was expanded further.

He posted:

“Our guidance for site migrations is at https://developers.google.com/search/docs/crawling-indexing/site-move-with-url-changes . What they’re doing is a “soft or crypto redirect”, and they’re doing it “N:1″ (meaning all old pages go there). Both of these make transfering information about the old site hard / impossible.”

Sometimes Google ranks pages that seem like they don’t belong. But sometimes the site rankings make sense when looked at from a different perspective, particularly from the perspective of what’s good and makes sense for the user. Rankings change all the time and it could be that the rankings for that page could go away after a certain amount of time. But waiting for a competitor to drop away isn’t really a good SEO strategy. Google’s Danny Sullivan had some good advice about differentiating a site for better rankings.