ChatGPT Leads AI Search Race While Google & Others Slip, Data Shows via @sejournal, @MattGSouthern

ChatGPT leads the AI search race with an 80.1% market share, according to fresh data from Similarweb.

Over the last six months, OpenAI’s tool has maintained a strong lead despite ups and downs.

Meanwhile, traditional search engines are struggling to grow as AI tools reshape how people find information online.

AI Search Market Share: Today’s Picture

The latest numbers show ChatGPT’s market share rebounding to 80.1%, up from 77.6% a month ago.

Here’s how the competition stacks up:

  • DeepSeek: 6.5% (down from 7.6% last month)
  • Google’s AI tools: 5.6% (up slightly from 5.5% last month)
  • Perplexity: 1.5% (down from 1.9% last month)
  • Grok: 2.6% (down from 3.2% last month)

These numbers are part of Similarweb’s bigger “AI Global” report (PDF link).

Traditional Search Engines Losing Ground

The most important finding may be that traditional search engines aren’t growing:

  • Google: -2% year-over-year
  • Bing: -18% year-over-year (a big drop from +18% in January)
  • Yahoo: -11% year-over-year
  • DuckDuckGo: -6% year-over-year
  • Baidu: -12% year-over-year

Traditional search shows a steady decline of -1% to -2% compared to last year. It’s important to note, however, that Google has seven times the user base of ChatGPT.

Which AI Categories Are Growing Fastest

While AI is changing search, some AI categories are growing faster than others:

  • DevOps & Code Completion: +103% (over 12 weeks)
  • General AI tools: +34%
  • Music Generation: +12%
  • Voice Generation: +8%

On the other hand, some AI areas are shrinking, including Writing and content Generation (-12 %), Customer Support (11%), and Legal AI (70%).

Beyond Search: Other Affected Industries

AI’s impact goes beyond just search engines. Other digital sectors facing big changes include:

  • EdTech: -28% year-over-year (with Chegg down 66% and CourseHero down 69%)
  • Website Builders: -13% year-over-year
  • Freelance Platforms: -19% year-over-year

Design platforms are still growing at +10% year over year, suggesting that AI might be helping rather than replacing these services.

What This Means

Traditional SEO still matters, but it isn’t enough. As traditional search traffic drops, you need to branch out.

Similarweb’s data shows consistent negative growth for traditional search engines alongside ChatGPT’s dominant market position, indicating a significant shift in information discovery patterns.

The takeaway for search marketers is to adapt to AI-driven search while keeping up with practices that work in traditional search. This balanced approach will be key to success in 2025 and beyond.


Featured Image: Fajri Mulia Hidayat/Shutterstock

WordPress WooCommerce Bug Causing Sites To Crash via @sejournal, @martinibuster

A WordPress bug is causing WooCommerce sites to display a fatal error, crashing ecommerce sites. The problem originates from a single line of code. A workaround has been created. The WooCommerce team is aware of the issue and is working on issuing a permanent fix in the form of a patch.

WooCommerce Sites Crashing

Someone posted about the error at the WordPress.org support forums and others with the same problem replied that they were experiencing the same thing. Most of those responding reported that they had not recently done anything to their sites, that they had crashed all of a sudden.

The person who initially reported the bug offered a workaround for getting websites back up and running, an edit of a single line of code in the BlockPatterns.php file, which is a WooCommerce file.

The file is located here:

wp-content/plugins/woocommerce/src/Blocks/BlockPatterns.php

Others reported receiving the same fatal error message:

“Uncaught Error: strpos(): Argument #1 ($haystack) must be of type string, null given in /var/www/site/data/www/site.com.br/wp-content/plugins/woocommerce/src/Blocks/BlockPatterns.php on line 251”

One of the commenters on the discussion posted:

“Same issue here.

It occurred in version 9.8.2, and upgrading to 9.8.3 didn’t resolve it. Downgrading to 9.7.1 didn’t help either.

The problem happened without any interaction with plugins or recent updates. Replacing the code at line 251 worked as a temporary workaround.

We’ll need to find a more stable solution until the WooCommerce team releases an official patch.”

Others reported that they received the error after updating their plugins but that rolling back the update didn’t solve the problem, while others reported that they hadn’t done anything prior to experiencing the crash.

Someone from WooCommerce support responded to say that the WooCommerce team is aware of the problem and are working to address it:

“Thank you for reporting this. It’s a known issue, and a temporary workaround has been shared here: https://github.com/woocommerce/woocommerce/issues/57760#issuecomment-2854510504

You can track progress and updates on the GitHub thread: https://github.com/woocommerce/woocommerce/issues/57760, as the team is aware and actively addressing it.”

Discussion On GitHub

The official WooCommerce GitHub repository has this note:

“Some sites might see a fatal error around class BlockPatterns.php, with the website not loading. This was due a bad response from Woo pattern repository. A fix was deployed to the repository but certain sites might still have a bad cache value.”

They also wrote:

“The issue has been fixed from the cache source side but certain sites were left with a bad cache value, we will be releasing patch updates to fix that.”

Featured Image by Shutterstock/Kues

It’s Official: Google Launches AI Max for Search Campaigns via @sejournal, @brookeosmundson

Google Ads has announced a major update to Search campaigns. The new AI Max campaign setting will roll out globally in beta starting later this month.

Per Google’s announcement, advertisers who enable AI Max in their Search campaigns can expect stronger performance through improved query matching, dynamic creative, and better control features.

According to Google, early testing shows advertisers see an average 14% more conversions or conversion value at a similar CPA or ROAS. Campaigns still using mostly exact or phrase match keywords see even greater uplifts, around 27%.

This update follows months of closed beta testing with large brands already reporting positive results.

Let’s take a deeper look at what AI Max brings and why it matters to paid search marketers.

What is AI Max for Search Campaigns?

If you’ve been hearing the term “Search Max” in the wild lately, the official name for it is AI Max for Search.

AI Max is not a new campaign type. Instead, it’s a one-click upgrade available within existing Search campaign settings.

Once activated, it layers in three core enhancements:

  • Search term matching: Uses AI to extend keyword matching into relevant, high-performing queries your current keywords might miss.

  • Text customization: Rebrands the former Automatically Created Assets (ACA) tool. Dynamically generates new headlines and descriptions based on your landing pages, existing ads, and keywords.

  • Final URL expansion: Sends users to the most relevant pages on your site based on query intent.

Advertisers can opt out of text customization or final URL expansion at the campaign level, and opt out of search term matching at the ad group level. However, Google recommends using all three together for maximum performance.

AI Max is designed to complement, not replace, keyword match types. If a user’s search exactly matches a keyword in your campaign, that will always take priority.

Why is Google Introducing AI Max?

Search behavior is changing fast. As Google integrates more AI-powered experiences like AI Overviews and Google Lens into Search, people are using more complex, conversational, and even visual queries.

Advertisers have also voiced concerns about losing transparency and control as campaign automation expands.

AI Max aims to address both.

  • Advertisers keep access to existing Search reports and controls while layering in new targeting and creative tools.
  • More granular reporting is rolling out, including search terms by asset and improved URL parameters for detailed tracking.

Essentially, it’s Google’s answer to increasing demand for flexible automation, but with guardrails in place for marketers.

Are There Controls For Brand Safety?

Google added several controls to address a frequent advertiser concern: automation overreaching into irrelevant or risky placements.

Here’s what’s included with the AI Max for Search rollout:

  • Brand controls: Choose which brands your ads appear alongside (or exclude specific brands).
  • Location of interest controls: Target based on user geo intent at the ad group level (great for multi-location businesses).
  • Creative asset controls: Remove generated assets or block them entirely if they don’t meet brand guidelines.

One note of caution: as of now, AI-generated assets will go live before advertisers have the chance to review them.

Advertisers will need to monitor and react quickly to any compliance issues.

Are There Updates Coming to Reporting?

While AI Max integrates into existing Search reporting, the functionality is bringing new insights:

  • Search terms reporting will now show associated headlines and URLs.
  • Asset reports will measure performance not just by impressions, but by spend and conversions.
  • A new URL parameter will offer deeper visibility into search queries and performance across match types.

These reporting improvements will start in the Google Ads online interface as the feature rolls out.

Support for API, Report Editor, and Desktop Editor access is slated for later in 2025.

How Does AI Max Compare to Performance Max or Dynamic Search Ads?

Many marketers are asking how AI Max fits alongside other Google campaign types.

Here’s the current landscape of differences or overlap between other campaign types:

  • Performance Max and AI Max for Search may be eligible for the same Search auctions. However, if a user’s search query exactly matches a keyword in your Search campaign, Search will always take priority.
  • Dynamic Search Ads (DSA) remain available. AI Max is not a direct replacement, though it does overlap in some areas like final URL expansion and keywordless matching.
  • Optimized Targeting for audiences could be seen as a similar concept to AI Max’s query expansion, but applied to audiences rather than keywords.

Additionally, AI Max for Search can be A/B tested against traditional Search setups using drafts and experiments. More customized testing tools are in development.

Who is AI Max Not Ideal For?

While AI Max offers clear benefits to trying out, this new setting may not suit every advertiser verticals.

If you’re an advertiser or a brand with the following scenarios, I’d recommend using caution when testing out AI Max for Search.

  • Advertisers with strict creative guidelines or sensitive content policies.
  • Brands needing pinning for ad assets (since final URL expansion does not support pinning).
  • Businesses with websites that change frequently, making automated creative risky or inaccurate.

For industries like legal or healthcare, where lead quality and content compliance are crucial, AI Max may require careful testing before wide adoption.

What This Means for Search Marketers

AI Max represents a significant shift in how Google Search campaigns can scale.

It brings the adaptive reach and creative flexibility of Performance Max without requiring a new campaign type or sacrificing keyword control.

For advertisers already embracing broad match and automated bidding, AI Max may feel like a natural progression.

For those still relying on exact and phrase match keywords, it offers an opportunity to expand cautiously while maintaining key controls.

The rollout also signals Google’s direction: automation will continue to evolve, but advertiser input and oversight remain essential.

Marketers who test AI Max thoughtfully by balancing automation with strategy are likely to gain a competitive edge as search behavior grows more complex.

Google’s Walled Garden: Users Make 10 Clicks Before Leaving via @sejournal, @MattGSouthern

New data shows Google keeps users on its site longer. Visitors now make 10 clicks on Google’s site before leaving for another website.

This finding comes from a 13-month study comparing Google and ChatGPT traffic patterns.

Google Keeps Users In, ChatGPT Sends Them Out

Tyler Einberger of Momentic analyzed Similarweb data showing that Google’s “pages per visit” metric has climbed to 10 as of March, a big jump from before.

Image Credit: Momentic.

What does this mean? Users spend more clicks on Google’s search results than on other websites.

The report explains:

“Increasing ‘Pages per Visit’ for Google.com is an indicator that users are spending more clicks within Google’s search results (SERPs). Since most SERP interactions—like interacting with SERP features, paging, refining searches, or clicking images—change the URL but keep visitors on Google’s domain.”

Google still sends the most overall traffic to external websites.

Google generated 175.5 million outgoing visits in March compared to ChatGPT’s 57.6 million. This represents a 66.4% increase for Google compared to last year.

The Efficiency Gap

ChatGPT is more efficient at sending people to other websites.

The numbers tell the story:

  • ChatGPT generates 1.4 external website visits per user
  • Google produces just 0.6 visits per user

This means ChatGPT users are 2.3 times more likely to visit external websites than Google users, even though Google’s audience is about 6.8 times larger.

The SERP Retention Strategy

Google’s increasing in-platform clicks match its strategy of expanding search features. These features provide immediate answers without requiring users to visit other websites.

Google is succeeding at two goals:

  1. Remaining the web’s primary traffic source
  2. Keeping users on Google’s properties longer

While Google sent more outgoing traffic in early 2025, its audience barely grew. This shows a complex relationship between keeping users and referring them elsewhere.

What This Means

For SEO pros and marketers, this trend creates new challenges and opportunities:

  • With users spending more time on Google’s interfaces, capturing attention in the first screen view matters more than ever.
  • Focus on appearing in featured snippets, knowledge panels, and other SERP elements to maintain visibility as traditional organic clicks become harder to get.
  • Consider ChatGPT and other AI platforms as additional traffic sources since they refer more visitors per user.
  • Users now interact with multiple SERP features before clicking a website, requiring better attribution models and content strategies.

The Broader AI Search Market

While Google and ChatGPT lead the conversation, other AI search platforms are growing fast.

Perplexity grew 110.7% month-over-month in March. Grok grew 48.1% and Claude grew 23%.

These newer platforms could change current traffic patterns as they gain users, though the report doesn’t analyze their referral efficiency in detail.

Google remains the biggest traffic source overall. However, its growing “walled garden” approach means marketers should watch these trends and diversify where their traffic comes from.


Featured Image: Here Now/Shutterstock

Google’s Updated Raters Guidelines Target Fake EEAT Content via @sejournal, @martinibuster

A major update to Google’s Search Quality Raters Guidelines (QRG) clarifies and expands on multiple forms of deception that Google wants its quality raters to identify. This change continues the trend of refining the guidelines so that quality raters become better at spotting increasingly granular forms of quality issues.

TL/DR

Authenticity should be the core principle for any SEO and content strategy.

Quality Guidelines Section 4.5.3

Section 4.5.3 has essentially been rewritten to be clearer and easier to understand but most importantly it has been expanded to cover more kinds of deception. One can speculate that the quality raters weren’t overlooking certain kinds of website deception and that these changes are addressing that shortcoming. This could also signal that Google’s algorithms may in the near future become more adept at spotting the described kinds of deception.

The change in the heading of section 4.5.3 reflects the scope of the changes, with greater detail over the original version.

The section title changed from this:

4.5.3 Deceptive Page Purpose and Deceptive MC Design

To this:

“4.5.3 Deceptive Page Purpose, Deceptive Information about the Website, Deceptive Design”

The entire section was lightly rewritten and reorganized for greater clarity. It’s not necessarily a new policy but rather a more detailed and nuanced version of it, with a few parts that are brand new.

Deceptive Purpose

The following is a new paragraph about deceptive purpose:

“Deceptive purpose:

● A webpage with deliberately inaccurate information to promote products in order to make money from clicks on monetized links. Examples include a product recommendation page on a website falsely impersonating a celebrity blog, or a product recommendation based on a false claim of personal, independent testing when no such testing was conducted.”

Google very likely has algorithmic signals and processes to detect and remove sites with these kinds of deceptive content. While one wouldn’t expect that a little faking would be enough to result in a sudden drop in rankings, why take the chance? It’s always the safest approach to focus on authenticity.

To be clear, the focus of this section isn’t just about putting fake information on a website but rather it’s about deceptive purpose. The opposite of a deceptive purpose is a purpose rooted in authenticity, with authentic intent.

Deceptive EEAT Content

There is now a brand new section that is about fake EEAT (Expertise, Experience, Authoritativeness, and Trustworthiness) content on a website. A lot of SEOs talk about adding EEAT to their web pages but the fact is that EEAT is not something that one adds to a website. EEAT is a quality of a website that’s inherent in the overall experience of researching a site, learning about a site, and in the process of consuming the content, which can result in signals that site visitors may generate about a website.

Here’s the guidance about fake EEAT content:

“● A webpage or website with deceptive business information. For example, a website may claim to have a physical “brick and mortar” store but in fact only exists online. While there is nothing wrong with being an online business, claiming to have a physical “brick and mortar” (e.g. fake photo, fake physical store address) is deceptive.

● A webpage or website with “fake” owner or content creator profiles. For example, AI generated content with made up “author” profiles (AI generated images or deceptive creator descriptions) in order to make it appear that the content is written by people.

● Factually inaccurate and deceptive information about creator expertise. For example, an author or creator profile inaccurately claims to have credentials or expertise (e.g. the content creator claims falsely to be a medical professional) to make the content appear more trustworthy than it is.”

Deceptive Content, Buttons, And Links

The new quality raters guidelines also goes after sites that use deceptive practices to get users to take actions they didn’t intend to. This is an extreme level of deception that shouldn’t be a concern to any normal site.

The following are additions to the section about deceptive design:

“● Pages with deceptively designed buttons or links . For example, buttons or links on pop ups, interstitials or on the page are designed to look like they do one thing (such as close a pop up) but in fact have a different result which most people would not expect, e.g. download an app.

● Pages with a misleading title or a title that has nothing to do with the content on the page. People who come to the page expecting content related to the title will feel tricked or deceived.”

Takeaways

There are three  important takeaways from the updates to section 4.5.3 of Google’s Search Quality Raters Guidelines:

1. Expanded Definition Of Deceptive Purpose

  • Section 4.5.3 now explicitly includes new examples of deceptive page intent, such as fake endorsements or falsified product testing.
  • The revision emphasizes that deceptive purpose goes beyond misinformation—it includes misleading motivations behind the content.

2. Focus On Deceptive EEAT Content

  • A new subsection addresses deceptive representations of EEAT, including:
  • Fake business details (e.g. pretending to have a physical store).
  • Made-up author profiles or AI-generated personas.
  • False claims of creator expertise, such as unearned professional credentials.

3. Deceptive Design and UI Practices

The raters guidelines calls attention to manipulative interface elements, such as:

  • Buttons that pretend to close popups but trigger downloads instead.
  • Misleading page titles that don’t match the content.

Google’s January 2025 update to the Search Quality Raters Guidelines significantly expands how raters should identify deceptive web content. The update clarifies deceptive practices involving page purpose, false EEAT (Expertise, Experience, Authoritativeness, Trustworthiness) content, and misleading design elements. The purpose of the update is to help raters to better recognize manipulation that could mislead users or inflate rankings and could indicate the kinds of low quality that Google is focusing on.

Featured Image by Shutterstock/ArtFamily

Google Shares Insight About Time-Based Search Operators via @sejournal, @martinibuster

Google’s Search Liaison explained limitations in Google’s ability to return web pages from a prior date, also noting that date-based advanced search operators are still in beta. He provided one method for doing this but omitted discussing an older, simpler method that almost accomplishes the same thing.

How To Find Articles By Older Published Date

The person asking the question understood how to find articles published from the previous year, month, and twenty four hours. But didn’t know how to find articles published before a specific date.

The post on Bluesky asked:

“Is there a way to search for articles OLDER than a certain date?

I know advanced search can guarantee in the past year, month, 24h, but I want to specifically be able to find articles published BEFORE X historical event happened, and I can’t find a way to filter. Help?”

Search Liaison posted this way to do it which can be difficult to memorize if you’re a busy person:

“We have before: and after: operators that are still in beta. You must provide year-month-day dates or only a year. You can combine both. For example:

[avengers endgame before:2019]
[avengers endgame after:2019-04-01]
[avengers endgame after:2019-03-01 before:2019-03-05]”

Another Way To Do Time-Based Search

In my opinion it’s a lot easier to just use Google’s search tools:

Tools > Any Time > Custom Range

From there you just set whatever time range that you want, nothing to memorize. However, you can’t search before a certain date, you have to set the starting date and ending date you’re searching for.

Caveat About Time-Based Search

Search Liaison shared an interesting insight about how the advanced search operators for time work:

“Just keep in mind it can be difficult for us to know the exact date of a document for a variety of reasons. There’s no standard way that all site owners use to indicate a publishing or republishing date. Some provide no dates at all on web pages. Some might not indicate if an older page is updated.”

Takeaways:

The time-based advanced search operators are still in beta, which means that Google is testing to see how many people find it useful. Google might at some time in the future remove the search operators if it’s not popular or useful.

The other takeaway is that it’s hard for Google to know the exact date that a document is published.

Read the discussion on Bluesky.

Server-Side vs. Client-Side Rendering: What Google Recommends via @sejournal, @MattGSouthern

In an interview with Kenichi Suzuki from Faber Company Inc., Google Developer Advocate Martin Splitt recently shared key information about JavaScript rendering, server-side vs. client-side rendering, and structured data.

The talk cleared up common SEO confusion and offered practical tips for developers and marketers working with Google’s changing search systems.

Google’s AI Crawler & JavaScript Rendering

When asked how AI systems handle JavaScript content, Splitt revealed that Google’s AI crawler (used by Gemini) processes JavaScript well through a shared service.

Splitt explained:

“We don’t share what Googlebot sees for web search, but Google’s AI crawler that Gemini uses also renders. It uses WRS [Web Rendering Service], but it’s basically like we have a service Googlebot uses, and Gemini uses the service as well.”

This gives Google’s AI tools an edge over competitors that have trouble with JavaScript.

While one study mentioned in the interview claimed rendering sometimes takes weeks, Splitt explained that it usually happens much faster.

“The 99th percentile is within minutes,” Splitt noted, suggesting that long delays are rare and might be due to measurement errors.

Server-Side vs. Client-Side Rendering: Which is Better?

Part of the discussion covered the debate between server-side rendering (SSR) and client-side rendering (CSR).

Instead of saying one is always better, Splitt stressed that the right choice depends on what your website does.

Splitt stated:

“If you have a website that is a classical website that is basically just presenting information to the user, then requiring JavaScript is a drawback. It can break. It can cause problems. It will make things slower. It will need more battery on your phone.”

Splitt suggests SSR or even pre-rendering static HTML for websites focused on content. But CSR works better for interactive tools like CAD programs or video editors.

Splitt clarified:

“It’s not one or the other. It is two tools. Do you need a hammer or do you need a screwdriver? That depends on what you’re trying to do.”

See also: Understand the Difference Between Client-Side and Server-Side Rendering.

Structured Data’s Role in AI Understanding

The talk then moved to structured data, which is becoming more important as AI systems grow in search.

When asked if structured data helps Google’s AI understand content better, like Microsoft claims about Bing, Splitt confirmed it helps.

He stated:

“Structured data gives us more information and gives us more confidence in information. So it makes sense to have structured data.”

However, Splitt clarified that while structured data adds context, it “does not push rankings” directly. This is an important difference for SEO professionals who might think it directly boosts search positions.

What This Means

Here are the key things we learned from this interview:

  1. Google’s rendering usually happens within minutes, so the old fear of JavaScript-heavy sites being at a disadvantage is less of an issue now.
  2. Non-Google AI tools may still have trouble with JavaScript, making SSR possibly more critical for visibility across all AI systems.
  3. Use SSR for content sites and CSR for interactive tools. Don’t use one solution for everything.
  4. Though not a ranking factor, structured data helps Google understand your content better. This matters more as AI becomes a bigger part of search.

In his final advice to SEO professionals, Splitt highlighted basic principles over technical tricks:

“Think about your users. Figure out what is your business goal, how to make users happy, and then just create great content.”

As AI changes search technology, understanding these technical details becomes more important for marketers who want to optimize content for people and search algorithms.

Hear the full discussion in the video below:

Reddit Q1 Report: What It Means For Digital Marketing And SEO via @sejournal, @martinibuster

Reddit’s first-quarter 2025 earnings report shows strong growth in traffic and user engagement, particularly among logged-out users. Although the announcement and shareholder letter did not mention search engines or the temporary decrease in traffic in February, the increase in logged-out traffic suggests year-over-year growth from external referrals from search engines and social media.

The shareholder letter indirectly referenced search traffic:

“For seekers, Reddit’s open nature is essential—it allows our content to surface across the open web and be easily found in search. We remain one of the last major platforms that doesn’t require you to sign in to learn something because we believe that by giving everyone access to knowledge, we are helping fulfill the purpose of the internet. This openness broadens visibility, drives awareness, and brings us new users— but it also means that some of our traffic from external sources is variable. “

TL/DR

  • Over 400 million people visit Reddit weekly, and total revenue reached $392.4 million for the quarter, representing a 61% year-over-year increase.
  • More users are logged-in, improving Reddit’s value for ad targeting and audience engagement, which is good news for digital marketers.
  • Growth in logged-out users suggests increased external referrals from search and social, a trend that may concern publishers and SEOs

Platform Growth and User Activity: Daily Active Unique Visitors

Reddit refers to their daily active uniques as DAUq. The Reddit quarterly report defines Daily Active Uniques (DAUq) as:

“We define a daily active unique (“DAUq”) as a user whom we can identify with a unique identifier who has visited a page on the Reddit website, www.reddit.com, or opened a Reddit application at least once during a 24-hour period.”

Reddit reported 108.1 million Daily Active Uniques (DAUq), a 31% increase compared to the same quarter last year. Weekly Active Uniques (WAUq) reached 401.3 million globally, also up 31% year-over-year.

Daily Active Uniques: U.S. Versus International

Daily Active Uniques for both U.S. and International saw significant increases compared to the same period last year, with international visits post nearly double the gains for the United States.

  • U.S. DAUq was 50.1 million, up 21%,
  • International DAUq rose to 58.0 million, up 41%.

This contrast highlights that Reddit’s fastest user growth is occurring outside the U.S., signaling expanding global reach and rising visibility in international markets. This trend may indicate growing discovery opportunities through non-U.S. referral sources and increased relevance in regions where Reddit has historically had lower reach.

Logged-in Daily Active Uniques

Logged-in users can comment, moderate, start discussions, and vote. A bottom line significance is that being logged-in allows for better behavioral tracking and a higher ad targeting value. Thus, logged-in users are an important metric of the viability of the Reddit community.

Logged-in Daily Active Uniques (DAUq) were up in both the U.S. and internationally, with international growth outpacing the U.S. on a year-over-year basis:

  • Logged-in U.S. DAUq: 23.0 million (up 19% year-over-year)
  • Logged-in International DAUq: 25.8 million (up 27% year-over-year)

Logged-Out Daily Active Uniques (DAUq)

The rise in overall Daily Active Uniques (DAUq) extended to logged-out users, with that segment experiencing strong year-over-year growth. This suggests that Reddit continues to be a popular destination for reading opinions and reviews from real people. It may also suggest that search engines and social media are sending more visitors to Reddit, as those users are more likely to arrive via external referrals, although the quarterly report did not mention search traffic or referral sources.

Logged-out users increased in both the U.S. and internationally, with international growth more than double that of the United States.

  • Logged-out U.S. DAUq: 27.1 million (up 22% YoY)
  • Logged-out International DAUq: 32.2 million (up 54% YoY)

Revenue and Monetization

Reddit’s total revenue reached $392.4 million, an increase of 61% compared to Q1 2024. Advertising revenue accounted for $358.6 million, also up 61%. Other revenue totaled $33.7 million, a 66% increase year-over-year.

Average Revenue Per Unique (ARPU) also increased:

  • U.S. ARPU: $6.27, up 31%
  • International ARPU: $1.34, up 22%

These increases in Average Revenue Per Unique (ARPU) suggest that Reddit is improving its monetization of user activity.

It also reported $115.3 million in adjusted EBITDA. EBITDA stands for earnings before interest, taxes, depreciation, and amortization. The adjusted version excludes certain non-recurring or non-cash expenses. This figure is often used to show how profitable a business is from its core operations.

  • Net income: $26.2 million
  • Adjusted EBITDA: $115.3 million
  • Operating cash flow: $127.6 million
  • Free Cash Flow: $126.6 million
  • Gross margin: 90.5%

Takeaways

Platform Usage Growth

  • Total Daily Active Uniques (DAUq) rose 31% year-over-year to 108.1 million.
  • Weekly Active Uniques reached 401.3 million globally, also up 31%.
  • Growth among international users (41%) outpaced U.S. growth (21%).
  • Logged-out users grew faster than logged-in users, especially internationally (+54% YoY).

Logged-In vs. Logged-Out Behavior

  • Logged-in users are critical for Reddit’s ad targeting and engagement features.
  • Logged-in DAUq rose in both U.S. (+19%) and international (+27%) markets.
  • Logged-out DAUq showed steeper growth: +22% U.S. and +54% internationally.
  • The rise in logged-out traffic suggests Reddit may be benefiting from increased exposure via search engines and social media, despite the report not directly mentioning search.

Revenue and Monetization

  • Total revenue grew 61% YoY to $392.4 million.
  • Advertising revenue: $358.6 million, also up 61%.
  • Other revenue grew 66% to $33.7 million.

ARPU (Average Revenue Per Unique)

  • U.S. ARPU: $6.27 (up 31%)
  • International ARPU: $1.34 (up 22%)
  • ARPU growth suggests improved monetization per user, especially through ad impressions in international markets.

Profitability and Financial Health

  • Net income: $26.2 million
  • Adjusted EBITDA: $115.3 million
    (Reflects earnings before interest, taxes, depreciation, and amortization, excluding non-cash or one-time costs)
  • Operating cash flow: $127.6 million
  • Free Cash Flow: $126.6 million
  • Gross margin: 90.5%

Reddit’s Q1 2025 earnings report highlights strong year-over-year growth in both logged-in and logged-out user activity, with usage rising significantly in the U.S. and even higher internationally. The company also reported a 61% increase in total revenue and positive cash flow, showing that Reddit is becoming more effective at monetizing its growing user base, which is useful information for digital marketers.

The report reflects growth in usage, revenue, and Average Revenue Per Unique (ARPU). Reddit’s expanding reach and monetization suggest it remains a relevant platform for users and remains a destination for referral traffic.

Featured Image by Shutterstock/gguy

Google Expands AIO Coverage In Select Industries via @sejournal, @martinibuster

BrightEdge Generative Parser™ detected an expansion of AI Overviews beginning April 25th, covering a larger quantity of entertainment and travel search queries, with noteworthy growth in insurance, B2B technology and education queries.

Expanded AIO Coverage

Expansion of AIO coverage for actor filmographies represented the largest growth area for the entertainment sector, with 76.34% of new query coverage focused on these kinds of queries. In total, the entertainment sector experienced approximately 175% expansion of AI Overview coverage.

Geographic specific travel queries experienced substantial coverage growth of approximately 108%, showing up in greater numbers for people who are searching for activities in specific travel destinations within specific time periods. These are complex search queries that are difficult to get right with the normal organic search.

B2B Technology

The technology space continues to experience steady growth of approximately 7% while the Insurance topic has a slightly greater expansion of nearly 8%. These two sectors bear a little more examination because they mean that publishers increasingly shouldn’t rely on keyword search results performance but instead focus on growing mindshare in the audience that are likely to be interested in these topics. Doing this may also assist in generating the external signals of relevance that Google may be looking for when understanding what topics a website is authoritative and expert in.

According to BrightEdge:

“Technical implementation queries for containerization (Docker) and data management technologies are gaining significant traction, with AIOs expanding to address specific coding challenges.”

That suggests that Google is stepping up on how-to type queries to help people understand the blizzard of new technologies, services and products that are available every month.

Education Queries

The Education sector also continues to see steady growth with a nearly 5% expansion of AIO keyword coverage, wiwth nearly 32% of that growth coming from keywords associated with online learning, with particular focus on specialized degree programs and professional certifications in new and emerging fields.

BrightEdge commented on the data:

“Industry-specific expansion rates directly impact visibility potential. Intent patterns are unique to each vertical – success requires understanding the specific query types gaining AI Overviews in YOUR industry, not just high-volume terms. Google is building distinct AI Overview patterns for each sector.”

Jim Yu, CEO of BrightEdge, observes:

“The data is clear, Google is reshaping search with AI-first results in highly specific ways across different verticals. What works in one industry won’t translate to another.”

Takeaways

Entertainment Sector Sees Largest AIO Growth

  • Actor filmographies dominate expanded coverage, making up over 76% of entertainment-related expansions.
  • Entertainment queries in AIO expanded by about 175%.

Travel AIO Coverage Grows For Location-Specific Queries

  • Geographic and time-specific activity searches expanded by roughly 108%.
  • AIO is increasingly surfacing for complex trip planning queries.

Steady AIO Expansion In B2B Technology

  • About 7% growth, with increasing coverage of technical topics.
  • Google appears to target how-to queries in fast growing technology sectors.

Insurance Sector Expansion Signals Broader Intent Targeting

  • Insurance topics coverage by AIO grew by nearly 8%.

Education Sector Growth Is Focused On Online Learning

  • 5% increase overall, with nearly one-third of new AIO coverage tied to online programs and professional certifications in emerging fields.

Sector-Specific AIO Patterns Require Tailored SEO Strategies

Success depends on understanding AIO triggers within your vertical and not relying solely on high-volume keywords, which means considering a more nuanced approach to topics.  Google’s AI-first indexing is reshaping how publishers need to think about search visibility.

Featured Image by Shutterstock/Sergey Nivens

33% of Google Users Stuck with Bing After a Two-Week Trial: Study via @sejournal, @MattGSouthern

A study found that 33% of Google users continued to use Bing after trying it for two weeks. This challenges the prevailing notion about search engine preferences and Google’s market dominance.

The research, published by the National Bureau of Economic Research, suggests Google’s market share isn’t just because it’s better. Many users haven’t tried alternatives.

The study was initially published in January but flew under our radar at the time. Hat tip to Windows Central for surfacing it again recently.

After reviewing the study, I felt it deserved a closer examination. Here are the findings that stand out.

Google’s Market Power: More Than Just Quality

Researchers from Stanford, MIT, and the University of Pennsylvania tested 2,354 desktop internet users to understand why Google holds about 90% of the global search market.

They looked at several possible reasons for Google’s dominance:

  • Better quality
  • Wrong ideas about competitors
  • Default browser settings
  • Hassle of switching
  • Users are not paying attention
  • Data advantages

While many think Google wins purely on quality, the research shows it’s not that simple.

The researchers challenge these claims with their findings:

“Google, however, maintains that its success is driven by its high quality, that competition is ‘only a click away’ given the ease of switching, and that increasing returns to data are small over the relevant range.”

The “Try Before You Buy” Effect

One key finding stands out: after being paid to use Bing for two weeks, one-third of Google users continued to use Bing even after the payments stopped.

The researchers found:

“64 percent of participants who kept using Bing said it was better than they expected, and 59 percent said they got used to it.”

The study further explains:

“Exposure to Bing increased users’ self-reported perceptions of its quality by 0.6 standard deviations.

This represents “a third of the initial gap between Google and Bing and more than half a standard deviation.”

This suggests people avoid Bing not because it’s worse, but because they haven’t given it a fair shot.

Challenging Common Beliefs

When Google users were asked to choose their search engine (making switching simple), Bing’s share grew by only 1.1 percentage points.

This suggests that default settings affect market share mainly by preventing users from trying alternatives.

The authors state:

“Our results suggest that their perceptions about Bing improved after exposure. Default Change group participants who keep using Bing do so for two reasons. First, like Switch Bonus group participants, their valuation of Bing increases due to experience. Second, some participants may continue to prefer Google but not switch back due to persistent inattention.”

Analysis of Bing’s search data showed that even if Microsoft had access to Google’s search data, it wouldn’t dramatically improve results.

The researchers concluded:

“We estimate that if Bing had access to Google’s data, click-through rates would increase from 23.5 percent to 24.8 percent.”

The EU requires Google to display users with a choice of search engines, but this study suggests that such measures won’t be effective unless users try the alternatives.

The researchers add:

“Driven by the limited effects of our Active Choice intervention, our model predicts that choice screens would increase Bing’s market share by only 1.3 percentage points.”

How They Did the Research

Unlike studies that ask people questions, this one used a browser extension to track real search behavior over time.

The researchers split users into groups:

  • A control group that changed nothing
  • An “active choice” group that picked their preferred search engine
  • A “default change” group paid to switch to Bing for two days
  • A “switch bonus” group paid to use Bing for two weeks

They also measured how users’ opinions changed after trying different search engines. Many users rated Bing higher after using it.

What This Means

These findings suggest Google’s advantage comes from exposure, not just from being technically superior.

Current legal cases against Google may not have a significant impact unless they encourage more people to try alternatives.

The researchers conclude:

“Our results suggest that regulators and antitrust authorities can increase market efficiency by considering search engines as experience goods and designing remedies that induce learning.”

This research comes as Google faces legal challenges in both the US and the EU, with courts considering ways to increase competition in the search market.

For search marketers, the study suggests that Google’s position may not be as secure as many think, although competitors still face the challenge of persuading users to give them a try.


Featured Image: gguy/Shutterstock