Google Revisits 15% Unseen Queries Statistic In Context Of AI Search via @sejournal, @martinibuster

At Search Central Live NYC, Google’s John Mueller revisited the statistic that 15% of search queries Google sees are completely brand new and have never been encountered. He also addressed what impact the advent of AI has had on the number of novel search queries users are making today.

Understanding Language

Understanding language is important to serving relevant search queries. That means Google needs to understand the nuances of what people mean when they search, which could include unconventional use of specific words or complete misuse of words. BERT is an example of a technology that Google uses to understand user queries.

Google’s introduction to BERT explains:

“We see billions of searches every day, and 15 percent of those queries are ones we haven’t seen before–so we’ve built ways to return results for queries we can’t anticipate.

…Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.”

Are There More Unknown Search Queries?

In the context of an overview of Google Search John Mueller briefly discussed the statistic of new queries Google search sees and if LLMs have made any impact.

This is, according to my notes, what he said:

“15% of all queries are new every day. This is something that I’m surprised is still the case. I would have thought at some point most of the searches would have been made people just ask the same thing over and over again.

But when when we recalculate these metrics, it’s always around 15%. I imagined maybe with LLM’s and the AI systems that maybe it would be a bit higher in recent years but it’s still hovering around that number..”

He then speculated why that 15% number remains the same, attributing it to things are always changing and that life is not static.

My paraphrase of what Mueller observed:

“It’s fantastic to see because it means to me that people keep going to search and looking for something new… and if people would stop going to search or stop searching new things, and to me that would be a sign that maybe something is wrong here. So this is a great number.”

Curious Outcome

It’s amazing that something as groundbreaking like AI search and the ability to search visually would have added more complex searches that Google has never seen before but that 15% number keeps holding steady.

Google’s SEO Tips For Better Rankings – Search Central Live NYC via @sejournal, @martinibuster

Google’s Search Liaison answered a question at Google Search Central Live NYC about whether Google prefers brands. Sullivan took that as an opportunity to affirm that Google is working to show more independent sites and also offered insights into how independent sites can improve their search performance.

Google Wants Good Independent Sites To Rank

Someone at the Search Central Live NYC event submitted a question asking whether Google was focusing on just showing a smaller set of sites from the Internet that’s limited to big brand sites. Danny Sullivan, aka Google Search Liaison, immediately responded, no. He responded that he understands that there’s a sense that big brands always rank well on Google and that many people say that Google only wants to show big brands.

Google Search Central Live New York City

Photo of Google Search Liaison Danny Sullivan taken at Search Central Live NYC Event

Sullivan acknowledged that this is a valid concern from small independent sites because there are many who are doing good work who aren’t ranking as well as they should be and explained that they were working on it.

The following is a paraphrase based on my notes:

“And we’ve been spending a lot of time (and we’re going to continue to spend a lot of time) to understand how can we do a better job on better understanding and perhaps guiding some of the smaller creators and small independent sites so they can be successful. It has been like a huge chunk of my time over the past year. And I’m not alone in it.

We were just in Zurich last week. We were just out there and we were looking at a bunch of real queries from small creators, independent sites and sitting with the ranking team and going through them and what’s happening here and …we made a note that you know, we have done some changes that we think help and we have done some changes that have helped. We also anticipate working through the whole rest of the year.”

Why Changes Are Incremental

Danny explained that independent sites and their topic areas vary widely which complicates applying a single algorithmic solution to help them all. That explains why Google keeps saying they’re making incremental changes.

According to my notes, he said:

“One of the things I would say is I don’t expect you’re going to suddenly see one day we do a big huge, ‘And here is the independent small site update’ type of thing. I think it’s going be these incremental things that we do, in part because these kinds of sites are not monolithic.”

That Thing You Need To Know About Brands

Danny discussed how serious they are about finding solutions for independent publishers and eventually began speaking of more tangible things that publishers can do to help themselves, specifically about becoming memorable to site visitors.

This is something that I’ve been doing for over twenty years. I never rolled out an affiliate or AdSense site that didn’t have a carefully planned domain name, logo and mascot in place. That mascot is super important because it helps make the site memorable to site users. They’ll forget the domain name but they’ll remember that mascot and the site.

Danny said that Google’s systems are not tuned to identify big brands and rank them well. He acknowledged that sites with a lot of branded searches might rank well and this is the point where it felt like okay, am I really hearing this? It’s the kind of information you come to these events for.

This is a paraphrase from my notes of what Danny said:

“And I’ve seen where people do research and say, ‘I’ve figured out that if you have a lot of branded searches…’ That’s kind of valid in some sense.

But it’s not like you have a lot of big branded searchers or small branded searchers or whatever and you’re finding that correlates to your traffic. What it’s saying is that people have recognized you as a brand, which is a good thing. We like brands. Some brands we don’t like, but at least we recognize them, right?

So if you’re trying to be found in the sea of content and you have the 150,000th fried chicken recipe, it’s very difficult to understand which ones of those are necessarily better than anybody else’s out there.

But if you are recognized as a brand in your field, big, small, whatever, just a brand, then that’s important.

That correlates with a lot of signals of perhaps success with search. Not that you’re a brand but that people are recognizing you. People may be coming to you directly, people, may be referring to you in lots of different ways… You’re not just sort of this anonymous type of thing.

So, one thing I would encourage anybody, but especially to smaller and independent ones that are kind of feeling like the big brands are kind of getting it all is, are you making sure that people understand who you are?”

Differentiate Yourself. A Lot.

Danny Sullivan discussed that users submitted over 13,000 sites with feedback about Google’s algorithm and claimed that he’s confident that he’s looked at more sites than any SEO in the audience has. He acknowledged that many of the submissions had valid concerns but he also said he noticed that some sites that were high quality also lacked that extra bit that made them different and better.

What he was referring to, in my words, not Danny’s, was a clear narrative on the page that lets site visitors know who is behind the site. He wasn’t talking about the sidebar with the bio and a photo that travel and recipe sites all have. He was talking about something that goes beyond the generic narrative that many bloggers use.

This is a paraphrase from my notes about what Danny said:

“I can land on a site and have no idea who runs the site, what the site is about. Who’s behind it? That’s not to say that if you put an ‘about us’ link on your site that now you’ll rank better. But people come to websites from Search and they don’t know what they’re getting into.”

He then contrasted social media to search to show how a forum or a social media site offers a carefully curated experience where you know where everything is at, where expectations are managed. He then said that Search is completely different. While Danny didn’t explicitly say this, I believe what he meant to communicate was that the randomness of sites that Google sends people to can be jarring to users who consequently aren’t sure whether to trust a site. It’s a different experience than the carefully curated experience of a forum or social media site and for that reason it’s important to be able to give a sense of who is behind the site.

This is a paraphrase of what Danny said:

“Search is nothing like that. Search is a grab bag. It’s weird. You don’t know what you’re going to get. It’s like I’m feeling lucky. You don’t know what you’re going to end up with.”

And please, I beg you, especially those of you that said that Google wants everything to be the same. That’s not what we want. We don’t want every website to be a cookie cutter site.

We want you to build websites that you think makes sense for your readers.

Anytime you ever have a question about what you should be doing to be successful in Google search and your answer is to ask if it’s a good thing for your readers, if you do that, you are aligning with the things we’re trying to do because we’re trying to send people to satisfying content so that they go, ‘This was great! This is wonderful, I loved it!’

So when they wind up on your website, probably for the first time and they don’t know you from anything and they’re coming from this crazy world where they don’t even know where the profiling for the author is, make it easy for them. Make it easy for them to come into the site and know exactly what you’re about.

I know the travel bloggers, you all have the thing on the side that says, ‘we love travelling the world…’ It’s like, OK, that’s fine and at least people know to expect that from travel bloggers and you’ve got it there.

But help them understand what’s unique or different about you, that makes you a brand. And that is a really good thing.”

Insights From Search Central Live NYC

Google Actively Supports Independent Sites

Danny Sullivan said multiple times that Google is spending a significant amount of time into improving the algorithm so that more independent publishers will attain visibility in search. However these improvements are incremental because of the wide variety of sites and topics makes it so that one change won’t affect all sites equally.

Brand Recognition Drives Search Success

Being recognized as a brand to site visitors is a quality that highly successful sites tend to have. It’s not that cultivating a brand is a ranking factor, but rather that cultivating site users leads to stronger search signals.

Differentiation Is Important

Some high-quality sites fail to stand out because they do what they think they do what everyone else is doing. Site visitors may appreciate more effort to make it clearer who is behind the site. An example of something to consider avoiding are things like rote generic bios in favor of providing a real sense of why the site is important or matters.

Clarity Builds Trust

Recognize that the web has an element of randomness that make some site visitors wary about visiting a site for the first time. Design with this understanding in mind.

Design for the Reader, Not the Algorithm

One of the most common mistakes I see by publishers is that they can list all of the things they did for SEO but very little if anything that they did for their site visitors. Danny Sullivan recommends basing decisions on whether a change is good for the site visitors because that will align it with the kinds of sites Google wants to rank.

Google Completes March 2025 Core Update Rollout via @sejournal, @MattGSouthern

Google officially completed the rollout of its March 2025 Core Update today at 5:34 AM PDT, ending two weeks of significant volatility in search rankings.

This update began on March 13 and has created notable shifts in search visibility across various sectors and website types.

Widespread Impact Observed

Data collected during the update’s rollout period revealed some of the most volatile search engine results pages (SERPs) in the past 12 months, according to tracking from Local SEO Guide.

Their system, which monitors 100,000 home services keywords, showed unprecedented movement beginning the week of March 10th.

SISTRIX’s Google Update Radar confirmed these findings, detecting substantial changes across UK and US markets starting March 16th.

Forum Content Recalibration

One of the most significant trends emerging from this update is a recalibration of how Google values forum content.

After approximately 18 months of heightened visibility for forum websites following Google’s mid-2023 “hidden gems” update, many forum sites are now experiencing substantial drops in visibility.

SEO strategist Lily Ray highlighted this trend, noting steep visibility declines for platforms like proboards.com, which hosts numerous forum websites.

Ray pointed out that while Reddit continues gaining visibility, many other forum sites that benefited from the 2023 algorithm changes are now diminishing their rankings.

“The SEO glory days of ‘just be a forum and you’ll rank’ might be coming to an end,” Ray observed.

Additional Patterns Identified

Andrew Shotland, CEO of Local SEO Guide, identified several other potential patterns in this update:

  1. Forum Content Devaluation: While Reddit remains strong, other forums are seeing their previously gained visibility disappear.
  2. Programmatic Content Penalties: Sites creating large volumes of programmatic pages, particularly those designed specifically for SEO rather than user value, are experiencing significant declines.
  3. Cross-Sector Impact: Unlike some updates that target specific industries, this core update has affected sites across retail, government, forums, and content publishers.

Industry professionals commenting on the update have noted the potential connection to Google’s broader efforts to improve search result diversity and combat low-value content.

This recalibration may also relate to the ongoing integration of AI-generated content in search results.

What This Means for SEO

With the update now complete, SEO professionals can begin to assess the full impact on their sites and implement appropriate strategies.

For those managing forum content, this update signals the importance of quality over quantity and suggests that simply having forum content is no longer sufficient for strong rankings.

Sites negatively impacted by the update should focus on improving content quality, removing programmatic or low-value pages, and ensuring their content genuinely addresses user needs rather than being created primarily for search engines.

Search Engine Journal will continue to monitor the aftermath of this core update and provide additional analysis as more data becomes available.

Google Rolls Out AI-Powered Travel Updates For Search & Maps via @sejournal, @MattGSouthern

Google has released its annual summer travel trends report alongside several AI-powered updates to its travel planning tools.

The announcement reveals shifting travel preferences while introducing enhancements to Search, Maps, Lens, and Gemini functionality.

New AI Search and Planning Features

Google announced five major updates to its travel planning ecosystem.

Expanded AI Overviews

Google has enhanced its AI Overviews in Search to generate travel recommendations for entire countries and regions, not just cities.

You can now request specialized itineraries by entering queries like “create an itinerary for Costa Rica with a focus on nature.”

The feature includes visual elements and the ability to export recommendations to various Google products.

Image Credit: Google

Price Monitoring for Hotels

Following its flight price tracking implementation, Google has extended similar functionality to accommodations.

When browsing google.com/hotels, you can now toggle price tracking to receive alerts when hotel rates decrease for selected dates and destinations.

The system factors in applied filters include amenity preferences and star ratings.

Image Credit: Google

Screenshot Recognition in Maps

A new Google Maps feature can help organize travel plans by automatically identifying places mentioned in screenshots.

Using Gemini AI capabilities, the system recognizes venues from saved images and allows users to add them to dedicated lists.

The feature is launching first on iOS in English, with Android rollout planned.

Gemini Travel Assistance

Google’s Gemini AI assistant now offers enhanced travel planning support, allowing users to create “Gems” – customized AI assistants for specific travel needs.

Now available at no cost, these specialized assistants can help with destination selection, local recommendations, and trip logistics.

Expanded Lens Capabilities

Google Lens continues evolving, offering enhanced AI-powered information delivery when pointing your camera at landmarks or objects.

The feature is expanding beyond English to include Hindi, Indonesian, Japanese, Korean, Portuguese, and Spanish, complementing its existing translation capabilities.

Image Credit: Google

Travel Search Trends

According to Google’s Flights and Search data analysis, travelers are increasingly drawn to coastal destinations for the Summer of 2025.

Caribbean islands, including Puerto Rico, Curacao, and St. Lucia, are seeing significant search growth, along with other beach destinations like Rio de Janeiro, Maui, and Nantucket.

The data also reveals continued momentum for outdoor adventure travel within the U.S.:

  • Cities with proximity to nature experiences (Billings, Montana; Juneau, Alaska; and Bangor, Maine) are experiencing higher search volume
  • “Cabins” has emerged as the top accommodation search for romantic getaways
  • Family travelers are increasingly searching for “dude ranch” vacations
  • Weekend getaway searches concentrate on natural destinations, including upstate New York, Joshua Tree National Park, and Sedona.

An unexpected trend in luggage preferences was also noted, with “checked bags” queries now exceeding historically dominant “carry on” searches.

Supporting this shift, space-saving solutions like vacuum bags and compression packing cubes have become top trending travel accessory searches.

Implications for SEO and Travel Content

These updates signal Google’s continued investment in controlling the travel research journey within its own ecosystem.

The expansion of AI-generated itineraries and information potentially reduces the need for users to visit traditional travel content sites during the planning phase.

Travel brands and publishers may need to adapt their SEO and content strategies to account for these changes, focusing more on unique experiences and in-depth content beyond what Google’s AI tools can generate.

The trend data also provides valuable insights for travel-related keyword targeting and content development as summer vacation planning begins for many consumers.

Top SEO Shares How To Win In The Era Of Google AI via @sejournal, @martinibuster

Jono Alderson, former head of SEO at Yoast and now at Meta, spoke on the Majestic Podcast about the state of SEO, offering insights on the decline of traditional content strategies and the rise of AI-driven search. He shared what SEOs should be doing now to succeed in 2025.

Decline Of Generic SEO Content

Generic keyword-focused SEO content, as well as every SEO tactic, has always been on a slow decline, arguably beginning with statistical analysis, machine learning and then into the age of AI-powered search. Citations from Google are now more precise and multimodal.

Alderson makes the following points:

  • Writing content for the sake of ranking is becoming obsolete because AI-driven search results provide those answer.
  • Many industries and topics like dentists and recipes sites have an oversaturation of nearly identical content that doesn’t add value.

According to Alderson:

“…every single dentist site I looked at had a tedious blog that was quite clearly outsourced to a local agency that had an article about Top 8 tips for cosmetic dentistry, etc.

Maybe you zoom out how many dentists are there in every city in the world, across how many countries, right? Every single one of those websites has the same mediocre article that somebody has done some keyword research. Spotted a gap they think they can write one that’s slightly better than their competitors. And yet in aggregate, we’ve created 10 million pages that none of which show the purpose, all of which are fundamentally the same, none of which are very good, none of which add new value to the corpse of the Internet.

All of that stops working because Google can just answer those kinds of queries in situ.”

Google Is Deprioritizing Redundant Content

Another good point he makes is that the days where redundant pages have a chance are going away. For example, Danny Sullivan explained at Search Central Live New York that many of the links shown in some of the AI Overviews aren’t related to the keyword phrase but are related to the topic, providing access to the next kind of information that a user would be interested in after they’d ingested the answer to their question. So, rather than show five or eight links to pages that essentially say the same thing Google is now showing links to a variety of topics. This is an important thing publishers and SEOs need to wrap their minds around, which you can read more about here: Google Search Central Live NYC: Insights On SEO For AI Overviews.

Alderson explained:

“I think we need to stop assuming that producing content is a kind of fundamental or even necessary part of modern SEO. I think we all need to take a look at what our content marketing strategies and playbooks look like and really ask the questions of what is the role of content and articles in a world of ChatGPT and AI results and where Google can synthesize answers without needing our inputs.

…And in fact, one of the things that Google is definitely looking for, and one of the things which will be safe to a degree from this AI revolution, is if you can publish, if you can move quickly, if you can produce stuff at a better depth than Google can just synthesize, if you can identify, discover, create new information and new value.

There is always space for that kind of content, but there’s definitely no value if what you’re doing is saying, ‘every month we will produce four articles focusing on a given keyword’ when all 10,000 of our competitors employ somebody who looks like us to produce the same article.”

How To Use AI For Content

Alderson discouraged the use of AI for producing content, saying that it tends to produce a “word soup” in which original ideas get lost in the noise. He’s right, we all know what AI-generated content looks like when we see it. But I think that what many people don’t notice is the extra garbage-y words and phrases AI uses that have lost their impact from overuse. Impactful writing is what supports engagement, and original ideas are what make content stand apart. These are the two things AI is absolutely rubbish at.

Alderson notes that Google may have anticipated the onslaught of AI-generated content by emphasizing EEAT (Experience, Expertise, Authoritativeness and Trustworthiness and argues that AI can be helpful.

He observed:

“And a lot of the changes we’re seeing in Google might well be anticipating that future. All of the EEAT stuff, all of the product review stuff, is designed to combat a world where there’s an infinite amount of recursive nonsense.

So definitely avoid the temptation to be using the tools just to produce. Use them as assistance and muses to bounce ideas around with and then do the heavy thinking yourself.”

The Shift from Content Production to Content Publishing

Jono encouraged content publishers to focus on creating original research, expert insights, to show things that have gone unnoticed. He suggested that succesful publishers are the ones who get out in the world and experience what they’re writing about through original research. He also encouraged focusing on authoritative voices rather than settling for generic content.

He explained:

“I think there’s definitely room to publish good content and publish. 2015-ish everyone started saying become a publisher and the whole industry misinterpreted that to mean write lots of articles. When actually you look at successful publishers, what they do is original research, by experts, they break news, they visit the places, they interact with things. A lot of what Google’s looking for in those kind of EEAT criteria, it describes the act of publishing. Yet very little of SEO actually publishes. They just produce And I think if you …close that gap there is definitely value.

And in fact, one of the things that Google is definitely looking for, and one of the things which will be safe to a degree from this AI revolution, is if you can publish, if
you can move quickly, if you can produce stuff at a better depth than Google can just synthesize.”

What does that mean in terms of a content strategy? One of the things that bothers me is the lack of originality in content. Things like concluding paragraphs with headings like “Why We Care” drive me crazy because to me it indicates a rote approach to content.

I was researching how to flavor shrimp for sautéing and every recipe site says to sprinkle seasonings on the shrimp prior to a quick sauté at a medium high heat, which burns the seasonings. Out of the thousands of recipe sites out there, not one can figure out that you can sauté the shrimp, add some garlic, then when it’s done add the seasoning just after turning off the flame? And if you ask AI how to do it the AI will tell you to burn your seasonings because that’s what everyone else says.

What that all means is that publishers and SEOs should focus on hands-on original research and unique insights instead of regurgitating what everyone else is saying. If you follow directions and it comes out poorly maybe the directions are wrong and that’s an opportunity to do something original.

SEO’s Role in Brand-Building & Audience Engagement

When asked what the role of content is in a world where AI is producing summaries, Alderson suggested that publishers and SEOs need to get ahead of the point where consumers are asking questions, go back to before they ask those questions.

He answered:

“Yeah, it’s really tricky because the kind of content that we’re producing there is going to change. It’s not not going to be the “8 Tips For X” in the hope the 2% of that audience convert. It’s not going to work anymore.

You’re going to need to go much higher up the funnel and much earlier into the research cycle. And the role of content will need to change to not try and convert people who are at the point of purchase or ready to make a decision, but to influence what happens next for the people who are at the very start of those journeys.

So what you can do is, for example, I know this is radical, but audience research, find out what kind of questions people in your sector had six months before they purchased or the kind of frustrations and challenges- what do they wish they’d known when they’d started to engage upon those processes?”

Turning that into a strategy, it may mean that SEOs and publishers may want to shift away from focusing solely on transactional keywords and toward developing content that builds brand trust early. As Jono recommends, conduct audience research to identify what potential customers are thinking about months before they are ready to buy and then create content that builds long-term familiarity.

The Changing Nature of SEO Metrics & Attribution

Alderson goes on to offer a critique about the overreliance on conversion-based metrics like last-click attribution. He suggests that the focus on proving success by showing that a user didn’t return to the search results page is outdated because SEO should be influencing earlier stages of the customer journey

“You look at the the kind of there’s increasing belief that attribution as a whole is a bit of a pseudoscience and that as the technology gets harder to track all the pieces together, it becomes increasingly impossible to produce an overarching picture of what are the influences of all these pieces.

You’ve got to go back to conventional marketing …You’ve got to look at actually, does this influence what people think and feel about our brand and our logo and our recall rather than going, ‘how many clicks did we get out of, how many impressions and how many sales?’ Because if you’re competing there, you’re probably too late.

You need to be influencing people much higher the funnel. So, yeah… All, everything we’ve ever learned in the nineteen fifties and sixties about marketing, that is how we measure what good SEO looks like. Yeah, it looks like maybe we need to step back from some of the more conventional measures.”

Turning that into a strategy means that maybe it’s a good exercise to rethink traditional success metrics and start looking at customer sentiment rather than just search rankings.

Radical Ideas For A Turning Point In History

Jono Alderson prefaced his recommendation for doing audience research with the phrase, “I know this is radical…” and what he proposes is indeed radical but not in the sense that he’s proposing something extreme. His suggestions are radical in the sense that he’s pointing out that what used to be common sense in SEO (like keyword research, volume-driven content production, last-click attribution) is increasingly losing relevance to how people seek out information today. The takeaway is that adapting means rethinking SEO to the point that it goes back to its roots in marketing.

Watch Jono Alderson speak on the Majestic SEO podcast:

Stop assuming that ‘producing content’ is a necessary component of modern SEO – Jono Alderson

Featured Image/Screenshot of Majestic Podcast

AI Crawlers Are Reportedly Draining Site Resources & Skewing Analytics via @sejournal, @MattGSouthern

Website operators across the web are reporting increased activity from AI web crawlers. This surge raises concerns about site performance, analytics, and server resources.

These bots consume significant bandwidth to collect data for large language models, which could impact performance metrics relevant to search rankings.

Here’s what you need to know.

How AI Crawlers May Affect Site Performance

SEO professionals regularly optimize for traditional search engine crawlers, but the growing presence of AI crawlers from companies like OpenAI, Anthropic, and Amazon presents new technical considerations.

Several site operators have reported performance issues and increased server loads directly attributable to AI crawler activity.

“SourceHut continues to face disruptions due to aggressive LLM crawlers,” reported the git-hosting service on its status page.

In response, SourceHut has “unilaterally blocked several cloud providers, including GCP [Google Cloud] and [Microsoft] Azure, for the high volumes of bot traffic originating from their networks.”

Data from cloud hosting service Vercel shows the scale of this traffic: OpenAI’s GPTBot generated 569 million requests in a single month, while Anthropic’s Claude accounted for 370 million.

These AI crawlers represented about 20 percent of Google’s search crawler volume during the same period.

The Potential Impact On Analytics Data

Significant bot traffic can affect analytics data.

According to DoubleVerify, an ad metrics firm, “general invalid traffic – aka GIVT, bots that should not be counted as ad views – rose by 86 percent in the second half of 2024 due to AI crawlers.”

The firm noted that “a record 16 percent of GIVT from known-bot impressions in 2024 were generated by those that are associated with AI scrapers, such as GPTBot, ClaudeBot and AppleBot.”

The Read the Docs project found that blocking AI crawlers decreased their traffic by 75 percent, from 800GB to 200GB daily, saving approximately $1,500 per month in bandwidth costs.

Identifying AI Crawler Patterns

Understanding AI crawler behavior can help with traffic analysis.

What makes AI crawlers different from traditional bots is their frequency and depth of access. While search engine crawlers typically follow predictable patterns, AI crawlers exhibit more aggressive behaviors.

Dennis Schubert, who maintains infrastructure for the Diaspora social network, observed that AI crawlers “don’t just crawl a page once and then move on. Oh, no, they come back every 6 hours because lol why not.”

This repeated crawling multiplies the resource consumption, as the same pages are accessed repeatedly without a clear rationale.

Beyond frequency, AI crawlers are more thorough, exploring more content than typical visitors.

Drew DeVault, founder of SourceHut, noted that crawlers access “every page of every git log, and every commit in your repository,” which can be particularly resource-intensive for content-heavy sites.

While the high traffic volume is concerning, identifying and managing these crawlers presents additional challenges.

As crawler technology evolves, traditional blocking methods prove increasingly ineffective.

Software developer Xe Iaso noted, “It’s futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more.”

Balancing Visibility With Resource Management

Website owners and SEO professionals face a practical consideration: managing resource-intensive crawlers while maintaining visibility for legitimate search engines.

To determine if AI crawlers are significantly impacting your site:

  • Review server logs for unusual traffic patterns, especially from cloud provider IP ranges
  • Look for spikes in bandwidth usage that don’t correspond with user activity
  • Check for high traffic to resource-intensive pages like archives or API endpoints
  • Monitor for unusual patterns in your Core Web Vitals metrics

Several options are available for those impacted by excessive AI crawler traffic.

Google introduced a solution called Google-Extended in the robots.txt file. This allows websites to stop having their content used to train Google’s Gemini and Vertex AI services while still allowing those sites to show up in search results.

Cloudflare recently announced “AI Labyrinth,” explaining, “When we detect unauthorized crawling, rather than blocking the request, we will link to a series of AI-generated pages that are convincing enough to entice a crawler to traverse them.”

Looking Ahead

As AI integrates into search and discovery, SEO professionals should manage crawlers carefully.

Here are some practical next steps:

  1. Audit server logs to assess AI crawler impact on your specific sites
  2. Consider implementing Google-Extended in robots.txt to maintain search visibility while limiting AI training access
  3. Adjust analytics filters to separate bot traffic for more accurate reporting
  4. For severely affected sites, investigate more advanced mitigation options

Most websites will do fine with standard robots.txt files and monitoring. However, high-traffic sites may benefit from more advanced solutions.


Featured Image: Lightspring/Shutterstock

YouTube Changes Shorts View Counts, No Change To Monetization via @sejournal, @MattGSouthern

YouTube’s updated Shorts view count now captures every play, but this change won’t affect earnings or monetization eligibility.

  • YouTube will begin counting Shorts views without minimum watch time starting March 31.
  • Earnings and YPP eligibility remain tied to the old way of counting views
  • Both total views and “engaged views” will be available in YouTube Analytics.
TikTok Ban Support Down As Trump’s Plans Face Hurdles via @sejournal, @MattGSouthern

Recent data shows that fewer Americans support banning TikTok.

At the same time, Democratic lawmakers warn that President Donald Trump’s current plans may not be enough to keep the platform online after the April 5 deadline.

Public Support For TikTok Ban Weakens

A Pew Research Center survey found that 34% of U.S. adults support banning TikTok, down from 50% in March 2023.

Fewer Americans now view TikTok as a national security threat, 49% compared to 59% in May 2023.

Opposition to the ban has risen from 22% to 32%, with one-third of Americans undecided. Support for a ban is higher among Republicans (39%) than among Democrats (30%).

Only 12% of TikTok users want a ban, compared to 45% of non-users.

Those in favor cite data security (83%) and Chinese ownership (75%), while opponents often point to free speech concerns (74%).

Democrats Challenge Trump’s Approach

On March 24, three Democratic senators—Ed Markey (D-MA), Chris Van Hollen (D-MD), and Cory Booker (D-NJ)—wrote to President Trump to criticize how his administration handled the TikTok situation.

They don’t support the ban, but they believe Trump’s order to extend the deadline for selling TikTok by 75 days is “unlawful.” They say this decision creates uncertainty about the platform’s future.

The senators wrote:

“To the extent that you continue trying to delay the divestment deadline through executive orders, any further extensions of the TikTok deadline will require Oracle, Apple, Google, and other companies to continue risking ruinous legal liability.”

Proposed Solutions & Path Forward

Reports say the Trump administration is considering a partnership with Oracle. In this arrangement, Oracle would buy a small share of TikTok and ensure the security of U.S. user data.

However, critics, including John Moolenaar, the Republican Chair of the House China Select Committee, warn that this plan might not fulfill the law’s requirements for a “qualified divestiture.”

Democrats are asking Trump to work with Congress instead of acting alone.

They have put forward two proposed solutions:

  1. The “Extend the TikTok Deadline Act” would move the deadline for selling TikTok to October 16, giving more time to find a solution that meets the law.
  2. Changes to the original law by Congress if Trump wants to go ahead with a deal with Oracle.

What’s Next?

The Democratic senators have requested that Trump respond to their questions by March 28.

They want to know whether his administration is considering further extending the deadline, details about the potential Oracle deal, and whether he believes additional legislative action is necessary.

As the April 5 deadline approaches, the future of one of the most influential social media platforms remains uncertain.


Featured Image: RKY Photo/Shutterstock

Ex-Googler: Google Sees Publisher Traffic As A Necessary Evil via @sejournal, @martinibuster

Google says it values the open web, and a current Googler confirmed in a private conversation at the recent Search Central Live in New York that the company, including CEO Sundar Pichai, cares about the web ecosystem. But that message is contradicted by an ex-Googler, who said Google internally regards sending traffic to publishers as “a necessary evil.”

Constant Evolution Of Google Search

Elizabeth Reid, VP of Search, is profiled in Bloomberg as the one responsible for major changes at Google search beginning in 2021, particularly AI Overviews. She was previously involved in Google Maps and is the one who revealed the existence of core topicality systems at Google.

Her statements about search show how it’s changing and give an idea of how publishers and SEOs should realign their perspectives. The main takeaway is that technology enables users to interact with information in different ways and search has to evolve with that to keep up with them. In her view, what’s happening is now a top-down approach to search where Google is imposing changes on users but rather it’s Google being responsive to users.

Her approach to search was said to be informed by her experience at Google Maps where Sergey Brin pushed the team to release Maps before they felt comfortable releasing it, teaching her that this enabled them to understand what users really wanted faster than had they waited longer.

According to Bloomberg:

“Reid refers to her approach as a “constant evolution” rather than a complete overhaul. Her team is still struggling to define the purpose of Google Search in this new era, according to interviews with 21 current and former search executives and employees…”

AI And Traditional Google Search

Google Search lost 20% of their search engineers who went over to focus on rolling out generative AI so perhaps it’s not surprising that she believes the search bar will lose prominence. According to the report:

“Reid predicts that the traditional Google search bar will become less prominent over time. Voice queries will continue to rise, she says, and Google is planning for expanded use of visual search, too.”

But she also said that the search bar isn’t going away:

“The search bar isn’t going away anytime soon, Reid says, but the company is moving toward a future in which Google is always hovering in the background. ‘The world will just expand,’ she says. ‘It’s as if you can ask Google as easily as you could ask a friend, only the friend is all-knowing, right?’”

Sending Traffic To Publishers Is A Necessary Evil

The article offers seemingly contradictory statements about how Google sees its relationship with the web ecosystem. An unnamed former Googler is quoted as saying that “giving” traffic to publishers is a necessary evil.

“Giving traffic to publisher sites is kind of a necessary evil. The main thing they’re trying to do is get people to consume Google services,” the former executive says. “So there’s a natural tendency to want to have people stay on Google pages, but it does diminish the sort of deal between the publishers and Google itself.”

What Current Googlers Say

At the Google Search Central Live event at New York City I had the opportunity to have a private conversation with a Googler about Google CEO Sundar Pichai’s inability to articulate what Google does to support the web ecosystem. The Googler told me that they’ve heard Sundar Pichai express a profound recognition of their relationship with publishers and said that it’s something he reflects on seriously.

That statement by the Googler was echoed in the article by something that Liz Reid and Sundar Pichai said:

“Reid says that Google cares deeply about publishers and that AI Overviews is a jumping-off point for users to conduct further research on the open web. Pichai, for his part, stresses the need to send ‘high-quality’ traffic to websites, instead of making users click around on sites that may not be relevant to them.

‘We are in the phase of making sure through this moment that we are improving the product, but in a way that prioritizes sending traffic to the ecosystem,’ he says, adding, ‘That’s been the most important goal.’”

Takeaways

  • Google is reshaping Search based on user behavior, not top-down mandates. But the fact that OpenAI’s ChatGPT pushed Google into rolling out their answer shows that other forces aside from user behaviors are in play as well.
  • Traditional search bar is becoming less central, replaced by voice (likely mobile devices) and visual search (also mobile). Google is multimodal, which means that it operates within multiple senses, like audio and visual. Publishers should really think hard about how that affects their business and how they can align it to also be multimodal so as to evolve along with users so that their content is already there when Google itself evolves to meet them there, too.
  • AI Overviews and possibly the Gemini Personal AI Assistant could signal a shift toward Google acting as an ambient presence, not a destination.
  • Google’s relationship with publishers has never been more strained. The disconnect between the public-facing statements and those by anonymous ex-Googlers send a signal that Google needs to be more out front with their relationship with publishers. For example, Google’s Search Central videos used to be interactive sessions with publishers, gradually drying up to scripted question and answers and now it’s completely gone. Although I believe what the Googler told me about Pichai’s regard for publishers because I know them to be truthful, the appearance that their search relations team has retreated behind closed doors sends a louder signal.
  • Google leadership emphasizes commitment to sending “high-quality traffic” to websites. But SEOs and publishers are freaking out that traffic is lower and the sentiment may be that Google should consider a little more give and a lot less take.

Hat tip to Glenn Gabe for calling attention to this article.

Featured Image by Shutterstock/photoschmidt

OpenAI Rolls Out GPT-4o Image Creation To Everyone via @sejournal, @MattGSouthern

OpenAI has rolled out a new image generation system directly integrated with GPT-4o. This system allows the AI to access its knowledge base and conversation context when creating images.

This integration is said to enable more contextually relevant and accurate visual outputs.

OpenAI’s announcement reads:

“GPT‑4o image generation excels at accurately rendering text, precisely following prompts, and leveraging 4o’s inherent knowledge base and chat context—including transforming uploaded images or using them as visual inspiration. These capabilities make it easier to create exactly the image you envision, helping you communicate more effectively through visuals and advancing image generation into a practical tool with precision and power.”

Here’s everything else you need to know.

Technical Capabilities

OpenAI highlights the following capabilities of its new image generation system:

  1. It accurately renders text within images.
  2. It allows users to refine images through conversation while keeping a consistent style.
  3. It supports complex prompts with up to 20 different objects.
  4. It can generate images based on uploaded references.
  5. It creates visuals using information from GPT-4o’s training data.

OpenAI states in its announcement:

“Because image generation is now native to GPT‑4o, you can refine images through natural conversation. GPT‑4o can build upon images and text in chat context, ensuring consistency throughout. For example, if you’re designing a video game character, the character’s appearance remains coherent across multiple iterations as you refine and experiment.”

Examples

To demonstrate character consistency, here’s an example showing a cat and then that same cat with a hat and monocle.

Screenshot from: openai.com/index/introducing-4o-image-generation/, March 2025.

Here’s a more practical example for marketers, demonstrating text generation: a full restaurant menu generated with a detailed prompt.

Screenshot from: openai.com/index/introducing-4o-image-generation/, March 2025.

There are dozens more examples in OpenAI’s announcement post, many of which contain several prompts and follow-ups.

Limitations

OpenAI admits:

“Our model isn’t perfect. We’re aware of multiple limitations at the moment which we will work to address through model improvements after the initial launch.”

The company notes the following limitations of its new image generation system:

  • Cropping: GPT-4o sometimes crops long images, like posters, too closely at the bottom.
  • Hallucinations: This model can create false information, especially with vague prompts.
  • High Blending Problems: It struggles to accurately depict more than 10 to 20 concepts at once, like a complete periodic table.
  • Multilingual Text: The model can have issues showing non-Latin characters, leading to errors.
  • Editing: Requests to edit specific image parts may change other areas or create new mistakes. It also struggles to keep faces consistent in uploaded images.
  • Information Density: The model has difficulty showing detailed information at small sizes.

Search Implications

This update changes AI image generation from mainly decorative uses to more practical functions in business and communication.

Websites can use AI-generated images but with important considerations.

Google’s guidelines do not prohibit AI-generated visuals, focusing instead on whether content provides value regardless of how it’s produced.

Following these best practices is recommended:

  • Using C2PA metadata (which GPT-4o adds automatically) to maintain transparency
  • Adding proper alt text for accessibility and indexing
  • Ensuring images serve user intent rather than just filling space
  • Creating unique visuals rather than generic AI templates

Google Search Advocate John Mueller has expressed a negative opinion regarding AI-generated images. While his personal preferences don’t influence Google’s algorithms, they may indicate how others feel about AI images.

Screenshot from: bsky.app/profile/johnmu.com, March 2025.

Note that Google is implementing measures to label AI-generated images in search results.

Availability

The feature is now available to ChatGPT users with Plus, Pro, Team, or Free plans. Access for Enterprise and Edu users will be available soon.

Developers can expect API access in the coming weeks. Because of higher processing needs, image generation takes about one minute on average.


Featured Image: PatrickAssale/Shutterstock