Google CTR Trends In Q3: Branded Clicks Fan Out, Longer Queries Hold via @sejournal, @MattGSouthern

Advanced Web Ranking released its Q3 Google organic clickthrough report, tracking CTR changes by ranking position across query types and industries.

The company compared July through September against April through June. The dataset is international, so the patterns reflect broad search behavior rather than a single region.

Here’s what stands out in this quarter’s report.

Branded Desktop Searches Shift Clicks Down-Page

The clearest movement this quarter shows up in branded queries on desktop.

For searches containing a brand or business name, position 1 lost 1.52 percentage points of CTR. Positions 2 through 6 gained a combined 8.71 points.

Unbranded queries were mostly unchanged, so this shift appears specific to how people navigate brand SERPs on desktop.

Commercial & Location Queries Lose Top CTR

When AWR sorted results by intent, commercial and location searches posted the clearest top-position declines.

Commercial queries, defined as searches including terms like “buy” or “price,” saw positions 1 and 2 on desktop drop a combined 4.20 points. Position 1 accounted for most of that loss at 3.01 points.

Location searches also weakened at the top. Position 1 fell 2.52 points on desktop and 2.13 points on mobile.

AWR doesn’t attribute cause, but these are the SERPs where rich results and other modules can crowd the page.

The takeaway is that top organic placements in commercial and local contexts captured a smaller share of clicks in Q3 than they did in Q2.

Longer Queries Hold Steady

Query length shows another split that matters for forecasting traffic.

On desktop, position-1 CTR fell for shorter multi-word searches. Two-word queries dropped 1.22 points and three-word queries dropped 1.24 points at the top spot.

AWR notes that 4+ word queries were the only group with steady CTR this quarter.

On mobile, the movement went the other way for the shortest queries. One-word searches gained 1.52 points at position 1.

The takeaway here is that short, generic desktop searches remain the most volatile category of CTR performance, while longer searches looked more stable in Q3.

Industry Winners And Losers

AWR tracked CTR shifts across 18 verticals and tied those changes to demand trends.

The report highlighted several large moves:

  • Arts & Entertainment had the steepest single-position decline, with position 1 on desktop down 5.13 points.
  • Travel showed the strongest gain, with position 2 on desktop up 2.46 points.
  • Shopping saw a redistribution near the top. Position 1 on desktop fell 2.10 points, while positions 2 and 3 gained a combined 2.83 points.

The takeaway is that CTR isn’t shifting evenly across verticals. Some categories are seeing a top-spot squeeze, while others are seeing clicks spread across more of the upper results.

Why This Matters For You

Q3 adds another data point for explaining CTR changes when rankings stay flat.

For branded desktop searches, position 1 is still dominant, but it’s no longer absorbing as much of the clickshare as last quarter.

If you track brand terms, it’s worth watching whether traffic is distributing across multiple listings on those SERPs.

And if your traffic depends on short, high-volume desktop queries, this report suggests those segments are still the most exposed to quarter-over-quarter click shifts. Longer searches were the only length group that held steady at the top in Q3.

Looking Ahead

AWR’s report reflects an international dataset and doesn’t isolate a single driver behind the CTR movement. Still, the direction in Q3 is clear in a few places.

Branded desktop clicks are spreading beyond position 1, and commercial and local SERPs continue to pressure the top organic slot.


Featured Image: Roman Samborskyi/Shutterstock

LLMs.txt Shows No Clear Effect On AI Citations, Based On 300k Domains via @sejournal, @MattGSouthern

A new analysis from SE Ranking suggests the llms.txt file isn’t delivering measurable benefits yet.

After examining roughly 300,000 domains, the company found no relationship between having llms.txt and how often a domain is cited in major LLM answers.

What The Data Says

Adoption Is Thin

SE Ranking’s crawl found llms.txt on 10.13% of domains. In other words, nearly nine out of ten sites they measured haven’t implemented it.

That low usage matters because the format is sometimes described as an emerging baseline for AI visibility. The data instead shows scattered experimentation. SE Ranking says adoption is fairly even across traffic tiers and not concentrated among the biggest brands.

High-traffic sites were slightly less likely to use the file than mid-tier websites in their dataset.

No Measurable Link To LLM Citations

To assess whether the llms.txt file affects AI visibility, SE Ranking analyzed domain-level citation frequency across responses from prominent LLMs. They employed statistical correlation tests and an XGBoost model to determine the extent to which each factor contributed to citations.

The main finding was that removing the llms.txt feature actually improved the model’s accuracy. SE Ranking concludes that llms.txt “doesn’t seem to directly impact AI citation frequency. At least not yet.”

Additionally, they found no significant correlation between citations and the file using simpler statistical methods.

How This Squares With Platform Guidance

SE Ranking notes that its results align with public platform guidance. But it’s important to be precise about what is confirmed.

Google hasn’t indicated that llms.txt is used as a signal in AI Overviews or AI Mode. In its AI search guidance, Google frames it as an evolution of Search that continues to rely on its existing Search systems and signals, without mentioning llms.txt as an input.

OpenAI’s crawler documentation similarly focuses on robots.txt controls. OpenAI recommends allowing OAI-SearchBot in robots.txt to support discovery for its search features, but does not say llms.txt affects ranking or citations.

SE Ranking also notes that some SEO logs show GPTBot occasionally fetching llms.txt files, though they say it doesn’t happen often and does not appear tied to citation outcomes.

Taken together, the dataset suggests that even if some models retrieve the file, it’s not influencing citation behavior at scale right now.

What This Means For You

If you want a clean, low-risk way to prepare for possible future adoption, adding llms.txt is easy and unlikely to cause technical harm.

But if the goal is a near-term visibility bump in AI answers, the data says you shouldn’t expect one.

That puts llms.txt in the same category as other early AI-visibility tactics. Reasonable to test if it fits your workflow, but not something to sell internally as a proven lever.


Featured Image: Mameraman/Shutterstock

SEO Community Reacts To Adobe’s Semrush Acquisition via @sejournal, @martinibuster

The SEO community is excited by the Semrush Adobe acquisition. The consensus is that it’s a milestone in the continuing evolution of SEO in the age of generative AI. Adobe’s purchase comes at a time of AI-driven uncertainty and may be a sign of the importance of data for helping businesses and marketers who are still trying to find a new way forward.

Cyrus Shepard tweeted that he believes the Semrush sale creates an opportunity for Ahrefs under the belief that Adobe’s scale and emphasis on the enterprise market will present an opportunity for Ahrefs to move fast to respond to rapidly changing needs of the marketing industry.

He tweeted:

“Adobe’s marketing tools lean towards ENTERPRISE (AEM, Adobe Analytics). If Adobe leans this way with Semrush, it may be a less attractive solution to smaller operators.

With this acquisition, @ahrefs remains the only large, independent SEO tool suite on the market. Ahrefs is able to move fast and innovate – I suspect this creates an opportunity for Ahrefs – not a problem.”

Shepard is right, some of Adobe’s products (like Adobe Analytics) do lean toward enterprise users but there’s a significant small and medium size business user base for design related tools with pricing at the $99/month range that make the tools relatively affordable. Nevertheless that’s a significant cost compared to the $600 range that Adobe used to charge for standalone versions for Windows and Mac.

I agree that Ahrefs is quite likely the best positioned tool to serve the needs of the SMB end of the SEO industry should Semrush increase focus on the enterprise market. But there are also smaller tools like SERPrecon that are tightly focused on helping businesses deliver results and may benefit from the vacuum left by Semrush.

Validates SEO Platforms

Seth Besmertnik, CEO of the enterprise SEO platform Conductor, sees the acquisition as validating SEO platforms, which is a valid observation considering how much money, in cash, Semrush was acquired for.

Besmertnik wrote:

“I’m feeling a lot this morning. HUGE news today. Adobe will be acquiring Semrush…our partner, competitor, and an ally in the broader SEO and AEO/GEO world for over a decade.

For a long time, big tech ignored SEO. It drove half of the internet’s traffic, yet somehow never cleared the bar as something to own. I always believed the day would come when major platforms took this category seriously. Today is that day.”

It’s an exciting moment! We’re starting to see some consolidation and this represents huge recognition of how important the work of SEOs is. From traditional SEO through optimizing for AI platforms, the work is important. Clearly Adobe is thinking this way on behalf of their clientele, which means great things ahead.”

Besmertnik also made the point that the industry is entering a transitional phase where platforms that are adapted to AI will be the leaders of tomorrow.

He added:

“This next era won’t be led by legacy architectures. It will be led by platforms that built their foundations for AI…and by companies engineered for the data-first, enterprise-grade world that’s now taking shape.”

Validates SEO

Duane Forrester, formerly of Bing, shared the insight that the acquisition shows how important SEO is, especially as the industry is evolving to meet the challenges of AI search.

Duane shared:

“It’s an exciting moment! We’re starting to see some consolidation and this represents huge recognition of how important the work of SEOs is. From traditional SEO through optimizing for AI platforms, the work is important. Clearly Adobe is thinking this way on behalf of their clientele, which means great things ahead.”

Online Reactions Were Mostly Positive

There were a few comments with negative sentiment published in response to Adobe’s announcement on X (formerly Twitter), where some used the post to vent about pricing and other grudges but many others from the SEO community offered congratulations to Semrush.

What It All Means

As multiple people have said, the sale of Semrush is a landmark moment for SEO and for SEO platforms because it puts a dollar figure on the importance of digital marketing at a time when the search marketing industry is struggling to reach consensus of how SEO should evolve to meet the many changes introduced by AI Search.

Many Questions Remain Unanswered

What Will Adobe Actually Do With Semrush’s Product?

Will Semrush remain a standalone product or will it be offered in multiple versions for enterprise users and SMBs or will it be folded into one of Adobe’s cloud offerings?

Pricing

A common concern is about pricing and whether the cost of Semrush will go up. Is it possible that the price could actually come down?

Semrush Is A Good Fit For Adobe

Adobe started as a software company focused on graphic design products but by the turn of the millenium it began acquiring companies directly related to digital marketing and web design, but increasingly focusing on the enterprise market. Data is useful for planning content and also for better understanding what’s going on at search engines and at AI-based search and chat. Semrush is a good fit for Adobe.

Featured Image by Shutterstock/Sunil prajapati

New Data Finds Gap Between Google Rankings And LLM Citations via @sejournal, @MattGSouthern

Large language models cite sources differently than Google ranks them.

Search Atlas, an SEO software company, compared citations from OpenAI’s GPT, Google’s Gemini, and Perplexity against Google search results.

The analysis of 18,377 matched queries finds a gap between traditional search visibility and AI platform citations.

Here’s an overview of the key differences Search Atlas found.

Perplexity Is Closest To Search

Perplexity performs live web retrieval, so you would expect its citations to look more like search results. The study supports that.

Across the dataset, Perplexity showed a median domain overlap of around 25–30% with Google results. Median URL overlap was close to 20%. In total, Perplexity shared 18,549 domains with Google, representing about 43% of the domains it cited.

ChatGPT And Gemini Are More Selective

ChatGPT showed much lower overlap with Google. Its median domain overlap stayed around 10–15%. The model shared 1,503 domains with Google, accounting for about 21% of its cited domains. URL matches typically remained below 10%.

Gemini behaved less consistently. Some responses had almost no overlap with search results. Others lined up more closely. Overall, Gemini shared just 160 domains with Google, representing about 4% of the domains that appeared in Google’s results, even though those domains made up 28% of Gemini’s citations.

What The Numbers Mean For Visibility

Ranking in Google doesn’t guarantee LLM citations. This report suggests the systems draw from the web in different ways.

Perplexity’s architecture actively searches the web and its citation patterns more closely track traditional search rankings. If your site already ranks well in Google, you are more likely to see similar visibility in Perplexity answers.

ChatGPT and Gemini rely more on pre-trained knowledge and selective retrieval. They cite a narrower set of sources and are less tied to current rankings. URL-level matches with Google are low for both.

Study Limitations

The dataset heavily favored Perplexity. It accounted for 89% of matched queries, with OpenAI at 8% and Gemini at 3%.

Researchers matched queries using semantic similarity scoring. Paired queries expressed similar information needs but were not identical user searches. The threshold was 82% similarity using OpenAI’s embedding model.

The two-month window provides a recent snapshot only. Longer timeframes would be needed to see whether the same overlap patterns hold over time.

Looking Ahead

For retrieval-based systems like Perplexity, traditional SEO signals and overall domain strength are likely to matter more for visibility.

For reasoning-focused models like ChatGPT and Gemini, those signals may have less direct influence on which sources appear in answers.


Featured Image: Ascannio/Shutterstock

Adobe To Acquire Semrush In $1.9 Billion Cash Deal via @sejournal, @MattGSouthern

Adobe and Semrush announced today that they have entered into a definitive agreement for Adobe to acquire Semrush in an all-cash transaction valued at approximately $1.9 billion. Adobe will pay $12.00 per share, describing Semrush as a “leading brand visibility platform.”

The acquisition brings a widely used SEO platform under Adobe’s Digital Experience umbrella.

The deal is expected to close in the first half of 2026, subject to regulatory approvals and the approval of Semrush stockholders.

What Adobe Is Buying

Semrush is a Boston-based SaaS platform best known in search marketing for keyword research, site audits, competitive intelligence, and online visibility tracking.

Over the past two years, Semrush has added enterprise products focused on AI-driven visibility, including tools that monitor how brands are referenced in responses from large language models such as ChatGPT and Gemini, alongside traditional search results.

Semrush has also been an active acquirer. Recent deals have included SEO education and community assets like Backlinko and Traffic Think Tank, as well as technology and media acquisitions such as Third Door Media, the publisher of Search Engine Land.

For Adobe, this gives the Experience Cloud portfolio a direct line into the SEO workflow that many in-house teams and agencies already use daily.

How Semrush Fits Adobe’s AI Marketing Stack

Adobe positions the deal as part of a broader strategy to support “brand visibility” in what it describes as an agentic AI era.

In the announcement, Anil Chakravarthy, president of Adobe’s Digital Experience business, says:

“Brand visibility is being reshaped by generative AI, and brands that don’t embrace this new opportunity risk losing relevance and revenue.”

Semrush’s “generative engine optimization” positioning aligns with that narrative. The company has been pitching GEO as a counterpart to traditional SEO, focused on keeping brands discoverable inside AI-generated answers, not just organic listings.

Adobe plans to integrate Semrush with products like Adobe Experience Manager, Adobe Analytics, and its newer Brand Concierge offering.

Deal Terms And Timeline

Under the terms of the agreement, Adobe will acquire Semrush for $12.00 per share in cash, representing a total equity value of roughly $1.9 billion.

Coverage from financial outlets notes that the price reflects a premium of around 77 percent over Semrush’s prior closing share price and that Semrush stock jumped more than 70 percent in early trading following the announcement.

According to the companies, the transaction has already been approved by both boards. An associated SEC filing shows the merger agreement was signed on November 18.

Closing is targeted for the first half of 2026, pending customary regulatory reviews and the approval of Semrush shareholders. Until then, Adobe and Semrush say they will continue to operate as separate companies.

Why This Matters

This deal continues a broader trend: core search and visibility tools are moving deeper into large enterprise suites.

If you already rely on Semrush, you can expect tighter integration with Adobe’s analytics and customer experience products over time.

It also raises practical questions:

  • How will Semrush be packaged and priced once it sits inside Adobe’s enterprise stack?
  • Can agencies and smaller teams keep using Semrush as a relatively independent tool?
  • How will Adobe choose to handle Semrush’s media holdings, including Search Engine Land and related properties?

For now, both companies are presenting the acquisition as a way to give marketers a more complete view of brand visibility across search results and AI-generated answers, rather than as a change to Semrush’s current product line.

Looking Ahead

In the near term, there are two things to watch.

First, regulators will review the transaction, particularly given Adobe’s history with large acquisitions in the digital experience space. That process will shape the closing timeline.

Second, Adobe will need to decide how quickly to integrate Semrush into Experience Cloud and how much to preserve the existing product and brand. Those choices will influence how disruptive this feels for your current workflows.

Watch for changes to Semrush’s API access, plan structure, and reporting integrations once the deal moves closer to completion.


Featured Image: IB Photography/Shutterstock

Google Brings Gemini 3 To Search’s AI Mode via @sejournal, @MattGSouthern

Google has integrated Gemini 3 in Search’s AI Mode. This marks the first time Google has shipped a Gemini model to Search on its release date.

Google AI Pro and Ultra subscribers in the U.S. can access Gemini 3 Pro by selecting “Thinking” from the model dropdown in AI Mode.

Robby Stein, VP and GM of Google Search, wrote on X:

“Gemini 3, our most intelligent model, is landing in Google Search today – starting with AI Mode. Excited that this is the first time we’re shipping a new Gemini model in Search on day one.”

Google plans to expand Gemini 3 in AI Mode to all U.S. users soon, with higher usage limits for Pro and Ultra subscribers.

What’s New

Search Updates

Google describes Gemini 3 as a model with state-of-the-art reasoning and deep multimodal understanding.

In the context of Search, it’s designed to explain advanced concepts, work through complex questions, and support interactive visuals that run directly inside AI Mode responses.

With Gemini 3 in place, Google says AI Mode has effectively re-architected what a “helpful response” looks like.

Stein explains:

“Gemini 3 is also making Search smarter by re-architecting what a helpful response looks like. With new generative UI capabilities, Gemini 3 in AI Mode can now dynamically create the overall response layout when it responds to your query – completely on the fly.”

Instead of only returning a block of text, AI Mode can design a response layout tailored to your query. That includes deciding when to surface images, tables, or other structured elements so the answer is clearer and easier to work with.

In the coming weeks, Google will add automatic model selection, Stein continues:

“Search will intelligently route tough questions in AI Mode and AI Overviews to our frontier model, while continuing to use faster models for simpler tasks.”

Enhanced Query Fan-Out

Gemini 3 upgrades Google’s query fan-out technique.

According to Stein, Search can now issue more related searches in parallel and better interpret what you’re trying to do.

A potential benefit, Stein adds, is that Google may find content it previously missed:

“It now performs more and much smarter searches because Gemini 3 better understands you. That means Search can now surface even more relevant web content for your specific question.”

Generative UI

Gemini 3 in AI Mode introduces generative UI features that build dynamic visual layouts around your query.

The model analyzes your question and constructs a custom response using visual elements such as images, tables, and grids. When an interactive tool would help, Gemini 3 can generate a small app in real time and embed it directly in the answer.

Examples from Google’s announcement include:

  • An interactive physics simulation for exploring the three-body problem
  • A custom mortgage loan calculator that lets you compare different options and estimate long-term savings

All of these responses include prominent links to high-quality content across the web so you can click through to source material.

See a demonstration in Google’s launch video below:

Why This Matters

Gemini 3 changes how your content is discovered and used in AI Mode. With deeper query fan-out, Google can access more pages per question, which might influence which sites are cited or linked during long, complex searches.

The updated layouts and interactive features change how links appear on your screen. On-page tools, explainers, and visualizations could now compete directly with Google’s own interface.

As Gemini 3 becomes available to more people, it will be important to watch how your content is shown or referenced in AI responses, in addition to traditional search rankings.

Looking Ahead

Google says it will continue refining these updates based on feedback as more people try the new tools. Automatic model selection is set to arrive in the coming weeks for Google AI Pro and Ultra subscribers in the U.S., with broader U.S. access to Gemini 3 in AI Mode planned but not yet scheduled.

Cloudflare Outage Triggers 5xx Spikes: What It Means For SEO via @sejournal, @MattGSouthern

A Cloudflare incident is returning 5xx responses for many sites and apps that sit behind its network, which means users and crawlers may be running into the same errors.

From an SEO point of view, this kind of outage often looks worse than it is. Short bursts of 5xx errors usually affect crawl behavior before they touch long-term rankings, but there are some details worth paying attention to.

What You’re Likely Seeing

Sites that rely on Cloudflare as a CDN or reverse proxy may currently be serving generic “500 internal server error” pages or failing to load at all. In practice, everything in that family of responses is treated as a server error.

If Googlebot happens to crawl while the incident is ongoing, it will record the same 5xx responses that users see. You may not notice anything inside Search Console immediately, but over the next few days you could see a spike in server errors, a dip in crawl activity, or both.

Keep in mind that Search Console data is rarely real-time and often lags by roughly 48 hours. A flat line in GSC today could mean the report hasn’t caught up yet. If you need to confirm that Googlebot is encountering errors right now, you will need to check your raw server access logs.

This can feel like a ranking emergency. It helps to understand how Google has described its handling of temporary server problems in the past, and what Google representatives are saying today.

How Google Handles Short 5xx Spikes

Google groups 5xx responses as signs that a server is overloaded or unavailable. According to Google’s Search Central documentation on HTTP status codes, 5xx and 429 errors prompt crawlers to temporarily slow down, and URLs that continue to return server errors can eventually be dropped from the index if the issue remains unresolved.

Google’s “How To Deal With Planned Site Downtime” blog post gives similar guidance for maintenance windows, recommending a 503 status code for temporary downtime and noting that long-lasting 503 responses can be treated as a sign that content is no longer available.

In a recent Bluesky post, Google Search Advocate John Mueller reinforced the same message in plainer language. Mueller wrote:

“Yeah. 5xx = Google crawling slows down, but it’ll ramp back up.”

He added:

“If it stays at 5xx for multiple days, then things may start to drop out, but even then, those will pop back in fairly quickly.”

Taken together, the documentation and Mueller’s comments draw a fairly clear line.

Short downtime is usually not a major ranking problem. Already indexed pages tend to stay in the index for a while, even if they briefly return errors. When availability returns to normal, crawling ramps back up and search results generally settle.

The picture changes when server errors become a pattern. If Googlebot sees 5xx responses for an extended period, it can start treating URLs as effectively gone. At that point, pages may drop from the index until crawlers see stable, successful responses again, and recovery can take longer.

The practical takeaway is that a one-off infrastructure incident is mostly a crawl and reliability concern. Lasting SEO issues tend to appear when errors linger well beyond the initial outage window.

See additional guidance from Google regarding 5xx errors:

Analytics & PPC Reporting Gaps

For many sites, Cloudflare sits in front of more than just HTML pages. Consent banners, tag managers, and third-party scripts used for analytics and advertising may all depend on services that run through Cloudflare.

If your consent management platform or tag manager was slow or unavailable during the outage, that can show up later as gaps in GA4 and ad platform reporting. Consent events may not have fired, tags may have timed out, and some sessions or conversions may not have been recorded at all.

When you review performance, you might see a short cliff in GA4 traffic, a drop in reported conversions in Google Ads or other platforms, or both. In many cases, that will reflect missing data rather than a real collapse in demand.

It’s safer to annotate today’s incident in your analytics and media reports and treat it as a tracking gap before you start reacting with bid changes or budget shifts based on a few hours of noisy numbers.

What To Do If You Were Hit

If you believe you’re affected by today’s outage, start by confirming that the problem is really tied to Cloudflare and not to your origin server or application code. Check your own uptime monitoring and any status messages from Cloudflare or your host so you know where to direct engineering effort.

Next, record the timing. Note when you first saw 5xx errors and when things returned to normal. Adding an annotation in your analytics, Search Console, and media reporting makes it much easier to explain any traffic or conversion drops when you review performance later.

Over the coming days, keep an eye on the Crawl Stats Report and index coverage in Search Console, along with your own server logs. You’re looking for confirmation that crawl activity returns to its usual pattern once the incident is over, and that server error rates drop back to baseline. If the graphs settle, you can treat the outage as a contained event.

If, instead, you continue to see elevated 5xx responses after Cloudflare reports the issue as resolved, it’s safer to treat the situation as a site-specific problem.

What you generally do not need to do is change content, internal linking, or on-page SEO purely in response to a short Cloudflare outage. Restoring stability is the priority.

Finally, resist the urge to hit ‘Validate Fix’ in Search Console the moment the site comes back online. If you trigger validation while the connection is still intermittent, the check will fail, and you will have to wait for the cycle to reset. It is safer to wait until the status page says ‘Resolved’ for a full 24 hours before validating.

Why This Matters

Incidents like this one are a reminder that search visibility is tied to reliability as much as relevance. When a provider in the middle of your stack has trouble, it can quickly look like a sudden drop, even when the root cause is outside your site.

Knowing how Google handles temporary 5xx spikes and how they influence analytics and PPC reports can help you communicate better with your clients and stakeholders. It allows you to set realistic expectations and recognize when an outage has persisted long enough to warrant serious attention.

Looking Ahead

Once Cloudflare closes out its investigation, the main thing to watch is whether your crawl, error, and conversion metrics return to normal. If they do, this morning’s 5xx spike is likely to be a footnote in your reporting rather than a turning point in your organic or paid performance.

Google On Generic Top Level Domains For SEO via @sejournal, @martinibuster

Google’s John Mueller answered a question about whether a generic Top Level Domain (gTLD) with a keyword in it offered any SEO advantage. His answer was in the context of a specific keyword TLD, but the topic involves broader questions about how Google evaluates TLDs in general.

Generic Top Level Domains (gTLDs)

gTLDs are domains that have a theme that relates to a topic or a purpose. The most commonly known ones are .com (generally used for commercial purposes) and .org (typically used for non-profit organizations).

The availability of unique keyword based gTLDs exploded in 2013. Now there are hundreds of gTLDs with which a website can brand themselves with and stand out.

Is There SEO Value In gTLDs?

The person asking the question on Reddit wanted to know if there’s an SEO value to registering a .music gTLD. The regular .com version of the domain name they wanted was not available but the .music version was.

The question they asked was:

“Noticed .music domains available and curious if it is relevant, growing etc or does the industry not care about it whatsoever? Is it worth reserving yours anyways just so someone else can’t have it, in case it becomes a thing?”

Are gTLDs Useful For SEO Purposes?

Google’s John Mueller limited his response to whether there gTLDs offered SEO value and his answer was no.

He answered:

“There’s absolutely no SEO advantage from using a .music domain.”

The funny thing about SEO is that Google’s standard of relevance is based on humans while SEOs think of relevance in terms of what Google thinks is relevant.

This sets up a huge disconnect between SEOs on one side who are creating websites that are keyword optimized for Google while Google itself is analyzing billions of user behavior signals because it’s optimizing search results for humans.

Optimizing For Humans With gTLDs

The thing about SEO is that it’s search engine optimization. When venturing out on the web it’s easy to forget that every website must be optimized for humans, too. Aside from spammy TLDs which can be problematic for SEO, the choice of a TLD isn’t important for SEO but it could be important for Human Optimization.

Optimizing for humans is a good idea because the signals generated by human interactions with the search engines and websites generate signals that Google uses at scale to better understand what users mean by their queries and what kinds of sites they expect to see for those queries. Some user generated signals, like searching by brand name, can send Google a signal that a particular brand is popular and is associated with a particular service, product, or keyword phrase (read about Google’s patent on branded search).

Circling back to optimizing for humans, if a particular gTLD is something that humans may associate with a brand, product, or service then there is something there that can be useful for making a site attractive to users.

I have experimented in the past with various gTLDs and found that I was able to build links more easily to .org domains than to the .com or .net versions. That’s an example of how a gTLD can be optimized for humans and lead to success.

I discovered that overtly commercial affiliate sites on .org domains ranked and converted well. They didn’t rank because of they were .org, though. The sites were top-ranked because humans responded well to the sites I created with that gTLD. It was easier to build links to them, for example. I have no doubt that people trusted my affiliate sites a little more because they were created on .org domains.

Optimizing for humans is conversion optimization. It’s super important.

Optimizing For Humans With Keyword-Based gTLDs

I haven’t played around with keyword gTLDs but I suspect that what I experienced with .org domains could happen with a keyword-based gTLD because a meaningful gTLD may communicate positive feelings or relevance to humans. You can call it branding but I think that the word “branding” is too abstract. I prefer the phrase optimizing for humans because in the end that’s what branding is really about.

So maybe it’s time we ditched bla,bla,bla-ing about branding and started talking about optimizing for humans. If that person had considered the question from the perspective of human optimization they may have been able to answer the question themselves.

When SEOs talk about relevance it seems like they’re generally referring to how relevant something is to Google. Relevance to Google is what was top of mind to the person asking the question about the .music gTLD and it might be why you’re reading this article.

Heck, relevance to search engines is what all that “entity” optimization hand waving is all about, right? Focusing on being relevant to search engines is a limited way to chase after success. For example, I cracked the code with the .org domains by focusing on humans.

At a certain point, if you’re trying to be successful online, it may be useful to take a step back and start thinking more about how relevant the content, colors, and gTLDs are to humans and you might discover that being relevant to humans makes it easier to be relevant to search engines.

Featured Image by Shutterstock/Kues

Google Search Console Adds Custom Annotations To Reports via @sejournal, @MattGSouthern

Google launched custom annotations in Search Console performance reports, giving you a way to add contextual notes directly to traffic data charts.

The feature lets you mark specific dates with notes explaining site changes or external events that affected search performance.

What The Feature Does

Custom annotations appear as markers on Search Console charts. Google’s announcement highlights several common use cases, including infrastructure changes, SEO work, content strategy shifts, and external events that affect business performance such as holidays.

All annotations are visible to everyone with access to a Search Console property. Google recommends avoiding sensitive personal information in notes due to the shared visibility.

Why This Matters

Connecting traffic changes with specific actions taken weeks or months earlier usually means maintaining separate documentation outside Search Console.

Annotations create a change log inside the performance reports you already use.

If you manage multiple properties or work with a larger team, annotations can give everyone a shared record of releases, migrations, and campaigns without relying on external spreadsheets or project tools.

How To Use It

You can add an annotation by right-clicking on a performance chart, selecting “Add annotation,” choosing a date, and entering up to 120 characters of text. The note then appears directly on the chart as a visual reference point alongside clicks, impressions, or other metrics.

Custom annotations are now part of Search Console performance reports and available through the chart context menu.

Google Extends AI Travel Planning And Agentic Booking In Search via @sejournal, @MattGSouthern

Google announced three AI-powered updates to Search that extend how users plan and book travel within AI Mode.

The company is launching Canvas for travel planning on desktop, expanding Flight Deals globally, and rolling out agentic booking capabilities that connect users directly to reservation partners.

The announcement continues Google’s push to handle complete user journeys inside Search rather than directing traffic to publisher sites and booking platforms.

What’s New

Canvas Travel Planning

Canvas creates travel itineraries inside AI Mode’s side panel interface. You describe your trip requirements, select “Create with Canvas,” and receive plans combining flight and hotel data, Google Maps information, and web content.

Canvas travel planning is available on desktop in the US for users opted into the AI Mode experiment in Google Labs.

Flight Deals Global Expansion

Flight Deals uses AI to match flexible travelers with affordable destinations based on natural language descriptions of travel preferences.

The tool launched previously in the US, Canada, and India. The feature has started rolling out to more than 200 countries and territories.

Agentic Booking Expansion

AI Mode now searches across multiple reservation platforms to find real-time availability for restaurants, events, and local appointments. The system presents curated options with direct booking links to partner sites.

Restaurant booking launches this week in the US without requiring Labs access. Event tickets and local appointment booking remain available to US Labs users.

Why This Matters

Canvas and agentic booking capabilities represent Google handling trip research, planning, and reservations inside its own interface.

People who would previously visit multiple publisher sites to research destinations and compare options can now complete those tasks in AI Mode.

The updates fit Google’s established pattern of verticalizing high-value query types. Rather than presenting traditional search results that send users to external sites, AI Mode guides users through multi-step processes from research to transaction completion.

Looking Ahead

Google provided no timeline for direct flight and hotel booking in AI Mode beyond confirming active development with industry partners.

Watch for whether Google provides analytics or attribution tools that let businesses track bookings initiated through AI Mode. Without visibility into these flows, measuring the impact of AI Mode on travel and local business traffic will be difficult.