Primer on ChatGPT’s 3 Bots

ChatGPT uses separate bots for training, searching, and taking action:

  • GPTBot provides training data.
  • OAI-SearchBot gathers data to respond to specific prompts.
  • ChatGPT-User accesses pages when requested by users.

Knowing which bot is responsible for which task is essential before attempting to disallow it.

GPTBot

GPTBot locates information to build and update training data, ChatGPT’s knowledge base for providing answers.

ChatGPT doesn’t store training URLs or track where the info comes from. Disallowing this bot will prevent the platform from using your content for training, but it won’t impact your traffic. It may affect what ChatGPT understands about your company, though external sources likely provide that information, too.

Some publishers disallow the bot to prevent ChatGPT from learning from their content and to reduce costs, as AI bots can increase hosting needs and slow down servers, especially for large sites.

I typically suggest allowing access to GPTBot to provide first-hand information about a business and thus control the context.

ChatGPT updates training data regularly, usually with each release.

OAI-SearchBot

OAI-SearchBot searches the web for current information, user reviews, product details, and more.

Opinions differ as to whether the platform indexes the URLs from these searches. (ChatGPT states it “uses a hybrid system that includes limited indexing, plus on-demand retrieval, rather than a single, exhaustive web index.”)

OAI-SearchBot searches Google, Bing, Reddit, and others for info, much like humans, and may independently crawl sites, too.

Disallowing this bot prevents it from visiting your site, but it may still cite your pages via external links. Google does this, too, incidentally. A robots.txt file can prevent Google’s bot from crawling a site, but the search giant can still index and rank its pages.

Still, disallowing OAI-SearchBot will likely reduce or eliminate citations (and traffic), which is why I don’t usually advise it.

ChatGPT-User

ChatGPT-User performs actions as requested by users. For example, a user can prompt ChatGPT to visit a page and summarize its content.

ChatGPT-User does not provide training data or citations. If your server logs include this bot, a human instructed ChatGPT to visit your site. There’s no way to block this bot because it’s user-initiated, per ChatGPT.

Google Explains Why Staggered Site Migrations Impact SEO Outcome via @sejournal, @martinibuster

Google’s John Mueller recently answered a question about how Google responds to staggered site moves where a site is partially moved from one domain to another. He said a standard site move is generally fine, but clarified his position when it came to partial site moves.

Straight Ahead Site Move?

Someone asked about doing a site move, initially giving the impression that they were moving the entire site. The question was in the context of using Google Search Console’s change of address feature.

They asked:

“Do you have any thoughts on this GSC Change of Address question?

Can we submit the new domain if a few old URLs still get traffic and aren’t redirected yet, or should we wait until all redirects are live?”

Mueller initially answered that it should be fine:

“It’s generally fine (for example, some site moves keep the robots.txt on the old domain with “allow: /” so that all URLs can be followed). The tool does check for the homepage redirect though.”

Google Explains Why Partial Site Moves Are Problematic

His opinion changed however after the OP responded with additional information indicating that the home page has been moved while many of the product and category pages on the old domain will stay put for now, meaning that they want to move parts of the site now and other parts later, retaining one foot in on a new domain and the other firmly planted on the old one.

That’s a different scenario entirely. Unsurprisingly, Mueller changed his opinion.

He responded:

“Practically speaking, it’s not going to be seen as a full site move. You can still use the change of address tool, but it will be a messy situation until you’ve really moved it all over. If you need to do this (sometimes it’s not easy, I get it :)), just know that it won’t be a clean slate.

…You’ll have a hard time tracking things & Google will have a hard time understanding your sites. My recommendation would be to clean it up properly as soon as you can. Even properly planned & executed site migrations can be hard, and this makes it much more challenging.”

Google’s Site Understanding

Something that I find intriguing is Mueller’s occasional reference to Google’s understanding of a website. He’s mentioned this factor in other contexts in the past and it seems to be a catchall for things that are related to quality but also to something else that he’s referred to in the past as a relevance topic related to understanding where a site fits in the Internet.

In this context, Mueller appears to be using the phrase to mean understanding the site relative to the domain name.

Featured Image by Shutterstock/Here

Cloudflare Report: Googlebot Tops AI Crawler Traffic via @sejournal, @MattGSouthern

Cloudflare published its sixth annual Year in Review, offering a comprehensive looks at Internet traffic, security, and AI crawler activity across 2025.

The report draws on data from Cloudflare’s network, which spans more than 330 cities across 125 countries and handles over 81 million HTTP requests per second on average.

The AI crawler findings stand out. Googlebot crawled far more web pages than any other AI bot, reflecting Google’s dual-purpose approach to crawling for both search indexing and AI training.

Googlebot Top AI Crawler Traffic

Cloudflare analyzed successful requests for HTML content from leading AI crawlers during October and November 2025. The results showed Googlebot reached 11.6% of unique web pages in the sample.

That’s more than 3 times the pages seen by OpenAI’s GPTBot at 3.6%. It’s nearly 200 times more than PerplexityBot, which crawled just 0.06% of pages.

Bingbot came in third at 2.6%, followed by Meta-ExternalAgent and ClaudeBot at 2.4% each.

The report noted that because Googlebot crawls for both search indexing and AI model training, web publishers face a difficult choice. Blocking Googlebot’s AI training means risking search discoverability.

Cloudflare wrote:

“Because Googlebot is used to crawl content for both search indexing and AI model training, and because of Google’s long-established dominance in search, Web site operators are essentially unable to block Googlebot’s AI training without risking search discoverability.”

AI Bots Now Account For 4.2% of HTML Requests

Throughout 2025, AI bots (excluding Googlebot) averaged 4.2% of HTML requests across Cloudflare’s customer base. The share fluctuated between 2.4% in early April and 6.4% in late June.

Googlebot alone accounted for 4.5% of HTML requests, slightly more than all other AI bots combined.

The share of human-generated HTML traffic started 2025 at seven percentage points below non-AI bot traffic. By September, human traffic began exceeding non-AI bot traffic on some days. As of December 2, humans generated 47% of HTML requests while non-AI bots generated 44%.

Crawl-to-Refer Ratios Show Wide Variation

Cloudflare tracks how often AI and search platforms send traffic to sites relative to how often they crawl. A high ratio means heavy crawling without sending users back to source sites.

Anthropic had the highest ratios among AI platforms, ranging from approximately 25,000:1 to 100,000:1 during the second half of the year after stabilizing from earlier volatility.

OpenAI’s ratios reached as high as 3,700:1 in March. Perplexity maintained the lowest ratios among leading AI platforms, generally below 400:1 and under 200:1 from September onward.

For comparison, Google’s search crawl-to-refer ratio stayed much lower, generally between 3:1 and 30:1 throughout the year.

User-Action Crawling Grew Over 20X

Not all AI crawling is for model training. “User action” crawling occurs when bots visit sites in response to user questions posed to chatbots.

This category saw the fastest growth in 2025. User-action crawling volume increased more than 15 times from January through early December. The trend closely matched the traffic pattern for OpenAI’s ChatGPT-User bot, which visits pages when users ask ChatGPT questions.

The growth showed a weekly usage pattern starting in mid-February, suggesting increased use in schools and workplaces. Activity dropped during June through August when students were on break and professionals took vacations.

AI Crawlers Most Blocked In Robots.txt

Cloudflare analyzed robots.txt files across nearly 3,900 of the top 10,000 domains. AI crawlers were the most frequently blocked user agents.

GPTBot, ClaudeBot, and CCBot had the highest number of full disallow directives. These directives tell crawlers to stay away from entire sites.

Googlebot and Bingbot showed a different pattern. Their disallow directives leaned heavily toward partial blocks, likely focused on login endpoints and non-content areas rather than full site blocking.

Civil Society Became Most-Attacked Sector

For the first time, organizations in the “People and Society” vertical were the most targeted by attacks. This category includes religious institutions, nonprofits, civic organizations, and libraries.

The sector received 4.4% of global mitigated traffic, up from under 2% at the start of the year. Attack share jumped to over 17% in late March and peaked at 23.2% in early July.

Many of these organizations are protected by Cloudflare’s Project Galileo.

Gambling and games, the most-attacked vertical in 2024, saw its share drop by more than half to 2.6%.

Other Key Findings

Cloudflare’s report included several additional findings across traffic, security, and connectivity.

Global Internet traffic grew 19% year-over-year. Growth stayed relatively flat through mid-April, then accelerated after mid-August.

Post-quantum encryption now secures 52% of human traffic to Cloudflare, nearly double the 29% share at the start of the year.

ChatGPT remained the top generative AI service globally. Google Gemini, Windsurf AI, Grok/xAI, and DeepSeek were new entrants to the top 10.

Starlink traffic doubled in 2025, with service launching in more than 20 new countries.

Nearly half of the 174 major Internet outages observed globally were caused by government-directed shutdowns. Cable cut outages dropped nearly 50%, while power failure outages doubled.

European countries dominated Internet quality metrics. Spain topped the list for overall Internet quality, with average download speeds above 300 Mbps.

Why This Matters

The AI crawler data should affects how you think about bot access and traffic.

Google’s dual-purpose crawler creates a competitive advantage. You can block other AI crawlers while keeping Googlebot access for search visibility, but you can’t separate Google’s search crawling from its AI training crawling.

The crawl-to-refer ratios help quantify what publishers already suspected. AI platforms crawl heavily but send little traffic back. The gap between crawling and referring varies widely by platform.

The civil society attack data matters if you work with nonprofits or advocacy organizations. These groups now face the highest rate of attacks.

Looking Ahead

Cloudflare expects AI metrics to change as the space continues to evolve. The company added several new AI-related datasets to this year’s report that weren’t available in previous editions.

The crawl-to-refer ratios may change as AI platforms adjust their search features and referral behavior. OpenAI’s ratios already showed some decline through the year as ChatGPT search usage grew.

For robots.txt management, the data shows most publishers are choosing partial blocks for major search crawlers while fully blocking AI-only crawlers. The year-end state of these directives provides a baseline for tracking how publisher policies evolve in 2026.


Featured Image: Mamun_Sheikh/Shutterstock

20 SEO Experts Offer Their Advice For 2026 via @sejournal, @theshelleywalsh

This year has been a continuation of learning and understanding about how AI impacts our industry. It’s been less about the chaos of the initial disruption and more about “how do we leverage this?

My belief is that SEO is a practice that needs to be adaptable to the end goal and not fixed to any predetermined notions centered around Google, ranking, or keywords. The foundation of SEO is about making yourself visible online wherever your audience can find you.

“S’ is for “search engine,” but one of my favorite phrases from the year is from Ashley Liddell, who said “search everywhere optimization,” and that is the perfect approach for the mindset needed to continue in the next level of SEO.

It might be TikTok, YouTube, Google, ChatGPT, or Reddit. Most likely, it’s a combination of all of these.

For the technical side of SEO, it’s fundamental that your pages can be accessed by all search engines and machines. For the content side of SEO, you need to be creating content experiences that can be cited by search engines and machines. And everyone should be thinking about the bottom line: Does this align with the defined business outcome for my client/brand/company? Show me the money.

One critical area I wouldn’t overlook is agentic AI and the development of closed systems completing actions for users. Think booking a holiday, personal shopping for a styled wardrobe, or buying your food shop based on a specific diet. When that happens, you need to ensure you are in the game and included in those closed spaces. Start learning about this now.

As the AI future is coming fast, get ready and go with it rather than resisting it. 2026 is the watershed where you need to get on board to stay in the game.

At Search Engine Journal, we showcase some of the best SEO minds in the industry, and in our usual tradition, we asked 20 of the best practicing SEO experts, including many of our contributors, “In 2026, what should SEOs focus on to maintain visibility and achieve measurable results?”

(Editor’s Note: The following are not in any order of preference and are displayed in the order of who responded first.)

How To Maintain Visibility Online In 2026

1. Be Mentioned In the Right Places

Kevin Indig, Growth Advisor

In 2026, visibility is the result of having the right content, engaging on the right channels, and being mentioned in the right places.

The right content is a mix of hyperlong-tail articles/landing pages tailored to your audience(s) and based on your unique positioning and data stories. People prompt 5x longer than they search on Google, so you want to be the best result for their specific context. LLMs also love fresh, unique datapoints, so you want to create research-driven content.

The right channels are Google, ChatGPT, Reddit, Quora, review sites, LinkedIn, and niche forums. Those are not just the most cited platforms in LLMs but also in Google Search. But being present here takes an engagement strategy rather than an SEO approach.

The right places to be mentioned are authoritative publishers and review sites in your industry. LLMs seem to rely heavily on mentions from other (relevant) sites, so you have to be present in context (surrounding words) that reflect your positioning and market position.

→ Read More: The Alpha Is Not LLM Monitoring

2. We Have To Do More Than Just Appease Google

Cindy Krum, CEO & Founder, MobileMoxie

We have to do more than just appease Google.

Now, to get visibility in all the places where it is needed, having a good website, with high-quality, indexable content, is table stakes; it is the bare minimum, and likely not enough.

For years, Google’s algorithm focused on using content and links to a site to evaluate that particular site, and rank it. AI search utilities and LLMs work very differently. They were designed to find a consensus and synthesize it, and they are looking across all the information that they have access to, to do it.

This means, if you are just relying on your website to create your visibility online, it will not be enough. There is no consensus and minimal synthesis from just one site.

Your branding message needs to be widely distributed across the web to create a consistent but discernibly unique branding message.

→ Read More: Google’s AI Search Journeys Are Reshaping SEO With Cindy Krum

3. Optimize For Systems That Read Like Machines

Duane Forrester, Founder and CEO, UnboundAnswers.com

In 2026, SEOs need to treat visibility as something earned through retrieval, not ranking.

Focus on how content is chunked, cited, and most importantly, trusted by AI systems.

Audit what gets surfaced inside chatbots and answer engines, not just in SERPs.

Build authority signals machines can verify: structured data, consistent sourcing, and entity clarity.

Use embeddings, vector search, and retrieval testing to understand how meaning (not keywords) drives exposure.

Replace “optimize for Google” with “optimize for systems that read like machines.” Your goal isn’t a blue link anymore. It’s being the trusted source those systems turn to when humans ask questions. Trust, in 2026, is paramount.

→ Read More: Ex-Microsoft SEO Pioneer On Why AI’s Biggest Threat To SEO Isn’t What You Think

4. Be Retrieved, Cited, And Trusted Wherever Users Search

Carolyn Shelby, Founder, CSHEL Search Strategies

In 2026, SEOs need to refocus on clarity, consistency, and comprehension.

Every channel that describes your brand – your site, feeds, listings, and profiles – must tell the same story, in the same words, in a way both humans *and machines* can understand. That means cleaning up fragmented site structures, removing “hidden” or toggle-buried information, and ensuring the important facts live on the page in visible text. (Note, I did not say Schema doesn’t matter, but I am saying that there are situations where the Schema that is in the JSON-LD is NOT being read, and for those times, it is important that you have valuable product specs and data ON the page, in visible text, and not hidden behind a tab or in a toggle.)

You won’t be penalized or hurt yourself in Google or Bing by *also* optimizing for the lowest-common-denominator crawlers – but you will lose out on that extra visibility if you ignore them. Build pages that are fast (LLMs have a short attention span), crawlable, and semantically clear. Make sure your product, pricing, and positioning statements are consistent across every surface.

The goal isn’t *just* to rank anymore (though ranking is still a necessary first step in most cases). It’s to be retrieved, cited, and trusted wherever users search – whether that’s Google, Bing, or an LLM.

→ Read More: Why AI Search Isn’t Overhyped & What To Focus On Right Now

5. Visibility Will Depend On Agentic Readiness

Andrea Volpini, Co-Founder and CEO, WordLift

In 2026, we are finally designing for the Reasoning Web, where agents will read, decide, and act on our behalf, and SEO becomes the discipline of making these systems effective. Visibility will depend on agentic readiness: clean structured data, stable identifiers, precise ontologies, and knowledge graphs that let agents resolve entities, compare offers, execute tasks, and learn from results.

This is a semantic shift: not simply about being “mentioned” in AI Overviews or ChatGPT, but about exposing products, content, and services as machine-operable assets through feeds, APIs, and tools that make agents smarter every time they interact with us.

The brands that let agents run the show, safely and verifiably, will own the next chapter of search.

→ Read More: How Structured Data Shapes AI Snippets And Extends Your Visibility Quota

6. Search And Product Are Intimately Connected

Ray Grieselhuber, Founder & CEO, DemandSphere

The most important thing, in our view, is understanding that AI search is ubiquitous now across three core experiences: SERPs, LLMs, and agentic experiences.

For the first two, SERPs and LLMs, there is a lot of overlap because they rely on a shared search index (Google in most cases), but the way in which the retrieval process works across these two experiences varies widely. This is why we are hearing that everyone’s No. 1 problem is getting good data, so spend time to make sure your monitoring and data pipelines are accurate and fine-tuned.

For the agentic experience, it’s still early but you should be thinking about how your product strategy will intersect with feeds and APIs (and new, related protocols like MCP). Search and product are intimately connected going forward, and the real ones will know that they always have been.

→ Read More: AI Platform Founder Explains Why We Need To Focus On Human Behavior, Not LLMs

7. Have A Relentless Focus On Being The Best

Barry Adams, Polemic Digital

Whatever you do, don’t lose your mind to the AI hype and try to radically reinvent your SEO efforts. Yes, it will be tougher to grow traffic and revenue from search, but too many SEOs have been coasting along and relying on Google’s own growth to fuel their figures. Now that clicks from Google have stagnated, you’ll need to be smarter about your SEO.

Spend less time and effort on “busywork,” those minor little things that don’t bring any measurable improvement to your traffic. Do the stuff that actually works. Don’t compromise on quality, have a relentless focus on being the best, and make sure you capitalize on your site’s strengths and eradicate its weaknesses.

Sites that are significantly suboptimal, either technically or editorially, will simply not succeed. You have to be all-in on search, without cutting corners and “that will do” concessions. Anything less than that and you will end up on the wrong side of the zero-sum game that Google search has become.

→ Read More: AI Survival Strategies For Publishers

8. Focus On Quality And Conversion Over The Quantity Of Content

Lily Ray, Vice President, SEO Strategy & Research, Amsive

For many years, I’ve answered this question with some version of “focusing on E-E-A-T,” and believe it or not, I think this answer *still* applies in 2026 with the rise of AI search.

Why? Because being mentioned in AI search is all about reputability, experience, and trust. The more your brand is well-known and well-respected in your industry, the more likely LLMs will be to cite you as a trusted and recommended brand. This requires earning mentions and positive reviews in all the places where it matters; having a well-known and well-respected team of individuals who contribute authentic, expert insights into the brand’s content, etc.

As homogenous, AI-generated content floods the internet, users will continue to want to follow real human creators engaged in honest and authentic conversations. Also, focus on the quality and conversion potential of content over the quantity of content, as the latter can cause major SEO headaches over time.

→ Read More: The Role Of E-E-A-T In AI Narratives: Building Brand Authority For Search Success

9. Maintain A Strong Focus On Retrieval Systems And Search Overall

Pedrio Dias, Technical SEO/AI Discoverability Consultant, Visively

I believe that, in the current scenario where a significant amount of new (AI) technologies have been introduced between users and how we interact with the web, and are currently being seen through a disruptive lens, it’s more important than ever to maintain objectivity and pragmatism in our approach to organic visibility as a whole, and search in particular. As professionals, we need to understand in depth the changes that we’re being faced with, both from a technical point of view, but also (and maybe more importantly) from a behavioral point of view.

It’s tempting to cling to old habits and metrics to chase around, instead of assessing if and how we need to rework our strategies and tactics. We’re currently being bombarded with an insane amount of tools claiming to “give you insights into AI answers” and promising that they can give you directional “data” – and in some cases even bold claims of outcomes – but we haven’t even started to understand if any kind of optimization can be performed on AI, or even if inference can be influenced in any controlled and desirable way. So far, everyone is mostly just poking around, guessing, and hoping.

So, that said, in 2026, I believe SEOs should maintain a strong focus on retrieval systems and search overall. Make sure your SEO strategy didn’t get stuck in 2005 and that you’re considering all areas that contribute to consistency in visibility, be it content, branding, technical, etc.

Above all, make sure your share-of-voice strategy is omnichannel and isn’t siloed. All this while keeping your curiosity sharp and your critical thinking aimed at questioning the inconsistencies, while being cautious with a dive-head-first approach.

Watch out for overpromising claims, outdated methodologies sitting on top of baseless assumptions, and vanity metrics.

→ Read More: AI Overviews – How Will They Impact The Industry? We Ask Pedro Dias

10. Remain Focused On What Drives Impact

Montserrat Cano, MC. International SEO & Digital Strategy

In 2026, SEOs and digital marketers need to combine a deep understanding of how AI platforms work with a strong knowledge of their user base across every market.

As search becomes more personalized, AI-driven, and fragmented, visibility may also depend on understanding local search behaviors, expectations, cultural nuances, and how audiences interact with SERP features and LLMs along the purchase path, often in different ways.

The real value comes from embedding this research into ongoing internal processes such as content planning, prioritization, and testing. This ensures teams remain focused on what drives impact, e.g., the queries and content formats that matter, and the AI experiences users actually engage with.

Grounding strategies in first-party data, current market insights, and continuous learning may protect visibility and help build sustainable growth. In 2026, this becomes a core capability for effective SEO and marketing strategy.

→ Read More: Why The Build Process Of Custom GPTs Matters More Than The Technology Itself

11. Review How Content Is Organized, Linked, And Surfaced

Alex Moss, Principal SEO, Yoast

Site speed, UX, and IA are obvious and constant, but structure is something that needs to be audited and improved in the coming months, as we now need to accommodate for both agents and humans. Review how content is organized, linked, and surfaced.

Schema is essential, where in 2026, they will be utilized more to understand entities and their relationships better, which in turn reduces possible hallucinogenic responses from agents.

Also concentrate on IA, query grouping, and internal linking. These strategies have existed for some time, but also need to be revisited if you haven’t done so recently.

For brand and offsite, shift from old-hat link acquisition and instead focus on brand sentiment through third-party perspectives, including native digital PR (unlinked brand mentions are welcome).

Finally, take advantage of multi-modal content – invest in imagery, video, and platforms beyond traditional search to increase discoverability.

→ Read More: The Same But Different: Evolving Your Strategy For AI-Driven Discovery

12. Focusing On Evaluating The Revenue Impact Of Your Strategies

Helen Pollitt, Head of SEO, Getty Images

In 2026, SEOs should be focusing on evaluating the revenue impact of their strategies. Too often, SEOs fall into the trap of trying to optimize for traffic or following the newest advice or fancy tactic.

In reality, the most effective SEO strategies are those that are constantly driving towards revenue or other commercial goals. Keeping this premise front and center to your SEO strategies in 2026 will ensure you don’t get sidetracked by the latest SEO fad rather than working on a plan that drives genuine value to your business.

This means setting out your priorities based on their likeliness of success, and their revenue-generating potential. This simple calculation can help you to identify which projects or activities are worth focusing on in 2026. You will be able to identify if the latest “reverse-meta-optimization-deindexing” fad, or whatever it ends up being, is really worth your budget and resources to pursue.”

→ Read More: Ask An SEO: How Can You Distinguish Yourself In This Era Of AI Search Engines?

13. Treat The Website Like An Enterprise System

Bill Hunt, Global Strategist with Bisan Digital

In 2026, SEOs must stop optimizing solely for pages and singular phrases and start optimizing for topical understanding.

AI-driven search systems are no longer ranking documents but evaluating entities, synthesizing answers, and choosing which brands they trust enough to cite. Visibility now depends on three things: clean, authoritative data; deep topical coverage; and systems that make your content easy to retrieve, understand, and reuse. If your site architecture, structured data, and feeds aren’t aligned to these eligibility gates, you’re invisible before the ranking discussion even begins.

The SEOs who will win in 2026 are the ones who treat the website like an enterprise system, not a collection of pages. That means building durable information architecture, improving data reliability, collaborating with product and engineering teams, and creating content designed for synthesis across formats – not just the blue link.

If you’re not strengthening your site’s underlying information integrity and cross-functional alignment, you’re not competing in the new search environment; you’re just publishing.

→ Read More: Industry Pioneer Reveals Why SEO Isn’t Working & What To Refocus On

14. Develop A Distributed Revenue Strategy

John Shehata, CEO & Founder, NewzDash

In 2026, Brand Authority takes the front seat, replacing traffic volume as the primary metric. AI platforms prioritize trusted entities, so you must prove you are one. SEOs need a dual-speed strategy: a short-term strategy that maximizes today’s Google reality, and a long-term plan for a world where traffic and attention are more fragmented.

In the short term, Google is still the primary traffic driver, so optimize for multi-surface and multi-modal visibility. That means targeting AI Overviews, Discover, Top Stories, video, and short-form reels, not just traditional text results.

Convert every visitor into a direct connection through email, apps, and own communities. At the same time, double down on entity and topic authority, publish useful and unique content that is hard for AI to replicate, such as strong opinion, investigative work, and proprietary data, and strengthen technical SEO, structured data, and answer-ready formatting.

Long-term: Prepare for a post-click reality. Develop a distributed revenue strategy driven by a creator network that monetizes attention directly on social platforms and AI interfaces, accepting that success means revenue generated off-site, not just on your domain.

→ Read More: Google Discover, AI Mode, And What It Means For Publishers: Interview With John Shehata

15. Really Focus On Your Audience

Harry Clarkson-Bennett, SEO Director, The Telegraph

This is very brand and customer-dependent. My best advice is to really focus on your audience. Speak to them. Understand the impact SEO should have vs the impact it currently has. There may still be easy wins on the table. Don’t neglect it.

If you use a last click attribution system, I suspect SEO is over-valued. Work with your analytics team to trial multi-touch attribution and try to figure out the value of each channel. Then work with your PPC, social, and newsletter teams to create a proper marketing and acquisition strategy. Build your owned channels. Improve your blended CPA and solve real business problems.

This is the year you manage up more effectively and stop silo-ing channels and people. Make SEO Great Again.

→ Read More: The Impact AI Is Having On The Marketing Ecosystem

16. Transform Metrics Into Strategic Levers

Motoko Hunt, International SEO Consultant, AJPR

Audit and evolve your measurement framework. Many organizations track extensive data points without translating insights into actionable optimization strategies. The key differentiation lies not in data collection, but in strategic application.

Adapt your metrics architecture for the fragmented SERP landscape. With AI Overviews, featured snippets, and expanding SERP features fragmenting traditional organic visibility, implement granular tracking that isolates performance by SERP element. This segmentation reveals where you’re capturing attention and, more critically, where competitors are intercepting traffic before users reach your listings.

Balance emerging channels with revenue-driving fundamentals. AI search warrants monitoring – track share of voice in AI-generated responses and assess brand mention quality. However, at current adoption rates, AI search primarily serves upper-funnel awareness objectives. Your core optimization efforts should remain anchored to proven conversion pathways: traditional organic search, site experience optimization, and technical excellence that drives qualified traffic and revenue.

Transform metrics into strategic levers. Don’t just report CTR decline from position 3 to 5 – quantify the revenue impact, and identify the ranking factors at play. Connect performance gaps directly to business outcomes, then prioritize initiatives that close those gaps with the highest ROI potential.

→ Read More: Effective SEO Organizational Structure For A Global Company

17. Be Aware Of Falsehoods Which Will Continue To Circulate

Dawn Anderson, International SEO Consultant, Bertey

In 2026, SEOs should accept that we continue to have a steeper-than-ordinary SEO learning curve ahead of us. How AI is going to fully impact our industry over time continues to be largely an educated guessing game.

LLMs and agentic search provide a considerable opportunity, but it is important to not simply presume producing copy and paste AI LLM slop will make the cut for performative SEO in 2026, since this is a degenerative downward quality spiral. Instead, we must prioritize adding more authentic value beyond the norm, standing head and shoulders above competitors, and using AI predominantly for efficiency and ideation kick starting, along with prototype generation and concept testing.

Building increasingly robust reputation and authority through quality and connections should remain firmly a key priority. Particularly as the general consensus of opinion in verticals will continue to build via accumulative LLM extractions, shaping competitive narratives.

We should also be aware of falsehoods, which will continue to circulate in the vacuum of genuine knowledge that these severe industry changes create.  Don’t end up going down the wrong paths which may be very difficult to return from in the short to medium term.

→ Read More: Building Trust In The AI Era: Content Marketing Ethics And Transparency

18. Understand The User And How They Make Decisions

Giulia Panozzo, Founder, Neuroscientive

I believe that our key to achieving measurable results in 2026 is looking beyond the tactics and the new shiny tools: we need to get back to basics and really understand the user, their motivations, their frustrations, and mostly how they make decisions.

When customers decide to engage with a brand, a product, or a service, they do so by leveraging a number of micro-decisions that have very little to do with our marketing tactics and a lot more to do with their expectations and needs, their personal experiences, and the perception they have about us. A lot of those choices are made subconsciously, before they are even aware of them – and as a result, they are not visible by looking at traditional metrics.

So, focus on the bigger picture by working cross-functionally to understand not only how people get to your site, but what underlying needs and expectations they have by leveraging social listening, CX logs, and on-site behavioral metrics that will inform what they need to see and engage with before they even click on your result on the SERP.

→ Read More: The Behavioral Data You Need To Improve Your Users’ Search Journey

19. Find Ways To Differentiate Yourself From The Noise

Alli Berry, SEO Director, Marketwise

Looking into 2026 and beyond, I think SEOs need to be focused heavily on digital PR efforts and getting brand mentions and links from influential sites and people. I think we’re going to hit a point where what others say about your brand is going to have more weight than what you say about your own brand.

We’re already starting to see that with Reddit and forums, and as LLMs gain more traction, that is only becoming a more important factor in gaining visibility.

I’d also be focused on finding unique content angles that can’t be easily replicated by AI. Whether it’s telling customer stories or doing primary research, you’re going to need to find ways to differentiate yourself from the noise.

→ Read More: How To Get Brand Mentions In Generative AI

20. Have Influence Where Your Audience Is

Shelley Walsh, Managing Editor, Search Engine Journal & IMHO

During times of significant flux, go to the fundamentals and hold on: Know where your audience is finding its trusted information and have influence in those spaces.

If you embrace this core maxim, it will guide you through all the changes that Google, discovery engines, LLMs, and whatever comes next can throw at you.

However, don’t overlook the significant changes happening with technology that do influence the channels through which audiences can find us. Also, pay attention to how agentic SEO is developing so that you can consider now how you could apply it to your niche.

Don’t get caught up in pointless arguments over nomenclature or caught up in hype cycles chasing distractions. Keep focusing on what a user wants and applying your brand presence and message where they can see it. Everyone is running around like the sky is falling, but it’s all just SEO.

→ Read More: Google’s Old Search Era Is Over – Here’s What 2026 SEO Will Really Look Like

SEO In 2026

What most of our experts are saying is that what is changing is not so much the how, but the where.

Search is happening everywhere, and you need to ensure your brand narrative is accessible and consistent across all the channels where your audience is.

However, that means being mentioned in the right places, and constantly asking: “Does this move the needle for revenue, or is it just more noise?

The future of search is being built in real time, so make sure you’re not just watching it happen, but actively shaping how your brand shows up in it.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Why Your AI Agent Keeps ‘Hallucinating’ (Hint: It’s Your Data, Not The AI) via @sejournal, @purnavirji

If it looks like an AI hallucination problem, and sounds like an AI hallucination problem, it’s probably a data hygiene problem.

I’ve sat through dozens of demos this year where marketing leaders show me their shiny new AI agent, ask it a basic question, and watch it confidently spit out information that’s either outdated, conflicting, or flat-out wrong.

The immediate reaction is to blame the AI: “Oh, sorry the AI hallucinated. Let’s try something different.”

But was it really the AI hallucinating?

Don’t shoot the messenger, as the saying goes. While the AI is the messenger bringing you what looks like inaccurate data or hallucination, it’s really sending a deeper message: Your data is a mess.

The AI is simply reflecting that mess back to you at scale.

The Data Crisis Hiding Behind “AI Hallucinations”

An Adverity study found that 45% of marketing data is inaccurate.

Almost half of the data feeding your AI systems, your reporting dashboards, and your strategic decisions is wrong. And we wonder why AI agents give vague answers, contradict themselves, or pull messaging that no one’s used since 2022.

Here’s what I see in nearly every enterprise:

  • Three teams operating with three different definitions of ideal customer profile (ICP).
  • Marketing defines “conversion” one way, sales defines it another.
  • Buyer data scattered across six systems that barely acknowledge each other’s existence.
  • A battlecard last updated in 2019 still floating around, treated like gospel by your AI agent.

When your foundational data argues with itself, AI doesn’t know which version to believe. So it picks one. Sometimes correctly. Often not.

Why Clean Data Matters More Than Smart AI

AI isn’t magic. It reflects whatever you feed it: the good, the bad, and the three-years-outdated.

Everyone wants the “build an agent” sexy moment. The product demo that has everyone applauding. The efficiency gains that guarantee a great review, heck, maybe even a raise.

But the thing that makes AI useful is the boring, unsexy, foundational work of data discipline.

I’ve watched companies spend six figures on AI infrastructure while their product catalog still has duplicate entries from a 2021 migration. I’ve seen sales teams adopt AI coaching tools while their CRM defines “qualified lead” three different ways depending on which region you ask.

The AI works exactly as designed. The problem is what it’s designed to work with.

If your system is messy, AI can’t clean it up (at least, not yet). It amplifies the mess at scale, across every interaction. As much as we would like for it to, even the sexiest AI model in the world won’t save you if your data foundation is broken.

The Real Cost Of Bad Data Hygiene

When your data is inaccurate, inconsistent, or outdated, mistakes are inevitable. These can get risky quickly, especially if they negatively impact customer experience or revenue.

Here’s what that looks like in practice:

Your sales agent gives prospects pricing that changed six months ago because nobody updated the product sheet it’s trained on.

Your content generation tool pulls brand messaging from 2020 because the 2026 messaging framework lives in a deck on someone’s desktop.

Your lead scoring AI uses ICP criteria that marketing and sales never agreed on, so you’re nurturing the wrong prospects while ignoring the right ones.

Your sales enablement agent recommends a case study for a product you discontinued last quarter because nobody archived the old collateral.

This is happening every single week in enterprises that have invested millions in AI transformation. And most teams don’t even realize it until a customer or prospect points it out.

Where To Start: 5 Steps To Fix Your Data Foundation

The good news: You don’t need a massive transformation initiative to fix this. You need discipline and ownership.

1. Audit What Your AI Can Actually See

Before you can fix your data problem, you need to understand its scope.

Pull every document, spreadsheet, presentation, and database your AI systems have access to. Don’t assume. Actually look.

You’ll more than likely find:

  • Conflicting ICP definitions across departments.
  • Outdated pricing from previous years.
  • Messaging from three rebrand cycles ago.
  • Competitive intel that no longer reflects market reality.
  • Case studies for products you no longer sell.

Retire what’s wrong. Update what’s salvageable. Be ruthless about what stays and what goes.

2. Create One Source Of Truth

This is non-negotiable. Pick one system for every definition that matters to your business:

  • ICP criteria.
  • Conversion stage definitions.
  • Territory assignments.
  • Product positioning.
  • Competitive differentiators.

Everyone pulls from it. No exceptions. No “but our team does it differently.”

When marketing and sales use different definitions, your AI can’t arbitrate. It picks one randomly. Sometimes it picks both and contradicts itself across interactions.

One source of truth eliminates that chaos.

3. Set Expiration Dates For Everything

Every asset your AI can access should have a “valid until” date.

Battlecards. Case studies. Competitive intelligence. Messaging frameworks. Product specs.

When it expires, it automatically disappears from AI access. No manual cleanup required. No hoping someone remembers to archive old content.

Stale data is worse than no data. At least with no data, your AI admits it doesn’t know. With stale data, it confidently delivers wrong information.

4. Test What Your AI Actually Knows

Don’t assume your AI is working correctly. Test it.

Ask basic questions:

  • “What’s our ICP?”
  • “How do we define a qualified lead?”
  • “What’s our current pricing for [product]?”
  • “What differentiates us from [competitor]?”

If the answers conflict with what you know is true, you just found your data hygiene problem.

Run these tests monthly. Your business changes. Your data should change with it.

5. Assign Someone To Own It

Data discipline without ownership is a Slack thread that goes nowhere.

One person needs to be explicitly responsible for maintaining your source of truth. Not as an “additional responsibility.” As a core part of their role.

This person:

  • Reviews and approves all updates to the source of truth.
  • Sets and enforces expiration dates for assets.
  • Runs monthly audits of what AI can access.
  • Coordinates with teams to retire outdated content.
  • Reports on data quality metrics.

Without ownership, your data hygiene initiative dies in three months when everyone gets busy with other priorities.

The Bottom Line: Foundation Before Flash

If you don’t fix the mess, AI will scale the mess.

Deploying powerful AI on top of chaotic data is at best inefficient, but at worst, it can actively damage your brand, your customer relationships, and your competitive position.

You can have the most sophisticated AI model in the world. The best prompts. The most expensive infrastructure. None of it matters if you’re feeding it garbage. It takes a disciplined foundation to make it work.

It’s like seeing someone with perfectly white teeth and thinking they just got lucky. What you don’t see is the daily flossing, the regular dental cleanings, the discipline of avoiding sugar and brushing twice a day for years.

Or watching an Olympic athlete make a performance look effortless. You’re not seeing the 5 a.m. training sessions, the strict diet, the thousands of hours of practice that nobody applauds.

The same applies to AI.

To get real value and ROI from AI, start with setting it up for success with the right data foundation. Yes, it might not be the most glamorous or exciting work. But it is what makes the glamorous and exciting possible.

Remember, your AI isn’t hallucinating. It’s telling you exactly what your data looks like.

The question is: Are you ready to fix it?

More Resources:


Featured Image: BestForBest/Shutterstock

Google Updates Search Live With Gemini Model Upgrade via @sejournal, @martinibuster

Google has updated Search Live with Gemini 2.5 Flash Native Audio, upgrading how voice functions inside Search while also extending the model’s use across translation and live voice agents. The update introduces more natural spoken responses in Search Live and reflects Google’s effort to improve natural voice queries, treating voice as a core interface as a way for users to get everything they can get from regular search plus enabling them to ask questions about the physical world around them and receive immediate voice translations between two people speaking different languages.

The new updated voice capabilities, rolling out this week in the  United States, will enable Google’s voice responses to sound more natural and can even be slowed down for instructional content.

According to Google:

“When you go Live with Search, you can have a back-and-forth voice conversation in AI Mode to get real-time help and quickly find relevant sites across the web. And now, thanks to our latest Gemini model for native audio, the responses on Search Live will be more fluid and expressive than ever before.”

Broader Gemini Native Audio Rollout

This Search upgrade is part of a broader update to Gemini 2.5 Flash Native Audio rolling out across Google’s ecosystem, including Gemini Live (in the Gemini App), Google AI Studio, and Vertex AI. The model processes spoken audio in real time and produces fluid spoken responses, reducing barriers to natural conversation, reducing friction in live interactions. Although Google’s announcement didn’t say that the model was a speech-to-speech model (as opposed to speech-to-text then text-to-speech), this update follows Google’s October announcement of “Speech-to-Retrieval (S2R). It’s a neural network-based machine-learning model trained on large datasets of paired audio queries.”

These changes show Google treating native audio as a core capability across consumer-facing products, making it easier for users to ask and receive information about the physical world around them in a natural manner that wasn’t previously possible.

Improvements For Voice-Based Systems

For developers and enterprises building voice-based systems, Google says the updated model improves reliability in several areas. Gemini 2.5 Flash Native Audio more consistently triggers external functions during conversations, follows complex instructions, and maintains context across multiple turns. These improvements make live voice agents more dependable in real-world workflows, where misinterpreted instructions or broken conversational flow reduce usability.

Smooth Conversational Translation

Beyond Search and voice agents, the update introduces native support for “live speech-to-speech translation.” Gemini translates spoken language in real time, either by continuously translating ambient speech into a target language or by handling conversations between speakers of different languages in both directions. The system preserves vocal characteristics such as speech rhythm and emphasis, supporting translation that sounds smoother and conversational.

Google highlights several capabilities supporting this translation feature, including broad language coverage, automatic language detection, multilingual input handling, and noise filtering for everyday environments. These features reduce setup friction and allow translation to occur passively during conversation rather than through manual controls. The result is a translation experience that behaves much like an actual person in the middle translating between two people.

Voice Search Realizing Google’s Aspirations

The update reflects Google’s continued iteration of voice search toward an ideal that was originally inspired by the science fiction voice interactions between humans and computers in the popular Star Trek television and movie series.

Read More:

Google Announces A New Era For Voice Search

You can now have more fluid and expressive conversations when you go Live with Search.

Improved Gemini audio models for powerful voice interactions

Gemini Live

5 ways to get real-time help by going Live with Search

Featured Image by Shutterstock/Jackbin

SEO Pulse: December Core Update, Preferred Sources & Social Data via @sejournal, @MattGSouthern

The December 2025 core update is the main story this week.

Google confirmed a new broad ranking update, clarified how often core changes happen, expanded Preferred Sources in Top Stories, and started testing social performance data in Search Console Insights.

Here’s what matters for your work.

Google Releases December 2025 Core Update

Google has released the December 2025 core update, its third core update of the year.

Key Facts

The rollout started on December 11, and Google says it may take up to three weeks to complete. This follows the March and June core updates and comes two days after Google refreshed its core updates documentation to explain smaller, ongoing changes.

Why SEOs Should Pay Attention

If you see big swings in rankings or traffic over the next few weeks, this update is probably the cause.

Core updates are broad changes to how Google evaluates content. Pages can move up or down even if you haven’t changed anything on the site, because Google is reassessing your content against everything else in the index.

The timing matters. Earlier in the week, Google reminded everyone that smaller core updates happen all the time. The December core update sits on top of that layer. You’re dealing with both a visible event and quieter, continuous adjustments running underneath.

Right now, the best move is to watch your data rather than panic. Mark the rollout dates in your reporting. Track when things start to move for your key sections. Compare this behavior with what you saw during the March and June updates. That helps you separate core-update effects from seasonality, technical issues, or campaign changes.

Over the longer term, this is another nudge toward content that shows clear expertise, purpose, and useful detail. The documentation change earlier in the week suggests those improvements can be recognized over time, not only when Google names a new core update.

What SEO Professionals Are Saying

Reactions on X focused on timing, expectations, and the kind of content that might come out ahead.

Some SEO professionals leaned into the holiday angle, joking that Google’s “Christmas update” could either deliver a gift or push sites “off a cliff” right before peak season. Others used the announcement to talk about human-written work, saying they hope this is the update where stronger, human-generated content gets more visibility.

There were also practical reads. A few people tied the update to recent delays in Search Console data, saying the backlog now makes more sense. Others pointed out that this is the third broad update in a year where Google is also investing heavily in AI systems, and that core updates now sit inside a bigger stack of changes rather than defining everything on their own.

Read our full coverage: Google Releases December 2025 Core Update

Google Confirms Smaller Core Updates Happen Continuously

Earlier in the week, Google updated its core updates documentation to spell out that ranking changes can happen between the named core updates.

Key Facts

The documentation now says Google makes smaller core updates on an ongoing basis, alongside the larger core updates it announces a few times a year. Google explained that this change is meant to clarify that sites can see ranking gains after making improvements without waiting for the next big announcement.

Smaller core updates were mentioned in a 2019 blog post, but this is the first time the concept appears directly in the core updates documentation.

Why SEOs Should Pay Attention

This answers a question that has been hanging over SEO for years. Recovery isn’t limited to moments when Google announces a core update. The new wording confirms that Google can reward improvements at any time as smaller updates roll out in the background.

If you’ve been holding back on site fixes or content work until “the next core update,” this is a good time to drop that pattern. You can ship improvements now, knowing there’s more than one window where Google might reassess your content.

The timing is interesting given this year’s release pattern. Until this week, the only named core updates in 2025 were the March and June releases, with several months between them. For sites hit early in the year, those gaps made it hard to know when changes might start to pay off. The December update adds another obvious checkpoint, but the documentation makes it clear that it isn’t the only one.

For reporting and communication, this supports a change from “wait for the next update” to “improve steadily and monitor continuously.” You still don’t need to chase every drop, but you can be more confident that sustained work has more than one chance to show up in the data.

What SEO Professionals Are Saying

Former Google search team member Pedro Dias summed up one common read, saying he thinks Google has finally reached a place where it doesn’t need to announce every core update separately. Others have connected the change to Google’s move toward layered ranking systems, where visible events are only one part of an ongoing stream of tweaks.

For you, that supports a slower, steadier approach. Instead of waiting for one moment to “fix” everything, you can keep tuning content and UX, and treat named core updates as checkpoints rather than the only chance to move.

Read our full coverage: Google Confirms Smaller Core Updates Happen Continuously

Google Expands Preferred Sources In Top Stories

Google is expanding Preferred Sources globally for English-language users, giving people more control over which outlets show up in Top Stories and similar news surfaces.

Key Facts

Preferred Sources lets people pick specific outlets they want to see more often when they browse news in Google Search. The feature is now rolling out to English-language users worldwide, with other supported languages planned for early next year. Google says people have already selected close to 90,000 different sources, from local blogs to large international publishers, and that users who mark a site as preferred tend to click through to it about twice as often.

Why SEOs Should Pay Attention

Preferred Sources gives you a direct way to turn casual readers into regulars inside Google’s own interfaces. If your site publishes timely coverage, you can now build a segment of people who have chosen to see more of your work in Top Stories.

That makes “choose us as a preferred source” another call to action you can test alongside email sign-ups and follow buttons. Some publishers are already creating simple guides that show readers how to add them and what changes once they do. You can take a similar approach, especially if you already have a loyal audience on site or through newsletters.

It’s also a signal that Google wants users to have more say in which outlets they see. For you, that means brand perception, clarity of coverage, and consistency matter a bit more, because people are deciding which sources they want in their feed instead of relying on a default mix.

What SEO Professionals Are Saying

On LinkedIn, several SEO professionals and content strategists pointed out that Preferred Sources mostly reinforces behavior that already exists.

Garrett Sussman notes that people tend to stick with outlets they trust. This feature simply makes that choice more visible and gives publishers another growth lever inside Google’s ecosystem.

If you work on news or frequently updated content, you can start treating Preferred Sources selection as its own metric. Watch how often people choose you, which articles tend to drive that choice, and how those readers behave over time.

Read our full coverage: Google Expands Preferred Sources & Publisher AI Partnerships

Google Tests Social Channel Insights In Search Console

Search Console is testing a feature that shows how your social channels perform in Google Search results.

Key Facts

Google announced a new experimental feature in Search Console that adds social performance data to the Search Console Insights report. It covers social profiles that Google has automatically associated with your site. For each connected profile, you can see clicks, impressions, top queries, trending content, and audience location.

The experiment is limited to a small set of properties, and you can’t manually add profiles. The feature only appears if Search Console detects your channels and prompts you to link them.

Why SEOs Should Pay Attention

Up to now, you’ve probably watched search performance for your site and your social channels in separate tools. This experiment pulls both into one place, which can save time and make it easier to see how people move between your website and your social profiles.

The new data shows which queries lead people to your social profiles, which posts tend to surface in search, and which markets use Google to find you on social platforms. That’s useful if you run campaigns where organic search, social content, and creator work all overlap.

The main limitation is access. If you don’t see a prompt in Search Console Insights asking you to connect detected social channels, your site isn’t in the initial test group. Still, it’s worth logging as a feature to watch, especially if you already spend time explaining how social content shows up for branded and navigational queries.

What SEO Professionals Are Saying

Reactions on LinkedIn focused on two main points. People liked the idea of a single view of website and social performance, and they quickly started asking when similar data might be available for AI Overviews, AI Mode, and other search experiences.

Others raised questions about coverage. Some practitioners want to know whether this data will stay limited to Google-owned properties or expand to platforms like Instagram, LinkedIn, and X. There’s also curiosity about how Google detects and links social profiles in the first place, and whether structured data or Knowledge Graph entities play a role.

Read our full coverage: Google Tests Social Channel Insights In Search Console

Theme Of The Week: Core Updates At Two Speeds

The common thread this week is movement at two speeds.

At one speed, you have the December 2025 core update. It’s a visible event with a clear start date, a multi-week rollout, and a lot of attention. At the other speed, you have the quieter changes around it.

Google has now said directly that smaller core updates happen all the time. Preferred Sources gives users more control over which outlets they see. Social insights start to connect website and social performance in one view.

For you, this means there’s no single moment when everything gets decided. Core updates still matter and can cause sharp movements, but they sit inside an environment where improvements can pay off gradually and where readers are making more explicit choices about who they want to hear from.

The practical response is to treat this as an ongoing feedback loop. Keep improving content and UX. Watch how those changes behave during calm periods and during core updates. Encourage your most engaged readers to mark you as a preferred source where they can. Keep an eye on how search and social interact for your brand. That way, you’re ready for both speeds.


Top Stories Of The Week

More Resources


Featured Image: Pixel-Shot/Shutterstock

Google Releases December 2025 Core Update via @sejournal, @MattGSouthern

Google has released the December 2025 core update, the company confirmed through its Search Status Dashboard.

The rollout began at 9:25 a.m. Pacific Time on December 11, 2025.

This marks Google’s third core update of 2025, following the March and June core updates earlier this year.

What’s New

Google lists the update as an “incident affecting ranking” on its status dashboard.

The company states the rollout “may take up to three weeks to complete.”

Core updates are broad changes to Google’s ranking systems designed to improve search results overall. Unlike specific updates targeting spam or particular ranking factors, core updates affect how Google’s systems assess content across the web.

2025 Core Update Timeline

The December update follows two previous core updates this year.

The March 2025 core update rolled out from March 13-27, taking 14 days to complete. Data from SEO tracking providers suggested volatility similar to the December 2024 core update.

The June 2025 core update ran from June 30 to July 17, lasting about 16 days. SEO data providers indicated it was one of the larger core updates in recent memory. Some sites previously hit by the September 2023 Helpful Content Update saw partial recoveries during this rollout.

Documentation Update On Continuous Changes

Two days before this core update, Google updated its core updates documentation with new language about ongoing algorithm changes.

The updated documentation now states:

“However, you don’t necessarily have to wait for a major core update to see the effect of your improvements. We’re continually making updates to our search algorithms, including smaller core updates. These updates are not announced because they aren’t widely noticeable, but they are another way that your content can see a rise in position (if you’ve made improvements).”

Google explained that the addition was meant to clarify that content improvements can lead to ranking changes without waiting for the next announced update.

Why This Matters

If you notice ranking fluctuations over the coming weeks, this update is likely a major factor.

Core updates can shift rankings for pages that weren’t doing anything wrong. Google has consistently stated that pages losing visibility after a core update don’t necessarily have problems to fix. The systems are reassessing content relative to what else is available.

The documentation update is a reminder that rankings can change between major updates as Google rolls out smaller core changes behind the scenes.

Looking Ahead

Google will update the Search Status Dashboard when the rollout is complete.

Monitor your rankings and traffic over the next three weeks. If you see changes, document when they occurred relative to the rollout timeline.

Based on 2025’s previous updates, completion typically takes two to three weeks. Google will confirm completion through the dashboard and its Search Central social accounts.

14 Things Executives And SEOs Need To Focus On In 2026 via @sejournal, @DuaneForrester

So many people spent 2025 arguing about whether SEO was dying. It was never dying. It was shifting into a new layer. Discovery continues to move from search boxes to AI systems. Answers now come from models that rewrite your work, summarize competitors, blend sources, and shape decisions before a browser window loads. In 2026, this shift becomes visible enough that executives and SEOs can no longer treat it like an edge case; percentages from sources will shift. The search stack that supported the last 20 years is now only one of several layers that shape customer decisions. (I talk about all this in my new book, “The Machine Layer” (non-affiliate link).)

This matters because the companies that win in 2026 will be the ones treating AI systems as new distribution channels. The companies that lose will be the ones waiting for their analytics dashboards to catch up. You no longer optimize for a single front door. You now optimize for many. Each one is powered by models that decide what to show, who to show it to, and how to describe you.

Here are 14 things that will define competitive advantage in 2026. Each one is already visible in real data. Together, they point to a year where discovery becomes more ambient, more conversational, and more dependent on how well a machine can parse and trust you. And at the end of this list is one heck of a prediction that I bet you didn’t see coming for next year! If I’m being honest, I’m sure a few of you did, but to this depth? Realizing it was all so close?

Grab a coffee or tea, find your favorite spot to read, and let’s get started!

Image Credit: Duane Forrester

1. AI Answer Surfaces Become The New Front Door

ChatGPT, Claude, Gemini, Meta AI, Perplexity, CoPilot, and Apple Intelligence now sit between customers and your website. More and more users ask questions inside these systems before they ever search. And the answers they get are inconsistent. BrightEdge’s analysis showed that AI engines disagree with each other 62% of the time. When engines disagree this much, brand visibility becomes unstable. Executives need reporting that reveals how often their brand appears inside these systems. SEOs need workflows that evaluate chunk retrieval, embedding strength, and citation presence across multiple answer engines.

2. Content Must Be Designed For Machine Retrieval

Microsoft’s 2025 Copilot study analyzed more than 200,000 work sessions. The most common AI-assisted tasks were gathering information, explaining information, and rewriting information. These are the core tasks modern content must support. AI models choose content that is structured, predictable, and easy to embed. If your content lacks clear sectioning, consistent patterns, or explicit definitions, it becomes harder for models to use. This impacts whether you appear in answers. In 2026, your formatting choices become ranking signals for machines.

3. On-Device LLMs Change How People Search

Apple Intelligence runs many tasks locally. It also rewrites queries in more natural conversational patterns. This pushes search activity away from browsers and deeper into the operating system. People will ask their device short, private questions that never hit the web. They will ask follow-up questions inside the OS. They will make decisions without ever visiting a page. This shifts both volume and structure. SEOs will need content designed for lightweight, edge device retrieval.

4. Wearables Start Steering The Discovery Funnel

Meta Ray Bans already support visual queries. The user points at something and asks what it is. Voice and camera replace typing. This increases micro queries tied to real-world context. Expect to see more identify thiswhat does this do, and how do I fix that queries. Wearables compress the distance between stimulus and search. Executives should invest in image quality, product clarity, and structured metadata. SEOs should treat visual search signals as core inputs.

5. Short-Form Video Becomes A Training Input For AI

Video is now a core training signal for modern multimodal models. V-JEPA 2 from Meta AI is trained on an unknown number of hours of raw video and images, but this still shows that large-scale video learning is becoming foundational for motion understanding, physical prediction, and video question answering. Gemini 2.5 from Google DeepMind explicitly supported video understanding, allowing the model to interpret video clips, extract visual and audio context, and reason over sequences. OpenAI’s Sora research demonstrates that state-of-the-art generative video models learn from diverse video inputs to understand motion, physical interactions, transitions, and real-world dynamics. In 2026, your short-form video becomes part of your broader signal footprint. Not only the transcript. The visuals, pacing, motion, and structure become vectors the model can interpret. When your video output and written content diverge, the model will default to whichever medium communicates more clearly and consistently.

6. Organic Search Signals Shift Toward Trust And Provenance

Traditional algorithms relied on links, keywords, and click patterns. AI systems shift that weight toward provenance and verification. Perplexity describes its model as retrieval-augmented, pulling from authoritative sources like articles, websites, and journals and surfacing citations to show where information comes from. Independent audits support this direction. A 2023 evaluation of generative search engines found that systems like Perplexity favored content that is factual, well-structured, and supported by external evidence when assembling cited answers. This remains true today as well. SEO industry analysis also shows that pages with clear metadata, consistent topical organization, and visible author identity are more likely to be cited. Naturally, all of this changes what trust looks like. Machines prioritize consistency, clarity, and verifiable sourcing. Executives should focus on data governance and content stability. SEOs should focus on structured citations, author attribution, and semantic coherence across their content ecosystem.

7. Real-Time Cohort Creation Replaces Static Personas

LLMs build temporary cohorts by clustering people with similar intent patterns. These clusters can form in seconds and dissolve just as fast. They are not tied to demographics or personas. They are based on what someone is trying to do right now. This is the basis of the experiential cohort concept. Marketers have not caught up yet. In 2026, cohort-based targeting will shift toward intent embeddings and away from persona documents. SEOs should tune content for intent patterns, not identity attributes.

8. Agent-To-Agent Commerce Becomes Real

Agents will schedule appointments, book travel, reorder supplies, compare providers, and negotiate simple agreements. Your content becomes instructions for another machine. To support that, it must be unambiguous. It must be explicit about requirements, constraints, availability, pricing rules, and exceptions. If you want an agent to pick your business, you need a content model that feeds the agent’s decision tree. Executives should map the top 10 agent-mediated tasks in their industry. SEOs should design content that makes those tasks easy for a machine to interpret.

9. Hardware Acceleration Pushes AI Into Every Routine

NVIDIA, Apple, and Qualcomm are all building hardware optimized for on-device and low-latency AI inference. These chips reduce friction, which increases the number of everyday questions people ask without ever opening a browser. NVIDIA’s data center inference platforms show how much compute is moving toward real-time model execution. Qualcomm’s AI Hub highlights how modern phones can run complex models locally, shrinking the gap between thought and action. Apple’s M-series chips include Neural Engines that support local model execution inside Apple Intelligence. Lower friction means people will ask more small, immediate questions as they move through their day instead of grouping everything into one session. SEOs should plan for discovery happening across many short, assistant-driven interactions rather than a single focused search moment.

10. Query Volume Expands As Voice And Camera Take Over

Voice input grows the long tail. Camera input grows contextual queries. The Microsoft Work Trend Index shows rising AI usage across everyday task categories, including personal information gathering. People ask more questions because speaking is easier than typing. The shape of demand widens, which increases ambiguity. SEOs need stronger intent classification workflows and a better understanding of how retrieval models cluster similar questions.

11. Brand Authority Becomes Machine Measurable

Models determine authority by measuring consistency across your content. They look for stable terminology, clear entity relationships, and patterns in how third parties reference you. They look for alignment between what you publish and how the rest of the web describes your work. This is not the old human quality framework. It is a statistical confidence score. Executives should invest in knowledge graphs. SEOs should map their entity network and tune the language around each entity for stability.

12. Zero-click Environments Become Your Primary Competitor

Answer engines pull from multiple sources and give the user a single synthesized answer. This reduces visits but increases influence. In 2026, the dominant competitors for organic attention are ChatGPT, Perplexity, Gemini, CoPilot, Meta AI, and Apple Intelligence. You do not win by resisting zero click. You win by being the source the engine prefers. Executives must adopt new performance metrics that reflect answer presence. SEOs should run monthly audits of brand visibility across all major platforms, tracking citations, mentions, paraphrases, and omissions.

13. Competitive Intelligence Shifts Into Prompt Space

Your competitors now live inside AI answers, whether they want to or not. Their content becomes part of the same retrieval memory that models use to answer your queries. In 2026, SEOs will evaluate competitor visibility by studying how platforms describe them. You will ask models to summarize competitors, benchmark capabilities, and compare offerings. The insights you get will shape strategy. This becomes a new research channel that executives can use for positioning and differentiation.

14. Your Website Becomes A Training Corpus

AI systems will digest your content many times before a human does. That means your site is now a data repository. It must be structured, stable, and consistent. Publishing sloppy structure or unaligned phrasing creates noise inside retrieval models. Executives should treat their content like a data pipeline. SEOs should think like information architects. The question shifts from how do we rank to how do we become the preferred reference source for a model.

The companies that succeed in 2026 will be the ones that understand this shift early. Visibility now lives in many places at once. Authority is measured by machines, not just people. Trust is earned through structure, clarity, and consistency. The winners will build for a world where discovery is ambient, and answers are synthesized. The losers will cling to dashboards built for a past that is not coming back.

Now, if you’ve read this far, thank you, and I have a surprise – an actual prediction for 2026! I think it’s a big, important one, so buckle up!

I’m calling this Latent Choice Signals, or these, I suppose, as it’s a grouping of signals that paint a picture for the platforms. From the consumer’s POV, this is the essential mental map they’re following: “I saw it, I felt something about it, and I decided not to continue.” This is the core. The user’s mind is making a choice, even if they never articulate it or click anything. That behavior generates meaning. And the system can interpret that meaning at scale. Let’s dig in…

The Prediction No One Sees Coming

By the end of 2026, AI systems will begin optimizing decisions around the patterns users never articulate. Not the queries they type. Not the questions they ask. But the choices they avoid.

This is the shift almost everyone misses, and you can see the edges of it forming across three different fields. When you pull them together, the picture becomes clearer.

First, operating system-level AI is already learning from behavior that is not explicitly expressed. Apple Intelligence is described as a personal intelligence layer that blends generative models with on device personal context to prioritize messages, summarize notifications, and suggest actions across apps. Apple built this for convenience and privacy, but it created something more important. The system must learn over time which suggestions people accept and which they quietly ignore. It sees which notifications get swiped away, which app actions never get used, and which prompts are abandoned. It does not need to read your mind. It only needs to see which proposed actions never earn a tap. Those patterns are already part of how it ranks what to surface next.

Second, recommender systems already treat non-actions as meaningful signals. You see it every time you skip a YouTube video, swipe past a TikTok in under a second, or close Netflix when the row of suggestions feels wrong. These platforms do not publish their exact mechanics, but implicit feedback is a well-established concept in the research world. Classic work on collaborative filtering for implicit feedback datasets shows how systems use viewing, skipping, and browsing behavior to model preference, even when users never rate anything directly. Newer work continues to refine how clicks, views, and avoidance patterns feed recommendation models at scale. It is reasonable to expect LLM-driven assistants to borrow from the same logic. The pattern is too useful to ignore. When you close an assistant, rephrase a question to avoid a certain brand, or scroll past a suggestion without engaging, that is data about what you did not want.

Third, alignment research already trains models to follow what humans prefer, not just what text predicts. OpenAI’s “Learning to summarize with human feedback” work shows how models can be tuned using human comparisons between outputs, with a reward model that learns which responses people think are better. This has been in play for years now. This kind of reinforcement learning from human feedback was built for tasks like summarization and style, but the underlying principle matters here. Models can be optimized around patterns of acceptance and rejection. Over time, conversational systems can extend this to live settings, where corrections, rewrites, and abandonments are treated as signals about what the user did not want, even when they never spell that out.

Put these three domains together, and a larger pattern emerges. As AI systems move into glasses, phones, laptops, cars, and operating systems, they will gain precise visibility into the choices people avoid. These avoidance patterns will become signals that inform how assistants rank options, choose providers, and recommend products.

This will not feel like surveillance. The model is not peeking into your private life. It is watching your interaction patterns with the system itself. It sees where you hesitate, which suggestions you skip, which tasks you hand off, which providers create follow-up questions, which prices cause users to pause, which explanations reduce confidence, and which interfaces break the chain of intent. These are all first-party behavioral signals the assistant is already allowed to use. And that platforms see these signals on a global scale.

In 2026, these Latent Choice Signals will become strong enough that they form a new optimization layer. A silent ranking system built around friction. If your brand generates hesitation, the assistant will reduce your visibility long before your analytics flag a problem. If your content creates confusion during synthesis, it will be bypassed during retrieval. If your policies trigger too many follow-up questions, the model will favor a competitor with clearer flows. The user will never know why. All they will see is the assistant presenting a different option.

This is the layer that will blindside executives. Dashboards will look normal. Rankings may appear stable. Traffic may hold steady. Yet conversions inside AI-mediated decisions will drift. Customers will stop choosing you, not because you lost traditional ranking signals, but because you introduced cognitive friction the model can detect and optimize against.

The winners will be the companies that treat avoidance as a measurable signal. They will analyze which parts of their product and content cause hesitation. They will refine policies to reduce ambiguity. They will simplify offerings. They will align explanations with how models process uncertainty. They will build experiences that reduce agent-level friction and improve confidence inside a retrieval sequence.

By late 2026, negative intent signals may become one of the strongest competitive filters in digital business. Not because users say anything, but because their silence now has structure the model can learn from. Anyone watching today’s data can see this shift forming, but almost no one is naming it. Yet the early indicators are already here, hiding between the interactions users never get far enough to complete.

This is the prediction that will define the next phase of AI-driven discovery. And the companies that understand it early will be the ones the assistants prefer.

More Resources:


This post was originally published on Duane Forrester Decodes.


Featured Image: Collagery/Shutterstock

Well-Known SEO Explains Why AI Agents Are Coming For You & What To Do Now via @sejournal, @theshelleywalsh

I’m carefully watching the development of agentic SEO, as I believe over the next few years, as capabilities improve, agents will have a significant impact on the industry. I’m not suggesting this will be a seamless replacement of talent with a highly capable machine intelligence. There is going to be a lot of trial and error, but I do think we are going to see radical shifts in how the online space operates. Not unlike how automation transformed manufacturing.

Marie Haynes has long been a well-known expert in the industry who shared her learnings on E-E-A-T and Google’s algorithm through her popular Search News You Can Use newsletter.

A few years ago, Marie made the decision to retire her SEO agency and went all in on learning AI systems, as she believes we’re at the beginning of a profound transformation.

Marie wrote a recent article, “Hype or not, should you be investing in AI agents?” about what SEOs need to understand about this rapidly developing space. So, I invited her to IMHO to dive more into this topic.

Marie believes AI will radically change our world for the better, and she believes every business will have AI agents.

You can watch the full interview with Marie on the IMHO recording at the end, or continue reading the article summary.

“The idea that we optimize for appearing as one of the 10 blue links on Google is already gone.”

Experimenting With Gemini Gems

Marie’s practical advice for anyone wanting to understand agents is to start with Gems:

“If you take one thing from this conversation, it’s to try to create some Gemini Gems,” Marie emphasized. “Eventually I’m fairly certain that these gems will morph into agentic workflows.”

To illustrate, she shared a process she called her “originality Gem,” which contains a 500+ word prompt that captures how she evaluates content, along with examples of truly original content in its knowledge base.

“We’re not far from the day where all of my processes that I do for SEO can be handled by agentic workflows that occasionally pull on me for some advice,” Marie said.

The Power Of Chaining Agents

The next progression and real potential come from chaining agents together to create agentic workflows.

The power that this gives opportunity to is that we can use our knowledge and experience to teach AI like a team of assistants to do the work that can be automated.

We would then orchestrate the process and, like a conductor, sit and guide the agents to perform the work as we become the human-in-the-loop to review the output.

Once we have downloaded our knowledge to the agents, and the systems work, we can scale ourselves to handle exponential clients.

“Instead of me handling just a small handful of clients, all of a sudden I could have a hundred clients and do the same work because it’s all going through my workflow,” Marie said.

The challenge here is the skill in prompting the agents and constructing them to achieve the desired output.

“The future of our industry is not about optimizing for an engine, but about acting as the interface between businesses and technology, and we will be the human experts who teach, guide, and implement AI agents.”

Why Gemini Over ChatGPT

I asked Marie why she focuses on Gemini over ChatGPT, and her response was based on futureproofing: “The main reason why I use Gemini is not to accomplish things today, but to grow my skills in what’s coming tomorrow.”

Marie went on to explain that “Google’s got a whole ecosystem that you can see it coming together like right now,” and she believes that Google will be the winner in the AI race.

“I think that Google is going to win the game. I think it’s always been their game to win. So I make it a point to use Gemini as much as I can.”

Transformations Will Follow The Money

Marie’s prediction for the next few years is for workflows to become embedded. “Sundar Pichai, CEO of Google, said this way back in March, that, in two to four years, every agentic workflow will be deeply embedded into our day-to-day work.”

However, she thinks the real transformations will come when businesses start making money from agentic workflows.

“It’s wild how many trillions of dollars are being spent on developing AI, yet there’s not a whole lot of financial output at this point,” Marie noted, referencing a McKinsey study showing 95% of businesses using AI aren’t making money from it yet [Editor’s note: McKinsey was 80%; MIT said 95%].

“It’s very similar to SEO. There was a day where there were just a small handful of people who figured out how to improve on Google. Once people started making good money from understanding SEO, there was a lot of attention. Tools were created and a whole industry popped up. I think that’s going to happen again. Will it be within the next 12 months? I don’t know. I feel like it might be a little bit longer.”

What SEOs Should Do Now

Overwhelm is a real issue to be aware of, and with developments moving so quickly, there is a huge learning curve to essentially retrain. Even for those working on this full-time.

Marie made a commitment when she went all in on AI research. “I made it my full-time job to stay on top of what’s happening, and even I get overwhelmed with all the stuff that’s happening with AI,” she explained.

Marie’s advice is to keep learning, keep trying things, and experiment with writing prompts.

“The next time you go to do a task, try to create an agent that would do this for you,” she suggested. Even if you don’t finish, you’ll learn skills for the next attempt.

Also, persevere instead of taking the first failure. “Try to figure out what they can do, instead of just telling everybody, ‘Oh, it can’t do this.’ Find ways you can use it.”

For development teams, she recommends vibe coding with tools like Google’s Anti Gravity or AI Studio. “You can deploy a whole website without even knowing any HTML,” Marie said.

She also advocates for deep research reports using either Gemini or ChatGPT to analyze how competitors are using AI, providing immediate value to clients while building skills.

The Future Of SEO

Marie referenced Sundar Pichai calling AI technology more profound than fire or electricity in its impact on society. Despite acknowledging her bias after investing significant time in understanding AI, she maintains there’s going to be societal disruption.

“Being able to understand what’s happening in the world and distill it down to what’s important to your clients will be a superpower,” she said. Although, she does admit, there is still a lot of learning and grey areas to move through as we navigate the edge of technology.

“If you’re feeling lost, you’re not alone because imagine right now we’re sort of at the forefront of all of these changes happening.”

For those who do persevere, there will be significant rewards. Eventually, business owners will be clamoring for people who can explain AI and implement it. The professionals who develop these skills now will be extremely valuable in the future.

“The people who know how to use AI, know how to create agents, and know how to make money from AI are going to be extremely valuable in the future.”

Watch the full video interview with Marie Haynes here:

Thank you to Marie Haynes for offering her insights and being my guest on IMHO.

More Resources:


Featured Image: Shelley Walsh/Search Engine Journal