Evidence That Google Detects AI-Generated Content via @sejournal, @martinibuster

A sharp-eyed Australian SEO spotted indirect confirmation about Google’s use of AI detection as part of search rankings that was hiding in plain sight for years. Although Google is fairly transparent about content policies, the new data from a Googler’s LinkedIn profile adds a little more detail.

Gagan Ghotra tweeted:

“Important FYI Googler Chris Nelson from Search Quality team his LinkedIn says He manages global team that build ranking solutions as part of Google Search ‘detection and treatment of AI generated content’.”

Googler And AI Content Policy

The Googler, Chris Nelson, works at Google in the Search Ranking department and is listed as co-author of Google’s guidance on AI-generated content, which makes knowing a little bit about him

The relevant work experience at Google is listed as:

“I manage a large, global team that builds ranking solutions as part of Google Search and direct the following areas:

-Prevent manipulation of ranking signals (e.g., anti-abuse, spam, harm)
-Provide qualitative and quantitative understanding of quality issues (e.g., user interactions, insights)
-Address novel content issues (e.g., detection and treatment of AI-generated content)
-Reward satisfying, helpful content”

There are no search ranking related research papers or patents listed under his name but that’s probably because his educational background is in business administration and economics.

What may be of special interest to publishers and digital marketers are the following two sections:

1. He lists addressing “detection and treatment of AI-generated content”

2. He provides “qualitative and quantitative understanding of quality issues (e.g., user interactions, insights)”

While the user interaction and insights part might seem unrelated to the detection and treatment of AI-generated content, the user interactions and insights part is in the service of understanding search quality issues, which is related.

His role is defined as evaluation and analysis of quality issues in Google’s Search Ranking department. “Quantitative understanding” refers to analyzing data and “qualitative understanding” is a more subjective part of his job that may be about insights, understanding the “why” and “how” of observed data.

Co-Author Of Google’s AI-Generated Content Policy

Chris Nelson is listed as a co-author of Google’s guidance on AI-generated content. The guidance doesn’t prohibit the use of AI for published content, suggesting that it shouldn’t be used to create content that violates Google’s spam guidelines. That may sound contradictory because AI is virtually synonymous with scaled automated content which has historically been considered spam by Google.

The answers are in the nuance of Google’s policy, which encourages content publishers to prioritize user-first content instead of a search-engine first approach. In my opinion, putting a strong focus on writing about the most popular search queries in a topic, instead of writing about the topic, can lead to search engine-first content as that’s a common approach of sites I’ve audited that contained relatively high quality content but lost rankings in the 2024 Google updates.

Google (and presumably Chris Nelson’s advice) for those considering AI-generated content is:

“…however content is produced, those seeking success in Google Search should be looking to produce original, high-quality, people-first content demonstrating qualities E-E-A-T.”

Why Doesn’t Google Ban AI-Generated Content Outright?

Google’s documentation that Chris Nelson co-authored states that automation has always been a part of publishing, such as dynamically inserting sports scores, weather forecasts, scaled meta descriptions and date-dependent content and products related to entertainment.

The documentation states:

“…For example, about 10 years ago, there were understandable concerns about a rise in mass-produced yet human-generated content. No one would have thought it reasonable for us to declare a ban on all human-generated content in response. Instead, it made more sense to improve our systems to reward quality content, as we did.

…Automation has long been used to generate helpful content, such as sports scores, weather forecasts, and transcripts. …Automation has long been used in publishing to create useful content. AI can assist with and generate useful content in exciting new ways.”

Why Does Googler Detect AI-Generated Content?

The documentation that Nelson co-authored doesn’t explicitly states that Google doesn’t differentiate between how low quality content is generated, which seemingly contradicts his LinkedIn profile that states “detection and treatment of AI-generated content” is a part of his job.

The AI-generated content guidance states:

“Poor quality content isn’t a new challenge for Google Search to deal with. We’ve been tackling poor quality content created both by humans and automation for years. We have existing systems to determine the helpfulness of content. …Our systems continue to be regularly improved.”

How do we reconcile that part of his job is detecting AI-generated content and Google’s policy states that it doesn’t matter how low quality content is generated?

Context is everything, that’s the answer. Here’s the context of his work profile:

Address novel content issues (e.g., detection and treatment of AI-generated content)”

The phrase “novel content issues” means content quality issues that haven’t previously been encountered by Google. This refers to new types of AI-generated content, presumably spam, and how to detect it and “treat” it. Given that the context is “detection and treatment” it could very well be that the context is “low quality content” but it wasn’t expressly stated because he probably didn’t think his LinkedIn profile would be parsed by SEOs for a better understanding of how Google detects and treats AI-generated content (meta!).

Guidance Authored By Chris Nelson Of Google

A list of articles published by Chris Nelson show that he may have played a role in many of the most important updates from the past five years, from the Helpful Content update, site reputation abuse to detecting search-engine first AI-generated content.

List of Articles Authored By Chris Nelson (LinkedIn Profile)

Updating our site reputation abuse policy

What web creators should know about our March 2024 core update and new spam policies

Google Search’s guidance about AI-generated content

What creators should know about Google’s August 2022 helpful content update

Featured Image by Shutterstock/3rdtimeluckystudio

Google Rejects EU’s Call For Fact-Checking In Search & YouTube via @sejournal, @MattGSouthern

Google has reportedly told the EU it won’t add fact-checking to search results or YouTube videos, nor will it use fact-checks to influence rankings or remove content.

This decision defies new EU rules aimed at tackling disinformation.

Google Says No to EU’s Disinformation Code

In a letter to Renate Nikolay of the European Commission, Google’s global affairs president, Kent Walker, said fact-checking “isn’t appropriate or effective” for Google’s services.

The EU’s updated Disinformation Code, part of the Digital Services Act (DSA), would require platforms to include fact-checks alongside search results and YouTube videos and to bake them into their ranking systems.

Walker argued Google’s current moderation tools—like SynthID watermarking and AI disclosures on YouTube—are already effective.

He pointed to last year’s elections as proof Google can manage misinformation without fact-checking.

Google also confirmed it plans to fully exit all fact-checking commitments in the EU’s voluntary Disinformation Code before it becomes mandatory under the DSA.

Context: Major Elections Ahead

This refusal from Google comes ahead of several key European elections, including:

  • Germany’s Federal Election (Feb. 23)
  • Romania’s Presidential Election (May 4)
  • Poland’s Presidential Election (May 18)
  • Czech Republic’s Parliamentary Elections (Sept.)
  • Norway’s Parliamentary Elections (Sept. 8)

These elections will likely test how well tech platforms handle misinformation without stricter rules.

Tech Giants Backing Away from Fact-Checking

Google’s decision follows a larger trend in the industry.

Last week, Meta announced it would end its fact-checking program on Facebook, Instagram, and Threads and shift to a crowdsourced model like X’s (formerly Twitter) Community Notes.

Elon Musk has drastically reduced moderation efforts on X since buying the platform in 2022.

What It Means

As platforms like Google and Meta move away from active fact-checking, concerns are growing about how misinformation will spread—especially during elections.

While tech companies say transparency tools and user-driven features are enough, critics argue they’re not doing enough to combat disinformation.

Google’s pushback signals a growing divide between regulators and platforms over how to manage harmful content.


Featured Image: Wasan Tita/Shutterstock

.AI Domain Migrated To A More Secure Platform via @sejournal, @martinibuster

The Dot AI domain has migrated to a new domain name registry, giving all registrants of .AI domains stronger security and more stability, with greater protection against outages.

Dot AI Domain

.AI is a country-code top-level domain (ccTLD), which is distinct from a gTLD. A CCTLD is a two letter domain that is reserved for a specific country, like .US is reserved for the United States of America. .AI is reserved for the British Overseas Territory in the Caribbean, Anguilla.

.AI Is Now Handled By Identity Digital

The .AI domain was previously handled by a local small business named DataHaven.net but has now fully migrated to the Identity Digital platform, giving the .AI domain availability from over 90% of all registrars worldwide and a 100% availability guarantee. The migration also provides fast distribution of the .AI domain in milliseconds and greater resistance to denial of service attacks.

According to the announcement:

“Beginning today, .AI is exclusively being served on the Identity Digital platform, and we couldn’t be more thrilled for what this means for Anguilla.

The quick migration brings important enhancements to the .AI TLD like 24/7 global support, and a growing list of features that will benefit registrars, businesses and entrepreneurs today and in the years to come.”

Read the full announcement:

.ai Completes a Historic Migration to the Identity Digital Platform

Featured Image by Shutterstock/garagestock

Google Shopping Rankings: Key Factors For Retailers via @sejournal, @MattGSouthern

A new study analyzing 5,000 Google Shopping keywords sheds light on the factors that correlate with higher rankings.

The research, conducted by Jeff Oxford, Founder of 180 Marketing, reveals trends that could help ecommerce stores improve their visibility in Google’s free Shopping listings.

Amazon Dominates Google Shopping

Amazon ranks in the #1 position for 52% of Google Shopping searches, outpacing Walmart (6%) and Home Depot (3%).

Beyond Amazon’s dominance, the study found a strong correlation between website authority and rankings, with higher-ranking sites often belonging to well-established brands.

Takeaway: Building your brand and earning trust is vital to ranking well on Google Shopping.

Backlinks, Reviews, & Pricing

The study identified several trends that separate higher-ranking pages from the rest:

  • Referring Domains: Product pages in the top two positions had more backlinks than lower-ranking pages. Interestingly, most product pages analyzed (98%) had no backlinks at all.
  • Customer Reviews: Product pages with customer reviews ranked higher, and stores with star ratings below 3.5 struggled to rank well.
  • Pricing: Lower-priced products tended to rank higher, with top-performing listings often featuring prices below the category average.

Takeaway: Building backlinks, collecting customer reviews, and offering competitive pricing can make a difference.

Meta Descriptions A Top Signal

Among on-page factors, meta descriptions had the strongest correlation with rankings.

Pages that included exact-match keywords in their meta descriptions consistently ranked higher.

While keyword usage in title tags and H1 headers showed some correlation, the impact was much smaller.

Takeaway: Optimize meta descriptions and product copy with target keywords to improve rankings.

Structured Data Findings

Structured data showed mixed results in the study.

Product structured data had little to no correlation with rankings, and Amazon, despite dominating the top spots, doesn’t use structured data on its product pages.

However, pages using review structured data performed better.

Takeaway: Focus on collecting customer reviews and using review structured data, which appears more impactful than product structured data.

Shipping & Returns Scores

Google Shopping evaluates stores on shipping, returns, and website quality metrics.

The study found that stores with “Exceptional” or “Great” scores for shipping and returns were more likely to rank higher, especially in the top 10 positions.

Takeaway: Prioritize fast shipping and clear return policies to boost your Google Shopping scores.

What Does This Mean?

According to these findings, success in Google Shopping correlates with strong customer reviews, competitive pricing, and fast service.

Optimizing for traditional SEO—like backlinks and well-written metadata—can benefit both organic search and Shopping rankings.

Retailers should prioritize the customer experience, as Google’s scoring for shipping, returns, and website quality affects visibility.

Lastly, remember that correlation doesn’t equal causation—test changes thoughtfully and focus on delivering value to your customers.

Mullenweg’s Grip On WordPress Challenged In New Court Filing via @sejournal, @martinibuster

A Motion to Intervene has been filed in the WP Engine lawsuit against Automattic and Matt Mullenweg, alleging fifteen claims and seeking monetary awards along with changes to WordPress.org’s governance structure.

A motion to intervene is a legal request by a third party that seeks to join an ongoing lawsuit, the success of which hinges on proving that they have a significant interest in the outcome of a case.

Legal Filing Seeks To Take Control Of WordPress

Among  the requests made in the legal filing is one that compels Matt Mullenweg to create a WordPress Oversight Board to oversee the governance of the WordPress Foundation, WordPress.org and other related entities.

“D. Order Defendant Matt Mullenweg to establish a Governance Oversight Board as defined in the Proposed Order For Contempt filed by Michael Willman;”

Moderator Of WPDrama Subreddit

The person filing the court motion is a WordPress web developer and a moderator of the r/WPDrama subreddit named Michael Willman, CEO of Redev, a WordPress development and SEO company, who alleges that Mullenweg banned him, which caused him to lose two clients and a significant amount of earnings because of those losses.

Michael explained what happened in a message to Search Engine Journal:

“Near the start of this dispute, I lost a large ($14,500) contract as a direct result of being banned by Matt along with everyone else loosely associated. We had just closed the contract mere days before and the client is just seeing all these stories, and they back out. Losing that revenue would eventually make us unable to serve our largest client at the time, and we lost them too.

I took this all personally, and I tried to take to his #ranting channel on Slack to respond to his inane blog posts and share how his actions had damaged me and got me to the point of being ready to sue him as well.

He then banned me in retaliation for that and afterwards claimed a message saying I was going to go to Houston to file other legal documents was a “physical threat.”

He has a long history of inconsistent application of the Code of Conduct and I don’t think he can show that his actions here were justified, my own reading of the Code of Conduct implies that some type of warning in private is the first step. “

That last part about the allegedly false claim that he made a physical threat against Matt Mullenweg is now a part of the new motion.

Post On Reddit

Mr. Willman posted about his motion on Reddit, saying that he will donate 5% of any monetary awards to WordPress.

Members of the Reddit WordPress community were supportive, with one member named JonOlds posting:

“A client backing out of a signed contract ($14,500) because you being banned created a significant change is the most clear-cut example of harm from the WPE bans that I’ve seen so far. Fuck MM, and I really hope this is granted.”

Another person wrote:

“Dude you’re my hero ❤

And I’m sorry for all this stuff that’s happened to you, it’s awful. I genuinely admire how well you’ve handled all this, while moderating this sub too.”

Claims For Relief

Section D of the filing lists fifteen claims, among them he cites that Mullenweg’s retaliatory actions disrupted existing client contracts and the ability to cultivate new clients. It also describes attempted extortion, libel and trade libel among the many other claims.

Three of the claims made in the motion:

“1. Intentional Interference with Contractual Relations
Defendant Matt Mullenweg’s actions, including banning Michael Willman from the Make.WordPress.org Slack workspace and retaliating against him, disrupted existing contractual relationships. Some specific examples are the $14,500 website development contract that was canceled due to Michael Willman being banned from WordPress.org, the remainder of another contract with Trellis that was lost valued at $5,526.35, and an ongoing relationship with Trellis that included active retainers valued at $4,700 per month in addition to regular ad-hoc work, the combination of which generated $77,638.65 in invoices in 2024.

2. Intentional Interference with Prospective Economic Relations
By targeting and banning Michael Willman from essential WordPress platforms, Defendants interfered with potential business opportunities. The absence of new website development projects, loss of existing relationships and the unease expressed by clients about the WordPress ecosystem are direct results of these retaliatory actions.

4. Attempted Extortion
During discussions, Matt Mullenweg offered to refer clients to Michael Willman’s business on the condition that he cease working with WP Engine and join Automattic’s affiliate program. This constitutes coercive conduct aimed at disrupting Michael Willman’s business relationships.

6. Libel
Matt Mullenweg publicly claimed that Michael Willman made threats of physical violence, a statement that is objectively false and defamatory. This damaged Michael Willman’s reputation within the WordPress community and beyond.

7. Trade Libel
Public statements by Matt Mullenweg disparaged Michael Willman’s professional services and integrity, causing harm to his business relationships and reputation.”

Possible Outcome Of New Court Motion

The motion to intervene contains serious allegations of abuse of authority by the single most influential person in the open-source WordPress project, a worldwide ecosystem of developers, business users, publishers, plugin and theme developers and thousands of volunteers around the world who contribute to the development of the WordPress content management software.

The filing not only seeks restitution, it also asks the court for changes to the WordPress governance to remove Matt Mullenweg from his position of power at WordPress.

Read The Reddit Post And Legal Document

A link to the legal document is posted on a Reddit discussion about the filing:

Motion to Intervene & Motion for Contempt Filed in WPEngine, Inc. v. Automattic Inc.

Featured Image by Shutterstock/Rose Tamani

Google Retires Web Vitals Extension, Moves Everything to DevTools via @sejournal, @MattGSouthern

Google has officially shut down its Web Vitals Chrome extension with the release of Chrome 132.

All its key features are now fully integrated into DevTools’ Performance panel, making it the go-to tool for measuring Core Web Vitals.

While nearly 200,000 users used the extension, the Chrome team has decided to focus solely on DevTools, which offers a more powerful and centralized platform for debugging site performance.

Why the Extension Was Retired

The Web Vitals extension was great for early Core Web Vitals monitoring, but DevTools now offers the same functionality—and more.

By moving everything into DevTools, Google provides developers with a more seamless performance optimization workflow.

What’s New in DevTools?

The Performance panel in DevTools now replicates and expands on the extension’s capabilities:

  • Live Metrics: Real-time Core Web Vitals data for your local tests.
  • Field Data: Compare local metrics to CrUX data for URLs and origins, including desktop and mobile views.
  • Largest Contentful Paint (LCP) Details: Find the specific element behind your LCP score, see phase breakdowns like Time to First Byte (TTFB), and render delay.
  • Interaction To Next Paint (INP) Interaction Log: Track interactions contributing to INP with detailed timing for input delay, processing, and presentation.
  • Cumulative Layout Shift (CLS) Log: See grouped layout shifts contributing to your CLS score.
  • Diagnostic Metrics: Includes TTFB and First Contentful Paint (FCP).

DevTools provides everything the extension did, plus advanced debugging tools, all in one place.

What Developers Should Do Next

If you’re still using the Web Vitals extension, it’s time to switch to DevTools.

Google has even created a migration guide to make the transition easier.

For those who can’t migrate, Google has shared instructions for maintaining a local copy of the extension.

However, the CrUX API key tied to the extension will soon be revoked, so field data integration may break unless you generate a new key through the CrUX API docs.

Looking Ahead

This move signals Google’s commitment to making DevTools the best performance monitoring tool for developers. The Performance panel covers everything from Core Web Vitals to advanced diagnostics, and more updates are coming.

The Web Vitals extension was a helpful tool, but its best features now live in DevTools, making it easier for developers to monitor and optimize site performance from one place.

For more details, check out the official announcement or the GitHub repository.


Featured Image: William Potter/Shutterstock

Google Adds Data Collection Period To PageSpeed Insights (PSI) via @sejournal, @MattGSouthern

Google has updated PageSpeed Insights (PSI) to display the data collection period for Chrome User Experience Report (CrUX) metrics, addressing a common frustration among developers.

Barry Pollard, Web Performance Developer Advocate at Google Chrome, announced the change on X:

“Good post. But this thing bugged us: ‘Unfortunately, PageSpeed Insights does not show the data collection period in their UI.’ You know what—they’re right! We thought we should fix that. So we did. Available now on PSI.”

The comment was in response to a DebugBear blog post explaining how to interpret CrUX data and pointing out PSI’s lack of clarity around the time range covered by its metrics.

What Changed In PSI

CrUX data in PSI is based on the 75th percentile of real user visits over a rolling 28-day period, with a two-day delay.

For example, a test run on January 5 would show data from December 7 to January 3.

Previously, PSI didn’t show this date range, forcing developers to dig into Chrome DevTools to find it.

With the update, the data collection period is now displayed directly in the PSI interface, making it easier for developers to understand the context of the metrics.

Why It Matters

CrUX data is critical for measuring real-world user experience and is even used as a ranking factor for Google search results.

Knowing the data’s time frame helps developers track changes and improvements after optimizing their sites.

CrUX Data Across Tools

CrUX data shows up in multiple Google tools, but each handles it a bit differently:

  • PageSpeed Insights (PSI): Reports data for specific URLs or the whole site (origin-level), covering 28 days with a two-day delay.
  • Google Search Console: Groups CrUX data by related pages (page groups) rather than individual URLs, which can sometimes lead to confusion.
  • BigQuery: Offers monthly CrUX data dumps, including extra details like histograms and geographic breakdowns. This origin-level data updates about 10 days after the end of each month.

Looking Ahead

Google’s update to PSI makes CrUX data easier to interpret and more transparent.

This small but impactful change simplifies the analysis of real-world user data for developers working to optimize performance and improve search rankings.


Featured Image: salarko/Shutterstock

39% Of Skills May Be Obsolete By 2030, WEF Jobs Report Warns via @sejournal, @MattGSouthern

A new report shows the most in-demand jobs as AI and automation change industries worldwide.

The World Economic Forum’s (WEF) Future of Jobs report (PDF link) lists the jobs expected to grow the most in the next five years.

Here’s what you need to know.

AI’s Impact On Job Market

The report surveyed over 1,000 global executives, representing over 14 million workers in 55 economies.

Most executives—86%—believe AI and related technologies will significantly change their businesses by 2030.

Key points include:

  • AI & Information Processing: This technology is expected to create about 11 million new jobs while displacing around 9 million, leading to net job growth in AI fields.
  • Robotics and Autonomous Systems: While some jobs may be replaced, many positions will support robotic tasks.
  • Broadening Digital Access: 60% of businesses see this as essential to their operations.

Despite advances in AI, human workers are still crucial. New job opportunities will emerge in big data, cybersecurity, and human-focused roles such as talent management and customer service.

The Fastest-Growing Jobs

According to the report, technology-related roles are expected to grow most by 2030.

Leading the pack are positions like:

  1. Big Data Specialists
  2. FinTech Engineers
  3. AI and Machine Learning Specialists
  4. Software and Applications Developers
  5. Security Management Specialists
  6. Data Warehousing Specialists
  7. Autonomous and Electric Vehicle Specialists
  8. UI and UX Designers
  9. Light Truck or Delivery Services Drivers
  10. Internet of Things Specialists
  11. Data Analysts and Scientists
  12. Environmental Engineers
  13. Information Security Analysts
  14. DevOps Engineer
  15. Renewable Energy Engineers

The demand for tech workers is increasing as businesses adopt AI, information processing technologies, and robotics.

The report notes that “AI and big data are the fastest-growing skills,” followed by networks, cybersecurity, and technology literacy.

Green jobs, like Electric Vehicle Specialists and Environmental Engineers, are also among the fastest-growing roles due to efforts to reduce carbon emissions.

While tech jobs grow the fastest in percentage terms, the largest increase in actual job numbers is expected in traditional frontline roles.

These include:

  1. Farmworkers, Labourers, and Other Agricultural Workers
  2. Light Truck or Delivery Services Drivers
  3. Software and Applications Developers
  4. Building Framers, Finishers, and Related Trades Workers
  5. Shop Salespersons
  6. Food Processing and Related Trades Workers
  7. Car, Van and Motorcycle Drivers
  8. Nursing Professionals
  9. Food and Beverage Serving Workers
  10. General and Operations Managers
  11. Social Work and Counselling Professionals
  12. Project Managers
  13. University and Higher Education Teachers
  14. Secondary Education Teachers
  15. Personal Care Aides

Care economy jobs, such as nursing professionals, social workers, counselors, and personal care aides, are also expected to grow significantly.

The Most In Demand Skills

As job roles transform, so do the skills required to perform them successfully.

The Future of Jobs Report finds that, on average, workers can expect 39% of their core skills to become outdated over the next five years.

However, this “skill instability” has slowed compared to the predictions in previous editions of the report, potentially due to increasing employee reskilling and upskilling rates.

Employers surveyed identified the following as the top skills workers will need in 2025 and beyond:

  • Analytical thinking
  • Resilience, flexibility, and agility
  • Leadership and social influence
  • AI and big data
  • Networks and cybersecurity
  • Technological literacy
  • Creative thinking
  • Curiosity and lifelong learning
  • Environmental stewardship
  • Systems thinking

Skills such as manual dexterity, endurance, precision, and basic skills such as reading, writing, and math are expected to be in less demand.

The report notes:

“Manual dexterity, endurance, and precision stand out with notable net declines in skills demand, with 24% of respondents foreseeing a decrease in their importance.”

Preparing The Workforce

The report highlights the need to upskill and reskill workers due to upcoming skill changes. Employers can upskill 29% of their staff and redeploy 19%, but 11% may not receive the necessary training.

The report states:

“If the world’s workforce was made up of 100 people, 59 would need training by 2030.”

To address these challenges, 85% of employers plan to focus on upskilling current workers, 70% will hire new staff with needed skills, and 50% aim to move workers from declining jobs to growing ones.

Saadia Zahidi, the Managing Director at the World Economic Forum, emphasized the need for collective action:

“The disruptions of recent years have underscored the importance of foresight and collective action. We hope this report will inspire an ambitious, multistakeholder agenda—one that equips workers, businesses, governments, educators, and civil society to navigate the complex transitions ahead.”

What Does This Mean?

The rise of AI and data-driven marketing is reshaping SEO roles.

Here’s what matters:

  1. SEO pros need AI basics. Understanding machine learning (ML), natural language processing (NLP), and analytics tools is becoming essential for managing automated systems and content optimization.
  2. While AI helps create content, success needs human insight. Focus on storytelling and brand strategy that connects with users and satisfies search intent.
  3. Better tools mean more data. Winners will be those who can turn metrics into effective campaigns and prove ROI.
  4. Privacy and data protection knowledge sets you apart. Expect more overlap with security teams.
  5. SEO isn’t solo work anymore. Success means working well with devs, AI teams, and product managers.

Bottom line: Blend AI and analytics skills with human creativity and strategy to stay competitive.


Featured Image: Lightspring/Shutterstock

Reuters: Publishers Pivot To Video As AI Disrupts Search Traffic via @sejournal, @MattGSouthern

A new report from the Reuters Institute examines the influence of AI overviews and Google Discover, which have changed how people access information.

Additionally, the report finds publishers relying more on video and social platforms like YouTube and TikTok to reach audiences.

These trends suggest the need to refine strategies and embrace new technologies to remain competitive.

Here are all the need-to-know highlights from the report.

AI Disruption & Zero-Click Search

A major threat to publishers is AI-driven search.

Platforms like Google and OpenAI provide direct answers to user questions, often making it unnecessary for users to click on links. This creates a “zero-click” search environment.

74% of publishers are concerned about losing traffic, prompting many to seek new strategies.

Larger publishers have made licensing deals with AI aggregators like ChatGPT or Perplexity, while smaller ones are still finding ways to gain visibility.

Building audience relationships through newsletters, subscriptions, or apps can help publishers withstand disruption from AI search.

Google Discover Traffic Grows

As social media referral traffic from platforms like Facebook and X continues to decline—67% and 50% drops over the past two years—publishers are increasingly turning to Google Discover.

The Reuters Institute notes that Discover grew by 12% year over year, and many publishers now rely on it as their primary referral source.

Its personalized recommendations have made it a focus for publishers looking to replace lost traffic from other platforms.

For SEOs, technical optimizations like structured data and engaging visuals are key to maximizing Discover’s potential.

However, the feed’s algorithmic nature means results can be unpredictable, requiring constant monitoring.

Video & Social Media

Video platforms like YouTube, TikTok, and Instagram are essential for publishers who want to connect with younger audiences.

The Reuters Institute reports that publishers plan to invest more in these platforms, with YouTube (+52%), TikTok (+48%), and Instagram (+43%) showing the biggest increases in focus.

Short-form videos are effective for engagement, but they have challenges. Making quality videos requires resources, and earning money on platforms like TikTok is hard.

For publishers, this means creating strategies optimized for each platform’s algorithm while driving traffic back to your websites or apps.

Cross-Team Collaboration

The Reuters Institute stresses the need for cross-team collaboration. As newsrooms adopt more AI tools, teams will need to work together to streamline content creation.

For instance, AI tools like automated headlines and fact-checking can enhance workflows. However, they depend on support from editorial teams, which many publishers find challenging.

Fostering good relationships between different departments will be necessary for continued success.

Broader Context

The Reuters Institute’s findings match those in the NewzDash 2025 News SEO Survey. They both highlight AI disruption, Google Discover, and a lack of resources as major challenges.

Together, these reports show an industry facing rapid change.

The key takeaways for publishers and SEO professionals are: embrace AI-driven search, make the most of Google Discover, and focus on video and social media platforms.


Featured Image: Inside Creative House/Shutterstock