AI Search Engines Often Cite Third-Party Content, Study Finds via @sejournal, @MattGSouthern

A recent analysis by xfunnel.ai examines citation patterns across major AI search engines.

The findings provide new insight into how these tools reference web content in their responses.

Here are the must-know highlights from the report.

Citation Frequency Differs By Platform

Researchers submitted questions across different buyer journey stages and tracked how the AI platforms responded.

The study analyzed 40,000 responses containing 250,000 citations and found differences in citation frequency:

  • Perplexity: 6.61 citations per response
  • Google Gemini: 6.1 citations per response
  • ChatGPT: 2.62 citations per response

ChatGPT was tested in its standard mode, not with explicitly activated search features, which may explain its lower citation count.

Third-Party Content Leads Citation Types

The research categorized citations into four groups:

  • Owned (company domains)
  • Competitor domains
  • Earned (third-party/affiliate sites)
  • UGC (user-generated content)

Across all platforms, earned content represents the largest percentage of citations, with UGC showing increasing representation.

Affiliate sites and independent blogs hold weight in AI-generated responses as well.

Citations Change Throughout Customer Journey

The data shows differences in citation patterns based on query types:

  • During the problem exploration and education stages, there is a higher percentage of citations from third-party editorial content.
  • UGC citations from review sites and forums increase in the comparison stages.
  • In the final research and evaluation phase, citations tend to come directly from brand websites and competitors.

Source Quality Distribution

When examining the quality distribution of cited sources, the data showed:

  • High-quality sources: ~31.5% of citations
  • Upper-mid quality sources: ~15.3% of citations
  • Mid-quality sources: ~26.3% of citations
  • Lower-mid quality sources: ~22.1% of citations
  • Low-quality sources: ~4.8% of citations

This indicates AI search engines prefer higher-quality sources but regularly cite content from middle-tier sources.

Platform-Specific UGC Preferences

Each AI search engine shows preferences for different UGC sources:

  • Perplexity: Favors YouTube and PeerSpot
  • Google Gemini: Frequently cites Medium, Reddit, and YouTube
  • ChatGPT: Often references LinkedIn, G2, and Gartner Peer Reviews

The Third-Party Citation Opportunity

The data exposes a key area that many SEO professionals might be overlooking.

While the industry often focuses on technical changes to owned content for AI search optimization, this research suggests a different approach may be more effective.

Since earned media (content from third parties) is the biggest citation source on AI search platforms, it’s important to focus on:

  • Building relationships with industry publications
  • Creating content that others want to cover
  • Contributing guest articles to trusted websites
  • Developing strategies for the user-generated content (UGC) platforms that each AI engine prefers

This is a return to basics: create valuable content that others will want to reference instead of just modifying existing content for AI.

Why This Matters

As AI search is more widely used, understanding these citation patterns can help you stay visible.

The findings show the need to use different content strategies across various platforms.

However, maintaining quality and authority is essential. So don’t neglect SEO fundamentals in pursuit of broader content distribution.

Top Takeaway

Invest in a mix of owned content, third-party coverage, and presence on relevant UGC platforms to increase the likelihood of your content being cited by AI search engines.

The data suggests that earning mentions on trusted third-party sites may be even more valuable than optimizing your domain content.


Featured Image: Tada Images/Shutterstock

73% Of Marketers Use Generative AI, Consumer Acceptance Up via @sejournal, @MattGSouthern

Recent studies by Gartner and Adobe show that generative AI is becoming a key tool in marketing.

Almost three-quarters of marketing teams now use GenAI, and most consumers are comfortable with AI in advertising.

AI Adoption In Marketing

A survey by Gartner of 418 marketing leaders found that 73% of marketing teams use generative AI.

However, 27% of CMOs say their organizations have limited or no use of GenAI in their marketing campaigns.

Correlation With Top Performers

Marketing teams that consistently exceed targets and meet customer acquisition goals are adopting AI faster than competitors.

Greg Carlucci, Senior Director Analyst in the Gartner Marketing Practice, states:

“The most successful marketing organizations are leading the way when it comes to GenAI adoption.”

Most marketers are using GenAI for:

  • Creative development (77%)
  • Strategy work (48%)
  • Campaign evaluation (47% reporting benefits)

Challenges With Generative AI

Despite spending almost half their budgets on campaigns, 87% of CMOs faced performance problems last year, and nearly half had to end underperforming campaigns early.

The Gartner study found:

“On average, 87% of CMOs report they experienced campaign performance issues in the last 12 months, with 45% reporting that they sometimes, often, or always had occasion to terminate campaigns early in the last year due to poor performance.”

CMOs identified several departments as barriers to their success:

  • Finance (31%)
  • Executive leadership (26%)
  • Sales (26%)

Opportunities With Generative AI

Adobe’s research highlights personalization as the primary AI opportunity for marketers.

Heather Freeland, Chief Brand Officer at Adobe, notes:

“Across all industries, there is an insatiable demand for content as customers expect every encounter with a brand to be personalized.”

She adds:

“Just when this challenge seemed insurmountable, the emergence of generative AI is presenting creative and marketing teams with a new way to keep pace with customer demands while also breaking through with their brands.”

The study finds that 97% of marketers believe mass personalization is achievable with AI, but most find it challenging without appropriate tools.

AI Acceptance Among Consumers

Consumers say that knowing content was created by AI either makes them more engaged or does not change their engagement at all.

Adobe’s study found:

Three in four consumers surveyed agree that knowing content was AI-produced would either improve or not impact their likelihood of engaging with it.

Consumers are even willing to share their data for a better AI-driven experience.

Adobe’s study finds the top data points consumers are willing to share include:

“… past purchases (56%), products they’ve viewed (52%), their gender (47%), age (41%), and language (35%).”

Generational Differences

Different age groups prefer personalization in different channels.

According to Adobe’s research:

“Gen Z respondents show a higher affinity for personalized content from the consumer electronics industry, particularly music (45%) and video games (43%)…

This contrasts with Baby Boomers, who prefer personalization in retail industry content, specifically from grocery stores (46%).”

The study also found:

“Millennials prefer personalized email campaigns (45%) and website content (40%), while Gen Z values social media personalization (51%).”

Measurable Results

Adobe reports that the implementation of GenAI tools delivered performance improvements.

Its report states:

“… in one of our first generative AI-powered email tests, we used the tool to quickly build and test five versions of an Adobe Photoshop email. It delivered a more than 10% increase in click-through rates, and a subsequent test reported a 57% increase in click rates for an Adobe Illustrator email.”

Additionally:

“Testing scale and speed transformed our approach to content optimization, significantly enhancing our marketing performance and efficiency.”

What This Means

Generative AI is shifting from a novel technology to a standard practice within marketing.

Marketing departments are facing tighter budgets while consumer demand for personalized content grows. Generative AI offers a potential solution to create personalized content at scale.

Further, using AI to personalize marketing messages will unlikely impact consumer perception of your brand. Some marketers believe it may even improve retention.

Adobe’s research suggests:

“Over one in four (26%) marketer respondents agree that AI-powered personalization will increase consumer brand loyalty.”

If you want to incorporate AI into your advertising strategy but are unsure where to start, data suggests that the best approach is to enhance personalization.


Featured Image: Frame Stock Footage/Shutterstock

The State Of AI Chatbots And SEO via @sejournal, @Kevin_Indig

Last week, I published a meta-analysis of AI Overviews and their impact on SEO.

Today, I publish an analysis of the research on AI chatbots and their potential impact on customer acquisition and purchase decisions.

Image Credit: Lyna ™

I’ve analyzed 14 studies and research papers to answer five key questions:

    1. How valuable is AI chatbot visibility?
    2. How can you grow your AI chatbot visibility?
    3. How are people searching on AI chatbots?
    4. What challenges are associated with AI chatbots?
    5. Where are AI chatbots headed?

This analysis is perfect for you if you:

  • Are unsure about whether to invest in AI chatbot visibility.
  • Want an overview of the state of AI chatbots.
  • Look for ways to optimize for AI chatbots.

I don’t include AI Overviews in this analysis since I’ve covered them in depth in last week’s Memo.

Sources:

Image Credit: Kevin Indig

Get the spreadsheet.

How Valuable Is AI Chatbot Visibility?

While AI chatbot traffic currently represents a tiny percentage of overall traffic, the data shows early evidence for the value of citations and mentions.

AI chatbot adoption is skyrocketing, referral traffic to websites is growing, and traffic quality is high.

Adoption

ChatGPT has over 400 million weekly users as of January 2025.1

Semrush, 12/24: Most ChatGPT users are from the U.S. (25%) or India (12%), followed by India, Brazil, the UK and Germany. 70% are male, and over 50% are between 18 and 34 years old.

Higher Visibility, 02/25: 71.5% of consumers use ChatGPT for searching but complementary to Google, not as a replacement.

Ahrefs, 02/25: 63% of websites receive at least some traffic from AI sources. Only 0.17% of total visits came from AI Chatbots, with top sites achieving up to 6%.

  • 98% of AI traffic comes from three AI chatbots: ChatGPT (> 50%), Perplexity (30.7%), and Gemini (17.6%).
  • Smaller sites get proportionally more visits from AI.

Semrush, 02/25: The generative AI market was valued at $67 billion in 2024 and is expected to grow annually by 24.4% through 2030.

Referral Traffic

Semrush, 12/24: ChatGPT referrals to websites grew by 60% between June and October.

Semrush, 02/25: ChatGPT’s reach has expanded dramatically, sending traffic to over 30,000 unique domains daily in November 2024, up from less than 10,000 in July.

  • Online services, education, and mass media are getting the most referral traffic from ChatGPT after filtering out authentication URLs. Retail, finance, and healthcare show lower volumes.

Growth Memo, 02/25: The quality of AI chatbot traffic is superior in several key metrics:

  • The average session duration is 10.4 minutes for AI chatbot referrals versus 8.1 minutes for Google traffic.
  • Users view more pages: 12.4 pages on average for AI chatbot referrals compared to 11.8 for Google traffic.

Impact On Purchase Decisions:

Adobe, 10/24: 25% of Britons use AI while shopping online.

  • AI usage rose 10x between July and September to 10 billion visits to UK retail websites and ~100 million products.
  • Most shoppers are looking for deals:

In an Adobe survey of 5,000 U.S. consumers, 7 in 10 respondents who have used generative AI for shopping believe it enhances their experience. Additionally, 20% of respondents turn to generative AI to find the best deals, followed by quickly finding specific items online (19%) and getting brand recommendations (15%).

Semrush, 02/25: 46% of ChatGPT queries use the Search feature.

The research paper “A comparative study on the effect of ChatGPT recommendation and AI recommender systems on the formation of a consideration set” by Chang et al. looked at 471 consumers to understand:

  • Whether ChatGPT impacts consumer choices.
  • The process that impacts choices.
  • The impact on products with low-brand awareness vs. high-brand awareness.

Results:

  • ChatGPT does influence the consumer purchase journey and products recommended by ChatGPT are more likely to be adopted.
  • Products with low brand awareness see higher trust after a recommendation from ChatGPT.

My Take

  • ChatGPT had 560 million unique worldwide visitors in December 2024, compared to Google’s 6.5 billion. For comparison, that’s still small but about the size of X/Twitter today.
  • ChatGPT sending more referral traffic to a diverse list of domains is probably a strategic move to win the web over and establish itself more as an alternative to Google. I don’t think OpenAI has to do that. I think they strategically chose to.
  • So far, it seems young men in the U.S., BRIC, and Europe are the major users of ChatGPT. If that’s your target audience, optimizing for AI chatbot visibility should be a higher priority.
  • To be crystal clear, I don’t think anybody has to optimize for AI chatbot visibility. I’m confident that most industries will be fine doing classic SEO for years to come. Some will even be fine in a decade. However, you can’t unsee the rapid adoption, which leads us to a situation where two things are true: classic SEO still works, and there is a first-mover advantage on AI chatbots.

How Can You Grow Your AI Chatbot Visibility?

Improving AI chatbot visibility is a mix of known and new levers.

Crawlability

Being visible on AI chatbots starts with being visible to their crawlers. Crystal Carter, head of SEO Commus at Wix, calls this “retrievability.”

Groomed XML sitemaps, strong internal linking, fast server response, and clean HTML are a good start.

LLM crawlers are less forgiving than Google when it comes to JavaScript and client-side rendering for critical SEO components. Avoid at all cost!

Brand Strength

Ziff Davis, 11/24: A Ziff Davis study compares Domain Authority in curated (OpenWebText, OpenWebText2) with uncurated public web indices (Common Crawl, C4) to investigate how major AI companies like OpenAI, Google, and Meta trained their large language models. The unsurprising conclusion is that AI developers prefer curated text to train their models, naturally giving commercial publishers more visibility.

Semrush, 12/24: Google tends to show larger domains, ChatGPT smaller ones. The opposite is true for transactional searches: Search GPT prefers larger domains, Google smaller ones.

Seer, 01/25: Backlinks showed no correlation with AI chatbot visibility.

Organic Ranks

Seer, 01/25: Brands ranking on page 1 of Google showed a strong correlation (~0.65) with LLM mentions. Bing rankings also mattered, but a little less (~0.5–0.6).

Semrush, 02/25: The overlap between Google, Perplexity, and ChatGPT search is low (25-35% on average). However, the overlap between ChatGPT search and Bing is much higher (average = 7 domains) than with Google (4 domains).

Go Off-Google

Semrush, 02/25: YouTube is the third largest domain by referral traffic from ChatGPT. Facebook, LinkedIn, and GitHub are in the top 10.

Growth Memo, 02/25: Amazon, eBay, and Walmart dominate in Google Search just as much as in AI chatbots.

My Take

  • There is a big question of how important backlinks are for AI chatbot visibility. I think there is a trap to think they have a direct impact. The way I understand the data is that they help with Google/Bing visibility, which passively translates to AI chatbot visibility. They might also help with LLM crawler discoverability. So, they’re still important but not as much as the content itself.
  • The biggest lever seems to be citable content on and off of Google: Industry reports with exclusive research and data, original surveys and case studies, and thought leadership content from recognized experts.
  • I wouldn’t restrict myself from optimizing for AI chatbot visibility as a small business with little to no visibility on classic search engines.
  • Ecommerce is an outlier because the journey is so much more transactional than for B2B or media. On one hand, the strong visibility of big ecommerce platforms like Amazon provides a direct path for AI chatbot visibility for merchants. On the other hand, integrating with programs like Perplexity’s Buy With Pro seems worth trying out.

How Are People Searching On AI Chatbots?

Consumers use AI chatbots differently than Google unless they turn on search features.

Semrush, 02/25: 70% of ChatGPT queries represent entirely new types of intent that don’t fit traditional search categories (navigational, informational, commercial, transactional).

  • Users are asking longer, more complex questions, with non-search-enabled ChatGPT prompts averaging 23 words compared to 4.2 words when search is enabled.

Higher Visibility, 02/25: People use different AI chatbots for different user intents, e.g., Google for initial product research, ChatGPT for product comparison, and Instagram for discovering new products. However, almost 80% stick to traditional search engines for informational searches.

Growth Memo, 02/25: AI chatbots send significantly more traffic to homepages (22% on average) compared to Google (10%) yet still maintain higher engagement metrics. This trend suggests that AI chatbots are effectively preparing users for brand interactions.

My Take

  • It’s fascinating to see that when people turn on Search in ChatGPT, they use shorter queries and emulate their behavior on Google. I wonder if this behavior sticks over the long term or not. If so, we can assume a stronger carryover from players who dominate classic search engines today to AI chatbots. If not, it might open the field to new players.
  • I’ve long been dissatisfied with our broad classification of user intents (information, navigational, etc.). We had this wrong for a long time. It’s too coarse. 70% of use cases are likely task-related and don’t fit our model for classic search engines. AI chatbots are more than search engines but solve the same problems, just with different means. That’s also where I see Google lagging behind: Consumers already associate AI chatbots with tasks rather than finding information.

What Challenges Are Associated With AI Chatbots?

AI chatbots make for a compelling marketing channel but put marketers in front of tracking and bias problems.

Tracking

We can track the referral source for almost all AI chatbots, but some traffic can still fall into the direct traffic bucket.

Citations in ChatGPT typically include a “utm_source=chatgpt.com” parameter, but links in search results don’t have the parameter.2

Ahrefs, 02/25: AI traffic is likely underreported because AI chatbots like Copilot get clustered into direct while they’re actually referrals.

Brand Bias

Semrush, 12/24: Consumers and users are skeptical about AI output. 50% say they trust it more when it’s been reviewed by a human.

In the paper “Global is Good, Local is Bad?” Kamruzzaman et al. conducted experiments with fill-in-the-blank questions across four product categories and 15 countries (English only). The researchers studied the effect of:

  • Brand attribute bias: global vs. local brands.
  • Socio-economic bias: luxury vs non-luxury brands.
  • Geo bias: local brands when the domestic country is specified.

Results:

  • LLMs across multiple models (GPT-4o, Llama-3, Gemma-7B, Mistral-7B) consistently associate global brands with positive and local brands with negative attributes.
  • LLMs tend to recommend luxury brands to people from high-income countries. In contrast, non-luxury brands are more commonly suggested for people from low-income countries, even when models were given the flexibility to suggest the same brands for both groups.

The underlying reasons are that local brand names are underrepresented in LLM training data, and large companies can afford larger marketing campaigns and, therefore, create more bias.

In the paper “Generative AI Search Engines as Arbiters of Public Knowledge: An Audit of Bias and Authority” by Li et al., researchers tested how ChatGPT, Bing Chat, and Perplexity answer questions about four major topics: climate change, vaccination, alternative energy, and trust in media. They wanted to see if the AI showed bias in its answers and how it tried to appear trustworthy.

The results:

  • The AI tends to match the emotion of the question. If you ask a negative question, you get a negative answer.
  • Different topics got different emotional treatment, e.g., vaccination and alternative energy got more positive responses than climate change and media trust.
  • Bing Chat and Perplexity heavily cite news media and businesses.
  • Heavy reliance on U.S. sources (65% of sources), even when used in other countries.
  • Too many commercial/business sources, especially for topics like alternative energy.
  • Some models mix unreliable sources with good ones.
  • Answers often include uncertain language and hedging to avoid taking strong positions.

My Take

  • We’re used to significant tracking gaps from Google and Bing, so unless AI chatbots try to persuade site owners with more data, we’ll have to continue to operate with aggregate data, as I mentioned in Death of the Keyword.
  • AI chatbot bias is serious. User trust is key to winning, so I assume AI developers are aware and try to solve the problem. Until then, we have to factor bias in with our optimization strategies and do our best to clearly indicate the target audience for our product in our content.

Conclusion: Where It’s All Going

The data we have today shows that AI chatbots are developing into a significant customer acquisition channel with many familiar mechanics.

However, their task-based nature, bias, and demographics suggest we should be cautious when using the same approach as classic search engines.

Don’t forget – Search is just a means to an end. Ultimately, people search to solve problems, i.e., do tasks.

The fact that AI chatbots can skip the search part and do tasks on the spot means they’re superior to classic search engines. For this reason, I expect Google to add more agentic capabilities to AI Overviews or launch a new Gemini-based product in Search.

The underlying technology allows AI chatbots to fork off search engine ranks and develop their own signals. And it evolves rapidly.

The evolution so far went from machine learning in the pre-2022 era to early LLMs and now inference models (think: reasoning).

Better reasoning allows LLMs to recognize user intent even better than classic search engines, making it easier to train models on better sources to mention or cite.

This brings me to the question of whether Google/Bing incumbents will also dominate AI chatbots down the road. Right now, the answer is yes. But for how long?

Generational preferences could be the biggest driver of new platforms. The easiest way for Google to become irrelevant is to lose young people.

  • Semrush, 02/25: Searchers over 35 years use Google more often than ChatGPT. People between 18 and 24 use ChatGPT 46.7% of the time, compared to Google with 24.7%.
  • Higher Visibility, 02/25: 82% of Gen Z occasionally use AI chatbots, compared to 42% of Baby Boomers.

There is a chance that multimodality will quickly play a more prominent role in AI chatbot adoption. So far, text interfaces dominate.

But Google already reports 10 billion searches with Google Lens, and Meta’s Ray Ban smartglasses are very successful. Other than Google Search, the LLM answer format is easy to transport to other devices and modalities, which could transform AI.3


1 ChatGPT now has 400 million weekly users — and a lot of competition

2 Deep Dive: Tracking How ChatGPT + Search & Others Send Users To Your Site

3 Google Lens Reaches 10 Billion Monthly Searches


Featured Image: Paulo Bobita/Search Engine Journal

Data Shows Google AI Overviews Changing Faster Than Organic Search via @sejournal, @martinibuster

New research on AI Overviews and organic search results presents a fresh view on how AIO is evolving and suggests how to consider it for purposes of SEO.

Among the findings:

  • Their research showed that the AIO they were tracking showed more volatility than the organic search results, that they were changing at a faster rate.
  • AIO volatility doesn’t correlate with organic search volatility
  • They conclude that AIO is replacing featured snippets or “enhancing search results.”
  • It was also concluded that, for the purpose of SEO, AIO should be considered as something separate from the organic search.
  • Generative text changed for every query they looked at.

That last finding was really interesting and here is what they said about that:

“As far as I can tell, the generative text changed for every single query. However, our measure was looking for meaningful changes in the generative text which might reflect that Google had shifted the intent of the original query slightly to return different generative ranking pages.”

Another interesting insight was a caveat about search volatility is that it shouldn’t be taken as a sign of a Google update because it could be the influence of current events temporarily changing the meaning of a search query, which is related to Google’s freshness algorithm. I don’t know who the Authoritas people are but hats off to them, that’s a reasonable take on search volatility.

You can read the AIO research report here, it’s very long, so set aside at least 20 minutes to read it:

AIO Independence From Organic SERPs

That research published by Authoritas got me thinking about AIO, particularly the part about the AIO independence from the search results.

My thoughts on that finding is that there may be two reasons why AIO and organic SERPs are somewhat decoupled:

  1. AIO is tuned to summarize answers to complex queries with data from multiple websites, stitching them together from disparate sources to create a precise long-form answer.
  2. Organic search results offer answers that are topically relevant but not precise, not in the same way that AIO is precise.

Those are important distinctions. They explain why organic and AIO search results change independently. They are on independent parallel paths.

Those insights are helpful for making sense of how AIO fits into overall marketing and SEO strategies. Wrap your head around the insight that AIO and Organic Search do different and complementary things and AIO will seem less scary and become easier to focus on.

A complex query is something AIO can do better than the regular organic search results. An example of a complex question is asking “how” a general concept like men’s fashion is influenced by an unrelated factor like military clothing. Organic search falls short because Google’s organic ranking algorithm generally identifies a topically relevant answer and this kind of question demands a precise answer which may not necessarily exist on a single website.

What Is A Complex Query?

If complex queries trigger AI Overviews, where is the line? It’s hard to say because the line is moving. Google’s AIO are constantly changing. A short TL/DR answer could arguably be that adding a word like what or how can make a query trigger an AIO.

Example Of A Complex Query

Here’s the query:

“How is men’s fashion influenced by military style?”

Here’s the AIO answer that’s a summary based on information combined from from multiple websites:

“Men’s fashion is significantly influenced by military style through the adoption of practical and functional design elements like sharp lines, structured silhouettes, specific garments like trench coats, bomber jackets, cargo pants, and camouflage patterns, which originated in military uniforms and were later incorporated into civilian clothing, often with a more stylish aesthetic; this trend is largely attributed to returning veterans wearing their military attire in civilian life after wars, contributing to a more casual clothing culture.”

Here are the completely different websites and topics that AIO pulled that answer from:

Screenshot Of AIO Citations

The organic search results contain search results that are relevant to the topic (topically relevant), but don’t necessarily answer the question.

Information Gain Example

An interesting feature of AI Overviews is delivered through a feature that’s explained in a Google Patent on Information Gain. The patent is explicitly in the context of AI Assistants and AI Search. It’s about anticipating the need for additional information beyond the answer to a question. So in the example of “how is men’s fashion influenced by military style” there is a feature to show more information.

Screenshot Showing Information Gain Feature

The information gain section contains follow-up topics about:

  • Clean lines and structured fit
  • Functional design
  • Iconic examples of military clothing
  • Camouflage patterns
  • Post-war impact (how wars influenced what men after they returned home)

How To SEO For AIO?

I think it’s somewhat pointless to try to rank for information gain because what’s a main keyword and what’s a follow up question? They’re going to switch back and forth. Like, someone may query Google about the influence of camouflage patterns and one of the information gain follow-up questions may be about the influence of military clothing on camouflage.

The better way to think about AIO, which was suggested by the Authoritas study, is to just think about AIO as a search feature (which is what they literally are) and optimize for that in the same way one optimized for featured snippets, which in a nutshell is to create content that is concise and precise.

Featured Image by Shutterstock/Sozina Kseniia

ChatGPT Referral Traffic To Publishers Remains Minimal via @sejournal, @MattGSouthern

ChatGPT referrals to publishers increased with the introduction of web search, but remain a minor share of overall traffic.

  • ChatGPT referrals to publishers are growing eightfold but remain under 0.1% of total traffic.
  • The New York Post, The Guardian, and Forbes saw the most ChatGPT-driven visits.
  • Traditional search engines still drive majority of publisher traffic.
Hostinger Horizons Enables Anyone To Build Web Apps With AI via @sejournal, @martinibuster

Hostinger announced a new service called Hostinger Horizons that allows anyone to build interactive online apps (like an AI-based website builder) without having to code or hire programmers. The new service allows users to turn their ideas into web applications by prompting an AI to create it.

AI Democratizes Entrepreneurship

In the early days of the Internet it seemed like people with backgrounds from Stanford University and Harvard Business School had access to the resources and connections necessary to turn ideas into functioning web apps. Over time, platforms like WordPress lowered the barrier to entry for starting and running online businesses, enabling virtually anyone to compete toe to toe with bigger brands. But there was still one last barrier and that was the ability to create web apps, the functionalities that power the biggest ideas on the Internet. Hostinger Horizons lowers that barrier, enabling anyone to turn their idea into a working app and putting entrepreneurial success within reach of anyone with a good idea. The significance of this cannot be overstated.

AI Powered Web App Builder

Hostinger Horizons is an AI-powered no-code platform created specifically for individuals and small businesses that enables them to create and publish interactive web applications without having to use third-party integrations or requiring programming knowledge.

The new platform works through an AI chat interface that creates what users are asking for while also showing a preview of the web app. A user basically prompts what they want, makes feature requests, tells it what to change and preview the results in real-time.

Hostinger Horizons speeds up the time it takes to create and deploy a functioning web app. Hosting and all other necessary services are integrated into the service, which simplifies creating web apps because there’s no need for third party services and APIs. Once an app is created an online a user can still return to it, edit and improve it in minutes. It promises to be a solution for fast prototyping without the technical and investment barriers that are typically associated with translating a good idea to deployment on the web.

The Hostinger announcement noted that simple web apps only takes minutes to create:

“Early access trials show that simple web apps, such as a personal calorie tracker, a language-learning card game, or a time management tool, can be built and published in minutes.”​

How Hostinger Horizons Works

The new service combines AI-powered chat, with real-time previews and the ability to instantly publish the app to the web.

Hostinger provides all the necessary elements to get the work done:

  • Domain name registration
  • Email services
  • Multilingual support (80+ languages)
  • Supports image uploads
  • Supports user-provided sketches and screenshots
  • Voice prompting
  • Web hosting

Giedrius Zakaitis, Hostinger Chief Product and Technology Officer, offered these insights:

“Web apps have turned ideas into million-dollar startups, but building one always required coding or hiring a developer. We believe it is time to change the game. Just like Hostinger AI Website Builder introduced a new kind of site-building experience, Hostinger Horizons will democratize web apps so that anyone can bring their unique and exciting ideas online…”

Hostinger Horizons is an AI-powered no-code platform that is specifically designed to enable individuals and small businesses to build and publish fully functional web apps with no coding experience or external integrations needed. Users can just prompt what they want through an AI chat interface with real-time previews. It even allows uploading screenshots and sketches.

Hostinger Horizons promises to dramatically simplify the process of turning an idea into a working business by bundling hosting, domain registration, and email services into one solution.

Four reasons that make this a breakthrough service:

  1. Rapid Prototyping: Create, modify, and deploy interactive apps in real-time, including rapid revisions after the app is published.
  2. Integrated Services: Hosting and other essential tools are built in, eliminating reliance on third-party providers.
  3. Democratized Development: Hostinger Horizons enables anyone to turn their ideas into an online business without technical barriers.
  4. Supports 80+ languages

Creating Complex Websites With AI

What can you do with Hostinger Horizons? It seems like the right question to ask is what can’t you do with it. I asked Hostinger if the following applications of the technology was possible and they affirmed that the short answer is yes but that some of the ideas that I suggested may not be 100% straightforward to implement but that they were indeed possible to create.

Money makes the web run and I think applications that many would be interested in are ways to interactively engage users by enabling them to accomplish goals, capture leads, product comparison, improved shopping experiences and follow-up emails.

Since Hostinger Horizons handles hosting, domain registration, and email in a single platform, entrepreneurs and businesses can build these kinds of web pages by describing it to the AI chat interface, iteratively improving it and then publishing the finished project when it’s ready.

This could be useful to a restaurant, a law office, or a product review site, for example. Here are examples of the kinds of things I’d like to see it do.

Restaurant:

  • Reservation & Loyalty App
    Allows users to sign up and reserve tables and receive follow up reminders and offers.
  • Interactive Menu Explorer
    Can enable users to browse a menu according to dietary preferences and capture contact information for special offers.

Legal Office

Could be used to generate questionnaires and streamline the intake.

Product Reviews

  • Can encourage users to provide their requirements and preferences and then generate a summary of product reviews with quick links to where to purchase them.
  • Interactive Comparison Tools with links to where to purchase

Read more:

Prompt, refine, go live: We are set to disrupt the web app market with a fully integrated no-code solution — Hostinger Horizons

DeepSeek And Its Impact On The Generative AI Global Race via @sejournal, @AlliBerry3

Since launching to the public on Jan. 20, 2025, Chinese startup DeepSeek’s open-source AI-powered chatbot has taken the tech world by storm.

As the top free app by downloads in the U.S. Apple app store since Jan. 26 – with 16 million app downloads in its first 18 days (ChatGPT had 9 million in the same timeframe) – DeepSeek’s performance and accompanying search feature is at least on par with OpenAI’s ChatGPT for a fraction of the cost.

Its launch led U.S.-based AI technology company, Nvidia, to the greatest drop in market value for a U.S. company in U.S. stock market history. That’s quite an entrance!

U.S. tech analysts and investors seem to all fear that the U.S. is falling behind in the generative AI global race.

This may be warranted considering how quickly and cost-effectively DeepSeek was able to get R1 developed and out the door.

DeepSeek utilizes reinforcement learning, meaning the model learns complex reasoning behaviors through reinforcement without supervised fine-tuning, which allows it to save significant computational resources.

But, is DeepSeek really going to emerge as the leader in AI? And what are the implications for this development for the future of search? Let’s dive in.

What Has Happened Since DeepSeek Launched?

While U.S. tech companies were humbled by the speed and claimed cost efficiency of this launch, DeepSeek’s arrival has not been without controversy.

A lot of questions lurk, ranging from suspected intellectual property violations to security, data privacy, Chinese censorship, and the true cost of its technology.

Legal Issues For Copyright And Data Protection

OpenAI and Microsoft are investigating whether DeepSeek used OpenAI’s API to integrate their AI models into DeepSeek’s own models.

OpenAI claims it has evidence of DeepSeek distilling the outputs of OpenAI to build a rival model, which is against OpenAI’s terms of service, but likely not against the law.

Distillation allows for the transfer of knowledge of a large pre-trained model into a smaller model, which enables the smaller model to achieve comparable performance to the large one while reducing costs.

This is more than a little ironic given the lawsuits against OpenAI for ignoring other site’s terms of service and using their copyrighted internet data to train its systems.

There are also questions about where user data is stored and how it is processed, given that DeepSeek is a Chinese-based startup.

For anyone handling customer information and payment details, integrating a tool like DeepSeek that stores data in a foreign jurisdiction could violate data protection laws and expose sensitive information to unauthorized access.

Given that DeepSeek has yet to provide its privacy policies, industry experts and security researchers advise using extreme caution with sensitive information in DeepSeek.

DeepSeek Security Breach

Wiz Research, a company specializing in cloud security, announced it was able to hack DeepSeek and expose security risks with relative ease on Jan. 29.

It found a publicly accessible database belonging to DeepSeek, which allowed it full control over database operations and access to user data and API keys.

Wiz alerted the DeepSeek team, and they took immediate action to secure the data. However, it is unclear who else accessed or downloaded the data before it was secured.

While it’s not uncommon for startups to move fast and make mistakes, this is a particularly large mistake and shows DeepSeek’s lack of focus on cybersecurity so far.

National Security Concerns Similar To TikTok

There are national security concerns about DeepSeek’s data collection policies reminiscent of fears about TikTok, which saw a similar rise in global prominence out of Chinese-based company ByteDance.

The U.S. government briefly banned TikTok in January 2025, which came out of concerns about how the company was collecting data about users. There were also fears that the Chinese government could use the platform to influence the public in the U.S.

A few incidents in the last several years that initiated that fear include TikTok employees utilizing location data from the app to track reporters to find a source of leaked information, and TikTok employees being reported to have plans to track specific U.S. citizens.

While TikTok is active in the U.S. right now, its future is unconfirmed.

For similar reasons to the TikTok concerns, a number of governments around the world, including Australia and Italy, are already working to ban DeepSeek from government systems and devices. The U.S. is also considering a ban on DeepSeek.

Chinese Censorship

Regardless of whether you run DeepSeek locally or in its app, DeepSeek’s censorship is present for queries deemed sensitive by the Chinese government, according to a Wired investigation.

However, because it is open source, there are ways of getting around the censorship, but it’s difficult.

Doing so would require running on your own servers using modified versions of the publicly available DeepSeek code, which means you’d need access to several highly advanced GPUs to run the most powerful version of R1.

Questions About Cost

Much has been written about the cost of building DeepSeek. Initial claims by DeepSeek were that it took under $6 million to build based on the rental price of Nvidia’s GPUs.

However, a report from SemiAnalysis, a semiconductor research and consulting firm, has since argued that DeepSeek’s hardware spend was higher than $500 million, along with additional R&D costs.

For context, OpenAI lost about $5 billion in 2024 and anticipates it will lose more than $11 billion in 2025. Even if DeepSeek did cost $500 million or more, it still cut costs compared to what leading competitors are spending.

So, how did they cut costs?

Before DeepSeek came along, the leading AI technologies were built on neural networks, which are mathematical systems that learn skills by analyzing huge amounts of data. This requires large amounts of computing power.

Specialized computer chips called graphics processing units (GPUs) are an effective way to do this kind of data analysis. This is how chipmaker Nvidia grew to prominence (and also had a huge fall in market value on the day DeepSeek launched).

GPUs cost around $40,000 and require considerable electricity, which is why leading AI technologies like OpenAI’s ChatGPT were so expensive to build.

Sending data between chips can also require more energy than running the chips themselves.

DeepSeek was able to reduce costs, most notably by using a method called “mixture of experts.”

Instead of creating one neural network that learned data patterns on the internet, they split the system into many neural networks and launched smaller “expert” systems paired with a “generalist” system, reducing the amount of data needed to travel between GPU chips.

The Implications Of Being Open Source

DeepSeek-R1 is as “open-source” as any LLM has been thus far, which means anyone can download, use, or modify its code.

Similar to Meta’s Llama, the code and technical explanations are shared, enabling developers and organizations to utilize the model for their own business needs, but the training data is not fully disclosed.

Many believe DeepSeek is a big step toward democratizing AI, allowing smaller companies and developers to build on DeepSeek-R1 and achieve greater AI feats faster.

This could lead to more innovation in places with more limited access to the tech needed to build AI solutions.

But, critics fear that open-source models can expose security vulnerabilities that could be exploited, which we’ve already seen in DeepSeek’s first weeks in the public.

DeepSeek And The Future of SEO

So, what does this all really mean for search professionals? The way I see it, DeepSeek is just the next splashy AI chatbot with search capabilities in the rapidly changing world of SEO.

It’s important to understand that while tools like DeepSeek and ChatGPT use advanced natural language processing (NLP) and machine learning, they still simply provide answers to real questions that real people ask.

Their responses heavily focus on semantic understanding, intent matching, and contextual analysis, but they ultimately serve the same core user need.

While we have years of experience testing optimization tactics on more established search engines like Google, we’re still at the beginning stages of understanding optimization for generative AI chatbots.

Final Thoughts

Whether DeepSeek will stick and grow in prominence remains to be seen.

Obviously, if other governments follow Australia, Italy, and potentially the U.S. to ban DeepSeek, that would limit its potential for growth.

And much as DeepSeek rose to prominence rapidly by providing a blueprint for others and significantly lowering costs, a new market-moving AI could always be just around the corner.

Regardless of what happens with DeepSeek, we are at the beginning of a very rapid period of innovation in AI technology.

As SEO professionals, we need to be prepared to test a surge of new platforms and reverse engineer how they arrive at their responses to user queries.

More Resources:


Featured Image: Phonlamai Photo/Shutterstock

AI Chatbots Fail News Accuracy Test, BBC Study Reveals via @sejournal, @MattGSouthern

BBC study finds leading AI chatbots consistently distort news content, raising concerns about information accuracy and trust.

  • AI chatbots are getting news wrong more often than right.
    Trusted brands like BBC are losing control of their content.
  • The problem is industry-wide, affecting all major AI platforms.
  • The problem is industry-wide, affecting all major AI platforms.
Building Trust In The AI Era: Content Marketing Ethics And Transparency via @sejournal, @rio_seo

We’re officially entering a new era: the content overload era.

Content is no longer seen as a nice to have but a must for the majority of businesses.

The sheer volume of content being created and published daily across the web is astounding, to say the least – WordPress alone sees about 70 million new posts each month.

Knowing this could send content marketers into a frenzy, scrambling to crank out more content to keep up with demand. But quantity alone isn’t the only marker for content success.

The content overload era has done more than just spark the need for more; it prompted a reliance on tools and technology.

One such tool that has made its way into nearly every content marketer’s toolkit is artificial intelligence (AI). Its ability to streamline mundane processes quickly and with minimal effort has made it a crowd favorite for many content marketing professionals.

As more content marketers turn to AI to help with the content brainstorming, development, and distribution process, it raises one poignant question: Are we sacrificing quality for speed?

While unclear at first, it’s now more evident than ever that AI is here to stay and holds the potential to become an ally for content marketers when used right – a tool used for support rather than as a standalone solution.

In this post, we’ll explore how to amplify your content marketing efforts the right way, strengthening your trust with your audience.

You’ll learn how to cut through the noise to reach your target audience amid emerging tools, tactics, and technology.

You’ll walk away feeling confident in how to effectively reach and engage with your audience without relying solely on AI for your content marketing efforts.

Understanding The Challenges Of Content Saturation

Every second of the day, an influx of content is published across myriad platforms such as email, social media, websites, and more.

Consumers are inundated with content, having to sift through the mountains of information to find what is most relevant to their needs.

Vying for their time and attention can be difficult, especially when your competitors, and even those in different verticals, are attempting to do the same.

The rise of AI technology presents another challenge. Some content marketers and businesses are turning to AI to draft and publish content quickly.

Given its accessibility and capabilities, AI is becoming an easy way to churn out content, although a study has pointed to decreases in search engine visibility with AI-generated content.

The consistent, steady stream of content options can lead to what many refer to as “information overload,” where consumers become overwhelmed with the endless content options at their fingertips.

Information overload makes it increasingly difficult for brands to stand out. Additionally, algorithms are becoming attuned to understanding consumer preferences, surfacing, and prioritizing content based on relevance and engagement.

Generic content marketing strategies no longer suffice. Smart and savvy strategies are required in the ultra-fierce race for audience attention.

Breaking Through The Noise: Strategies That Work To Build Trust

Content isn’t being served in one single location. Long gone are the days of direct mail, email, and blogs being the main content forces to reckon with. The battle for attention is more arduous than ever.

With the emergence of social media platforms like TikTok, Instagram, and YouTube, content creation is no longer just in written form but rather through captivating photos, audio, and video formats.

Innovative content marketing approaches are necessary to truly build trust and differentiate your content from others.

Hyper-targeted content, personalization, and strategic AI usage are among those approaches that lead to the path to content marketing mastery.

1. Hyper-Targeted Content: Reaching The Right Audience

Imagine shouting into a void, one so massive and wide-reaching that your voice barely penetrates the surface.

The effort exerted to scream your message wouldn’t be worthwhile as no one would hear a word you say. Unfortunately, this example is all too common in the world of digital marketing.

Despite the most earnest efforts, marketers don’t effectively reach their audience due to poor segmentation or not understanding the audience at a granular level.

By analyzing key data points – like demographic, psychographic, and behavior data – brands can tap into what motivates their target audience most.

Content can then be delivered more effectively to the right audience at the right time with the right message.

Tools like Google Analytics, Google Business Profile, and email and social media marketing platforms are becoming more intelligent, enabling businesses to gain a deeper understanding of their audience through deeper insights.

These insights may reveal the best time of day to send a message, what locations are receiving the most traffic, the top-performing email nurture sequence to send new customers, and much more.

Takeaway: Craft Content Tailored To Niche Interests

Generic content no longer works. Instead, successful content marketers focus on niche markets, delivering highly relevant content that addresses a specific pain point.

For example, a popular pet retailer offers numerous specialty services to their customers. Bundling all this information on one landing page can cause confusion, leading to lower click-through rates and, in turn, less revenue.

By adding specialty landing pages with unique content for each of their services offered, such as vaccinations, aquatics, grooming, and more, the pet retailer saw dramatic increases in organic search traffic.

Understanding your audience is imperative, and content must match the needs of the individual.

Additionally, this level of segmentation can help customers build trust with your business, perceiving you as a trusted resource that truly understands their needs.

They no longer feel like just another email contact on your massive send list.

Hyper-targeted content requires more than cranking out AI-generated content. It requires human oversight to ensure segmentation is correct, the message isn’t generic, and your content matches the audience’s unique needs.

AI can be great for helping you brainstorm content ideas for your niche audience; however, a human copywriter is necessary to truly get the message over the line.

2. Effective Use Of Personalization

Addressing a prospect by their first name isn’t personalization.

Content personalization extends far beyond simply knowing the names of your customers. Modern content consumers expect more out of businesses in order to trust them enough to purchase.

They expect content that aligns with their unique needs, such as surfacing previously frequent purchases or highlighting a book that’s similar in style to the last book a customer read.

Customers are savvy, and if they’re presented with options that don’t align with their preferences, they’ll look elsewhere.

Think of Amazon, for example. Amazon’s algorithms are intelligent enough to highlight a product within a certain time period based on the buyer’s purchase history.

For example, a customer might buy Vitamin D supplements every three months. Amazon will likely show this product to the consumer around the time a refill is needed, streamlining and optimizing the path to purchase.

Revenue can be strongly tied to personalization. A HubSpot report found that segmented emails can boost opens by 30% and click-throughs by 50%, highlighting the value of personalization.

Takeaway: Personalization – A Powerful Differentiator That Requires Balance

Personalization walks a fine line. It shows you care about your customers by sharing more relevant content that matches their needs; however, privacy must be considered.

Algorithms are becoming more intelligent by analyzing and refining their content distribution strategies. This requires customer data, a subject that breeds concern and calls ethics into question.

It’s crucial for businesses to share how, when, and where customer data is collected. Disclose this clearly on your website and in your content in a clearly visible and easy-to-locate location.

Transparency is key to winning trust and credibility.

3. Responsible AI Usage In Content Creation

Many marketers have jumped aboard the AI bandwagon – 64% are already using it. Despite its prominent adoption, AI is seen as both a blessing and a curse.

On one hand, it has significantly impacted the way we work, streamlining tasks and delivering quick results.

On the other, it leads to duplicated content, information bias, irrelevant content, and an abundance of content that all sounds the same.

In fact, over half (60%) are concerned AI will harm their brand’s reputation through bias, plagiarism, or misalignment with brand values.

AI, when used responsibly, can enhance content marketing. However, the tool itself can’t mitigate concerns associated with its usage for content marketing specifically.

Only humans hold the power to truly transform the content experience and eliminate the over-reliance on AI for content creation.

Use cases for AI for content marketers:

  • Data analysis.
  • Improving drafts.
  • Keyword research.
  • Content optimization.
  • Technical SEO fixes.
  • Grammar and clarity.
  • Outline creation.

Takeaway: Use AI To Complement Human Efforts

Relying solely on AI for content creation comes with inherent risks.

AI-generated content often lacks authenticity and loses the author’s unique tone of voice. It can sound the same, reading too crisp and polished.

It loses the human element – interjecting the emotion and spark human writers accomplish that AI simply can’t.

Successful brands recognize AI can enhance human creativity, but it is not meant as a replacement.

Human ingenuity helps to build trust and shines your business in a more positive light.

Integrating All 3 Strategies For Maximum Impact

Content marketing strategies work best when used in tandem.

For example, a retailer might use AI to extract common themes in customer feedback, hyper-targeted content to promote relevant content based on customer feedback within a specific region, and personalize outreach with product recommendations based on the buyer’s behavior.

This all-encompassing approach not only improves customer experience but holds the potential to improve return on investment (ROI) as well.

As with any marketing strategy, measurement is a must. Keep a pulse on your wins as well as your opportunities for enhancement.

A firm understanding of metrics such as click-through rates, conversion rates, and engagement metrics across all platforms helps you spot what’s working and what isn’t.

The dual content overload and AI era has just begun, and the way content marketers used to reach customers will no longer suffice.

Instead, as marketers, we must work diligently to bridge the trust gap that exists between customers and brands.

This has become an increasingly tough task given the advancement of AI technology, where it can be tough to discern who’s behind the messaging – a human or a machine.

Marketers must focus on ethics and transparency to ensure every message they craft is meaningful, useful, and relevant.

By using AI as a supportive tool, adopting hyper-targeted campaigns, and leveraging personalization strategies, brands will create customer experiences that land with their audience.

Content will continue to grow at an astounding pace, but the brands that prioritize top-notch content and connection will continue to stand out.

More Resources:


Featured Image: Andrey_Popov/Shutterstock