How Brands Are Increasing AI Visibility By Up To 2,000% [Webinar] via @sejournal, @hethr_campbell

The answer is Reddit, and yes, this 90-day strategy is worth your time.

Most brands treat Reddit as an afterthought.

However, Reddit is where buyers finalize their purchase decisions.

Reddit is where human trust gets built.

Therefore, Reddit serves as a trust signal for how AI search tools determine which brands are worth recommending.

AI Mentions & Cites Brands Based On Trust Signals, Across Channels

When ChatGPT, Perplexity, or Google AIO recommends a brand, it’s drawing on a web of signals that indicate the brand is credible, relevant, and mentioned by real people in real contexts.

Reddit is one of the most authentic of those signals.

Your opportunity: not Reddit instead of other channels, but Reddit as a meaningful addition to the multi-channel trust footprint AI reasons from.

One brand OGS Media worked with saw 2,000% AI visibility growth in 90 days after building a genuine Reddit presence. That’s the strategy Bartosz and Brent are unpacking on May 5.

What You’ll Learn In This AI Search Webinar

  • How Reddit community content contributes to the multi-channel trust signals AI uses to evaluate and surface brands
  • The 5-stage framework behind OGS Media’s 2,000% AI visibility result
  • The 7 most common Reddit mistakes brands make
  • What authentic subreddit engagement looks like when it’s actually working
  • How to find and engage in Reddit conversations that influence both buyers and AI

About the Speakers

Bartosz Goralewicz is the CEO of OGS Media and one of the most experienced Reddit marketing practitioners in SEO. Brent Csutoras is a Reddit Official Advisor and the Owner of Search Engine Journal, with nearly two decades of hands-on Reddit strategy for brands across every major vertical.

AEO In 2026: Which Content Formats Earn AI Citations & How to Produce More [Webinar] via @sejournal, @hethr_campbell

AI-generated answers are capturing intent before the click, and that changes where to invest, what to measure, and which formats to prioritize. The question isn’t whether to adapt, it’s knowing exactly what to do first.

Answer Engine Optimization (AEO) Is A Core Discipline

AEO sits alongside SEO as a primary driver of how brands get discovered in 2026. The content formats, authority signals, and workflows that earn citations in ChatGPT, Claude, and Gemini are distinct from what drives traditional rankings.

What You’ll Learn

  • Which AEO and content marketing trends will have the most impact on AI citation rate and organic visibility in 2026.
  • How to reframe your success metrics when AI answers replace the click, and what to optimize for instead.
  • Which content formats generate the highest likelihood of AI citation, and how to build more of them into your editorial workflow.
  • How to integrate agentic workflows into your content operation to scale authority-building without losing quality.

About the Speakers

Shannon Vize is Sr. Content Marketing Manager at Conductor, focused on the intersection of AI and content strategy. Pat Reinhart is VP of Services & Thought Leadership at Conductor, with deep experience helping digital teams adapt their search strategies to emerging discovery behaviors.

This session delivers a practical, prioritized framework for operationalizing AEO and building AI search visibility in 2026.

Why Great Content Is No Longer Enough & What Beats It In AI Search via @sejournal, @TaylorDanRW

The assumption has been that producing something more detailed, more original, and more useful would naturally lead to stronger results, since that approach worked in a search ecosystem where discovery (and success) depended on rankings, clicks, and users actively choosing what to read.

That ecosystem rewarded the most compelling, scannable, or comprehensive option on the page, which made craftsmanship feel like the primary lever for success.

It is no longer the ecosystem we are working in, and continuing to apply that same logic without adjusting is exactly where many teams are starting to fall behind. We’ve seen this with the gamification of listicles already, and how large language models (and Google) are having to “patch” exploits as they’re found.

AI has not reduced the importance of content, but it has shifted where value is created and how that value is realized, which now revolves around who gets surfaced, cited, and reused within systems that sit between users and the web.

Content quality still matters, but it is no longer the deciding factor, and treating it as such creates a blind spot that is becoming increasingly difficult to ignore.

The Shift From Authorship To Retrieval

In traditional search, authorship carried clear weight because you created a page, earned visibility through rankings, and relied on users to click through and engage directly with what you had produced.

Success was closely tied to ownership and placement within a list of results, which made the relationship between effort and outcome feel transactional, and easily reportable to stakeholders.

Authorship still matters, and it still influences whether content is trusted, referenced, and reused, but its role has shifted toward how it supports retrieval rather than how it drives direct consumption.

Content now needs to function not only as a complete piece for human readers but also as a collection of ideas that can be extracted and reused across different contexts. This creates pressure on structure, clarity, and alignment with recognizable entities, since an author is no longer just a name attached to a page but an entity that exists across a broader ecosystem of signals, references, and mentions.

When those connections are strong, authorship reinforces retrieval and increases the likelihood that content will be selected and reused. When they are weak or absent, even high-quality content can struggle to gain traction.

AI systems don’t ignore authorship, but the way that we’ve thought about Google and authorship vectors is adapting. LLMs compress it by relying on signals of credibility and consistency, then expressing that trust through what they retrieve and include in generated responses.

This changes the unit of competition from pages to fragments and shifts the focus from ownership to accessibility, while still anchoring value in who created the content and how that creator is understood elsewhere. Strong writing and clear expertise improve the chances of being retrieved, but they do not guarantee it, which means success depends on combining credible authorship with high retrievability.

Does Being Cited Matter More Than Being Read?

For the past two decades, content strategies have been built around generating clicks, with teams refining headlines, descriptions, and formats to encourage users to visit their pages and engage directly with their work.

The visit itself served as the primary measure of success, which made traffic a reliable proxy for impact. In AI-driven experiences, that step is often removed because answers are formed within the interface before a user considers visiting a website, which fundamentally changes what visibility looks like.

Being read becomes less important than being cited, since citations now act as the mechanism through which influence is established. When content is consistently used to construct answers, it shapes user decisions even without a measurable visit, which makes its impact harder to track but no less significant.

Content that is not used in this way becomes effectively invisible, regardless of how much effort was invested in creating it.

This shift disrupts the feedback loop that marketers have relied on for years, since traffic is no longer a reliable indicator of presence or influence, even though many teams continue to optimize for it.

Distribution Wins

Challenging the idea that better work leads to better outcomes is uncomfortable because it runs counter to a belief that has been widely accepted for a long time. The ability to write excellent content still plays a role, but it is no longer the primary driver of success, and overinvesting in it while neglecting other factors is becoming a strategic risk (depending on how strong your brand and distribution mechanisms are).

Distribution has taken on a more important role, although it needs to be understood in a broader sense than traditional concepts like social reach or link building. In an AI-driven search ecosystem, distribution refers to how information exists across a network of sources that inform and validate what systems retrieve and use.

This includes being referenced across multiple trusted platforms, appearing in formats that are easy for machines to interpret, reinforcing consistent narratives about your brand, and showing up in places where systems look for confirmation.

The goal is to create alignment between what you publish and how systems evaluate credibility, relevance, and usefulness. It is entirely possible to produce an exceptional piece of content and still underperform if it exists in isolation, while a network of average content that is widely distributed and consistently reinforced can outperform it.

Content Needs To Do More Than ‘Be Read’

Great content that is not surfaced has no meaningful impact, which highlights a shift that many teams are still coming to terms with.

Quality continues to matter because weak content cannot sustain visibility over time, but the threshold for what qualifies as good enough is lower than many assume, especially when compared to the level of effort being invested.

Once that threshold is met, positioning becomes the factor that determines whether content is retrieved, cited, and embedded into answers or ignored entirely.

This reflects a broader change in how outcomes are determined, since effort no longer has a clear or direct relationship with results.

Alignment with systems on the platforms where content exists now plays a larger role, which requires a different way of thinking about strategy.

What This Means In Practice

A strategy that focuses only on improving content quality addresses only part of the challenge and leaves a significant opportunity untapped, particularly as AI continues to shape more of the user journey.

It becomes essential to consider how easily content can be extracted and reused, where ideas are reinforced outside of owned platforms, whether structure supports both human understanding and machine interpretation, and how consistently narratives appear across the broader ecosystem.

This shift also requires rethinking how success is measured, since influence can increase without a corresponding rise in traffic, which can feel uncomfortable for teams that are used to clear attribution models.

The goal is not to abandon quality but to recognize that it is no longer sufficient on its own, and that positioning needs to be treated as a core component of strategy.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

How Zero-Party & First-Party Data Can Fuel Your Intent-Based SEO Strategy via @sejournal, @rio_seo

There’s an interesting paradox currently occurring in the realm of marketing. Marketers have more tools and data at their fingertips, yet despite this influx of information, marketing leaders also somehow have less clarity than ever before.

Over the past decade, Google’s algorithms and privacy regulations have significantly shifted traditional SEO best practices. SEO has evolved from a precise science to more of a trust discipline, where marketers must infuse credibility and authority into their content to improve visibility.

The new opportunity at hand isn’t scraping more consumer behavior but rather listening to it in a new manner. By diving deeper into zero-party data, information customers willingly share, and first-party data, behavior observed directly on your own channels, chief marketing officers can shape their SEO strategies around real human intent.

Search success will be contingent on whether brands understand their audience well enough to create relevant, authentic, and trustworthy content at every step of the customer journey, not just when an algorithm prompts them to.

The Connection Between Zero-Party Data And SEO

Zero-party data is marketing’s cleanest and clearest source of truth. It uncovers the information customers want you to have. It unveils their preferences, motivations, and needs through methods like surveys, quizzes, chatbots, and more.

First-party data shows what users do. Zero-party data shows you why they did what they did. When paired together, both forms of data bridge the gap between analytics and empathy.

For example, a retail brand might ask site visitors in a post-purchase survey, “What is most likely to motivate you to make a purchase?” The choices the site visitor can choose between are price, sustainability, or convenience. Now, consider if nearly half of those respondents chose “sustainability.”

This insight shouldn’t fall into a void, but rather should be acted upon quickly. It’s not a trend but rather a clear signal. The content and SEO teams can now focus on creating content around “eco-friendly shopping” and other relevant sustainability topics, while communications teams can align messaging around the same topic. In turn, seamless collaboration and alignment take place.

Moving Beyond Keywords To Conversations

Traditional SEO honed in on what people typed into the search bar. Zero-party data reveals what people mean when they’re searching for a business, product, or service. Algorithms are increasingly rewarding intent satisfaction when evaluating content. When your content addresses and is built on declared motivations, like why someone is looking for your specific solution, you’re aligned with the future of search.

How To Turn Customer Data Into Search Strategy

The issue isn’t that CMOs aren’t collecting data; it’s the struggle with turning it into action that drives meaningful change.

An intent-based SEO strategy has three phases, which we will discuss next (capture, interpret, and activate).

Phase 1: Capture

Customers aren’t going to hand over information if they don’t see a clear value in doing so. To encourage this, marketers must highlight a mutual benefit in the information exchange. A few methods include:

  • Gated research studies.
  • Short post-purchase surveys.
  • Interactive quizzes or calculators.
  • Preference centers so customers only receive communication around specified topics that matter most to them.
  • Incentives such as coupons and exclusive promotions for newsletter subscribers.

Each of the aforementioned information exchanges becomes a declared-intent breadcrumb. Users have granted your business permission to act on their feedback and are much more actionable than cookie trails alone.

Phase 2: Interpret

Collecting information from myriad channels can make it difficult to determine where they should focus their attention first. To dissect and pull out the insights that matter most from unstructured and structured feedback, CMOs should invest in qualitative analysis tools. Tools like text analytics, for example, can make it easy for CMOs and CX teams alike to mine for common themes.

Customer Data Platforms (CDPs), can also help you create audiences and segments to deliver more personalized content that resonates with customers. This might look like a retail marketing manager only receiving newsletters, ebooks, or blogs that are related to the retail industry and trends.

These types of thematic content pillars can help inform supporting search queries, schema markup, content priorities, and more.

Phase 3: Activate

In this phase, you’ll set your plans into action. First, connect declared intent to keyword intent. For example, if customers talk about “security peace of mind,” this gives you clear insight into what they’re interested in learning more about and how your company can help. You could create content that explicitly speaks to “how we secure your personal data.”

On the other hand, if they’re talking about “easy to implement,” it may be beneficial for you to provide explainer-type content, such as a short video or an FAQ page (with FAQ schema), to address “how to integrate [product name]” searches.

Zero-party data helps move the needle with SEO efforts; from a guessing game to an action engine, producing content that doesn’t just satisfy search algorithms, but also the people behind the search, too.

Leadership Enablement: Aligning Teams, Culture, And Technology

To build an insight-to-action culture, CMOs should encourage teams to share qualitative learnings regularly, whether through a cadence of weekly meetings, via email, or a combination of the two. Customer experience teams should make Voice of Customer insights loud and clear to help inform SEO and content briefs.

It’s also important to highlight and reward cross-functional wins to showcase how working together helps drive growth. This might look like an SEO strategy that was informed by CX feedback or a case study that solves a pressing challenge clients typically face, informed by online reputation feedback.

Operationalize The Feedback Loop

CMOs can install a regular “intent feedback loop” to operationalize the data your company receives and act upon that data. This might look like:

  • Gather declared data (surveys, chatbot transcripts, online reviews, call center logs).
  • Identify what motivates consumers most (customers often talk about time savings, value for money, trust issues, emotions).
  • Update content briefs and keyword maps (primary and secondary keywords, content requirements, search intent to ensure you’re staying up to speed).
  • Measure whether your content is landing with your intended audience on an emotional and intellectual level. Engagement, recall, and action are key determinants of content success, not just how it ranks.

This type of feedback framework helps organizations embed customers’ preferences and desires directly into the content published, helping your business create the content that actually connects with your target audience.

The Metrics To Add

Measuring what matters most is integral to assess the impact of zero-party data analysis efforts. Alongside other SEO metrics, the following can gain a holistic view of your SEO performance:

Resonance Metrics

Engagement quality is a true testament of attention. Meanwhile, volume, while great to have, is somewhat meaningless if you have an abundance of unqualified leads. Instead, look at:

  • Average engagement time: How long people stick around to view your content.
  • Return visits: People who come back to consume more of your content.
  • Scroll depth: Visitors should scroll down to read the entirety of your content because they find it to be that interesting.

Relevance Metrics

Marketers must track growth in high-intent and branded queries, as these are most often the terms that someone who is on the verge of buying will use when searching for your business. If you’re showing up for phrases customers typically use when at the decision-making stage, such as “State Farm compared vs. Geico car insurance,” this indicates deeper resonance.

Relationship Metrics

Loyalty metrics, while not a metric SEOs track, can correlate with how well your SEO program is working. Reframing SEO performance as a reflection of customer understanding helps CMOs dig a layer deeper, past solely tactics, and understand deeper-rooted customer emotions that could be preventing your business from scaling. Look at:

  • Zero-party response rate: The percentage of users who are willing to share their personal information and experiences.
  • Repeat engagement: Consumers who continue to engage with your business and see value in doing so.
  • Customer lifetime value: How valuable a customer is to your business over time (how much they purchase, do they churn quickly)
  • Retention rate: Customers who continue to do business with you that you’ve worked hard to acquire and keep.

The Future Belongs To Human-Declared Intent

We may be in the age of AI, but the future is human. Yes, AI can generate a keyword-optimized blog in a matter of seconds, but human touch is where the real value is. And human-informed data will be your business’s ultimate differentiator.

Zero- and first-party data reveal pertinent insights that elevate organizations when this data is acted upon. It unlocks insights into why people search and not just what they search for. It also uncovers where in the sales journey customers are getting stuck and blockers for purchasing.

Moving forward, to fuel your SEO efforts:

  • Ask customers what matters most to them.
  • Listen to what they have to say.
  • Create content that addresses those asks.
  • Optimize it for human needs, not just engagement and clicks.
  • Measure customer experience metrics, not just SEO.

When marketing leaders take consumer feedback to heart, they bridge the gap between traffic and trust, building stronger relationships that lead to more purchases, repeat customers, and improved brand experiences.

More Resources:


Featured Image: Anton Vierietin/Shutterstock

The Content Moat Is Dead. The Context Moat Is What Survives via @sejournal, @DuaneForrester

So, let’s say you spent six months building a resource library: guides, explainers, comparison pages, all well-researched and clearly written, structured for humans who are trying to make decisions. Your analytics show strong engagement, and your team is proud of the work.

Then someone asks ChatGPT a question your library answers perfectly, and the response cites a competitor. Not because the competitor was more accurate or more thorough, but because they published original benchmark data that the model could not find anywhere else. Your content was correct; theirs was irreplaceable. That distinction now helps decide who gets cited and who gets omitted.

Free Frameworks From My Book

The Summarization Problem Is Now The Content Strategy Problem

Any major AI platform can condense a 3,000-word guide into three sentences in under two seconds, now, today. It is a current capability with a direct consequence for how content creates value. If your content can be fully replaced by a summary, it has no moat. The summary becomes the product, and your page becomes the raw material that someone else’s system processes and discards.

This is already happening across multiple surfaces. Gmail’s Gemini-powered summary cards condense marketing emails before recipients see the original content. Google AI Overviews synthesize answers from your pages and present them above your link. Microsoft’s Copilot can now handle purchasing without visiting retailer websites, compressing the entire discovery-to-transaction journey into a single assistant interaction. Samsung plans to double its Galaxy AI devices to 800 million in 2026, pushing AI-mediated discovery and summarization into everyday consumer interactions at a scale that dwarfs what we are seeing today.

The layer between your content and your audience is getting thicker and more capable every quarter. When that layer can reproduce the value of your page without sending anyone to it, the page itself stops being the asset. The asset becomes whatever the layer cannot reproduce.

What Commodity Content Actually Is

Most teams will not like this definition, but it needs to be precise. Commodity content is information available from multiple public sources, repackaged without original data, methodology, or first-person insight. That covers a lot of ground. Most how-to guides, most of what passes for “thought leadership,” and any page where the core information could be assembled by a competent person with access to the same public sources you used.

The uncomfortable reality is that much of what marketing teams call “high-quality content” qualifies as commodity. Clean writing, accurate information, and helpful structure are necessary, but they are no longer sufficient. They are table stakes in the same way that having a mobile-responsive site became table stakes a decade ago. When AI can produce a competent synthesis of public knowledge on any topic, the bar for defensible content moves above “correct and well-written.”

The Content Marketing Institute’s 2026 B2B research surveyed over 1,000 B2B marketers, and the top challenges they reported remain identical to prior years: not enough quality content, difficulty differentiating from competitors, and resource constraints. Those challenges are not new. What is new is that AI makes the consequences of undifferentiated content dramatically worse, because when your guide and your competitor’s guide both say the same thing, the AI picks one and ignores the other, or it picks neither and synthesizes from both without citing either.

The Context Moat Defined

A context moat is content that requires proprietary access, original research, unique datasets, or domain-specific experience to produce. AI can summarize it, AI can reference it, but AI cannot replicate the source material because the source material does not exist anywhere else.

The categories are specific and worth naming clearly:

  • Original benchmarks and proprietary data. This means your customer data (anonymized and aggregated), your internal performance metrics, your survey results. When HubSpot publishes its State of Marketing report, AI must cite HubSpot. When Salesforce publishes State of Sales, AI must cite Salesforce. That “must” is the moat, as the model has no alternative source for those specific numbers.
  • First-person methodology and case studies with specifics. Not “a SaaS company improved retention.” Instead: “We reduced churn from 8.2% to 4.1% over six months by restructuring onboarding around three specific interventions, and here is exactly what we did.” The specificity is the moat because nobody else was in the room when those decisions were made.
  • Expert commentary that models cannot fabricate. Named humans with verifiable credentials offering professional judgment, not just information. Models can synthesize facts from public sources all day long, but they struggle to replicate the judgment of someone who has spent twenty years in a specific domain and can tell you what the data means in context.
  • Original testing and experimentation. You ran the test, you controlled the variables, you measured the outcome. Nobody else has that data unless you choose to publish it, which means the model has to come to you or go without.

This is not an abstract framework. Research is already showing that AI systems disproportionately cite content with original data. The peer-reviewed GEO study from Princeton and Georgia Tech, presented at KDD 2024, found that adding statistics to content improved AI visibility by 41%, making it the single most effective optimization technique tested. Separate analysis from Yext found that data-rich websites earn 4.3 times more citation occurrences per URL than directory-style listings. The mechanism is straightforward: AI systems are risk-minimizing, and when a model needs to support a claim, it looks for a source it can confidently attribute. Original data with clear provenance is safer to cite than a synthesis of public information.

Why This Is An AI Visibility Play, Not Just A Content Strategy Play

If you have been reading this publication, you already know that AI retrieval works differently from traditional search ranking. I have written about how answer engines pick winners, about the gap between human relevance and model utility, and about why being right is not enough for visibility. The context moat connects all those threads into a single strategic argument.

Context-moat content becomes the authoritative node in the retrieval graph. When multiple sources say the same thing, the model has choices and your page is fungible: It can pull from you, your competitor, or a third party and produce an equivalent answer. When only one source has the data, the model has a dependency, and dependencies get cited while fungible sources get compressed.

Evertune.ai’s analysis of 75,000 brands found that brand recognition is the strongest single predictor of AI citations, with a 0.334 correlation coefficient. But brand recognition does not appear from nowhere. It compounds from being the origin point for data, research, and insights that other sources then reference, creating what the researchers describe as a citation authority flywheel: You publish original research, the research generates press coverage and industry mentions, those mentions increase brand recognition signals in AI training and retrieval systems, and the higher recognition makes your content safer for the model to cite.

This is why first-party data is not just a personalization play or an advertising play. It is an AI visibility play. The organizations sitting on proprietary datasets, customer behavior patterns, and operational benchmarks have a structural advantage in the AI retrieval layer, if they publish it. Most do not, and that gap between what companies know and what they make available to the machine layer is where the real opportunity sits right now.

The Investment Reallocation

The CMO Survey, drawing from over 11,000 marketing executives, reports that companies allocate an average of 11.2% of digital marketing budgets to first-party data initiatives, expected to reach 15.8% by 2026. Content marketing overall claims 25% to 30% of total marketing budgets, with enterprise teams investing heavily in experiential marketing, video, and distribution.

Here is the question nobody is asking loudly enough: What percentage of that content budget produces commodity content versus context-moat content?

Run the audit on your own library. Take your top 50 pages by traffic or strategic importance, and for each one, ask a single question: Could a competent competitor produce substantially the same page using only public information? If the answer is yes, that page is commodity content. It may still serve a purpose, and it may still drive traffic today, but its defensibility against AI summarization is zero. When the AI can reproduce its value without sending anyone to your page, the page’s strategic contribution collapses.

Now count. If 80% of your library is commodity and 20% is context-moat, your content investment is structurally misaligned with where AI visibility is heading.

The reallocation does not require burning down what exists. It requires shifting new investment toward the content only you can produce, and in most organizations, that shift looks like four concrete changes:

  • Publishing internal data that already exists but is not being shared. Most organizations collect far more proprietary data than they ever publish. Customer behavior benchmarks, operational metrics, industry-specific performance data, etc. The research team has it, the product team has it, and marketing has not yet turned it into published content that AI systems can discover and cite.
  • Investing in original research as a recurring editorial commitment. Annual surveys, quarterly benchmarks, longitudinal studies. These are expensive to produce and impossible for competitors to replicate, which is exactly the point. They create ongoing citation dependencies that compound over time.
  • Shifting editorial resources from synthesis to analysis. A writer summarizing industry trends produces commodity content because anyone can summarize the same trends from the same public sources. A writer analyzing your proprietary data and explaining what it means produces context-moat content. Same writer, different assignment, fundamentally different value to the business.
  • Treating subject matter experts as content assets, not interview sources. An SME quoted in a blog post adds a sentence of value. An SME who authors a detailed methodology breakdown or publishes professional judgment under their own name and credentials creates an AI-citable authority signal that compounds over time. The difference between “we talked to an expert” and “our expert published their analysis” is the difference between commodity and context moat.

The Existing Content Is Not Worthless

I want to be direct about this because the title of this article is deliberately provocative. Commodity content is not garbage. It still serves real functions; it still helps humans find what they need, it still drives traffic and supports some conversions, and it still forms the baseline of how your brand shows up across the web.

But it is no longer the moat. It is the foundation, and foundations do not differentiate because every competitor has one.

The shift I am describing is not “stop producing commodity content.” It is “stop treating commodity content as your competitive advantage.” Those are different statements: The first is impractical for any real business, while the second is a strategic reorientation that changes how you allocate budget and editorial attention.

This aligns with a pattern I see across the AI search transition more broadly. New practices layer onto existing ones rather than replacing them. SEO is no longer a single discipline, but the old disciplines did not disappear. Technical SEO still matters, on-page fundamentals still matter, and the content you already have still contributes. What changed is that those practices are necessary but insufficient. The context moat is the new sufficiency layer.

Where This Leaves You

The competitive landscape for content is splitting into two tiers, and the split is accelerating as AI systems become the primary mediators of discovery.

Tier one consists of organizations that publish original data, proprietary research, and experience-based insight that AI systems must cite because no alternative source exists. These organizations become origin points in the AI retrieval layer, and their content compounds in value as models train on it, reference it, and build answers around it.

Tier two consists of organizations that publish well-written, accurate, helpful content that could be reproduced by any sufficiently motivated team with access to the same public information. These organizations contribute to the training data, but they do not control how they appear in answers. Their content is raw material, not product.

The question for your next budget cycle is not “are we producing enough content.” It is “are we producing content that only we can produce.”

If the answer is no, the moat is already gone. The good news is that most organizations are sitting on first-party data they have never published – the research exists, the benchmarks exist, the operational knowledge exists. Turning that into published, structured, citable content is an editorial decision and a prioritization choice, not a capability gap (though you really should check with legal, too). Start with one proprietary metric or benchmark published quarterly with a branded name that AI can reference, and build from there. Every month of original data published is a month of context-moat content that no competitor can replicate, and no AI system can synthesize from public sources.

That is the new defensibility. Not having information, but having context that only you can provide.

More Resources:


This post was originally published on Duane Forrester Decodes.


Featured Image: Gabriela Flores Espinosa/Shutterstock; Paulo Bobita/Search Engine Journal

AI-Generated Content Isn’t The Problem, Your Strategy Is

“If AI can write, why are we still paying writers?” For any CMO or senior manager on a budget, you’ve probably already had a version of this conversation. It’s a seductive idea. After all, humans are expensive and can take hours or even days to write a single article. So, why not replace them with clever machines and watch the costs go down while productivity goes up?

It’s understandable. Buffeted by years of high inflation, high interest rates, and disrupted supply chains, organizations around the world are cutting costs wherever they can. These days, instead of “cost cutting,” CFOs and executive teams prefer the term “cost transformation,” a new jargon for the same old problem.

Whatever you call it, marketing is one department that is definitely feeling the impact. According to Gartner, in 2020, the average marketing budget was 11% of overall company revenue. By 2023, this had fallen to 9.1%. Today, the average budget is 7.7%.

Of course, some organizations will have made these cuts under the assumption that AI makes larger teams and larger budgets unnecessary. I’ve already seen some companies slash their content teams to the bone; no doubt believing that all you need is a few people capable of crafting a decent prompt. Yet a different Gartner study found that 59% of CMOs say they lack the budget to execute their 2025 strategy. I guess they didn’t get the memo.

Meanwhile, some other organizations refuse to let AI near their content at all, for a variety of reasons. They might have concerns over quality control, data privacy, complexity, and so on. Or perhaps they’re hanging onto the belief that this AI thing is a fad or a bubble, and they don’t want to implement something that might come crashing down at any moment.

Both camps likely believe they’ve adopted the correct, rational, financially prudent approach to AI. Both are dangerously wrong. AI might not be the solution, but it’s also not the problem.

Beeching’s Axe

Spanish philosopher George Santayana once wrote: “Those who cannot remember the past are condemned to repeat it.” With that in mind, let me share a cautionary tale.

In the 1960s, British Railways (later British Rail) made one of the most short-sighted decisions in transport history. With the railway network hemorrhaging money, the Conservative Government appointed Dr. Richard Beeching, a physicist from ICI with no transport experience, as the new chairman of the British Transport Commission, tasked with cutting costs and making the railways profitable.

Beeching’s solution was simple; do away with all unprofitable routes, identified by assessing the passenger numbers and operational costs of each route in isolation. Between 1963 and 1970, Beeching’s cost-cutting axe led to the closure of 2,363 stations and over 5,000 miles of track (~30% of the rail network), with the loss of 67,700 jobs.

Decades later, the country is spending billions rebuilding some of those same routes. As it turned out, many of those “unprofitable” routes were vital not only to the health of the wider rail network, but also to the communities in those regions in ways that Beeching’s team of bean counters simply didn’t have the imagination to value.

I’m telling you this because, right now, a lot of businesses are carrying out their own version of the Beeching cuts.

The Data-Led Trap

There’s a crucial distinction between being data-led and data-informed. Understanding this could be the difference between implementing a sound content production strategy and repeating Beeching’s catastrophe.

Data-led thinking treats the available data as the complete picture. It looks for a pattern and adopts it as an undeniable truth that points towards a clear course of action. “AI generates content for a fraction of our current costs. Therefore, we should replace the writers.”

Data-informed thinking sets out to understand what might be behind the pattern, extrapolate what’s missing from the picture, and stress-test the conclusions. The data becomes a starting point for inquiry, not an endpoint for decisions. “What value isn’t captured in this data? What would replacing our writers with AI actually mean for the effectiveness of our content when our competitors can do the exact the same thing with the exact same tools?”

That last question is the real challenge facing companies considering AI-generated content, but the answer won’t be found in a spreadsheet. If you can use AI to generate your content with minimal human input, so can everyone else. Very soon, everyone is generating similar content on similar topics to target the same audiences, with recycled information and reheated “insights” drawn from the same online sources.

Why would ChatGPT somehow generate a better blog post for you than for anyone else asking for 1,200 words on the same topic? It wouldn’t. You need to add your own secret sauce.

There is no competitive advantage to be gained by relying on AI-generated content alone. None.

AI-generated content is not a silver bullet. It’s the minimum benchmark your content needs to significantly exceed if your brand and your content is to have any chance of standing out in today’s noisy online marketplace.

Unfortunately, while organizations know they need to have content, far too many senior decision-makers don’t fully understand why, never mind all the things an effective content strategy needs to accomplish.

Content Isn’t A Cost, It’s An Infrastructure

Marketing content is often looked down upon as somehow easier or less worthy than other forms of writing. Yet it arguably has the hardest job of all. Every article, ebook, LinkedIn post, brochure, and landing page has to tick off a veritable to-do list of strategic requirements.

Of course, your content needs to have something to say. It must work on an informational level, backed by solid research and journalism. However, each asset or article also has a strategic role to play: attracting audiences, nurturing prospects, or converting customers, while aligning with the brand’s carefully mapped out messaging at every stage.

Your content must build authority, earn trust, and demonstrate expertise. It must be memorable enough to aid brand awareness and recall, and distinctive enough to differentiate the brand from its competitors. It must be structured for search engines with the right entities, topics, and relationships, without losing the attention of busy humans who can click away at any second. Ideally, it should also include a couple of quote-worthy lines or interesting stats capable of attracting attention when the content is distributed on social media.

ChatGPT or Claude can certainly string a bunch of convincing sentences together. But if you think they can spin all those other plates for you at the same time, and to the same standard as a skilled content creator, you’re going to be disappointed. No matter how detailed and nuanced your prompt, something will always be missing. You’re still asking AI to synthesize something brilliant by recycling what’s already out there.

Which brings me to the most ironic part of this discussion. With the rapid adoption of AI-mediated search, your content now needs to become a source that large language models will confidently cite in responses to relevant queries.

Expecting AI to create content likely to be cited by AI is like watching a dog chasing its tail: futile and frustrating. If AI provided the information and insights contained in your content, it already has better, more authoritative sources. Why would AI cite content that contains little if any fresh information or insight?

If your goal is to increase your brand’s visibility in AI responses, then your content needs to offer what can’t easily be found elsewhere.

The Limitations Of Online Knowledge

Despite appearances, AI cannot think. It cannot understand, in the sense we usually mean it. As it currently stands, it cannot reason. It certainly cannot imagine. Words like these have emerged as common euphemisms for how AI generates responses, but they also set the wrong expectations.

AI also cannot use information that isn’t already available and crawlable online. While we like to think that somehow the internet is a massive store of the entirety of human knowledge, the reality is that it’s not even close.

So much of the world we live in simply cannot be captured as structured, digitized information. While AI can tell you when and where the next local collectables market is on, it can’t tell you which dealer has that hard-to-find comic you’ve been chasing for years. That’s the kind of information you can only find out by digging through lots of comic boxes on the day.

And then there are cultural histories and localized experiences that exist more in verbal traditions than in history books. AI can tell me plenty of stuff about the First World War. But if I ask it about the Iranian famine during WW1, it’s going to struggle because it’s not that well documented outside of Iranian history books. Most of my knowledge of the famine comes almost entirely from stories my great grandma told my mother, who then passed them on to me, like how she had to survive on just one almond per day. But you won’t find her stories in any book.

How can AI draw upon the wealth of personal experience and memories we all have? The greatest source of knowledge is human. It’s us. It’s always us.

But while AI can’t do your thinking for you, it can still help in many other ways.

→ Read More: Can You Use AI To Write For YMYL Sites? (Read The Evidence Before You Do)

You Still Need A Brain Behind The Bot

Let me be clear: I use AI every day. My team uses AI every day. You should, too. The problem isn’t the tool. The problem is treating the tool as a strategy, and an efficiency or cost reduction strategy at that. Of course, it isn’t only marketing teams hoping to reduce costs and boost productivity with generative AI. Another industry has already discovered that AI doesn’t actually replace anything.

A recent survey conducted by the Australian Financial Review (AFR) found that most law firms reported using AI tools. However, far from reducing headcount, 70% of surveyed firms increased their hiring of lawyers to vet, review, and sign off on AI-generated outputs.

This isn’t a failure in their AI strategy, because the strategy was never about reducing headcount. They’re using AI tools as digital assistants (research, drafting, document handling, etc.) to free up more time and headspace for the kinds of strategic and insightful thinking that generates real business value.

Similarly, AI isn’t a like-for-like replacement for your writers, designers, and other content creators. It’s a force multiplier for them, helping your team reduce the drudgery that can so often get in the way of the real work.

  • Summarizing complex information.
  • Transcribing interviews.
  • Creating outlines.
  • Drafting related content like social media posts.
  • Checking your content against the brand style guide to catch inconsistencies.

Some writers might even use AI to generate a very rough first draft of an article to get past that blank page. The key is to treat that copy as a starting point, not the finished article.

All these tasks are massive time-savers for content creators, freeing up more of their mental bandwidth for the high-value work AI simply can’t do as well.

AI can only synthesize content from existing information. It cannot create new knowledge or come up with fresh ideas. It cannot interview subject matter experts within your business to draw out hidden wisdom and insights. It cannot draw upon personal experiences or perspectives to make your content truly yours.

AI is also riddled with algorithmic biases, potentially skewing your content and your messaging without you even realizing. For example, the majority of AI training data is in the English language, creating a huge linguistic and cultural bias. It might require an experienced and knowledgeable eye to spot the subtle hallucinations or distortions.

While AI can certainly accelerate execution, you still need skilled, experienced creatives to do the real thinking and crafting.

You Don’t Know What You Have, Until It’s Gone

Until Beeching closed the line in 1969, the route between Edinburgh and Carlisle was a vital transport artery for the Scottish Borders. On paper, the line was unprofitable, at least according to Beeching’s simplistic methodology. However, the closure had massive knock-on effects, reducing access to jobs, education, and social services, as well as impacting tourism. Meanwhile, forcing people onto buses or into cars placed greater strain on other transport infrastructures.

While Beeching might have solved one narrowly defined problem, he had undermined the broader purpose of British Railways: the mobility of people in all parts of Great Britain. In effect, Beeching had shifted the consequences and cost pressures elsewhere.

The route was partially reopened in 2015 as The Borders Railway, costing an estimated £300 million to reinstate just 30 miles of line with seven stations.

Beeching’s cuts illustrate the folly of evaluating infrastructure (or content strategy) purely on narrow, short-term financial metrics.

Organizations that cut their teams in favor of AI are likely to find it isn’t so easy to reverse course and undo the damage a few years from now. Replacing your writers with AI risks eroding the connective tissue that characterizes your content ecosystem and anchors long-term performance: authority, context, nuance, trust, and brand identity.

Experienced content creators aren’t going to wait around for organizations to realize their true value. If enough of them leave the industry, and with fewer opportunities available for the next generation of creators to gain the necessary skills and experience, the talent pool is likely to shrink massively.

As with the Beeching cuts, rebuilding your content team is likely to cost you far more in the long term than you saved in the short term, particularly when you factor in the months or years of low-performing content in the meantime.

Know What You’re Cutting Before You Wield The Axe

According to your spreadsheet, AI-generated content may well be cheaper to produce. But the effectiveness of your content strategy doesn’t hinge on whether you can publish more for less. This isn’t a case of any old content will do.

So, beware of falling into the Beeching trap. Your content workflows might only seem “loss-making” on paper because the metrics you’re looking at don’t adequately capture all the ways your content delivers strategic value to your business.

Content is not a cost center. It never was. Content is the infrastructure of your brand’s discoverability, which makes it more important than ever in the AI era.

This isn’t a debate about “human vs. AI content.” It’s about equipping skilled people with the tools to help them create work worthy of being found, cited, and trusted.

So, before you start swinging the axe, ask yourself: Are you cutting waste, or are you dismantling the very system that makes your brand visible and credible in the first place?

More Resources:


Featured Image: IM Imagery/Shutterstock

Five Ways To Boost Traffic To Informational Sites via @sejournal, @martinibuster

Informational sites can easily decline into a crisis of search visibility. No site is immune. Here are five ways to manage content to maintain steady traffic, increase the ability to adapt to changing audiences, and make confident choices that help the site maintain growth momentum over time.

1. Create A Mix Of Content Types

Publishers are in a constant race to publish what’s latest because being first to publish can be a source of massive traffic. The main problem with these kinds of sites is that publishing content about current events can run into problems, putting into question the sustainability of the publication.

  • Current events quickly become stale and no longer relevant to an audience.
  • Unforeseen events like an industry strike, accidents, world events, and pandemics can disrupt interest in a topic.

The focus then is to identify content topics that are reliably relevant to the website’s current audience. This kind of content is called evergreen content, and it can form a safety net of reliable traffic that can sustain the business during slow cycles.

An example of the mixed approach to content that comes to mind is how the New York Times has a standalone recipes section on a subdomain of the main website. It also has a category-based section dedicated to gadget reviews called The Wirecutter.

Another example is the entertainment niche which in addition to industry news also publish interviews with stars and essays about popular movies. Music websites publish the latest news but also content based on snippets from interviews with famous musicians where the musicians make interesting statements about songs, inspirations, and cultural observations.

Rolling Stone magazine publishes content about music but also about current events like politics that align with their reader interests.

All three of those examples expand their topics to adjacent topics in order to bulk up their ability to attract steady and consistent traffic that is reliable.

2. Evergreen Content Also Needs Current Event Topics

Conversely, evergreen topics can generate new audience reach and growth by expanding to cover current events. Content sites about recipes, gardening, home repairs, DIY, crafts, parenting, personal finance, and fitness are all examples of topics that feature evergreen content and can also expand to cover current events. The flow of traffic derived from trending topics is an excellent source of devoted readers who return to read evergreen content and end up recommending the site to friends for both current events and evergreen topics.

Current events can be related to products and even to statements by famous people. If you enjoy creating content or making discoveries, then you’ll enjoy the challenge of discovering new sources of trending topics.

If you don’t already have a mix of evergreen and ephemeral content, then I would encourage you to seek opportunities to focus on those kinds of articles. They can help sustain traffic levels while feeding growth and life into the website.

3. Beware Of Old Content

Google evaluates the total content of a website in order to generate a quality score. Google is vague about these whole-site evaluations. We only know that they do it and that a good evaluation can have a positive effect on traffic.

However, what happens when the site becomes top-heavy with old, stale content that’s no longer relevant to site visitors? This can become a drag on a website. There are multiple ways of handling this situation.

Content that is absolutely out of date and of no interest to anyone and is therefore no longer useful should be removed. The criteria to judge content with is usefulness, not the age of the content. The reason to prune this content is because it’s possible that a whole-site evaluation may conclude that most of the website is comprised of unhelpful, outdated web pages. This could be a negative drag on site performance.

There’s nothing inherently wrong with old content as long as it’s useful. For example, the New York Times keeps old movie reviews in archives that are organized by year, month, day, category, and article title.

The URL slug for the movie review of E.T. looks like this:  /1982/06/11/movies/et-fantasy-from-spielberg.html

Screenshot Of Archived Article

Take Decisive Steps

  • Useful historical content can be archived.
  • Older content that is out of date can be rehabilitated.
  • Content that’s out of date and has been superseded by new content can be redirected with a 301 response code to the new content.
  • Content that is out of date and objectively useless should be removed from the website and allowed to show a 404 response code.

4. Topic Interest

Something that can cause traffic to decline on an informational site is waning interest. Technological innovation can cause the popularity of another product to decline, dragging website traffic along with it. For example, I consulted for a website that reported its traffic was declining. The site still ranked for its keywords, but a quick look at Google Trends showed that interest in the website topic was declining. This was several months after the introduction of the iPhone, which negatively impacted a broad category of products that the website was centered on.

Always keep track of how interested your audience is in your topic. Follow influencers in your niche topic on social media to gauge what they are talking about and whether there are any shifts in the conversation that indicate waning interest or growing interest in a related topic.

Always try out new subcategories of your topic that cross over with your readership to see if there is an audience there that can be cultivated.

Another nuance to consider is the difference between temporary dips in interest and long-term structural decline. Some topics experience predictable cycles driven by seasons, economic conditions, or news coverage, while others face permanent erosion as user needs change or alternatives emerge. Misreading a cyclical slowdown as a permanent decline can lead to unnecessary pivots, while ignoring structural shifts can leave a site over-invested in content that no longer aligns with how people search, buy, or learn.

Monitoring topic interest is less about reacting to short-term fluctuations and more about keeping aware of topical interest and trends. By monitoring audience behavior, tracking broader trends, and experimenting at the edges of the core topic, an informational site can adjust gradually rather than being forced into abrupt changes after traffic has already declined. This ongoing attention helps ensure that content decisions remain grounded in how interest evolves over time.

5. Differentiate

Something that happens to a lot of informational websites is that competitors in a topic tend to cover the exact same stories and even have similar styles of photos, about pages, and bios.

B2B software sites have images of people around a laptop, images of a serious professional, and people gesturing at a computer or a whiteboard.

Recipe sites feature the Flat Lay (food photographed from above), the Ingredient Still Life portrait, and action shots of ingredients grated, sprinkled, or in mid air.

Websites tend to converge into homogeneity in the images they use and the kind of content that’s shared, based on the idea that if it’s working for competitors, then it may be a good approach. But sometimes it’s best to step out of the pack and do things differently.

Evolve your images so that they stand out or catch the eye, try a different way of communicating your content, identify the common concept that everyone uses, and see if there’s an alternate approach that makes your site more authentic.

For example, a recipe site can show photographic bloopers or discuss what can go wrong and how to fix or avoid it. Being real is authentic. So why not show what underbaked looks like? Instagram and Pinterest are traffic drivers, but does that mean all images must be impossibly perfect? Maybe people might respond to the opposite of homogeneity and fake perfection.

The thing that’s almost always missing from product reviews is photos of the testers actually using the products. Is it because the reviews are fake? Hm… Show images of the products with proof that they’ve been used.

Takeaways

  • Sustainable traffic can be cultivated with a mix of evergreen and timely content. Find the balance that works for your website.
  • Evergreen content performs best when it is periodically refreshed with up-to-date details.
  • Outdated content that lacks utility or meaning in people’s lives can quietly grow to suppress site-wide performance. Old pages should be reviewed for usefulness and then archived, updated, redirected, or removed.
  • Audience interest in a topic can decline even if rankings remain stable. Monitoring search demand and cultural shifts helps publishers know when it’s time to expand into adjacent topics before traffic erosion becomes severe.
  • Differentiation matters as much as coverage. Sites that mirror competitors in visuals, formats, and voice risk blending into sameness, while original presentation and proof of authentic experience build trust and attention.

Search visibility declines are not caused by a single technical flaw or isolated content mistake but by gradual misalignment between what a site publishes and what audiences continue to value. Sites that rely too heavily on fleeting interest, allow outdated material to accumulate, or follow competitors into visual and editorial homogeneity risk signaling mediocrity rather than relevance and inspiring enthusiasm. Sustained performance depends on actively managing content, balancing evergreen coverage with current events, pruning what’s no longer useful, and making deliberate choices that distinguish the site as useful, authentic, and credible.

Featured Image by Shutterstock/Sergey Nivens

Signal Vs. Noise: Predicting Future Impact Of Content Marketing

This edited excerpt is from “B2B Content Marketing Strategy” by Devin Bramhall ©2025, and is reproduced and adapted with permission from Kogan Page Ltd.

Marketing can contribute to company growth in many different ways: Net new sales, customer retention, reduces risk from competitors, sometimes creates new revenue streams (like events) that impact more than one company goal, bringing a product to market successfully, feature adoption/upsells, to name a few.

The challenge marketers have is convincing multiple stakeholders that their work did, in fact, contribute to any of these areas. Even if you have goals and agreed-upon metrics to measure success, reporting on marketing ends up being fraught with all kinds of complications, from the political and interpersonal to depth of knowledge about marketing and what shows up as “impact” and “value” to the business.

The opportunity for marketers in this situation is that the people who need to be convinced don’t know what “the answer” to marketing attribution is either. They argue with each other about it behind closed doors and change their minds a lot, but they honestly can’t really prove anything better than you can. They just bought into some corollary model or made one up and have spent a ton of time campaigning internally and out in the world to make other people believe their way is correct, and eventually some of them do.

Predicting Future Impact

Most reporting focuses on what’s already happened – last month’s lead generation, last quarter’s revenue, or last year’s customer acquisition costs.

While historical data is crucial to making future decisions, it also keeps marketing leaders in a reactive position. By the time you identify a problem, it’s already affected your results. Leading indicators give you time to adjust course when needed, rather than explaining missed targets after the fact.

That’s why monitoring the signals along the way is also useful, if executed thoughtfully.

A few caveats:

  • Monitor quietly. You don’t have to share what you observe with your executives 1) at all, or 2) until you’re ready. They’ll either get confused or too excited, and neither leads to a good place for you.
  • Work with your data team. Whatever job title they’ve been given at your company, find the people who have access to the raw data and ask them questions. Be specific about what you want to know. You don’t have to know the exact data types, time periods, or segments. They just need a detailed question to get you what you need.
  • Talk it through. Since data contains multiple realities depending on how you slice it, I’ve always found it helpful to run any conclusions or stories by my data team and, where possible, my boss (see first bullet!). Basically, I look for two different analytical perspectives:
    • Someone whose job it is to ensure our data is accurate.
    • Someone whose job it is to analyze data for reporting on the business.

Remember: Reporting isn’t a single use-case activity. Reflecting on the past to measure impact is just one way to leverage reporting. Use it to inspire new ideas, optimizations, and experiments, too.

Read more: How To Write SEO Reports That Get Attention From Your CMO

A Few Potentially Useful Signals You Can Monitor

Ultimately, it’s up to you to determine which signals provide valuable insights into the performance of your marketing initiatives. And regardless of your role, whether it’s producer, manager, or team lead, as your boss, I’d expect you to know how to determine what those are.

Also, the exact signals you monitor will continue to change as technology and the internet evolve. However, there are a few informative signals that have stood the test of time (thus far) for me.

Resonance

When it comes to resonance, unprompted action on even a semiregular basis is a huge signal that something you’re doing is working, so even if your data is statistically insignificant, I’d lean in and, at the very least, conduct further experiments.

One example of this is folks sharing and referencing a topic or idea you share publicly in their own content (and how their followers react to it) on a semi-consistent basis. This indicates you’re at least on the right track with content direction.

In my experience, search volume for a keyword or phrase is minimally helpful in determining resonance in the beginning. As in, just because no one is searching for a topic doesn’t mean it’s not a common problem. A more useful exercise in search monitoring to me is whether your campaign corresponds with an increase in search volume in that time period.

Activity

The same principle applies to other actions as well. Are folks commenting on posts asking for your opinion on specific problems they are experiencing? Are you receiving anecdotal feedback semi-consistently on specific marketing initiatives or topics you’re investing in?

Do folks engage with your content even when you’re inconsistent? One client I worked with saw 60-70% open rates even on major holidays or when the newsletter was sent off-schedule on a Saturday or Monday.

Are you seeing an increase in time-on-page or pages per session from certain topics or even specific pieces?

Copycats

While not a perfect signal, if your competitors start copying your content, it’s either a sign you could be onto something or an indication that their strategy isn’t working, they don’t have one, or they’re struggling. No matter the case, it’s a signal worth paying attention to and perhaps doing some recon to find out if there are any weaknesses you can exploit.

Ultimately, your goal is to explore these signals to establish whether there are correlations between these leading indicators and your ultimate business outcomes. This isn’t just theoretical – it requires analyzing your data to identify patterns that predict success for your business.

Turning Measurement Into Mastery

Effective reporting isn’t the end of your marketing journey – it’s the bridge to your next phase of growth. Measuring the impact of content marketing isn’t just about proving its value; it’s about creating the leverage you need to execute strategies that genuinely move your business forward.

Remember these essential principles as you develop your measurement approach:

  • Numbers don’t tell stories – people do. Your data provides ingredients, but you create the meal. The most powerful reports transform complex metrics into clear narratives that inspire action and build confidence in your strategy.
  • Measurement serves strategy, not the other way around. When you begin with clear objectives and understand what truly influences behavior, metrics become tools for insight rather than constraints on creativity.
  • Reporting is campaigning. The most successful marketers recognize that performance reporting is ultimately a persuasion exercise – one that requires understanding audience motivations, building relationships, and consistently communicating value.
  • Both measurable and unmeasurable impacts matter. While focusing on quantifiable metrics, never lose sight of the equally valuable but harder-to-measure effects of brand building, relationship development, and community growth.

By developing measurement systems that capture both immediate impacts and leading indicators, you transform reporting from a dreaded obligation into a strategic advantage.

Summary: Practice And Persistence

As you apply these principles to your own marketing, remember that mastery comes through practice and persistence. You’ll make mistakes, discover unexpected insights, and continuously refine your approach. That’s not just normal – it’s the path to excellence.

To read the full book, SEJ readers have an exclusive 25% discount code and free shipping to the US and UK. Use promo code “SEJ25” at koganpage.com here.

More Resources:


Featured Image: Igor Link/Shutterstock

From Organic Search To AI Answers: How To Redesign SEO Content Workflows via @sejournal, @rio_seo

It’s officially the end of organic search as we know it. A recent survey reveals that 83% of consumers believe AI-powered search tools are more efficient than traditional search engines.

The days of simple search are long gone, and a profound transformation continues to sweep the search engine results pages (SERPs). The rise of AI-powered answer engines, from ChatGPT to Perplexity to Google’s AI Overviews, is rewriting the rules of online visibility.

Instead of returning traditional blue links or images, AI systems are returning immediate results. For marketing leaders, the question is no longer “How do we rank number one?” but rather “How do we become the top answer?”

This shift has eliminated the distance between the search and the solution. No longer do customers need to click through to find the information they’re seeking. And while zero-click searches are more prevalent and old metrics like keyword rankings are fading fast, it also creates a massive opportunity for chief marketing officers to redefine SEO as a strategic growth function.

Yes, content remains king, but it must be rooted in a foundation that fuels authority, brand trust, and authenticity to serve the systems that are shaping what appears when a search is conducted. This isn’t just a new channel; it’s a new way of creating, structuring, and validating content

In this post, we’ll dissect how to redesign content workflows for generative engines to ensure your content reigns supreme in an AI-first era.

What Generative Engines Changed And Why “Traditional SEO” Won’t Recover

When users ask generative search engines a question, they aren’t presented with a list of websites to click through to learn more; instead, they’re given a quick, synthesized answer. The source of the answer is cited, allowing users to click to learn more if they so choose to. These citations are the new “rankings” and most likely to be clicked on.

In fact, research shows 60% of consumers click through at least sometimes after seeing an AI-generated overview in Google Search. A separate study found that 91% of frequent AI users turn to popular large language models (LLMs) such as ChatGPT for their searching needs.

While keyword optimization still holds importance in content marketing, generative engines are favoring expertise, brand authority, and structured data. For CMOs, the old metrics no longer necessarily equate to success. Visibility and impressions are no longer tied to website traffic, and success is now contingent upon citations, mentions, and verifiable authority signals.

The AI era signals a serious identity shift, one in which traditional SEO collides with AI-driven search. SEO can no longer be a mechanical, straightforward checklist that sits under demand generation. It must integrate with a broader strategy to manage brand knowledge, ensuring that when AI pulls data to form an answer, your content is what they trust most out of all the options out there.

In this new search era, improving visibility can be measured in three diverse ways:

  • Appearing in results or answers.
  • Being seen as a thought leader in your space by being cited or trusted as a credible source.
  • Driving influence, affinity, or conversions from your digital presence.

Traditional SEO is now only one piece of the content visibility puzzle. Generative SEO demands fluency across all three.

The CMO’s New Dilemma: AI As Both Channel And Competitor

Consumers have questions. Generative engines have the answers. With over half (56%) of consumers trusting the use of Gen AI as an education resource, generative engines are now mediators between your brand and your customers. They can influence purchases or sway customers toward your competition, depending on whether your content earns their hard-earned trust.

For example, if a customer asks, “What’s the best CRM for enterprise brands?” and an AI engine suggests HubSpot’s content over your brand, the damage isn’t just a lost click but a missed opportunity to garner interest and trust with that motivated searcher. The hard truth is the Gen AI model didn’t see your content as relevant or reliable enough to deliver in its answer.

Generative engines are trained on content that already exists, meaning your competitors’ content, user reviews, forum discussions, and your own material are all fair game. That means AI is both a discovery channel and competitor for audience attention. This duality must be recognized by CMOs to invest in structuring, amplifying, and revamping content workflows to match Gen AI’s expectations. The goal isn’t to chase algorithms; it’s to shape the content in a meaningful way to ensure those algorithms trust and view your content as the single source of truth.

Think of it this way: Traditional SEO practices taught you to optimize content for crawlers. With Generative SEO, you’re optimizing for the model’s memory.

How To Redesign SEO Content Workflows For The Generative Era

To win citations and influence AI-generated answers, it’s time to throw out your old playbooks and overhaul previous workflows. It may be time to ditch how you used to plan content and how performance was measured. Out with the old and in with the new (and more successful).

From Keyword Targeting To Knowledge Modeling

Generative models go beyond understanding just keywords. They understand entities and relationships, too. To show up in coveted AI answers and to be the top choice, your content must reflect structured, interconnected knowledge.

Start by building a brand knowledge graph that maps people, products, and topics that define your expertise. Schema markup is also a must to show how these entities connect. Additionally, every piece of content you produce should reinforce your position within that network.

Long-tail keywords may be easier to target and rank for in traditional SEO; however, optimizing for AI search requires a shift in content workflows, one that targets “entity clusters” instead. Here’s what this might look like in practice: A software company wouldn’t only optimize content around the focus keyword phrase “best CRM integrations.” The writer should also define its relationship to the concept of “CRM,” “workflow automation,” “customer data,” and other related phrases.

From Content Volume To Verifiable Authority

It was once thought that the more content, the better. This is not the case with SEO today as AI systems prefer and prioritize content that’s well-sourced, attributable, and authoritative. Content velocity is no longer the end game, but rather producing stronger, more evidence-backed pieces.

Marketing leaders should create an AI-readiness checklist for their content marketing team to ensure every piece of content is optimized for generative engines. Every article should include author credentials (job title, advanced degrees, and certifications), clear citations (where the statistics or research came from), and verifiable claims.

Create an AI-readiness checklist for your team. Every article should include author credentials, clear citations, and verifiable claims. Reference independent studies and owned research where possible. AI models cross-validate multiple sources to determine what’s credible and reliable.

In short: Don’t publish faster. Publish smarter.

From Static Publishing To Dynamic Feedback

If one thing is certain, it’s that generative engines are continuing to evolve, similar to traditional search. What ranks well today may change entirely tomorrow. That’s why successful SEO teams are adopting an agile publishing cycle to continue to stay on top of what’s working best. SEO teams are actively and consistently:

  • Testing which questions their audience asks in generative engines.
  • Tracking whether their content appears in those answers.
  • Refreshing content based on what’s being cited, summarized, or ignored.

Several tools are emerging to help you track your brand’s presence across, ChatGPT, Perplexity, AI Overviews, and more, including SE Ranking, Peec AI,  Profound, and Conductor. If you choose to forego tools, you can also run regular AI audits on your own to see how your brand is represented across engines by following the aforementioned framework. Treat that data like search console metrics and think of it as your new visibility report.

How To Measure SEO Success In An Answer-Driven World

Measuring SEO success across generative engines looks different than how we used to measure traditional SEO. Traffic will always matter, but it’s no longer the sole proof of impact. For CMOs, understanding how to measure marketing’s impact is essential to demonstrate the value your team delivers to the organization’s mission.

Here’s how progressive CMOs are redefining SEO success:

  • AI Citations: How often your content is referenced within AI-generated responses.
  • Answer Visibility Share: The percentage of relevant queries where your content appears in an AI answer.
  • Zero-Click Exposure: Instances where your brand is visible in AI responses, even if users don’t visit your site.
  • Answer Referral Traffic: The new “clicks”; visits that originate directly from AI-generated links.
  • Semantic Coverage: The breadth of related entities and subtopics your brand consistently appears for.

These metrics move SEO reporting from vanity numbers to visibility intelligence and are a more accurate representation of brand authority in the machine age.

Future-Proof Your SEO For Generative Search

Generative search is just as volatile as traditional search, but volatility is fertile ground for innovation. Instead of resisting it, CMOs should continue to treat SEO as an experimental function; a sandbox for continuously testing new ways to be discovered and trusted. SEO continues to remain a function that isn’t a set it and forget it, but one that must change with time and testing.

CMOs should encourage their team to A/B test content formats, schema implementations, and even phrasing to see what appears in AI generated responses. Cross-pollinate SEO insights with PR, product, and customer experience. When your organization learns how AI represents your brand, it becomes a feedback loop that strengthens everything from messaging to market positioning.

In the near future, the term “organic search” will become something broader to encompass the fast-growing ecosystem of machine-mediated discovery. The brands that succeed won’t just optimize for keywords. They’ll build long-lasting trust.

The Next Evolution Of Search

The notion that AI is killing SEO is false. AI isn’t eliminating SEO but rather redefining what it means today. What used to be a tactical discipline is shifting to become a more strategic approach that requires understanding how your brand exists within digital knowledge systems. It’s straying from what’s comfortable and moving into largely uncharted territory.

The opportunity for marketing leaders is clear: It’s time to move past the known and venture into the somewhat elusive realm of generative answer engines. After all, Forrester predicts AI-powered search will drive 20% of all organic traffic by the end of 2025. At the end of the day, many of the traditional SEO best practices still apply: create content that’s verifiable, well-structured, and context-rich. The main mindset shift lies in how to measure generative engine success, not by rankings but by relevance in conversation.

In the age of AI answers, your brand doesn’t need to just be searchable; it needs to be knowable.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

The Founder-Led Growth Loop: How To Amplify And Measure Executive Voice For Real ROI via @sejournal, @purnavirji

In this series (here and here), I’ve covered why founder-led marketing works and the systems you need to stay consistent, based on the playbook I co-authored for LinkedIn (my employer).

You’ve built the content engine and the operational frameworks to avoid burnout. Now comes the final, most critical part: proving it works.

Your founder provides the authentic voice. Your job as the marketer is to amplify that voice to the entire market and build the measurement framework that proves to the board, “This is working.”

This is how you turn a content strategy into a scalable, predictable, full-funnel growth loop.

Part 1: Amplify What’s Already Working

Your founder’s organic content is resonating, but it’s only reaching their first-degree network. Why guess what might work when you can use data to amplify what’s already working?

This is the most efficient paid strategy you can run, because paid works better when it’s built on trust. Our playbook data shows that startups whose directors post actively already generate 33% more leads through their paid campaigns.

Your secret weapon is Thought Leader Ads (TLAs).

TLAs are a LinkedIn ad format that lets you promote posts from individuals – founders, employees, even customers – rather than just your company page. They look and feel like organic posts: authentic, human, and scroll-stopping.

In general, TLAs are a high-performing format resulting in 1.5x higher click-through-rates (CTRs), 30% more efficient cost-per-click (CPCs), and 2x follower growth.

Apply them to startups and the impact is even bigger:

  • 7.6x more engagement than any other paid ad format.
  • 5x higher video engagement with video TLAs than regular sponsored video ads.

This isn’t just a top-of-funnel awareness play. You can use TLAs to build a full-funnel machine:

  • Top-of-Funnel: Amplify your founder’s best “scar story” or “contrarian take” post to your entire Ideal Customer Profile.
  • Mid-Funnel: Retarget everyone who engaged with that TLA with a more direct offer, like a Conversation Ad or a Lead Gen Form for a webinar.
  • Bottom-of-Funnel: Add this engaged audience to your nurture sequences and track them as they become sales-qualified leads.

The foundation is your founder’s best organic posts. From there, you can plug them into a full-funnel paid strategy.

Part 2: Build The Measurement Framework

This strategy feels right, but you have to prove it.

The biggest challenge in founder-led marketing is that the most important metrics – trust, reputation, resonance – don’t show up on a simple dashboard. They show up in your deal velocity, your DMs, and the way people talk about you when you’re not in the room.

There are ways you can start to track these on LinkedIn. Let’s break it down.

First 90 Days: Track Leading Indicators

Validate whether your content is resonating before it drives pipeline:

  • Engagement quality: Comments from ideal customer profiles (ICPs), DMs received, reposts by peers.
  • Audience growth: Follower count, especially from target segments.
  • Conversation starters: Number of inbound messages or replies sparked by content.
  • Profile metrics: Track who’s viewing your profile after seeing your posts.

LinkedIn recently expanded its analytics for individual members, giving you more visibility into how your content performs. Under the “Analytics” tab, you can now track:

  • Profile views from a post.
  • Followers gained from a post.
  • Audience demographics (job title, industry, location).
  • Premium button clicks (if you have a custom CTA).

These metrics help you move beyond vanity metrics to start measuring resonance – what’s landing, with whom, and why.

What not to do: Obsess over engagement metrics, delete underperforming posts, or let your founder compare themself to established thought leaders. These habits will drain motivation before your systems are strong enough to carry them through the dip.

Next 90 Days: Track Momentum

Track how your content is influencing relationships and reputation:

  • Prospect mentions: Train your sales team to log every time a prospect mentions your founder’s content during calls.
  • Dark social mentions: Track when your content gets shared in private peer networks like Slack groups or email threads.
  • Content-influenced deals: Create a CRM field to tag every prospect who mentions your posts.

Scott Albro, TOPO founder, does this in Salesforce by creating a “content-influenced” deal stage and tagging every prospect who mentions posts, comments, or competitor reactions. Then he measures deal velocity and pipeline.

Irina Novoselsky, CEO of Hootsuite, shared her results in the playbook: “I just did the math on my daily LinkedIn commitment over the last 3 months—10M+ impressions generated. But most importantly, 37% of our monthly leads are influenced by my social presence.”

Her team saw measurable business impact:

  • Executive presence was mentioned more frequently in sales calls in Q1 2025 than in all of 2024.
  • Deals closed faster when buyers referenced her content.
  • Enterprise opportunities influenced by her social presence had higher ACV.

Kacie Jenkins, former SVP of Marketing at Sendoso, found that when a prospect followed one of their Director+ executives on LinkedIn, they saw 11% higher win rates and 120% larger closed-won deal sizes.

Peep Laja, CEO of Wynter, tracks self-reported attribution: “About 80% of people signing up for Wynter or scheduling a demo say they found me on LinkedIn.”

6 Months Onwards: Business Impact Metrics

Track your lagging indicators:

  • Increasing inbound pipeline: Gal Aga’s rule is “if 20%+ of your pipeline mentions your content, you’ve won”.
  • Increasing deal velocity: Deals with content-influenced leads close faster due to pre-established trust
  • Attracting talent: Job applicants cite your posts.
  • Owning your category: You’re increasingly referenced in industry conversations.

Connect The Paid Loop

This final step connects amplification and measurement. How do you prove your TLA spend is driving revenue?

Use LinkedIn’s Conversions API (CAPI) to connect your CRM and website data directly to LinkedIn. This gives you visibility into offline actions and helps you attribute pipeline.

LinkedIn’s revenue attribution tools let you measure impact at the business, campaign, and company level. One tech company using revenue attribution found 36% higher win rates and 37% shorter deal cycles.

Startup advisor Canberk Beker sums it up: “When founders connect their organic presence to paid strategy – and measure both direct and influenced pipeline – they see outsized ROI. We’ve proven that TLAs lift demo requests and drive cross-channel conversions.”

Your Role As The Growth Multiplier

A founder-led strategy is a game-changer for sales and marketing.

Your founder’s job is to be the authentic voice. Your job as the marketer is to build the machine around them.

By connecting an authentic organic strategy with a high-powered amplification lever and a sophisticated measurement framework, you create a complete growth loop.

This is the modern marketing engine, one that builds trust at scale and proves its impact on the bottom line.

All data, quotes, and examples cited above without a source link are taken from the “Founder-Led Sales and Marketing Never Ends” playbook.

More Resources:


Featured Image: eamesBot/Shutterstock