A security vulnerability was discovered in the popular All in One SEO (AIOSEO) WordPress plugin that made it possible for low-privileged users to access a site’s global AI access token, potentially allowing them to misuse the plugin’s artificial intelligence features and could allow attackers to generate content or consume credits using the affected site’s AIOSEO AI credits and AI features. The plugin is installed on more than 3 million WordPress websites, making the exposure significant.
All in One SEO WordPress Plugin (AIOSEO)
All in One SEO is one of the most widely used WordPress SEO plugins, installed in over 3 million websites. It helps site owners manage search engine optimization tasks such as generating metadata, creating XML sitemaps, adding structured data, and providing AI-powered tools that assist with writing titles, descriptions, blog posts, FAQs, social medial posts, and generate images.
Those AI features rely on a site-wide AI access token that allows the plugin to communicate with the AIOSEO external AI services.
Missing Capability Check
According to Wordfence, the vulnerability was caused by a missing permission check on a specific REST API endpoint used by the plugin which enabled users with contributor level access to view the global AI access token.
In the context of a WordPress website, an API (Application Programming Interface) is like a bridge between the WordPress website and different software applications (including external apps like AIOSEO’s AI content generator) that enable them to securely communicate and share data with one another. A REST endpoint is a URL that exposes an interface to functionality or data.
The flaw was in the following REST API endpoint:
/aioseo/v1/ai/credits
That endpoint is meant to return information about a site’s AI usage and remaining credits. However, it failed to verify whether the user making the request was actually allowed to see that data. AIOSEO’s plugin failed to do a capability check to verify whether someone logged in with a contributor level access can have access to that data.
Because of that, any logged-in user with Contributor-level access or higher could call the endpoint and retrieve the site’s global AI access token.
“This makes it possible for authenticated attackers, with Contributor-level access and above, to disclose the global AI access token.”
The problem was that the implementation of the REST API endpoint did not do a permission check, which enabled someone with contributor level access to see sensitive data.
In WordPress, REST API routes are supposed to include capability checks that ensure only authorized users can access them. In this case, that check was missing, so the plugin treated Contributors the same as administrators when returning the AI token.
Why The Vulnerability Is Problematic
In WordPress, the Contributor level role is one of the lowest privilege levels. Many sites grant Contributor level access to multiple people so that they can submit article drafts for review and publication.
By exposing the global AI token to those users, the plugin may have effectively handed out a site-wide credential that controls access to its AI features. That token could be used to:
1. Unauthorized AI Usage The token functions as a site wide credential that authorizes AI requests. If an attacker obtains it, they could potentially use it to generate AI content through the affected site’s account, consuming whatever credits or usage limits are associated with that token.
2. Service Depletion An attacker could automate requests using the exposed token to exhaust the site’s available AI quota. That would prevent site administrators from using the AI features they rely on, effectively creating a denial of service for the plugin’s AI tools.
Even though the vulnerability does not allow direct code execution, leaking a site-wide API token still represents a possible billing risk.
Part Of A Broader Pattern Of Vulnerabilities
This is not the first time All In One SEO has shipped with vulnerabilities related to missing authorization or low-privilege access. According to Wordfence, the plugin has had six vulnerabilities disclosed in 2025 alone, many of which allowed Contributor or Subscriber level users to access or modify data they should not have been able to access.
Those issues included SQL injection, information disclosure, arbitrary media deletion, missing authorization checks, sensitive data exposure, and stored cross-site scripting. The recurring theme across those reports is improper permission enforcement for low-privilege users, the same underlying class of flaw that led to the AI token exposure in this case.
Six vulnerabilities in one year is a high level for an SEO plugin. Yoast SEO plugin had zero vulnerabilities in 2025, RankMath had four vulnerabilities in 2025 and Squirrly SEO had only three vulnerabilities in 2025.
Screenshot Of Six AIOSEO Vulnerabilities In 2025
How The Vulnerability Was Fixed
The vulnerability affects all versions of All in One SEO up to and including 4.9.2. It was addressed in version 4.9.3, which included a security update described in the official plugin changelog by the plugin developers as:
“Hardened API routes to prevent AI access token from being exposed.”
That change corresponds directly to the REST API flaw identified by Wordfence.
What Site Owners Should Do
Anyone running All in One SEO should update to version 4.9.3 or newer as soon as possible. Sites that allow multiple external contributors are especially exposed since low-privilege accounts could access the site’s AI token on vulnerable versions.
Featured Image by Shutterstock/Shutterstock AI Generator
The Reuters Institute for the Study of Journalism has published its annual predictions report based on a survey of 280 senior media leaders across 51 countries and territories.
The report suggests publishers are preparing for two potential threats: generative AI tools, and creators who attract audiences with personality-led formats.
Note that the Reuters Institute survey reflects a strategic group of senior leaders. It’s not a representative sample of the entire industry.
What The Report Found
Search Traffic Is The Biggest Near-Term Concern
Survey respondents expect search engine traffic to decline by more than 40% over the next three years as AI-driven answers expand.
The report cites Chartbeat data showing aggregate Google Search traffic to hundreds of news sites has already started to dip. Lifestyle-focused publishers say they’ve been hit especially hard by Google’s AI Overviews rollout.
That comes on top of longer-running platform declines. The report notes referral traffic to news sites from Facebook fell 43% over the last three years, while referrals from X fell 46% over the same period.
Publishers Plan To Invest In Differentiation
In response to traffic pressure and AI summarization, publishers say they’ll invest more in original investigations, on-the-ground reporting, contextual analysis, and human stories.
Leaders surveyed say they plan to scale back service journalism and evergreen content, which many expect AI chatbots to commoditize.
Video & Off-Platform Distribution Rising
Publishers expect to invest more in video, including “watch tabs,” and more in audio formats such as podcasts. Text output is less of a priority.
On distribution, YouTube is the main off-platform channel cited in the report, alongside TikTok and Instagram.
Publishers are also trying to work out how to navigate distribution through AI platforms such as OpenAI’s ChatGPT, Google’s Gemini, and Perplexity.
Subscriptions Lead, Licensing Is Growing
For commercial publishers, paid content like subscriptions and memberships are the top focus. There’s also renewed interest in native advertising and face-to-face events as publishers look for revenue beyond traditional display ads.
Publishers are also looking at licensing and other platform payments. The report notes interest in platform funding has nearly doubled over the last two years as AI companies began offering large deals.
Why This Matters
I’ve watched publishers cycle through traffic crises before. When Facebook’s algorithm changes hit in 2018, the industry scrambled, and eventually most publishers adjusted by leaning harder into search. Search was supposed to be the stable channel.
That assumption is what this report challenges. A projected decline of 40%+ over three years has become a planning number, affecting budgets, headcount, and content strategy.
The content mix change warrants attention. When 280 senior media leaders say they’re scaling back service journalism and evergreen content, it signals which pages they think will still drive traffic in an AI-summarized environment. Original reporting and analysis survive because chatbots can’t replicate them. Commodity information doesn’t, because it can be synthesized without a click.
The doubling of interest in licensing deals over two years is the other number that jumped out to me. When AI companies started writing checks, the conversation changed from “should we license” to “what’s our leverage.”
This report is useful as a benchmark for where the industry’s head is at, even if individual outcomes vary.
Looking Ahead
Traffic from search and AI aggregators is unlikely to disappear, but the terms of trade are still being negotiated.
That includes how citations work, what licensing looks like at scale, and whether revenue-sharing becomes a standard arrangement.
Most people have a favorite coffee mug. You reach for it without thinking. It fits your hand. It does its job. For a long time, SEO felt like that mug. A defined craft, a repeatable routine, a discipline you could explain in a sentence. Crawl the site. Optimize the pages. Earn visibility. Somewhere along the way, that single mug turned into a cabinet full of cups. Each one different. Each one required – none of them optional anymore.
That shift did not happen because SEO got bloated or unfocused. It happened because discovery changed shape.
SEO did not become complex on its own. The environment around it fractured, multiplied, and layered itself. SEO stretched to meet it.
Image Credit: Duane Forrester
The SEO Core Still Exists
Despite everything that has changed, SEO still has a core. It is smaller than many people remember, but it is still essential.
This core is about access, clarity, and measurement. Search engines must be able to crawl content, understand it, and present it in a usable way. Google’s own SEO Starter Guide still frames these fundamentals clearly.
Crawl and indexing remain foundational. If content cannot be accessed or stored, nothing else matters. Robots.txt governance follows a formal standard, RFC 9309, which defines how crawlers interpret exclusion rules. This matters because robots.txt is guidance, not enforcement. Misuse can create accidental invisibility.
Page experience is no longer optional. Core Web Vitals represent measurable user experience signals that Google incorporates into Search. The broader framework and measurement approach are documented on Web.dev.
Content architecture still matters. Pages must map cleanly to intent. Headings must signal structure. Internal links must express relationships. Structured data still plays a role in helping machines interpret content and enable eligible rich results today.
Measurement and diagnostics remain part of the job. Search Console, analytics, and validation tools still anchor decision-making for traditional search.
That is the SEO core. It is real work, and it is not shrinking. It is, however, no longer sufficient on its own.
This first ring out from the core is where SEO stops being a single lane.
Once the core is in place, modern SEO immediately runs into systems it does not fully control. This is where the real complexity starts to expand.
AI Search And Answer Engines
AI systems now sit between content and audience. They do not behave like traditional search engines. They summarize, recommend, and sometimes cite. Critically, they do not agree with each other.
In mid-2025, BrightEdge analyzed brand recommendations across ChatGPT, Google AI experiences, and other AI-driven interfaces and found that they disagreed on brand recommendations for 62% of queries. Search Engine Land covered the same analysis and framed it as a warning for marketers assuming consistency across AI search experiences.
This introduces a new kind of SEO work. Rankings alone no longer describe visibility. Practitioners now track whether their brand appears in answers, which pages are cited when citations exist, and how often competitors are recommended instead.
This is not arbitrary. Retrieval-augmented generation exists precisely to ground AI responses in external sources and improve factual reliability. The original RAG paper outlines this architecture clearly.
That architectural choice pulls SEO into new territory. Content must be written so it can be extracted without losing meaning. Ambiguity becomes a liability. Sections must stand alone.
Chunk-Level Content Architecture
Pages are no longer the smallest competitive unit. Passages are. And despite being told we shouldn’t focus on chunks for traditional search, when you look outside of traditional search, you need to understand the role chunks play. And traditional search isn’t the only game in town now.
Modern retrieval systems often pull fragments of content, not entire documents. That forces SEOs to think in chunks. Each section needs a single job. Each answer needs to survive without surrounding context.
This changes how long-form content is written. It does not eliminate depth. It demands structure. We now live in a hybrid world where both layers of the system must be served. It means more work, but selecting one over the other? That’s a mistake at this point.
Visual Search
Discovery increasingly starts with cameras. Google Lens allows users to search what they see, using images as queries. Pinterest Lens and other visual tools follow the same model.
This forces new responsibilities. Image libraries become strategic assets. Alt text stops being a compliance task and becomes a retrieval signal. Product imagery must support recognition, not just aesthetics.
Voice changes how people ask questions and what kind of answers they accept. Queries become more conversational, more situational, and more task-focused.
Industry research compiled by Marketing LTB shows that a meaningful portion of users now rely on voice input, with multiple surveys indicating that roughly one in four to one in three people use voice search, particularly on mobile devices and smart assistants.
That matters less as a headline number and more for what it does to query shape. Spoken queries tend to be longer, more natural, and framed as requests rather than keywords. Users expect direct, complete answers, not a list of links.
And the biggest search platform is reinforcing this behavior. Google has begun rolling out conversational voice experiences directly inside Search, allowing users to ask follow-up questions in real time using speech. The Verge covered Google’s launch of Search Live, which turns search into an ongoing dialogue rather than a single query-response interaction.
For SEO practitioners, this expands the work. It pulls them into spoken-language modeling, answer-first content construction, and situational phrasing that works when read aloud. Pages that perform well in voice and conversational contexts tend to be clear, concise, and structurally explicit, because ambiguity collapses quickly when an answer is spoken rather than scanned. Still think traditional SEO approaches are all you need?
Personalization And Context
There is no single SERP. Google explains that search results vary based on factors including personalization, language, and location.
For practitioners, this means rankings become samples, not truths. Monitoring shifts toward trends, segments, and outcome-based signals rather than position reports.
Image Credit: Duane Forrester
The third ring is where complexity becomes really visible.
These are not just SEO tasks. The things in this layer are entire disciplines that SEO now interfaces with.
Brand Protection And Retrieval In An LLM World
Brand protection used to be a communications problem. Today, it is also a retrieval problem.
Large language models do not simply repeat press releases or corporate messaging. They retrieve information from a mixture of training data, indexed content, and real-time sources, then synthesize an answer that feels authoritative, whether it is accurate or not.
This creates a new class of risk. A brand can be well-known, well-funded, and well-covered by media, yet still be misrepresented, outdated, or absent in AI-generated answers.
Unlike traditional search, there is no single ranking to defend. Different AI systems can surface different descriptions, different competitors, or different recommendations for the same intent. That BrightEdge analysis showing 62% disagreement in brand recommendations across AI platforms illustrates how unstable this layer can be.
This is where SEO is pulled into brand protection work.
SEO practitioners already operate at the intersection of machine interpretation and human intent. In an LLM environment, that skill set extends naturally into brand retrieval monitoring. This includes tracking whether a brand appears in AI answers, how it is described, which sources are cited when citations exist, and whether outdated or incorrect narratives persist.
PR and brand teams are not historically equipped to do this work. Media monitoring tools track mentions, sentiment, and coverage. They do not track how an AI model synthesizes a brand narrative, nor how retrieval changes over time.
As a result, SEO increasingly becomes the connective tissue between brand, PR, and the machine layer.
This does not mean SEO owns brand. It means SEO helps ensure that the content machines retrieve about a brand is accurate, current, and structured in ways retrieval systems can use. It means working with brand teams to align authoritative sources, consistent terminology, and verifiable claims. It means working with PR teams to understand which coverage reinforces trust signals that machines recognize, not just headlines humans read.
In practice, brand protection in AI search becomes a shared responsibility, with SEO providing the technical and retrieval lens that brand and PR teams lack, and brand and PR providing the narrative discipline SEO cannot manufacture alone.
This is not optional work. As AI systems increasingly act as intermediaries between brands and audiences, the question is no longer “how do we rank?” It is “how are we being represented when no one clicks at all?”
Branding And Narrative Systems
Branding is not a subset of SEO. It is a discipline that includes voice, identity, reputation, executive presence, and crisis response.
SEO intersects with branding because AI systems increasingly behave like advisors, recommending, summarizing, and implicitly judging.
Trust matters more in that environment. The Edelman Trust Barometer documents declining trust across institutions and brands, reinforcing why authority can no longer be assumed. Trust diminishes, and consumer behavior changes. The equation is no longer brand = X, therefore X = brand.
SEO practitioners now care about sourcing, claims, and consistency because brand perception can now influence whether content is surfaced or ignored.
UX And Task Completion
Clicks are no longer the win. Completion is.
Though old, these remain applicable. Nielsen Norman Group defines success rate as a core usability metric, measuring whether users can complete tasks. They also outline usability metrics tied directly to task efficiency and error reduction.
When AI and zero-click experiences compress opportunities, the pages that do earn attention must deliver. SEO now has a stake in friction reduction, clarity, and task flow. CRO (conversion rate optimization) has never been more important, but how you define “conversion” has also never been broader.
Paid Media, Lifecycle, And Attribution
Discovery spans organic, AI answers, video feeds, and paid placements. Measurement follows the same fragmentation.
Google Analytics defines attribution as assigning credit across touchpoints in the path to conversion.
SEO practitioners are pulled into cross-channel conversations not because they want to own them, but because outcomes are shared. Organic assists paid. Email creates branded demand. Paid fills gaps while organic matures.
Generational And Situational Behavior
Audience behavior is not uniform. Pew Research Center’s 2025 research on teens, social media, and AI chatbots shows how discovery and engagement increasingly differ across age groups, platforms, and interaction modes, including traditional search, social feeds, and AI interfaces.
This shapes format expectations. Discovery may happen in video-first environments. Conversion may happen on the web. Sometimes the web is skipped entirely.
What This Means For SEO Practitioners
SEO did not become more complex because practitioners lost discipline or focus; it became more complex because discovery fractured. The work expanded because the interfaces expanded. The inputs multiplied. The outputs stopped behaving consistently.
In that environment, SEO stopped being a function you execute and became a role you play inside a system you do not fully control, and that distinction matters.
Much of the anxiety practitioners feel right now comes from being evaluated as if SEO were still a closed loop. Rankings up or down. Traffic in or out. Conversions attributed cleanly. Those models assume a world where discovery happens in one place and outcomes follow a predictable path.
That is no longer the world we’re operating in.
Today, a user might encounter a brand inside an AI answer, validate it through a video platform, compare it through reviews surfaced in search, and convert days later through a branded query or a direct visit. In many cases, no single click tells the story. In others, there is no click at all.
This is why SEO keeps getting pulled into UX conversations, brand discussions, PR alignment, attribution debates, and content format decisions. Not because SEO owns those disciplines, but because SEO sits closest to the fault lines where discovery breaks or holds.
This is also why trying to “draw a box” around SEO keeps failing.
You can still define an SEO core, and you should. Crawlability, performance, content architecture, structured data, and measurement remain non-negotiable. But pretending the job ends there creates a gap between responsibility and reality. When visibility drops, or when AI answers misrepresent a brand, or when traffic declines despite strong fundamentals, that gap becomes painfully visible.
What’s changed is not the importance of SEO, but the nature of its influence.
Modern SEO operates as an integration discipline. It connects systems that were never designed to work together. It translates between machines and humans, between intent and interface, between brand narrative and retrieval logic. It absorbs volatility from platforms so organizations don’t have to feel it all at once.
That does not mean every SEO must take on every cup in the cabinet. It does mean understanding what those cups contain, which ones you own, which ones you influence, and which ones you simply need to account for when explaining outcomes.
The cabinet is already there, and you can choose to keep reaching for a single familiar mug and accept increasing unpredictability. Or you can open the cabinet deliberately, understand what’s inside, and decide how much of the expanded role you’re willing to take on.
Either choice is valid, but pretending everything still fits in one cup is no longer an option.
“Build a brand” has become one of the most repeated phrases in SEO over the past year. It is offered as both diagnosis and cure. If traffic is declining, build a brand. If large language models are not citing you, build a brand. If organic performance is unstable, build a brand.
The problem is not that this advice is wrong. The problem is that it is incomplete, and for many SEOs, it is not actionable.
A large proportion of people working in SEO today have developed in an environment that rewarded channel depth rather than marketing breadth. They understand crawling, indexing, content templates, internal linking, and ranking systems extremely well. What they have often not been trained in is how demand is created, how brands are formed in the mind, or how different marketing channels reinforce one another over time.
So, when the instruction becomes “build a brand,” the obvious question follows. What does that actually mean in practice, and what happens after you say the words?
SEO Is Not A Direct Demand Generator
Search has always been a demand capture channel rather than a demand creation channel. SEO does not usually make someone want something they did not already want. It places a brand in front of existing intent and attempts to win preference at the moment of consideration.
What SEO can do very effectively is increase mental availability. By being visible across a wide range of non-branded queries, a website creates repeated brand touchpoints. Over time, those touchpoints can contribute to familiarity, preference, and eventually loyalty.
The important part of that sentence is “over time.”
Affinity and loyalty are not short-term outcomes. They are built through repeated exposure, consistency of messaging, and relevance across different contexts. SEO can support this process, but it cannot compress it. No amount of optimization can turn visibility into trust overnight.
AI Has Changed The Pressure, Not The Fundamentals
AI has introduced new technical and behavioral challenges, but it has also created urgency at the executive level. Boards and leadership teams see both risk and opportunity, and the result is pressure. Pressure to act quickly, to be visible in new surfaces, and to avoid being left behind.
In reality, this is one of the most significant visibility opportunities since the mass adoption of social media. But like social media, it rewards those who understand distribution, reinforcement, and timing, not just production.
Where Content And Digital PR Actually Fit
Content and digital PR are often positioned as the vehicles for brand building in search. That framing is not wrong, but it is frequently too vague to be useful.
Google has been clear, including in recent Search Central discussions, that strong technical foundations still matter. Good SEO is a prerequisite to performance, not a nice-to-have. Content and digital PR sit within that system because they create the signals that justify deeper crawling, more frequent discovery, and sustained visibility. Both content and digital PR can be dissected further based on tactical objectives, but at the core, the objective is the same.
Search demand does not appear out of nowhere. It grows when topics are discussed, linked, cited, and repeated across the web. Digital PR contributes to this by placing ideas and assets into wider ecosystems. Content supports it by giving those ideas a constant home that search engines can understand and return to users.
This is not brand building in the abstract sense; it is visibility building.
Strong Visibility Content Accelerates Brand Building
Well-executed SEO content plays a critical role in brand building precisely because it operates at the point of repeated exposure. When a brand consistently appears for high-intent, non-branded queries, it earns familiarity before it ever earns loyalty.
Visibility-led content does not need to be overtly promotional to do this work. In many cases, its impact is stronger when it is practical, authoritative, and clearly written for the user rather than for the brand. Over time, this consistency creates an association between the problem space and the brand itself.
This is where many brand discussions lose precision. Brand is not only shaped by creative campaigns or opinion pieces. It is shaped by whether a brand reliably shows up with useful answers when someone is trying to understand a topic, solve a problem, or make a decision.
Strong SEO content compounds over time, and each ranking page reinforces the others. An example of this is some work I did back with Cloudflare in mid-2017. A content hub, positioned as a “learning center,” that we developed and rolled out a section at a time, has compounded over the years to achieve millions of organic visits, and collected over 30,000 backlinks.
Image from author, January 2026
Each impression adds to mental availability, and each return visit subtly shifts perception from unfamiliar to known. This is slow work, but it is measurable, and it is durable, and builds signals over time through Chrome, and in turn, begins to feed its own growth.
In this sense, SEO content is not separate from brand building. It is one of the few channels where brand perception can be shaped at scale, repeatedly, and in moments of genuine user need.
Thought Leadership Without Readership Is A Vanity Project
Thought leadership content has real value, but only under specific conditions. It needs an audience, a distribution strategy, and a feedback loop.
One of the most common patterns seen over the years is organizations investing heavily in senior-led opinion pieces, vision statements, or industry commentary, and then assuming impact by default.
When performance is examined properly, using analytics platforms or marketing automation data, it often becomes clear that very few people are actually reading the content.
If nobody is consuming it, it is not thought leadership. It is publishing for internal reassurance.
This is not an argument against opinion-led content. It is an argument for accountability. Content should earn its place by contributing to visibility, engagement, or downstream commercial outcomes, even if those outcomes sit higher in the funnel.
That requires measurement beyond pageviews. It requires understanding how content is discovered, how it is referenced elsewhere, how it supports other assets, and whether it creates repeat exposure over time.
Balancing Brand And Search Visibility
The current challenge for SEOs is not choosing between brand building and visibility building. It is learning how to balance the two without confusing them.
Brand is the outcome of repeated, coherent experiences. Visibility is the mechanism that makes those experiences possible at scale. You cannot shortcut one with the other, and you cannot treat them as interchangeable.
For practitioners who have grown up inside SEO, this means expanding beyond the channel without abandoning its discipline. It means understanding distribution as well as creation, signals as well as stories, and measurement as well as messaging.
The future does not belong to those who simply declare themselves a brand. It belongs to those who understand how visibility compounds, how trust is earned gradually, and how SEO fits into a much wider system of influence.
Building a brand is not the answer. It is the work that begins once the question has finally been asked properly.
Massive structures, with thousands of specialized computer chips running in parallel to perform the complex calculations required by advanced AI models. A single facility can cover millions of square feet, built with millions of pounds of steel, aluminum, and concrete; feature hundreds of miles of wiring, connecting some hundreds of thousands of high-end GPU chips, and chewing through hundreds of megawatt-hours of electricity. These facilities run so hot from all that computing power that their cooling systems are triumphs of engineering complexity in themselves. But the star of the show are those chips with their advanced processors. A single chip in these vast arrays can cost upwards of $30,000. Racked together and working in concert, they process hundreds of thousands of tokens—the basic building blocks of an AI model—per second. Ooooomph.
Given the incredible amounts of capital that the world’s biggest companies have been pouring into building data centers you can make the case (and manypeoplehave) that their construction is single-handedly propping up the US stock market and the economy.
So important are they to our way of life that none other than the President of the United States himself, on his very first full day in office, stood side by side with the CEO of OpenAI to announce a $500 billion private investment in data center construction.
Truly, the hyperscale datacenter is a marvel of our age. A masterstroke of engineering across multiple disciplines. They are nothing short of a technological wonder.
So, let’s go to Georgia. The purplest of purple states. A state with both woke liberal cities and MAGA magnified suburbs and rural areas. The state of Stacey Abrams and Newt Gingrich. If there is one thing just about everyone there seemingly agrees on, it’s that they’ve had it with data centers.
Last year, the state’s Public Service Commission election became unexpectedly tight, and wound up delivering a stunning upset to incumbent Republican commissioners. Although there were likely shades of national politics at play (voters favored Democrats in an election cycle where many things went that party’s way), the central issue was skyrocketing power bills. And that power bill inflation was oft-attributed to a data center building boom rivaled only by Virginia’s.
This boom did not come out of the blue. At one point, Georgia wanted data centers. Or at least, its political leadership did. In 2018 the state’s General Assembly passed legislation that provided data centers with tax breaks for their computer systems and cooling infrastructure, more tax breaks for job creation, and even more tax breaks for property taxes. And then… boom!
But things have not played out the way the Assembly and other elected officials may have expected.
Journey with me now to Bolingbroke, Georgia. Not far outside of Atlanta, in Monroe County (population 27,954), county commissioners were considering rezoning 900 acres of land to make room for a new data center near the town of Bolingbroke (population 492). Data centers have been popping up all across the state, but especially in areas close to Atlanta. Public opinion is, often enough, irrelevant. In nearby Twiggs County, despite strong and organized opposition, officials decided to allow a 300-acre data center to move forward. But at a packed meeting to discuss the Bolingbroke plans, some 900 people showed up to voice near unanimous opposition to the proposed data center, according to Macon, Georgia’s The Telegraph. Seeing which way the wind had blown, the Monroe county commission shot it down in August last year.
The would-be developers of the proposed site had claimed it would bring in millions of dollars for the county. That it would be hidden from view. That it would “uphold the highest environmental standards.” That it would bring jobs and prosperity. Yet still, people came gunning for it.
Why!? Data centers have been around for years. So why does everyone hate them all of the sudden?
What is it about these engineering marvels that will allow us to build AI that will cure all diseases, bring unprecedented prosperity, and even cheat death (if you believe what the AI sellers are selling) that so infuriates their prospective neighbors?
There are some obvious reasons. First is just the speed and scale of their construction, which has had effects on power grids. No one likes to see their power bills go up. The rate hikes that so incensed Georgians come as monthly reminders that the eyesore in your backyard profits California billionaires at your expense, on your grid. In Wyoming, for example, a planned Meta data center will require more electricity than every household in the state, combined. To meet demand for power-hungry data centers, utilities are adding capacity to the grid. But although that added capacity may benefit tech companies, the cost is shared by local consumers.
Similarly, there are environmental concerns. To meet their electricity needs, data centers often turn to dirty forms of energy. xAI, for example, famously threw a bunch of polluting methane-powered generators at its data center in Memphis. While nuclear energy is oft-bandied about as a greener solution, traditional plants can take a decade or more to build; even new and more nimble reactors will take years to come online. In addition, data centers often require massive amounts of water. But the amount can vary widely depending on the facility, and is often shrouded in secrecy. (A number of states are attempting to require facilities to disclose water usage.)
A different type of environmental consequence of data centers is that they are noisy. A low, constant, machine hum. Not just sometimes, but always. 24 hours a day. 365 days a year. “A highway that never stops.”
And as to the jobs they bring to communities. Well, I have some bad news there too. Once construction ends, they tend to employ very few people, especially for such resource-intensive facilities.
These are all logical reasons to oppose data centers. But I suspect there is an additional, emotional one. And it echoes one we’ve heard before.
More than a decade ago, the large tech firms of Silicon Valley began operating buses to ferry workers to their campuses from San Francisco and other Bay Area cities. Like data centers, these buses used shared resources such as public roads without, people felt, paying their fair share. Protests erupted. But while the protests were certainly about shared resource use, they were also about something much bigger.
Tech companies, big and small, were transforming San Francisco. The early 2010s were a time of rapid gentrification in the city. And what’s more, the tech industry itself was transforming society. Smartphones were newly ubiquitous. The way we interacted with the world was fundamentally changing, and people were, for the most part, powerless to do anything about it. You couldn’t stop Google.
But you could stop a Google bus.
You could stand in front of it and block its path. You could yell at the people getting on it. You could yell at your elected officials and tell them to do something. And in San Francisco, people did. The buses were eventually regulated.
The data center pushback has a similar vibe. AI, we are told, is transforming society. It is suddenly everywhere. Even if you opt not to use ChatGPT or Claude or Gemini, generative AI is increasingly built into just about every app and service you likely use. People are worried AI will harvest jobs in the coming years. Or even kill us all. And for what? So far, the returns have certainly not lived up to the hype.
You can’t stop Google. But maybe, just maybe, you can stop a Google data center.
Then again, maybe not. The tech buses in San Francisco, though regulated, remain commonplace. And the city is more gentrified than ever. Meanwhile, in Monroe County, life goes on. In October, Google confirmed it had purchased 950 acres of land just off the interstate. It plans to build a data center there.
This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.
How next-generation nuclear reactors break out of the 20th-century blueprint
The popularity of commercial nuclear reactors has surged in recent years as worries about climate change and energy independence drowned out concerns about meltdowns and radioactive waste. The problem is, building nuclear power plants is expensive and slow.
A new generation of nuclear power technology could reinvent what a reactor looks like—and how it works. Advocates hope that new tech can refresh the industry and help replace fossil fuels without emitting greenhouse gases. Here’s what that might look like.
—Casey Crownhart
Next-gen nuclear is one of our 10 Breakthrough Technologies this year. If you want to learn more about why it made the list, sign up to receive The Spark, our weekly newsletter all about energy and climate change, tomorrow. You can also check out the rest of the technologies on the list here.
Data centers are amazing. Everyone hates them.
The hyperscale datacenter is a marvel of our age. A masterstroke of engineering across multiple disciplines. They are nothing short of a technological wonder. People hate them.
People hate them in Virginia, which leads the nation in their construction. They hate them in Nevada, where they slurp up the state’s precious water. They hate them in Michigan, and Arizona, and South Dakota. They hate them all around the world, it’s true. But they really hate them in Georgia. Read our story about why they’re provoking so much fury.
—Mat Honan
This story first featured in The Debrief with Mat Honan, a weekly newsletter about the biggest stories in tech from our editor in chief. Sign up here to get the next one in your inbox on Friday.
The must-reads
I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Iran is systematically crippling Starlink The satellite internet service is meant to be impossible to jam—but the Iranian authorities are doing just that. (Rest of World) + Messages getting around Iran’s internet block suggest that thousands of people have been killed. (NYT $) + On the ground in Ukraine’s largest Starlink repair shop. (MIT Technology Review)
2 Studies claiming microplastics harm us are being called into question Some scientists say the discoveries are probably the result of contamination and false positives. (The Guardian)
3 Trump is trying to temper the data center backlash He hopes cajoling tech companies to pay more and thus reduce people’s energy bills will do the trick. (WP $) + Microsoft has just become the first tech company to promise it will do just that. (NYT $) + We know AI is power hungry. But just how big is the scale of the problem? (MIT Technology Review)
4 US emissions jumped last year Thanks to a combination of rising electricity demand, and more coal being burned to meet it. (NYT $) + But it’s not all bad news: coal power generation in India and China finally started to decline. (The Guardian) + Four bright spots in climate news in 2025. (MIT Technology Review)
5 Elon Musk needs to face consequences for his actions If we tolerate him unleashing a flood of harassment of women and children, what will come next? (The Atlantic $) + The US Senate has passed a bill that could give non-consensual deepfake victims a new way to fight back. (The Verge $)
6 Why the US is set to lose the race back to the moon Cuts to NASA aren’t helping, but they’re not the only problem. (Wired $)
7 Google’s Veo AI model can now turn portrait images into vertical videos Really slick ones, too. (The Verge $) + AI-generated influencers are sharing fake images of them in bed with celebrities on Instagram. (404 Media $)
8 Former NYC mayor Eric Adams has been accused of a crypto ‘pump and dump’ He promoted a token that saw its market cap briefly soar to $580 million before plummeting. (Coindesk)
9 Are you a middle manager? Here’s some good news for you Your skills are not being replaced by AI any time soon. (Quartz)
10 Even miniscule lifestyle tweaks can extend your lifespan A study of 60,000 adults found just a little bit more sleep and exercise makes a huge difference. (New Scientist $) + Aging hits us in our 40s and 60s. But well-being doesn’t have to fall off a cliff. (MIT Technology Review)
Quote of the day
“What I’m hopeful for in ’26 is for more people speaking up. Speaking truth to power is the point of freedom of speech, is the point of American society.”
—LinkedIn cofounder Reid Hoffman tells Wired he wants more people in Silicon Valley to start pushing back against the Trump administration this year.
One more thing
DEEP LEARNING INDABA 2024
What Africa needs to do to become a major AI player
Africa is still early in the process of adopting AI technologies. But researchers say the continent is uniquely hospitable to it for several reasons, including a relatively young and increasingly well-educated population, a rapidly growing ecosystem of AI startups, and lots of potential consumers.
However, ambitious efforts to develop AI tools that answer the needs of Africans face numerous hurdles. Taken together, researchers worry, they could hold Africa’s AI sector back and hamper its efforts to pave its own pathway in the global AI race. Read the full story.
—Abdullahi Tsanni
We can still have nice things
A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)
+ Still keen to do a bit of reflecting on the year behind and the one ahead? This free guide might help! + Turns out British comedian Rik Mayall had some pretty solid life advice. + I want to stay in this house in São Paolo. + If you want to stop doomscrolling, it’s worth looking at your sleep habits. ($)
Our rundown this week of new products and services for ecommerce merchants includes updates on agentic commerce, product reviews, A/B testing, post-purchase experiences, cryptocurrency payments, fulfillment, analytics, personalization, and packaging.
Got an ecommerce product release? Email updates@practicalecommerce.com.
PayPal powers Microsoft’s launch of Copilot Checkout.PayPal is partnering with Microsoft in support of Checkout, enabling shoppers to discover, decide, and pay without leaving the Copilot experience. PayPal will surface merchant inventory, branded checkout, guest checkout, and credit card payments, starting with Copilot.com. Copilot uses AI to bring context and intent into the shopping journey. Users can now browse curated, shoppable results and complete their purchase with PayPal.
Amazon to limit reviews across product variations.Amazon is changing how reviews are shared across products. Amazon has heretofore shared reviews across all variations of a product, even when they differ significantly. Now, to improve accuracy and help shoppers make more informed purchasing decisions, Amazon will share only reviews between variations with minor differences that don’t affect functionality.
Kibo Commerce announces Connect Hub and MCP.Kibo Commerce, a platform for composable commerce, has launched two product offerings. The new Connect Hub helps scale pre-built integrations to platforms across various product categories, including ecommerce and marketplaces. Merchants gain access to a network of 3,300 trading partners and hundreds of payment and shipping adapters. The new Kibo MCP integrates enterprise commerce logic and generative AI tools.
Fluent Commerce launches order sourcing logic with A/B testing.Fluent Commerce, an order management system, has announced the launch of AI-powered order sourcing logic with A/B testing. Users can compare the outcomes of two sets of order sourcing logic run in parallel to see the impact on net margin, fulfillment and delivery costs, split shipment rate, order-to-door time, and average delivery distance, and to calculate carbon impact. The capability enables retailers to continuously learn from their fulfillment network, according to Fluent Commerce.
Fluent Commerce
Route acquires Frate Returns for ecommerce post-purchase experiences.Route, a post-purchase platform for ecommerce brands, has acquired Frate Returns, a returns-and-exchanges platform. By integrating Frate’s returns-and-exchanges software, Route now offers merchants a single integrated platform to manage the customer journey after checkout. According to Route, Frate’s capabilities (exchange-first optimization, AI image verification, and flexible shipping, refund, and payment options) allow brands to reduce refund rates and operational costs while retaining revenue and increasing loyalty.
Crypto.com partners with Stripe.Crypto.com, a global cryptocurrency platform, is partnering with Stripe to expand payment options. The collaboration will allow Crypto.com users to pay for everyday goods and services using their crypto balances at Stripe-powered merchants across the U.S. The integration will appear as a new payment option on the checkout pages of participating merchants that use Stripe’s Optimized Checkout Suite.
UCanPack launches tall ecommerce boxes.UCanPack, a provider of packaging and shipping supplies, has introduced a line of engineered tall boxes for ecommerce merchants. According to UCanPack, the line features impact resistance and crush protection and is right-sized for elongated and narrow goods, such as lamps, tripods, sports gear, decor, and rolled materials. UCanPack aims to help brands reduce transit damage and streamline pack bench workflows.
Shoplazza launches fulfillment option.Shoplazza, a global commerce platform serving direct-to-consumer brands, has launched Fulfillment by Shoplazza to help merchants navigate global logistics. Shoplazza says its new fulfillment service integrates global warehousing, last-mile delivery, financial automation, and real-time risk controls to provide merchants with a predictable and scalable logistics engine. Key capabilities include zero-prepayment logistics, revenue-aligned billing, embedded financial tools, global and localized fulfillment options, and a compliant logistics network, per Shoplazza.
Shoplazza
Stackline unveils analytics for AI-powered commerce platforms.Stackline, a provider of retail analytics and connected commerce, has launched AI Visibility, offering insights into how shoppers discover and interact with products via conversational and agentic shopping platforms. According to Stackline, participating merchants can (i) measure the volume of real shopping questions across leading AI platforms, (ii) bring results from ChatGPT, Amazon Rufus, and more into a unified analytical environment, (iii) view detailed product competitive insights, and (iv) analyze which products frequently appear in recommendations together.
Lightspeed Commerce launches AI assistants.Lightspeed Commerce, an omnichannel platform powering businesses in over 100 countries, has launched Lightspeed AI for agent-driven workflows, including conversational assistants for retail and restaurants. The assistants help merchants ask questions, get answers quickly, and make smarter decisions without navigating dashboards or reports.
YouTube is updating its Advertiser-friendly content guidelines to allow more videos about certain “controversial issues” to earn full ad revenue, as long as the content is non-graphic and presented in a dramatized or discussion-based context.
YouTube is loosening monetization restrictions for videos focused on controversial issues that advertisers may define as sensitive, including abortion, self-harm, suicide, and domestic and sexual abuse, when the content is “dramatized or discussed in a non-graphic manner.”
YouTube’s Help Center update describes the change, stating that content focused on “Controversial issues” is now eligible to earn ad revenue when it’s non-graphic and dramatized, and that this replaces a previous policy that limited monetization regardless of graphicness or whether content was fictional.
The current “Controversial issues” policy section also explicitly includes “non-graphic but descriptive or dramatized content” related to domestic abuse, self-harm, suicide, adult sexual abuse, abortion, and sexual harassment under the category that “can earn ad revenue.”
How YouTube Defines “Controversial Issues”
YouTube defines “Controversial issues” as topics associated with trauma or abuse, and notes the policy may apply even if the content is purely commentary.
The Help Center list includes child abuse, adult sexual abuse, sexual harassment, self-harm, suicide, eating disorders, domestic abuse, and abortion.
It also distinguishes between content that is “focal” versus “fleeting.” A passing reference is not considered a focus, whereas a sustained segment or a full-video discussion is.
Why This Matters
This update can change whether videos qualify for full ad revenue.
YouTube is drawing a clearer line between non-graphic dramatization or discussion (more likely to be eligible) and content that includes graphic depictions or very explicit detail (still likely to be restricted).
As with past advertiser-friendly updates, real-world outcomes can depend on how a specific upload is categorized during review, including signals from the video itself plus title and thumbnail.
Looking Ahead
It’s unknown whether previously limited videos will be re-reviewed automatically, or only on appeal.
Regardless, you shouldn’t wait for YouTube to do the work. Now is a great time to submit an appeal if your videos were affected by YouTube’s controversial issues policy.
At this point in time, AI search products such as Google and Microsoft’s PMax (and now AI Max) have firmly woven themselves into the toolkits of search marketers around the globe. But as many search marketers rush to not only test new products but also scale paid search activity, there is an increasing tendency to neglect the core elements of what makes a successful paid search account: audience, structure, and intent.
Within this article, I’ll aim to throw paid search foundations back into the limelight, placing an emphasis on why core concepts remain important and highlighting the fact that AI products don’t necessarily replace foundations but only serve to enhance them.
How We Got Here
It is, of course, important to stress that the shift towards AI prevalence has been a gradual one. From the early days of obsessing over match types and manual cost-per-click (CPCs) to Smart Bidding playing a greater role in finding customers across varying points of the user journey – we’ve come a long way to get here. With products such as PMax claiming to “do it all for you,” we can see that the “hands on” approach of yesteryear has become less active.
With every step taken towards the current climate, we have handed a little more control to the machine. While this has allowed us to scale campaigns on a much greater level, when comparing the role of a PPC manager now to 10 years ago, the day-to-day tasks look exponentially different.
But as automation has increased, so has the machines’ reliance upon clean and consistent foundations. AI features can only optimize based on what we feed it. If structure, signals, or audiences aren’t clear, the machine has no concept of what “good” looks like. Because of this, AI hasn’t removed the need for fundamentals; it’s made them more important.
Structure Is Still Integral To Success
Automated systems and products such as PMax encourage greater levels of consolidation through feeding insights to the algorithm and letting it decide what works best for us. However, in practice, structure remains one of the biggest drivers around whether AI drives success or not.
PMax is not psychic. It doesn’t have a full understanding of specific product margins your business may have, your product development lines, or your business’ full commercial realities (yet!). The only way to do this is to make those distinctions clear. This is where structure comes in! A well-structured account provides boundaries for the machine to work with. It helps by:
Providing clean learning environments: Grouping products and services in a logical manner helps to ensure that products such as PMax aren’t trying to learn everything all at once. Through clear separation, you increase the likelihood of more accurate outcomes.
Maintaining budget control: If everything is thrown into one campaign, it makes it increasingly more difficult to avoid under-performing products from cannibalizing budget.
Reducing conflicting intent: When campaigns mix differing intents (e.g., providing varying conversion actions that are contradictory from a user journey standpoint), the machine receives much greater volumes of noise. Through clear separation and delineation within a well-structured account, advertisers can reduce skewed data and improve performance.
Clear structure isn’t something to be ignored. It’s the backbone to improving AI performance. (Image from author, November 2025)
Audience Insight Remains AI’s Compass
When it comes to understanding people, human marketers will always hold a competitive advantage. Knowing why people convert, what motivates them, and ultimately, the understanding of human nature will always mean that human marketers have an intrinsic intuition that search features such as PMax will never have. Acknowledging this, it’s key that humans feed quality customer insights into these platforms to ensure that the machine can gain a better understanding of what makes us tick.
As an example, a family car buyer and a luxury SUV buyer may both search for [SUV cars], but their motivations and expectations differ dramatically. AI can easily cluster this behavior, but it takes human insight to translate that behavior into effective positioning.
Taking this into account, the foundational understanding of a) what makes a solid audience grouping and b) how to implement said audience is again where foundational understanding comes into play. The strongest performing PMax campaigns are the ones filled with the richest insights. CRM, loyalty information, and higher intent user signals often significantly improve PMax’s ability to drive performance. AI products can only feed off information you provide, and those signals must be rooted in real audience understanding.
When you understand your audience deeply, AI has a stronger foundation to optimize from. When you don’t, you leave the machine to guess.
Intent (And Keywords) Still Drive Everything
It could be argued that automation has accelerated the death of keywords, but what it hasn’t done is decrease the importance of intent. Search has always been (and remains!) an intent-driven channel. PMax might automate placements and assets, but it still requires queries and signals to understand what someone wants.
We might now be seeing fewer search queries (much to my annoyance!), but the system is still learning from billions of intent signals. Taking this into account, having a core, foundational understanding of intent enables you to:
Identify and prevent wasted spend. ALWAYS my ace card. Negatives and keyword exclusions remain critical in helping to guide AI products. Advertisers who refine intent signals almost always outperform those who automatically assume that ‘leaving it to the machine’ is the best approach.
Match creative with motivation. Understanding customer intent will help to ensure that you avoid over-generic ad copy and craft content that customers actually engage with.
Align landing pages with behavior. AI can send traffic to your pages, but if the content doesn’t match user intent, account efficiency will be impacted.
A Whole New World
To quote the 1992 Disney classic “Aladdin,” it really is a whole new world (pretty sure they had PMax in mind when writing that song…). However, while the further acceleration of AI products may have changed the mechanics of search advertising, what it hasn’t done is make the fundamentals less important.
Audience insight still guides strategy. Intent still shapes relevance of content. Structure still shapes accuracy. These are not only essentials that have stood the test of time but will also provide a clear advantage to advertisers who can recognize their benefit.
The future of paid search truly isn’t a case of fighting the machine; it’s about ensuring we influence the algorithms by providing richer context and insight, in turn utilizing their ability to scale to further drive results.
Social media is an integrated part of life in the U.S., with usage growing among adults. Pew Research Center reports that YouTube (84%) and Facebook (71%) remain the most widely used platforms for U.S. adults, followed by 50% of adults using Instagram.
What we should pay attention to is that people’s trust levels have changed. Americans turn to social media for local news and product research, but no longer trust every voice equally. According to Pew Research, trust in national and local news organizations is declining across all age groups. According to Ipsos Global Trustworthiness Monitor, only 22% of the public trusts social media companies.
With bots, fake reviews, and undisclosed AI-generated content flooding channels, authenticity becomes more essential than ever.
As trust in platforms’ ability to moderate or surface credible information has weakened, many users now look to individual creators they already trust, to help interpret or contextualize information, rather than relying solely on platform-level controls or algorithms. This is an opportunity for creators who can provide credibility.
This article takes a closer look at the factors that lose people’s trust, what rebuilds it, and what marketers can do to ensure they stay trusted.
What’s Not Working
Perfection and over-engineered AI-generated content are not what resonates with audiences and users. Companies that forget to be human-first could find their campaigns backfire, as in the case of this AI firm that commissioned a billboard featuring an AI employee and the tagline: “Stop Hiring Humans.” And Coca-Cola with its AI-generated holiday ad.
To keep building human-focused content and to engage with your audience through authenticity, avoid the following:
Overly Polished Content & Using AI Models
Audiences are quick to recognize overly “clean” content and AI models. While they’re usually a very engaging brand, Ryanair’s recent AI-generated TikTok didn’t land with people (pun intended):
If you read some of the negative comments, they focused on how disappointed people are with the airline service. One commenter remarked, “Why is Ryanair using AI?” and another user added, “Saves money. That’s all they care about.”
For a brand that is usually so self-effacing and quick to respond, this time, it looks like they missed the opportunity to be themselves and connect.
While it is true that AI can save more money (according to a survey conducted by Ahrefs, human-written content costs 4.7x more than AI-generated content), it reduces believability.
While not a social media example, Apple’s “Crush” iPad Pro Ad is an example of a big brand that usually gets it right, getting it very wrong. Apple had to withdraw the ad after negative reactions, even though the production was high-quality and polished. Viewers called it “soul crushing.”
The ad depicted heavy instruments: pianos, guitars, cameras, and paintings, being crushed into a single iPad, suggesting that the thinnest iPad can replace the things that people cherished over hundreds and even thousands of years. But, inadvertently, it dismissed human creativity.
The concept was considered tone deaf and disappointed Apple fans, who are usually such ardent supporters of their creativity.
The example highlights how the right tone is essential to get your audience to connect with your messaging. Users will respond less to generic brand messaging that feels like it was scripted by ChatGPT. Invoking positive emotions in your content that tie in with your brand values should be your focus.
Undisclosed Synthetic Content & The Risk Of Misinformation
According to a study conducted at Rutgers University, public views on AI are mixed: While AI can assist with automation and data analysis, people prefer human storytelling for content. Users can spot AI-crafted texts and scripts easily because of tells like em-dashes and punctuation.
Suspecting undisclosed AI-generated content could drive your users away: Over half (52%) of social media users are concerned about brands posting AI-generated content without disclosing it.
And according to Ahrefs, 62% of respondents cited the biggest perceived risk of using AI as sharing misinformation. In the same study, most people (65%) regard human-written content as higher quality than AI-generated content.
Replacing The Human Touch
While AI can be cost-effective as mentioned above, LLMs erode trust when they replace the writing of lived real experience. As a case in point, there was immediate backlash to Google’s Gemini commercial “Dear Sydney.” A father asks Gemini to help him write a letter to Olympian Sydney McLaughlin-Levrone about how inspiring she is.
Viewers called it sad and disturbing, with one commenting that it “completely negates why someone would write a letter to an athlete or anyone for that matter.”
How Marketers Can Demonstrate Trust
Authenticity will always prevail in marketing. When you’re honest with your users about your product and what you can offer them, your campaigns will resonate more. Aside from avoiding the pitfalls, here are tips that we recommend for a proactive approach:
Collaborate With Micro-Creators And Subject-Matter Voices
According to a Deloitte survey, “roughly 50% of Gen Zs and millennials surveyed say they feel a stronger personal connection to social media creators than they do with TV personalities or actors.”
Meaning good influencer marketing for your niche, demonstrating real product use, pros and cons, and reasoning will help instead of hurting your brand.
Choose Credible Partners Over Famous Ones
Three in 10 U.S. adult social media users say they have purchased something after seeing an influencer or content creator post about it. However, they don’t rely on follower counts for reach to establish trust. Studies find that “authenticity’s direct effect on the number of followers is not statistically significant.”
Choosing a creator to partner with is less about reach and more about how they align with your brand values and their expertise they demonstrate. Select the creator with the expertise to effectively review your product and craft compelling content about it.
According to Influencer Marketing Hub’s Influencer Marketing Benchmark Report 2025, brands are shifting towards nano and micro-influencers for targeted, cost-effective collaborations. It works because smaller creators are widely seen to be more relatable and authentic, showing real product use and reasoning in their testimonials, and are trusted within their communities.
Being transparent about partnerships and brand deals will help increase trust with your consumers. According to Sprout Social, 86% of survey respondents said that they would be more likely to give the company a second chance after a bad experience if it has a history of transparency.
As Google’s Search Quality Rater Guidelines have reiterated, trust is the crucial component of E-E-A-T, so keep demonstrating that you’re transparent about who is behind your content, maintain a positive reputation, and follow ethical content practices.
Monitor Your Brand Mentions Alongside AI
If you think you’re safe because your brand doesn’t use AI, think again. Users who are disenfranchised with AI may have a negative perception of your brand if they see unofficial ads on AI-generated, low-quality content.
In an email to Marketing Brew, Google spokesperson Nate Funkhouser said, “YouTube doesn’t currently offer advertisers the ability to opt out of appearing next to AI-generated content.”
It’s always good practice to monitor where your brand shows up and disassociate from fake content to remain trustworthy.
Tap Into UGC And Human-Led Brand Presence
Brands still need a human voice and faces to represent them, craft compelling narratives, and convey their messages professionally.
UGC gives your customers a sense of participation and is more trustworthy than content from a third party, since it isn’t commissioned. Not to mention, Google also surfaces more video, forums, and UGC in response to user behavioral shifts in seeking quality content.
Support Community-Led Content
Invest in creators and groups with genuine influence in your category.
Lastly, participate where the conversations happen. Niche communities on Reddit, Discord, and Threads are also great for truly connecting with your audience in a transparent way, while customers benefit by getting transparent feedback. What works in these communities is engaging in conversations, rather than just having them watch streams of your content.
What’s likely to work? Hosting AMAs and helping solve problems using Reddit vs. selling with an AI-generated avatar that misses all emotional connection to your audience? We recommend the former.
Trust Is Being Rebuilt Through People, Not Platforms
Instead of solely investing in AI to generate more creative assets, try partnering with nano-influencers, adding more context and disclosures to your campaigns, and facilitating lively UGC campaigns over communities in platforms like Reddit.
People don’t get inspired by a bot or someone who hasn’t actually lived the experience. Your brand earns your audience’s trust through relatable creators and thoughtfully crafted content.
The brands that win are ultimately the brands that feel more human, and the decision to lean into human-led content is yours.