Shaping the future with adaptive production

Adaptive production is more than a technological upgrade: it is a paradigm shift. This new frontier enables the integration of cutting-edge technologies to create an increasingly autonomous environment, where interconnected manufacturing plants go beyond the limits of traditional automation. Artificial intelligence, digital twins, and robotics are among the powerful tools manufacturers are using to create dynamic, intelligent systems that not only perform tasks, but also learn, make decisions, and evolve in real-time.

Taking this kind of adaptive approach can transform a manufacturer’s productivity, efficiency, and innovation. But beyond the factory, it also has the potential to deliver society-wide benefits, by bolstering economic growth locally, creating more attractive and accessible employment opportunities, and supporting a sustainability agenda.

As efforts to revive and modernize local manufacturing accelerate in regions around the world, including North America and Europe, adaptive production could help manufacturers overcome some of their biggest obstacles—firstly, attracting and retaining talent. Nearly 60% of manufacturers cited this as their top challenge in a 2024 US-based survey. Highly automated, technology-led adaptive production methods hold new promise for attracting talent to roles that are safer, less repetitive, and better paid. “The ideal scenario is one where AI enhances human capabilities, leads to new task creation, and empowers the people who are most at risk from automation’s impact on certain jobs, particularly those without college degrees,” says Simon Johnson, co-director of MIT’s Shaping the Future of Work Initiative.

Secondly, the digitalization of manufacturing—embedded in the very foundation of adaptive production technologies—allows companies to better address complex sustainability challenges through process and resource optimization and a better understanding of data. “By integrating these advanced technologies, we gain a more comprehensive picture across the entire production process and product lifecycle,” explains Jelena Mitic, head of technology for the Future of Automation at Siemens. “This will provide a much faster and more efficient way to optimize operations and ensure that all the necessary safety and sustainability requirements are met during quality control.”

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Finding value with AI automation

In June 2023, technology leaders and IT services executives had a lightning bolt headed their way when McKinsey published the “The economic potential of generative AI: The next productivity frontier” report. It echoed a moment from the 2010s when Amazon Web Services launched an advertising campaign aimed at Main Street’s C-suite: Why would any fiscally responsible exec allow their IT teams to spend capex for servers and software when AWS only cost 10 cents per virtual machine? 

Vendors understand that these kinds of reports and aggressive advertising around competitive risks projected onto an industry sector would drive many calls from boards to their C-suite, rolling from C-suite to their staff all asking, “What are we doing with AI?” When asked to “do something with AI,” technical leadership and their organizations promptly responded — sometimes begrudgingly and sometimes excitedly — for work-sanctioned opportunities to get their hands on a new technology. At that point, there was no time to sort between actual business returns from applying AI and “AI novelty” use cases that were more Rube Goldberg machines than tangible breakthroughs. 

Today’s opportunity: Significant automation gains 

When leaders respond to immediate panic, new business risks and mitigations often emerge.  Two recent examples highlight the consequences of rushing to implement and publish positive results from AI adoption. The Wall Street Journal reported in April 2025 on companies struggling to realize returns on AI. Just weeks later, it covered MIT’s retraction of a technical paper about AI where the results that led to its publication could not be substantiated.  

While these reports demonstrate the pitfalls of over-reliance on AI without common-sense guardrails, not all is off track in the land of enterprise AI adoption. Incredible results being found from judicious use of AI and related technologies in automating processes across industries. Now that we are through the “fear of missing out” stage and can get down to business, where are the best places to look for value when applying AI to automation of your business?  

While chatbots are almost as pervasive as new app downloads for mobile phones, the applications of AI realizing automation and productivity gains line up with the unique purpose and architecture of the underlying AI system they are built on. The dominant patterns where AI gains are realized currently boil down to two things: language (translation and patterns) and data (new format creation and data search).  

Example one: Natural language processing  

Manufacturing automation challenge: Failure Mode and Effects Analysis (FMEA) is both critical and often labor intensive. It is not always performed prior to a failure in manufacturing equipment, so very often FMEA occurs in a stressful manufacturing lines-down scenario. In Intel’s case, a global footprint of manufacturing facilities separated by large distances along with time zones and preferred language differences makes this even more difficult to find the root cause of a problem. Weeks of engineering effort are spent per FMEA analysis repeated across large fleets of tools spread between these facilities.  

Solution: Leverage already deployed CPU compute servers for natural language processing (NLP) across the manufacturing tool logs, where observations about the tools’ operations are maintained by the local manufacturing technicians. The analysis also applied sentiment analysis to classify words as positive, negative, or neutral. The new system performed FMEA on six months of data in under one minute, saving weeks of engineering time and allowing the manufacturing line to proactively service equipment on a pre-emptive schedule rather than incurring unexpected downtime.  

Financial institution challenge: Programming languages commonly used by software engineers have evolved. Mature bellwether institutions were often formed through a series of mergers and acquisitions over the years, and they continue to rely on critical systems that are based on 30-year-old programming languages that current-day software engineers are not familiar with. 

Solution: Use NLP to translate between the old and new programming languages, giving software engineers a needed boost to improve the serviceability of critical operational systems. Use the power of AI rather than doing a risky rewrite or massive upgrade. 

Example two: Company product specifications and generative AI models 

Sales automation challenge: The time it takes to reformat a company’s product data into a specific customer RFP format has been an ongoing challenge across industries. Teams of sales and technical leads spend weeks of work across different accounts reformatting the same root data between the preferred PowerPoint or Word document formats. The customer response times were measured in weeks, especially if the RFPs required legal reviews. 

Solution: By using generative AI combined with a data extraction and prompting technique called retrieval augmented generation (RAG), companies can rapidly reformat product information between different customer required RFP response formats. The time spent moving data between different documents and different document types only to find an unforced error in the move is reduced to hours instead of weeks.  

HR policy automation challenge: Navigating internal processes can be time consuming and confusing for both HR and employees. The consequences of misinterpretation, access outages, and personal information or private data being exposed are massively important to the company and the individual. 

Solution: Combine generative AI, RAG, and an interactive chatbot that uses employee-assigned assets to determine identity and access rights, provides employees interactive query-based chat formats to answer their questions in real time. 

Finding your best use cases for AI 

In a world where 80% to 90% of all AI proof of concepts fail to scale, now is the time to develop a framework that is based on caution. Consider starting with a data strategy and governance assessment. Then find opportunities to compare successful AI-based automation efforts at peer companies through peer discussions. Clear, rules-based policies and processes offer the best opportunities to begin a successful AI automation journey in your enterprise. Where you encounter disparate data sources (e.g., unstructured, video, structured databases) or unclear processes, maintain tighter human-in-the-loop decision controls to avoid unexpected data or token exposure and cost overruns. 

As the AI hype cycle cools and business pressure mounts, now is the time to become practical. Apply AI to well-defined use cases and begin unlocking the automation benefits that will matter not just in 2025, but for years to come.

This content was produced by Intel. It was not written by MIT Technology Review’s editorial staff.

Charts: Global Job Trends from AI

Despite worries of impending job loss, artificial intelligence is making employees more valuable, not less.

That’s according to PwC’s just-released “2025 Global AI Jobs Barometer” report, which shows that AI has the potential to enhance workers’ value, even in roles that are highly automatable. The report, based on an analysis of nearly 1 billion job postings across six continents, examines the global impact of AI on employment, skills, wages, and productivity.

According to the PwC report, since 2022, industries best positioned to adopt AI have seen their revenue growth almost quadruple.

In addition, wages are increasing twice as fast in industries with the highest AI exposure compared to those with the least.

Moreover, the requirements are shifting for AI-exposed jobs. Employer demand for college degrees is decreasing across the board, but it’s declining most rapidly in roles affected by AI.

WordPress Malware Scanner Plugin Contains Vulnerability via @sejournal, @martinibuster

Wordfence published an advisory on the WordPress Malcure Malware Scanner plugin, which was discovered to have a vulnerability rated at a severity level of 8.1. At the time of publishing, there is no patch to fix the problem.

Screenshot Showing 8.1 Severity Rating

Malcure Malware Scanner Vulnerability

The Malcure Malware Scanner plugin, installed on over 10,000 WordPress websites, is vulnerable to “Arbitrary File Deletion due to a missing capability check on the wpmr_delete_file() function” by authenticated attackers. The fact that an attacker needs authentication as a user makes it a little less likely for it to be exploited, however not by much because it only requires subscriber level authentication, which is the lowest level of authentication. The “subscriber” role is the default level of registration on a WordPress website (if registration is allowed).

According to Wordfence:

“This makes it possible for authenticated attackers, with Subscriber-level access and above, to delete arbitrary files making remote code execution possible. This is only exploitable when advanced mode is enabled on the site.”

There is no known patch available for the plugin and users are cautioned to take necessary actions such as uninstalling the plugin to mitigate risk.

The plugin is currently unavailable for download with a notice showing that it is under review.

Screenshot Of Malcure Plugin At WordPress Repository

Read More WordPress News

WordPress Update 6.8.2 – Ends Security Support For 0.9% of Sites

Featured Image by Shutterstock/Kues

Anthropic’s New Financial Tool Signals Shift To Offering Specialized Services via @sejournal, @martinibuster

Anthropic announced a new Financial Analysis Solution powered by its Claude 4 and Claude Code models. This is Anthropic’s first foray into a major vertical-focused platform, signaling a shift toward AI providers building tools that directly address common pain points in business workflows and productivity.

Claude For Financial Services

Anthropic’s Claude’s new service is an AI-powered financial analysis tool that’s targeted to financial professionals. It offers data integration via MCP (Model Context Protocol) and secure handling of data and total privacy. No user data is used for training Claude’s generative models.

According to the announcement:

“Claude has real-time access to comprehensive financial information including:

  • Box enables secure document management and data room analysis
  • Daloopa supplies high-quality fundamentals and KPIs from SEC filings
  • Databricks offers unified analytics for big data and AI workloads
  • FactSet provides comprehensive equity prices, fundamentals, and consensus estimates
  • Morningstar contributes valuation data and research analytics
  • PitchBook delivers industry-leading private capital market data and research, empowering users to source investment and fundraising opportunities, conduct due diligence and benchmark performance, faster and with greater confidence
  • S&P Global enables access to Capital IQ Financials, earnings call transcripts, and more–essentially your entire research workflow”

Takeaway:

This launch may signal a shift among AI providers toward building industry-specific tools that solve problems for professionals, rather than offering only general-purpose models that others use to provide the same solutions. Generative AI companies have the ability to stitch together solutions from big data providers in ways that smaller companies can’t.

Read more at Anthropic:

Transform financial services with Claude

Featured Image by Shutterstock/gguy

WordPress Update 6.8.2 – Ends Security Support For 0.9% of Sites via @sejournal, @martinibuster

WordPress released a maintenance update that contains twenty changes to the core and fixes fifteen issues in the Gutenberg block editor. WordPress also announced that it is dropping security support for WordPress versions 4.1 to 4.6.

Short-Cycle Maintenance Release

This is a maintenance release that incrementally makes WordPress a smoother experience.

Some of the fixes that are representative of what’s in this release:

Dropping Security Support

WordPress announced that it is dropping support for versions 4.1 through 4.6. According to the official WordPress stats, only 0.9% of websites are using those versions of WordPress.

Statement on release page:

“Dropping security updates for WordPress versions 4.1 through 4.6
This is not directly related to the 6.8.2 maintenance release, but branches 4.1 to 4.6 had their final release today. These branches won’t receive any security update anymore.”

Another WordPress page provides more information:

“As of July 2025, the WordPress Security Team will no longer provide security updates for WordPress versions 4.1 through 4.6.

These versions were first released nine or more years ago and over 99% of WordPress installations run a more recent version. The chances this will affect your site, or sites, is very small.”

Read the official WordPress 6.8.2 announcement:

WordPress 6.8.2 Maintenance Release

Read More WordPress News

Malware Discovered In Gravity Forms WordPress Plugin

Featured Image by Shutterstock/Praew stock

Google Updates Search Analytics API To Clarify Data Freshness via @sejournal, @MattGSouthern

Google has added a new metadata field to the Search Analytics API, making it easier for developers and SEO professionals to identify when they’re working with incomplete or still-processing data.

The update introduces new transparency into the freshness of query results, an improvement for marketers who rely on up-to-date metrics to inform real-time decisions.

What’s New In The API

The metadata field appears when requests include the dataState parameter set to all or hourly_all, enabling access to data that may still be in the process of being collected.

Two metadata values are now available:

  • first_incomplete_date: Indicates the earliest date for which data is still incomplete. Only appears when data is grouped by date.
  • first_incomplete_hour: Indicates the first hour where data remains incomplete. Only appears when data is grouped by hour.

Both values help clarify whether recent metrics can be considered stable or if they may still change as Google finalizes its processing.

Why It Matters For SEO Reporting

This enhancement allows you to better distinguish between legitimate changes in search performance and temporary gaps caused by incomplete data.

To help reduce the risk of misinterpreting short-term fluctuations, Google’s documentation states:

“All values after the first_incomplete_date may still change noticeably.”

For those running automated reports, the new metadata enables smarter logic, such as flagging or excluding fresh but incomplete data to avoid misleading stakeholders.

Time Zone Consistency

All timestamps provided in the metadata field use the America/Los_Angeles time zone, regardless of the request origin or property location. Developers may need to account for this when integrating the data into local systems.

Backward-Compatible Implementation

The new metadata is returned as an optional object and doesn’t alter existing API responses unless requested. This means no breaking changes for current implementations, and developers can begin using the feature as needed.

Best Practices For Implementation

To take full advantage of this update:

  • Include logic to check for the metadata object when requesting recent data.
  • Consider displaying warnings or footnotes in reports when metadata indicates incomplete periods.
  • Schedule data refreshes after the incomplete window has passed to ensure accuracy.

Google also reminds users that the Search Analytics API continues to return only top rows, not a complete dataset, due to system limitations.

Looking Ahead

This small but meaningful addition gives SEO teams more clarity around data freshness, a frequent pain point when working with hourly or near-real-time performance metrics.

It’s a welcome improvement for anyone building tools or dashboards on top of the Search Console API.

The metadata field is available now through standard API requests. Full implementation details are available in the Search Analytics API documentation.


Featured Image: Roman Samborskyi/Shutterstock

Topic-First SEO: The Smarter Way To Scale Authority via @sejournal, @Kevin_Indig

Over the past few months, I’ve deeply analyzed how Google’s AI Overviews handle long-tail queries, dug into what makes brands visible in large language models (LLMs), and worked with brands trying to future-proof their SEO strategies.

Today’s Memo is the first in a two-part series where I’m covering a tactical deep dive into one of the most overlooked mindset shifts in SEO: optimizing for topics (not just keywords).

In this issue, I’m breaking down:

  • Why keyword-first SEO creates surface-level content and cannibalization.
  • What the actual differences are. (Isn’t this just a pillar-cluster approach? Nope.)
  • Thoughts from other pros across the web.
  • How to talk through these issues with your stakeholders, i.e., clients, the C-suite, and your teams (for premium subscribers).

And next week, I’ll cover how to build a topic map, and operationalize a topic-first approach to SEO across your team.

If you’ve ever struggled to convince stakeholders to think beyond search volume or wondered how to grow authority, this memo’s for you.

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

At some point over the last year, it’s likely you’ve heard the guidance that keywords are out and topics are in.

The SEO pendulum has swung. If you haven’t already been optimizing for topics instead of keywords (and you really should have), now’s the time to finally start.

But what does that actually mean? How do we do it?

And how are we supposed to monitor topical performance?

With all this talk about LLM visibility, AI Overviews, AI Mode, query fan-out, entities, and semantics, when we optimize for topics, are we optimizing for humans, algorithms, or language models?

Personally, I think we’re making this more difficult than it has to be. I’ll walk you through how to optimize for (and measure/monitor) topics vs. keywords.

Why Optimizing For Topics > Keywords In 2025 (And Beyond)

If your team is still focused on keywords over topics, it’s time to explain the importance of this concept to the group.

Let’s start here: The traditional keyword-first approach worked when Google primarily ranked pages based on string matching. But in today’s search landscape, keywords are no longer the atomic unit of SEO.

But topics are.

In fact, we’re living through what you (and Kevin) might call the death of the keyword.

Think of it like this:

  • Topics are the foundation and framing of your site’s organic authority and visibility, like the blueprint and structure of a house.
  • Individual keywords are the bricks and nails that help build it, but optimizing for individual queries on their own without optimizing for the topics to anchor them, well, they don’t pull much weight.

If you focus only on keywords, it’s like obsessing over picking the right brick color without realizing the blueprint is incomplete.

But when you plan around (and optimize for) topics, you’re designing a structure that’s built to last – one that search engines and LLMs can understand as authoritative and comprehensive.

Google no longer sees a good search result as a direct match between a user’s query and a keyword on your page. That is some old SEO thinking that we all need to let go of completely.

Instead, search engines interpret intent and context, and then use language models to expand that single query into dozens of variations, a.k.a. query fan-out.

That’s why a piecemeal approach to targeting SEO keywords based on search volume, stage of the search journey, or even bottom-of-funnel (BOF) or pain-point intent can be wasted time.

And don’t get me wrong: Targeting queries that are BOF and solve core painpoints of your audience is a wise approach – and you should be doing it.

But own the topics, and you can see your brand’s organic visibility outlast big algorithm changes.

Keyword-Only Thinking Limits Growth

And after all that, if it’s still a challenge convincing your stakeholders, clients, or team to pivot to topic-forward thinking, explain how it limits growth.

Teams stuck in keyword-first mode often run into three problems:

  1. Surface-level content: Articles become thin, narrowly scoped, and easy to outcompete.
  2. Cannibalization: Content overlap happens often; articles compete with each other (and lose).
  3. Blind spots: You miss related subtopics, tailoring content to personas, or exploring problems within the topic that your audience actually cares about.

On the other hand, a topic-first approach allows you to build deeper, more useful content ecosystems.

Your goal is not to just answer one query well; it’s to become a go-to resource for the entire subject area.

Understanding The Topic Maturity Path: Old Way Vs. New Way

Let’s take a closer look at how these two approaches are different from one another.

Image Credit: Kevin Indig

Old Way: Keyword-First SEO

The classic approach to SEO centered around picking individual keywords, assigning each one a page, and publishing content that aimed to rank for that phrase.

This model worked well when Google’s ranking signals were more literal (think string matching, backlink anchor text, and on-page optimization carrying most of the weight).

But in 2025 and beyond, this approach is showing its age.

Keyword-first SEO often looks like this:

  • Minimal internal cohesion across pages; articles aren’t working together to build topic depth or reinforce semantic signals.
  • Content decisions are often driven by average monthly search and tool-based keyword difficulty scores, rather than intent or persona-specific needs.
  • A high-effort, low-durability content-first SEO strategy; posts may rank initially and hold for a while, but they rarely stick or scale.
  • Monitoring performance is often focused on traffic projections and done by page type, SEO-tool-informed intent type, query rankings, and (yes) even sometimes topic groups.

But even when teams adopt a topic cluster-first model (like grouping related keywords into topic clusters or deploying a topic-focused pillar + cluster strategy), they often stay tethered to outdated keyword logic.

The result? Surface-level coverage for single keywords, frequent content cannibalization, and a site structure that might seem organized but still lacks strategic topic optimization.

Without persona insights, or a clear content hierarchy built around core topics, you’re building with bricks, but no real authority blueprint.

Wait a second. Is optimizing for topics any different from the classic pillar + topic cluster approach?

Yes and no.

A pillar + cluster model (a.k.a. hub and spoke) is a framework that can organize a topical approach.

But strategists should shift from matching pages to exact keywords → covering concepts deeply instead.

This classic framework can support topic optimization, but only if it’s implemented with a topic-first mindset.

Here are the primary differences:

  • Keyword-driven pillar + cluster model: Pillar = covers seed keyword target(s); clusters = cover long-tail variations of seed keywords.
  • Topic-driven pillar + cluster model:Pillar = offers a comprehensive guide to the topic; clusters = provide in-depth support for key concepts, different personas, related problems, and unique angles.

Simply selecting high-volume keywords to optimize for in your pillar + cluster strategy plan doesn’t work like it used to.

So, a pillar + cluster plan can help you organize your approach, but you’ll need to cover your core topics with depth and from a variety of perspectives, and for each persona in your target audience.

New Way: Topic-First SEO

Your future-proof SEO strategy doesn’t start with a focus on keywords; it starts with focusing on your target people, their problems, and the topics they care about.

Topic-first SEO approaches content through the lens of the real-world solutions your brand provides through your products and services.

You build authority by exploring a topic (one that you can directly speak to with authority) from all relevant angles: different personas, intent types, pain points, industry sectors, and contexts of use.

But keep in mind: Topic-first SEO is not exactly a page volume game, although the breadth and depth of your topic coverage are crucial.

Topic-first SEO involves:

  • Covering your core, targeted topics across personas.
  • Investing in “zero-volume” content based on actual questions and needs your target audience has.
  • Producing content within your topic that offers different perspectives and hot takes.
  • Building authority with information gain: i.e., new, fresh data that offers unique insights within your core targeted topics.

And guess what? This approach aligns with how Google now understands and ranks content:

  1. Entities > keywords: Google doesn’t just match “search strings” anymore. It understands concepts and audiences (and how they’re related) through the knowledge graph.
  2. Content built around people, problems, and questions: You’re not answering one query when you optimize for a topic as a whole; you’re solving layered, real-world challenges for your audience.
  3. Content journeys, not isolated posts: Topic-first strategies map content to different user types and their stage in the journey (from learning to buying to advocating).
  4. More durable visibility + stronger links: When your site deeply reflects a topic and tackles it from all angles, it attracts both organic queries and natural backlinks from people referencing real insight and utility.
  5. That E-E-A-T we’re all supposed to focus on: Kevin discusses this a bit more when he digs into Google Quality Rater Guidelines in building and measuring brand authority. But this is an absolute no-brainer: Taking a topic-first approach actively works toward establishing Experience, Expertise, Authoritativeness, and Trustworthiness.

I wanted to know how others are doing this, so over on LinkedIn, I asked for your thoughts and questions.

Here are some that stuck out to me that I think we can all benefit from considering:

Lily Grozeva asks: “Is covering a topic and establishing a brand as an authority on it still a volume game?”

My answer: No. I think Backlinko is a good example. The site built incredible visibility with just a few, but very deep guides.

Image Credit: Kevin Indig

Diego Gallo asks: “Any tips on how to decide if a question should belong to a page or be its own page?”

My answer: “In my experience, one way to determine that is cosine similarity between the (tokenized, embedded) question and the main topics / intents of the pages that you can pick from.”

Diego also left a good tip for covering all relevant intents: Build an “intent template” for each page (e.g., product landing page, blog article, etc.). Base the template on what works well on Google.

Image Credit: Kevin Indig

Matthew Mellinger called out that you can use Google’s People Also Asked questions to get clarity on which questions to answer on a page.

Image Credit: Kevin Indig

By the way, you can also use the intent classifier I built for premium subscribers for this task!

Gianluca Fiorelli put a cool analogy on the table:

Image Credit: Kevin Indig

Not Strategizing With A Topic-First Mindset? You’re Outdated

Next week, we’re going to take a deep look at operationalizing a topic-first SEO strategy, but here are some final thoughts.

While there are so many unknowns in the current search landscape, there are a few truths we can ground ourselves in, whether optimizing for search engines or LLMs:

1. Your brand can still own a topic in the AI era.

As shown in the data via the UX study of AIOs, brand/authority is now the first gate users walk through when considering a click off the SERP, search intent relevance the second; snippet wording only matters once trust is secured.

If people are going to click, they’re going to click on the familiar and authoritative. Be the topical authority in your areas of expertise and offerings.

Have your brand show up again, and again, and again in search results across the topic. It’s simple, but it’s hard work.

2. I don’t think focusing on a topic-first mindset could backfire in any way (in 2025 or beyond).

Demonstrating to your core ICPs – whether they find you via paid ads, organic search, LLM chats, socials, or word-of-mouth – through authoritative, branded website content that you understand the topics they care about, questions they have, and provide the solutions for their needs specifically only builds trust … no matter how your brand is found.

3. Build topic systems, not just articles or pages.

Integrators need to take a page out of the aggregator’s product-led SEO playbook: Create a comprehensive system (similar to TripAdvisor’s millions of programmatic pages supported by user-generated content (UGC) reviews, but you don’t need millions 😅) built around your topics of expertise that tackle perspectives, solutions, and questions around each persona type for each sector you serve.

Build the organizational structure within your site that makes these topics and personas easy to navigate for users (and easy to crawl and understand for bots/agents).

4. Persona or ICP-based content is more useful, less generic, and built for the next era of personalized search results.

‘Nuff said. If you strategize topic optimization through the lens of personas (even to the point of including real interviews, surveys, comments, and tips from these persona types), you’re adding to the conversation with depth and unique data.

If you’re not building audience-first content, does optimizing for LLMs and search bots even matter? You’ll gain visibility, but will you gain trust once you finally earn that click?


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: Why Your Content Is No Longer Good Or Helpful via @sejournal, @rollerblader

This week’s Ask An SEO question is from someone who would like to know why their content is no longer “good” or “helpful.”

The person is curious why they no longer rank for “their keywords” when other pages don’t have original photos or content that proves they have real experience.

The person would like to remain anonymous, so I’m respecting their request.

This is a long post. The top section is how we think about content and why it should rank. The second section is where you will find ways to implement based on multiple niches, from travel to food, roofing, and more.

How We Go About Creating Content

Their argument for why their content should rank is based on a concept called E-E-A-T.

Experience, Expertise, Authoritativeness, and Trustworthiness is not a ranking factor or signals; it is a trust builder for readers.

When done well, it can cause your content to get citations and backlinks naturally.

Your goal as a creator should be to show expertise and experience through original thoughts that only a person with first-hand knowledge knows.

That is how E-E-A-T works for SEO. There is no score or metric, and E-E-A-T is not a factor for ranking a page or website; it is an SEO concept.

I reviewed the website from the question submission, and it was similar to the slew of sites submitted for audits when the helpful content update killed niche sites. My feedback was the same.

The content is not original or unique, and it is not helpful. The person was just lucky they had the traffic for as long as they did.

Yes, the information given was from their personal experience, but it could have been generated by a large language model (LLM). Although the images were unique, they were not something unique to the topic or entity, and they did not help the user with a complete solution.

I live in Washington, D.C., and take photos when I go running most days, but I’m not a professional, and I do not sell my shots. It is a hobby. I could technically write a DC photography blog post or guide and try to rank it.

In order to do so, I not only need to share original photos, but I need to share original thoughts and things that will help someone wanting to take photos of DC, a complete solution. “The complete solution” part is what a lot of the sites I audited that got wiped out were missing.

The first step in providing this solution is to look for what people are asking and find a way to present it.

A question I get from friends on Facebook and sometimes on Instagram when I post a photo is “how do you get the lighting,” or “what do you use to edit people out.”

These become two topics that can be blog posts or YouTube videos on their own, and tips that only I would know because I’m the person taking these photos over and over with the same results.

This is where I can display E-E-A-T, provide the same information as everyone else, and then provide the information only I would know as the creator.

Instead of saying “take these photos at golden hour” and showing myself at golden hour, I should show what it looks like before, during, and after.

This helps the person know what to look for light-wise, so they can snap their shot around the same time I do pre-processing. But that isn’t enough. I need to go five steps further. I’ll use taking a photo of the monuments on the mall as the example.

First, I’d share that I don’t edit the people out, then write a section of the article dedicated to “taking photos of XYZ monument without people.”

In this section, I’ll give the instructions on how I calculate when fewer people will be around, the light is still diffused, and you get the glow of golden hour.

If you’re curious, the trick for this is showing up about 10 minutes after peak golden hour and facing east in the morning. People start to leave ,and you get a clean shot with sunrise.

Now I need to add a tip that only someone with a lot of experience and time spent on the craft would know. This could be looking for rainy mornings. There are few to no people out, and when you combine the weather factor and being just past golden hour, you are likely to get a photo with little to no people in it.

Third is to add a tip only I would know. This can be “this does not work for blue hour or sundown because people crowd before the sun sets, and it is dark afterwards eliminating your light source.”

But that doesn’t fully provide a solution. I need to give an alternative, as they still want a people-free photo. The opportunity here is to give three or four other monuments with examples of why they are better than the XYZ monument for sunset, including getting a photo without anyone in it.

Sounds like a lot of work, right?  It is, but only if you’re not an expert in your field. Sites that did not go this far got wiped out. For our clients, we go even further. Here’s how.

Something I have not seen on photography sites is time tracking to let someone know when to show up, specifically.

For the DC monuments, I tracked the time it takes for people to leave after the sun rises during cherry blossoms, so I can get a photo as the sky changes and the trees are in bloom.

After keeping my spreadsheet for a few years, I knew how many days in advance and after peak bloom, people show up and leave. I also learned when they’d be dispersed enough from my favorite shots and angles.

It wasn’t perfect, but the spreadsheet did the job, and I get my favorite photos each year.

For this theoretical article, I could post the spreadsheet to help others, and that is something unique to my site and may get backlinks from travel guides, photographers, and DC tourism companies.

I do this for other locations as well.  If you’re a creator in the hobbyist, travel bloggers and nomads, food space, etc. you should be doing this level of detail.

For New Orleans and when I was in St. Marteen last January, I used live streaming cameras from specific locations to track when to show up to take my photos.

For sunrise, I looked at when people left the beach by tracking movement. The beach cams showed sunrise, but the Bourbon Street cams did not, so I used the time when fewer people passed per minute.

Applying This In The Real World And To Multiple Niches

By being able to see sunrise and sunset, I knew which beaches and angles to go to.

I was also able to figure out when I’d still have the right lighting with fewer chances of people being there by tracking when people show up and leave, and how many people are in each spot by day.

For my photos in the French Quarter, I used the Bourbon Street live cams.

You can see when the streets are less full, when lights go on and off, and capture the mood and setting you want, whether it is Jackson Square, Bourbon St., Canal Street, etc.

I personally like street lamps being lit in my photos, so that’s why I tracked their on and off times.

Now it is time to present the content. Written text is only a portion; the instructions can be presented in:

  • An ordered list.
  • A spreadsheet that can be downloaded or accessed online with instructions on it.
  • Videos sharing the steps visually so the person can follow along.
  • A table that lists the steps and what to do with notes, featuring alternatives and examples.
  • Infographics that walk through the steps and include reminders and visuals.

This is pretty specific, but it applies to the work we do for clients.

If you write about outdoor sports like hiking or snowboarding, you can do this for specific slopes or trails that are always trafficked, and the person wants to enjoy it without congestion.

I was able to apply this to the Velocicoaster at Universal Orlando and not have to wait in a huge line.

The same with shopping for fashion and food sites, when are the slower times, and how can they verify?

Google Business Profiles sometimes list heavy and slow traffic hours for stores, restaurants, and entertainment venues by day and hour.

In the case studies I share on my blog, I say we don’t build backlinks anymore, and that is true.

By thinking about how and why our data, skill sets, products, services, etc. are unique and how we can apply our knowledge, the content starts to rank and people cite and source us.

We get the backlinks naturally, and the clients grow.

If you’re a contract attorney, you know the trends that could signal shifts in the markets and how businesses are growing, what their concerns are, and the direction things are heading.

Publish the numbers of types of contracts as data points without revealing any client information, and share how it either correlates or goes against what traditional media and social media are saying.

Home builders, contractors, and interior designers know what is about to be popular because demand starts to spike, if people are downsizing or looking for more space and luxury, and how it compares to previous years. This can get B2B and B2C traffic and backlinks.

  • B2C comes from potential customers that want to know what to buy or what is on trend, or to see what was popular five years ago and if it is making a comeback.
  • B2B wants to know what materials, colors, and other items they should plan to order and stock as the demand will be coming.

By creating renderings and solutions for both, you can collect leads and hopefully convert them, whether they’re brides, religious events like a Bat Mitzvah or Quinceañeras, or kitchen renovations and roof repairs.

If you’re a retailer or affiliate, optimize your product pages for these before the demand starts vs. having to optimize and compete with the companies and vendors already ranking.

You have the advantage, as you know what will be in demand months in advance.

Travel sites can go five steps further than saying here is a wheelchair-friendly entrance.

Share where the nearest bathrooms to that entrance are and the easiest pathway through the museum that does not require stairs.

You can also share when they’re likely to be less crowded, so you don’t have to fight through crowds to see the exhibits or wait for elevators.

And make sure to post photos of how to find them, not just you at the location.  This is how you help the reader and create rank-worthy content.

The same goes for castles in Europe, and beaches or temples in Asia.  Help people with more than just saying it is friendly for people; give them the resources that only a person with real experience would know.

Show images of what to look for, not just the signature photo from the space. If you don’t share where to take that photo from, the person has to do more searching.

Anyone can show a photo saying they’ve been somewhere. That does not show E-E-A-T, and neither does saying it is “reviewed by” if the content does not have unique and original thoughts by the experts.

Take your content five to 10 times further and make sure the person does not have to do another search after or while performing a task.

This is how you create content that ranks and gets backlinks.

The content in these cases is helpful, as long as you use proper formatting with it so users can thumb through with ease.

More Resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Google Says AI Won’t Replace The Need For SEO via @sejournal, @martinibuster

Google’s John Mueller and Martin Splitt discussed the question of whether AI will replace the need for SEO. Mueller expressed a common-sense opinion about the reality of the web ecosystem and AI chatbots as they exist today.

Context Of Discussion

The context of the discussion was about SEO basics that a business needs to know. Mueller then mentioned that businesses might want to consider hiring an SEO who can help navigate the site through its SEO journey.

Mueller observed:

“…you also need someone like an SEO as a partner to give you updates along the way and say, ‘Okay, we did all of these things,’ and they can list them out and tell you exactly what they did, ‘These things are going to take a while, and I can show you when Google crawls, we can follow along to see like what is happening there.’”

Is There Value In Learning SEO?

It was at this point that Martin Splitt asked if generative AI will make having to learn SEO obsolete or whether entering a prompt will give all the answers a business person needs to know. Mueller’s answer was tethered to how things are right now and avoided speculating about how things will change in a year or more.

Splitt asked:

“Okay, I think that’s pretty good. Last but not least, with generative AI and chatbot AI things happening. Do you think there’s still a value in learning these kind of things? Or can I just enter a prompt and it’ll figure things out for me?”

Mueller affirmed that knowing SEO will still be needed as long as there are websites because search engines and chat bots need the information that exists on websites. He offered examples of local businesses and ecommerce sites that still need to be found, regardless of whether that’s through an AI chatbot or search.

He answered:

“Absolutely value in learning these things and in making a good website. I think there are lots of things that all of these chatbots and other ways to get information, they don’t replace a website, especially for local search and ecommerce.

So, especially if you’re a local business, maybe it’s fine if a chatbot mentions your business name and tells people how to get there. Maybe that’s perfectly fine, but oftentimes, they do that based on web content that they found.

Having a website is the basis for being visible in all of these systems, and for a lot of other things where you offer a service or something, some other kind of functionality on a website where you have products to sell, where you have subscriptions or anything, a chat response can’t replace that.

If you want a t shirt, you don’t want a description of how to make your own t-shirt. You want a link to a store where it’s like, ‘Oh, here’s t-shirt designs,’ maybe t-shirt designs in that specific style that you like, but you go to this website and buy those t-shirts there.”

Martin acknowledged the common sense of that answer and they joked around a bit about Mueller hoping that an AI will be able to do his job once he retires.

That’s the context for this part of their conversation:

“Okay. That’s very fair. Yeah, that makes sense. Okay, so you think AI is not going to take it all away from us?”

And Mueller answers with the comment about AI replacing him after he retires:

“Well, we’ll see. I can’t make any promises. I think, at some point, I would like to retire, and then maybe AI takes over my work then. But, like, there’s lots of stuff to be done until then. There are lots of things that I imagine AI is not going to just replace.”

What About CMS Platforms With AI?

Something that wasn’t discussed is the trend of AI within content management systems. Many web hosts and WordPress plugins are already integrating AI into the workflow of creating and optimizing websites. Wix has already integrated AI into their workflow and it won’t be much longer until AI makes a stronger presence within WordPress, which is what the new WordPress AI team is working on.

Screenshot Of ChatGPT Choosing Number 27

Will AI ever replace the need for SEO? Many easy things that can be scaled are already automated. However, many of the best ideas for marketing and communicating with humans are still best handled by humans, not AI. The nature of generative AI, which is to generate the most likely answer or series of words in a sentence, precludes it from ever having an original idea. AI is so locked into being average that if you ask it to pick a number between one and fifty, it will choose the number 27 because the AI training binds it to picking the likeliest number, even when instructed to randomize the choice.

Listen to Search Off The Record at about the 24 minute mark:

Featured Image by Shutterstock/Roman Samborskyi