seo enhancements
Scaling content creation without compromising quality (with template)

SEO is, for a large part, all about getting the right content in front of the right audience. When you’ve been doing that for a while, there comes a time when you want to scale content production. Scaling content creation means you aim to make more content to reach new targets. While that’s a good idea, you need to find a way to scale while keeping the same level of quality you’ve always had. Let’s go over how to scale your content production step by step, showing common problems and solutions.

Table of contents

What is content scaling?

Content scaling is about making your content process more efficient. The goal should be to make more content without lowering the quality. First, you must examine every step of your content creation process — from brainstorming to research, editing, publishing, and reporting. Once you have the process detailed, you can find ways to do those tasks faster and predictably. 

A well-scaled process helps you create a lot of content. This approach helps you build a solid system rather than adding more articles. For instance, your content team could develop a checklist to help review articles, introduce a content calendar to improve planning and set up clear tone-of-voice guidelines. These steps help you stay consistent and true to your brand — whether you produce one weekly article or dozens. 

Why scaling content matters

Scaling content production can directly help your business. If you actively publish high-quality content on your site, search engines will understand that your site is active and reliable. By targeting the right audience with the right search intent and message, you could improve your search visibility and generate more traffic for your content. Search engines are likelier to see you as trustworthy when you publish high-quality content.

In addition, producing content more consistently and following a plan can help you reach a bigger audience. More articles mean more opportunities to write about topics that interest your different audience groups. In the end, this will broaden your brand’s presence. You’ll have a bigger chance of people seeing you as a trusted source if you offer helpful insights and solutions to their problems.

All your content can help potential customers make decisions. This content is another way to address their concerns and answer questions. By doing this strategically, you can continue to engage your audience and nudge them closer to making that final decision. Of course, whether that decision is a sale, information request, or newsletter signup doesn’t matter.

Scaling your content production also supports your branding. When you create well-organized content over a longer period, you can support your brand voice and recognition. That reliability helps build trust and strengthens your reputation. 

The biggest challenges in scaling content

If you want to scale your content production, you must overcome several hurdles, which, if you don’t consider, will impact the quality and consistency of your content. 

Quality control and consistency

When you produce more content, you need to make sure that every piece represents your brand well. However, catching errors or maintaining the proper tone becomes harder because you have more content to review. If you don’t do this well, there’s a risk that your articles will vary in tone or style. Without proper guidelines or a good editorial process, your content quality may suffer when you publish more and more.

For example, you can miss issues like tone, formatting, or factual errors without a standard editing checklist. If you do this for a while and people start to notice, they can form a different view of your brand. It would almost look like you don’t care about these issues. You need to set clear quality benchmarks and a solid review process. Consistent editing with fixed content rules helps everything you publish meet the same standards.

Handling different audience needs

In an ideal world, you write for different groups. You cannot target one group only. Every segment has its own interests, problems, and ideas. But if you scale your output, you risk writing mainly generic articles. No one will like that content.

If you haven’t yet sorted your audience, do so and focus your content on these specific groups. As a result, your content will be more useful for the people in those groups.

Process difficulty and extra management work

More content means more parts to manage. Each article needs research, writing, review, checking, and then publishing. This is fine if you publish a few posts a month because you can handle these steps by hand. But growing your output complicates things when you face many deadlines, writers, or quality checks.

Complexity leads to bottlenecks. If you struggle with one thing, that might eventually slow down everything. Think of it like this: when you don’t scale your editorial process, you will eventually have a pile of articles that need approval. This grinds your publication flow to a halt. Develop a system that divides tasks into repeatable steps. Use content calendars and checklists to track progress and make managing projects easier. 

Balancing speed and thoughtfulness

Scaling content production can lead to pressure to cut corners to meet deadlines. When the speed of publication comes into play, there’s a high chance that content will become less developed. This shouldn’t happen. Every piece of content should be carefully planned and produced. Rushing only leads to content that lacks depth, accuracy, or clarity. 

Of course, this is easier said than done. You have to find ways to increase efficiency without sacrificing the quality of your content. Start by streamlining your process, breaking it up into smaller tasks. Set up a system that monitors quality while giving you enough room to be flexible.  

Building a repeatable content creation process

Scaling your content production reliably requires setting up a solid content process. That process should be easily repeatable and have clear tasks, which will help keep your team on track. 

Map the entire content workflow

Describe each content task and work your way through the list of what has to be done. Write down a list of all phases, ranging from conception through publication. This will help you understand where delays or errors creep in. Consider drawing a flow diagram or another visual. This list will act as your directive.  

Create a content calendar

Use a content calendar to plan your publishing schedule. Proper planning helps you keep track of deadlines, even if they are for different outlets. Thanks to your content plan, your team can write content in advance and, hopefully, without stressing out about deadlines too much.

Develop detailed briefs and outlines

Content briefs are a great way to align writers — see below for an example. A brief like this should, at least,  include the subject, target audience, key messages, and keywords that the writer should target. Once approved, create an outline for the content and fill in the structure. A good content brief speeds up the writing process while ensuring that content is targeted well. 

Implement a style guide

A style guide can help you ground every piece of content in a consistent tone of voice and formatting. This guide should include rules for tone, punctuation, formatting, and whatever else makes sense to share. You can easily share this guide with anyone on your team; even freelancers enjoy using it. 

Use checklists for each stage

You’ll find it easier to manage once you break the process down into small tasks. Make a checklist for tasks such as researching, writing, and editing. Having a proper checklist helps you make sure that you don’t forget anything. This could be checking facts, improving readability, or using proper SEO tactics. Your lists will help you scale your content production while maintaining quality output.

Standardize tools and platforms

Use well-known tools to manage tasks in your team. Think of project management tools like Jira or Asana, shared calendars in CoSchedule, Canva for visual designs, and document templates in Microsoft Office. Many companies use Google Docs to collaborate on documents. In those cases, you can use one of the standardized Google Docs extensions, which are easier to scale.

Write a good manual or checklist for these tools so that anyone — from in-house writers to external freelancers — follows the same steps. Standardization makes this work and helps apply important SEO best practices properly.

All of these things help your team routinely produce quality content. Making the process repeatable reduces the chance of errors and wasted time, so you can scale without losing what makes your content awesome. 

Strategies to scale without losing quality

Careful planning is one of the best ways to scale your content without lowering its quality. Another great option is to use clear methods to make your work more effective. 

Develop a strong content strategy and workflow 

As always, start with a solid plan that includes your goals, topics, and the audience you want to reach. Creating content for your audience is much easier when everyone truly understands who those people are. A good workflow avoids delays and helps people move from one task to another.

Use a detailed content calendar

We’ve discussed the importance of content calendars, and you really have to see these as your roadmap. A calendar shows all upcoming publications, deadlines, and the status of various projects. A good calendar keeps everyone up to date at all times and makes sure the work is nicely spread out. Good planning prevents missed deadlines.

Use template structures

Templates help you standardize your work, as they offer a reusable structure for common types of content. Each type of content can have its own structure to fill in. These templates help writers speed up their work while maintaining consistency across articles. 

Repurpose content thoughtfully

Look at what you already have and see how it can be adapted into a different form. For example, you can split a long-form article into several videos or a series of shorter posts. This strategy saves time while also delivering fresh material in new formats. Make sure to adapt the new content to the correct audience. 

Assign clear roles within your team 

Find out your team members’ strengths and have them do what they do best. A  writer should handle the initial draft while an editor reviews the work. Your trusted subject matter expert should check the content for accuracy. Clear roles help people do what they do best, which helps preserve content quality.

Maintaining high-quality content at scale

It isn’t easy to maintain content quality when scaling content production. To make the process more manageable, you should establish habits and use tools that help you make sure that every piece of content meets your standards. 

Follow your style guide

Setting up a good style guide keeps your writing consistent. Your style guide should include information on your content’s tone of voice, the terminology you can and can’t use, and how you structure and format it. Share this guide with your team.

Schedule periodic audits

Similarly, regularly review your existing content to see if it’s outdated or needs to adapt to changes in your brand messaging. This helps keep your older content relevant and accurate. 

Use tools when appropriate

Tools can help scale your content production. Even a tool like our Yoast SEO plugin can help your content work. Good content tools can help with formatting, improving readability, checking for keyword placement, and some even help with on-page SEO.

Using Generative AI for scaling content output

Using AI to scale content production might seem like a good idea, but please be careful. Generative AI can definitely be a valuable tool for content processes. However, AI is not without issues and needs interaction from real people.

Human oversight makes sure that the output aligns with your brand’s voice and content standards. You can use generative AI as a starting point or a helpful assistant, but not as a complete replacement for your real writers. Your use of AI should have a clear process to bring the content up to your desired quality level.

Conclusion to scaling content production

Scaling up content production shouldn’t mean lower quality. Mostly, it’s about knowing the content process inside out. Once you have that, you can lay out the steps for everyone to follow. With a good process, you can meet your goals and still maintain the quality of the content. Be sure to set up content templates, calendars, and clear roles for your team. Make the adjustments and see how this can lead to better results. 

Bonus: Content brief template for SEO

Are you looking for a basic content brief template that helps scale your content production? Check out the one below:

Content brief section Details
Title/headline suggestion [Insert title]
Primary keyword [Main keyword]
Secondary keywords [Keyword 1], [Keyword 2]
Search intent [Informational, commercial, transactional, etc.]
Audience persona [If needed, description of audience persona]
Content objective [What is the content meant to achieve]
Benchmark content [URLs of best-in-class content about this topic]
Word count range [Word count]
Tone and style guidelines [Tone and style]
Outline/sections Introduction;
Main points/headings;
Subheadings;
Conclusion
SEO requirements Meta title: [Title];
Meta description: [Description];
Header tags: H1, H2, H3;
URL: [Proposed URL for content]
Call to action [What do you want people to do/click on?]
Internal and external links Internal: [Links]
External: [Links]
Visuals and multimedia [List of visuals]
Examples/references [Links to examples/references]
Deadline and submission details [Deadline and submission instructions]
Analysis Forecasts More Vulnerabilities In 2025 via @sejournal, @martinibuster

A new analysis predicts that the number of reported vulnerabilities will reach record highs in 2025, continuing the trend of rising cybersecurity risks and increased vulnerability disclosures.

Analysis By FIRST

The analysis was published by the Forum of Incident Response and Security Teams (FIRST), a global organization that helps coordinate cybersecurity responses. It forecasts almost 50,000 vulnerabilities in 2025, an increase of 11% over 2024 and a 470% increase from 2023. The report suggest that organizations need to shift from reactive security measures to a more strategic approach that prioritizes vulnerabilities based on risk, planning patching efforts efficiently, and preparing for surges in disclosures rather than struggling to keep up after the fact.

Why Are Vulnerabilities Increasing?

There are three trends driving the increase in vulnerabilities.

1. AI-driven discovery and open-source expansion are accelerating CVE disclosures.

AI is vulnerability discovery, including machine learning and automated tools are making it easier to detect vulnerabilities in software which in turn leads to more CVE (Common Vulnerabilities and Exposures) reports. AI allows security researchers to scan larger amounts of code to quickly identify flaws that would have gone unnoticed using traditional methods.

The press release highlights the role of AI:

“More software, more vulnerabilities: The rapid adoption of open-source software and AI-driven vulnerability discovery has made it easier to identify and report flaws.”

2. Cyber Warfare And State-Sponsored Attacks

State-sponsored attacks are increasing which in turn leads to more of these kinds of vulnerabilities being discovered.

The press release explains:

“State-sponsored cyber activity: Governments and nation-state actors are increasingly engaging in cyber operations, leading to more security weaknesses being exposed.”

3. Shifts In CVE Ecosystem

Patchstack, a WordPress security company, identifies and patches vulnerabilities. Their work is adding to the number of vulnerabilities discovered every year. Patchstack offers vulnerability detection and virtual patches. Patchstack’s participation in this ecosystem is helping expose more vulnerabilities, particularly those affecting WordPress.

The press release provided to Search Engine Journal states:

“New contributors to the CVE ecosystem, including Linux and Patchstack, are influencing disclosure patterns and increasing the number of reported vulnerabilities. Patchstack, which focuses on WordPress security, is playing a role in surfacing vulnerabilities that might have previously gone unnoticed. As the CVE ecosystem expands, organizations must adapt their risk assessment strategies to account for this evolving landscape.”

Eireann Leverett, FIRST liaison and lead member of FIRST’s Vulnerability Forecasting Team, highlighted the accelerating growth of reported vulnerabilities and the need for proactive risk management, stating:

“For a small to medium-sized ecommerce site, patching vulnerabilities typically means hiring external partners under an SLA to manage patches and minimize downtime. These companies usually don’t analyze each CVE individually, but they should anticipate increased demands on their third-party IT suppliers for both planned and unplanned maintenance. While they might not conduct detailed risk assessments internally, they can inquire about the risk management processes their IT teams or external partners have in place. In cases where third parties, such as SOCs or MSSPs, are involved, reviewing SLAs in contracts becomes especially important.

For enterprise companies, the situation is similar, though many have in-house teams that perform more rigorous, quantitative risk assessments across a broad (and sometimes incomplete) asset register. These teams need to be equipped to carry out emergency assessments and triage individual vulnerabilities, often differentiating between mission-critical and non-critical systems. Tools like the SSVC (https://www.cisa.gov/ssvc-calculator) and EPSS (https://www.first.org/epss/) can be used to inform patch prioritization by factoring in bandwidth, file storage, and the human element in maintenance and downtime risks.

Our forecasts are designed to help organizations strategically plan resources a year or more in advance, while SSVC and EPSS provide a tactical view of what’s critical today. In this sense, vulnerability forecasting is like an almanac that helps you plan your garden months ahead, whereas a weather report (via EPSS and SSVC) guides your daily outfit choices. Ultimately, it comes down to how far ahead you want to plan your vulnerability management strategy.

We’ve found that Boards of Directors, in particular, appreciate understanding that the tide of vulnerabilities is rising. A clearly defined risk tolerance is essential to prevent costs from becoming unmanageable, and these forecasts help illustrate the workload and cost implications of setting various risk thresholds for the business.”

Looking Ahead to 2026 and Beyond

The FIRST forecast predicts that over 51,000 vulnerabilities will be disclosed in 2026, signaling that cybersecurity risks will continue to increase. This underscores the growing need for proactive risk management rather than relying on reactive security measures.

For users of software like WordPress, there are multiple ways to mitigate cybersecurity threats. Patchstack, Wordfence, and Sucuri each offer different approaches to strengthening security through proactive defense strategies.

The main takeaways are:

  • Vulnerabilities are increasing – FIRST predicts up to 50,000 CVEs in 2025, an 11% rise from 2024 and 470% increase from 2023.
  • AI and open-source adoption are driving more vulnerability disclosures.
  • State-sponsored cyber activity is exposing more security weaknesses.
  • Shifting from reactive to proactive security is essential for managing risks.

Read the 2025 Vulnerability Forecast:

Vulnerability Forecast for 2025

Featured Image by Shutterstock/Gorodenkoff

The AI Hype Index: Falling in love with chatbots, understanding babies, and the Pentagon’s “kill list”

Separating AI reality from hyped-up fiction isn’t always easy. That’s why we’ve created the AI Hype Index—a simple, at-a-glance summary of everything you need to know about the state of the industry.

The past few months have demonstrated how AI can bring us together. Meta released a model that can translate speech from more than 100 languages, and people across the world are finding solace, assistance, and even romance with chatbots. However, it’s also abundantly clear how the technology is dividing us—for example, the Pentagon is using AI to detect humans on its “kill list.” Elsewhere, the changes Mark Zuckerberg has made to his social media company’s guidelines mean that hate speech is likely to become far more prevalent on our timelines.

Technology shapes relationships. Relationships shape technology.

Greetings from a cold winter day.

As I write this letter, we are in the early stages of President Donald Trump’s second term. The inauguration was exactly one week ago, and already an image from that day has become an indelible symbol of presidential power: a photo of the tech industry’s great data barons seated front and center at the swearing-in ceremony.

Elon Musk, Sundar Pichai, Jeff Bezos, and Mark Zuckerberg all sat shoulder to shoulder, almost as if on display, in front of some of the most important figures of the new administration. They were not the only tech leaders in Washington, DC, that week. Tim Cook, Sam Altman, and TikTok CEO Shou Zi Chew also put in appearances during the president’s first days back in action. 

These are tycoons who lead trillion-dollar companies, set the direction of entire industries, and shape the lives of billions of people all over the world. They are among the richest and most powerful people who have ever lived. And yet, just like you and me, they need relationships to get things done. In this case, with President Trump. 

Those tech barons showed up because they need relationships more than personal status, more than access to capital, and sometimes even more than ideas. Some of those same people—most notably Zuckerberg—had to make profound breaks with their own pasts in order to forge or preserve a relationship with the incoming president. 

Relationships are the stories of people and systems working together. Sometimes by choice. Sometimes for practicality. Sometimes by force. Too often, for purely transactional reasons. 

That’s why we’re exploring relationships in this issue. Relationships connect us to one another, but also to the machines, platforms, technologies, and systems that mediate modern life. They’re behind the partnerships that make breakthroughs possible, the networks that help ideas spread, and the bonds that build trust—or at least access. In this issue, you’ll find stories about the relationships we forge with each other, with our past, with our children (or not-quite-children, as the case may be), and with technology itself. 

Rhiannon Williams explores the relationships people have formed with AI chatbots. Some of these are purely professional, others more complicated. This kind of relationship may be novel now, but it’s something we will all take for granted in just a few years. 

Also in this issue, Antonio Regalado delves into our relationship with the ecological past and the way ancient DNA is being used not only to learn new truths about who we are and where we came from but also, potentially, to address modern challenges of climate and disease.

In an extremely thought-provoking piece, Jessica Hamzelou examines people’s relationships with the millions of IVF embryos in storage. Held in cryopreservation tanks around the world, these embryos wait in limbo, in ever growing numbers, as we attempt to answer complicated ethical and legal questions about their existence and preservation. 

Turning to the workplace, Rebecca Ackermann explores how our relationships with our employers are often mediated through monitoring systems. As she writes, what may be more important than the privacy implications is how the data they collect is “shifting the relationships between workers and managers” as algorithms “determine hiring and firing, promotion and ‘deactivation.’” Good luck with that.

Thank you for reading. As always, I value your feedback. So please, reach out and let me know what you think. I really don’t want this to be a transactional relationship. 

Warmly,

Mat Honan
Editor in Chief
mat.honan@technologyreview.com

Welcome to robot city

Tourists to Odense, Denmark, come for the city’s rich history and culture: It’s where King Canute, Denmark’s last Viking king, was murdered during the 11th century, and the renowned fairy tale writer Hans Christian Andersen was born there some 700 years later. But today, Odense (with a population just over 210,000) is also home to more than 150 robotics, automation, and drone companies. It’s particularly renowned for collaborative robots, or cobots—those designed to work alongside humans, often in an industrial setting. Robotics is a “darling industry” for the city, says Mayor Peter Rahbæk Juel, and one its citizens are proud of.

Odense’s robotics success has its roots in the more traditional industry of shipbuilding. In the 1980s, the Lindø shipyard, owned by the Mærsk Group, faced increasing competition from Asia and approached the nearby University of Southern Denmark for help developing welding robots to improve the efficiency of the shipbuilding process. Niels Jul Jacobsen, then a student, recalls jumping at the chance to join the project; he’d wanted to work with robots ever since seeing Star Wars as a teenager. But “in Denmark [it] didn’t seem like a possibility,” he says. “There was no sort of activity going on.”

That began to change with the partnership between the shipyard and the university. In the ’90s, that relationship got a big boost when the foundation behind the Mærsk shipping company funded the creation of the Mærsk Mc-Kinney Møller Institute (MMMI), a center dedicated to studying autonomous systems. The Lindø shipyard eventually wound down its robotics program, but research continued at the MMMI. Students flocked to the institute to study robotics. And it was there that three researchers had the idea for a more lightweight, flexible, and easy-to-use industrial robot arm. That idea would become a startup called Universal Robots, Odense’s first big robotics success story. In 2015, the US semiconductor testing giant Teradyne acquired Universal Robots for $285 million. That was a significant turning point for robotics in the city. It was proof, says cofounder Kristian Kassow, that an Odense robotics company could make it without being tied to a specific project, like the previous shipyard work. It was a signal of legitimacy that attracted more recognition, talent, and investment to the local robotics scene.

Kim Povlsen, president and CEO of Universal Robots, says it was critical that Teradyne kept the company’s main base in Odense and maintained the Danish work culture, which he describes as nonhierarchical and highly collaborative. This extends beyond company walls, with workers generally happy to share their expertise with others in the local industry. “It’s like this symbiotic thing, and it works really well,” he says. Universal Robots positions itself as a platform company rather than just a manufacturer, inviting others to work with its tech to create robotic solutions for different sectors; the company’s robot arms can be found in car-part factories, on construction sites, in pharmaceutical laboratories, and on wine-bottling lines. It’s a growth play for the company, but it also offers opportunities to startups in the vicinity.

In 2018 Teradyne bought a second Odense robotics startup, Mobile Industrial Robots, which was founded by Jacobsen, the Star Wars fan who worked on the ship-welding robots in his university days. The company makes robots for internal transportation—for example, to carry pallets or tow carts in a warehouse. The sale has allowed Jacobsen to invest in other robotics projects, including Capra, a maker of outdoor mobile robots, where he is now CEO.

The success of these two large robotics companies, which together employ around 800 people in Odense, created a ripple effect, bringing both funding and business acumen into the robotics cluster, says Søren Elmer Kristensen, CEO of the government-funded organization Odense Robotics.

There are challenges to being based in a city that, though the third-largest in Denmark, is undeniably small on the global scale. Attracting funding is one issue. Most investment still comes from within the country’s borders. Sourcing talent is another; demand outstrips supply for highly qualified tech workers. Kasper Hallenborg, director of the MMMI, says the institute feels an obligation to produce enough graduates to support the local industry’s needs. Even now, too few women and girls enter STEM fields, he adds; the MMMI supports programs aimed at primary schoolers to try to strengthen the pipeline. As the Odense robotics cluster expands, however, it has become easier to attract international talent. It’s less of a risk for people to move, because plenty of companies are hiring if one job doesn’t work out. 

And Odense’s small size can have advantages. Juel, the mayor, points to drone-testing facilities established at the nearby Hans Christian Andersen Airport, which, thanks to relatively low air traffic, is able to offer plenty of flying time. The airport is one of the few that allow drones to fly beyond the visual line of sight.

The shipyard, once the city’s main employer, closed down completely shortly after the 2007–2008 financial crisis but has recently become an industrial park aimed at manufacturing particularly large structures like massive steel monopiles. The university is currently building a center to develop automation and robotics for use in such work. Visit today and you may see not ships but gigantic offshore wind turbines—assembled, of course, with the help of robots.

Victoria Turk is a technology journalist based in London.

Job titles of the future: Pharmaceutical-grade mushroom grower

Studies have indicated that psychedelic drugs, such as psilocybin and MDMA, have swift-acting and enduring antidepressant effects. Though the US Food and Drug Administration denied the first application for medical treatments involving psychedelics (an MDMA-based therapy) last August, these drugs appear to be on the road to mainstream medicine. Research into psilocybin led by the biotech company Compass Pathways has been slowed in part by the complexity of the trials, but the data already shows promise for the psychedelic compound within so-called magic mushrooms. Eventually, the FDA will decide whether to approve it to treat depression. If and when it does—a move that would open up a vast legal medical market—who will grow the mushrooms?

Scott Marshall already is. The head of mycology at the drug manufacturer Optimi Health in British Columbia, Canada, he is one of a very small number of licensed psilocybin mushroom cultivators in North America. Growers and manufacturers would need to do plenty of groundwork to be able to produce pharmaceutical psilocybin on an industrial, FDA-approved scale. That’s why Optimi is keen to get a head start.

A nascent industry

Marshall is at the cutting edge of the nascent psychedelics industry. Psilocybin mushroom production was not legally permitted in Canada until 2022, when the country established its limited compassionate-­access program. “Our work is pioneering large-scale, legal cultivation of psilocybin mushrooms, ensuring the highest standards of safety, quality, and consistency,” he says. 

Backed by more than $22 million in investment, Optimi received a drug establishment license in 2024 from Canadian regulators to export pharmaceutical-­grade psilocybin to psychiatrists abroad in the limited number of places that have legal avenues for its use. Oregon has legalized supervised mushroom journeys, Australia has approved psilocybin therapy for PTSD and depression, and an increasing number of governments—national, state, and local—are considering removing legal barriers to psychedelic mushrooms on a medical basis as the amount of research supporting their use grows. There are also suggestions that the Trump administration may be more likely to support federal reform in the US.

But the legal market, medical or otherwise, remains tiny. So for now, almost all of Marshall’s mushrooms—he has grown more than 500 pounds since joining Optimi in 2022—stay in the company’s vault. “By setting the bar for production and [compliance with] regulation,” he says, “we’re helping to expand scientific understanding and accessibility of psychedelics for therapeutic use.”

Learning the craft

Before Marshall, 40, began cultivating mushrooms, he was working in property management. But that changed in 2014, when a friend who was an experienced grower gave him a copy of the book Mushroom Cultivator: A Practical Guide to Growing Mushrooms at Home (1983). That friend also gave him a spore print, effectively the “seeds” of a mushroom, from which Marshall grew three Psilocybin cubensis mushrooms from the golden teacher variety, his first foray into the field. “I kept growing and growing and growing—for my own health and well-being—and then got to a point where I wanted to help other people,” he says.

In 2018, he established his own company, Ra Mushrooms, selling cultivation kits for several varieties, including illegal psilocybin, and he was regularly posting photos on Instagram of mushrooms he had grown. In 2022, he was hired by Optimi, marking his journey from underground grower to legal market cultivator—“an unbelievable dream of mine.” 

Mattha Busby is a journalist specializing in drug policy and psychedelic culture.

The Download: Introducing the Relationships issue

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Introducing: the Relationships issue

Relationships are the stories of people and systems working together. Sometimes by choice. Sometimes for practicality. Sometimes by force. Too often, for purely transactional reasons.

That’s why we’re exploring relationships in this issue. Relationships connect us to one another, but also to the machines, platforms, technologies, and systems that mediate modern life.

They’re behind the partnerships that make breakthroughs possible, the networks that help ideas spread, and the bonds that build trust—or at least access. In this issue, you’ll find stories about the relationships we forge with each other, with our past, with our children, and with technology itself.

Here’s just a taste of what you can expect:

+ People are forming relationships with AI chatbots. Some of these are purely professional, others more complicated. This kind of relationship may be novel now, but it’s something we will all take for granted in just a few years. 

+ Adventures in the genetic time machine. Ancient DNA is telling us more and more about humans and environments long past. Could it also help rescue the future?

+ Frozen embryos are filling storage banks around the world. It’s a struggle to know what to do with them. Read the full story.

+ Our relationships with our employers are often mediated through monitoring systems. And while it’s increasing the power imbalance between companies and workers, protections are lagging far behind. Read the full story.

MIT Technology Review Narrated: The messy quest to replace drugs with electricity

“Electroceuticals” promised the post-pharma future for medicine. But their exclusive focus on the nervous system is seeming less and less warranted.

This is our latest story to be turned into a MIT Technology Review Narrated podcast, which 
we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 DOGE is working on software to automate firing workers
It builds on an existing program previously used by the US Department of Defense. (Wired $)
+ DOGE workers are already resigning from the department. (Fast Company $)
+ Can AI help DOGE slash government budgets? It’s complex. (MIT Technology Review)

2 American workers are generally pessimistic about AI
Whereas Silicon Valley can’t get enough of it.(WP $)
+ How to fine-tune AI for prosperity. (MIT Technology Review)

 3 iPhones are autocorrecting the term ‘racist’ to ‘Trump’
The company is blaming what it calls a ‘phonetic overlap.’ (NYT $)
+ It’s promised to fix the bug as soon as possible. (FT $)

4 Amy Gleason is the head of DOGE, apparently
The former Digital Service senior advisor is the acting administrator. (NY Mag $)
+ But Elon Musk is still ultimately in charge. (NBC News)

5 Grok’s new unhinged mode can simulate phone sex
If that’s what you’re into. (Ars Technica)

6 More data centers don’t necessarily mean more jobs
The massive facilities don’t actually need many humans to run them. (WSJ $)
+ Not that that’s putting Meta off building a gigantic data center campus. (The Information $)

7 China is keen for tech companies to monetize their data
But not everyone is buying in. (Rest of World)

8 The slow death of the combustion engine
Pistons are out, and electrons are in. (IEEE Spectrum)
+ Why EVs are (mostly) set for solid growth in 2025. (MIT Technology Review)

9 The US is in love with cheap clothing
And established brands are the ones paying the price. (Insider $)

10 What frozen mummies can tell us about the ancient world
From wolf pups to mammoths. (New Scientist $)

Quote of the day

“I felt nothing but utter disgust. I no longer enjoyed sitting in my Tesla.”

—Mike Schwede, an entrepreneur living in Switzerland, tells the Guardian he’s turned his back on the electric car company after Elon Musk’s Nazi-linked salutes during Trump’s inauguration.

The big story

Think that your plastic is being recycled? Think again.

October 2023

The problem of plastic waste hides in plain sight, a ubiquitous part of our lives we rarely question. But a closer examination of the situation is shocking. To date, humans have created around 11 billion metric tons of plastic. 72% of the plastic we make ends up in landfills or the environment. Only 9% of the plastic ever produced has been recycled.

To make matters worse, plastic production is growing dramatically; in fact, half of all plastics in existence have been produced in just the last two decades. Production is projected to continue growing, at about 5% annually.

So what do we do? Sadly, solutions such as recycling and reuse aren’t equal to the scale of the task. The only answer is drastic cuts in production in the first place. Read the full story

—Douglas Main

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Look up to the sky over the next few nights: seven planets will be aligned, and won’t do so again until 2040.
+ Jeremy Strong probably won’t win an Oscar next week, but he definitely deserves to.
+ Why English is such a strange language.
+ 1985 produced some truly anthemic songs—and some absolute bilge.

The Rebirth of ‘Marketing Mix Modeling’

Lingering privacy challenges and ever-improving cloud and artificial intelligence technology are driving a marketing model renaissance.

Marketing mix modeling (MMM), launched in 1949, fell out of favor in the early 2000s when digital advertising took off. The data-driven technique had long helped marketers understand how variables such as advertising, promotion, and prices impact revenue.

Yet compared to tracking cookies and last-touch attribution models, MMM seemed complex and expensive.

Renaissance

In 2025, however, MMM has enjoyed renewed attention.

Meta and Google have released free, open-source MMM tools in the past couple of years — Google’s Meridian on January 29, 2025, and Meta’s Robyn in 2023.

Why does MMM interest two of the largest digital advertising platforms? I see three probable factors: tracking cookies, AI, and cloud computing.

Meridian: Empower your team with best-in-class marketing mix models and drive better business outcomesMeridian is an open-source MMM built by Google that provides innovative solutions to key measurement challenges.

Google launched Meridian, an open-source marketing-mix model, last month.

Cookie-less Advertising

Controversies surrounding tracking cookies are the first driver. Cookies are a foundational and useful technology. A first-party cookie on a browser keeps users logged into a website and retains their preferences.

However, third-party tracking cookies that catalog an individual’s behavior across web properties are a privacy pariah. Laws such as Europe’s General Data Protection Regulation and the California Consumer Privacy Act limit such cookies, and many browser companies have stopped supporting them entirely.

The potential of cookie-less ad targeting makes MMM attractive to large-scale advertisers and platforms.

Advertising performance. Third-party cookies, despite privacy concerns, drive ad targeting and thus performance. MMM should help advertisers identify which marketing channels and creatives produce the best returns. Coupled with new ad targeting techniques, MMM will almost certainly improve performance.

Meta’s Robyn, for example, helps advertisers analyze the performance of campaigns across Facebook, Instagram, and other channels. It gauges channel effectiveness and optimizes ad spend based on results.

The era of cookie-less targeting encourages the use of MMM. Some of the most forward-looking, high-budget advertisers are considering alternative targeting methods and new promotional channels. Monitoring those experiments requires complicated multi-touch attribution or MMM.

For example, Google’s Meridian MMM moved beyond standard regression models to a theory called “Bayesian causal inference,” which captures the impact of imprecise marketing actions, such as a social media post.

Personal privacy is yet another reason why MMM appeals to Google, Meta, and many advertisers. The model aggregates data and generally avoids personally identifiable information.

Artificial Intelligence

AI makes MMM relatively faster, more adaptive, and easier to scale.

Improved speeds come first from training the model quickly. The foundational, now-available models are a massive headstart compared to starting from scratch.

Second, AI helps process and clean large, complex datasets from multiple sources such as digital ads, TV, print, and online and in-store sales. The associated algorithms detect seasonality, outliers, and data anomalies, reducing manual work but still requiring data scientists to fine-tune the models.

Regardless, the AI behind Meta’s Robyn dynamically adjusts model variables, improving accuracy automatically. It is thus more adaptable and scalable.

Cloud Computing

Twenty years ago, the rise of web-based marketing produced massive datasets and hefty processing loads. Analyzing that info typically required custom-built infrastructure and expensive data warehouses

These limitations no longer exist thanks to cloud computing advances and affordability. Instead of spending $500,000 or more on MMM software and servers, a company can run Google Meridian in the cloud for a fraction of the amount, perhaps as little as $10,000 a year.

Accurate modeling, however, requires some scale — businesses investing at least $500,000 per year in advertising likely benefit the most. But that could change if MMM becomes available as a service.

Data Suggests Google Indexing Rates Are Improving via @sejournal, @martinibuster

New research of over 16 million webpages shows that Google indexing rates have improved but that many pages in the dataset were not indexed and over 20% of the pages were eventually deindexed. The findings may be representative of trends and challenges that are specific to sites that are concerned about SEO and indexing.

Research By IndexCheckr Tool

IndexCheckr is a Google indexing tracking tool that enables subscribers to be alerted to when content is indexed, monitor currently indexed pages and to monitor the indexing status of external pages that are hosting backlinks to subscriber web pages.

The research may not statistically correlate to Internet-wide Google indexing trends but it may have a close-enough correlation to sites whose owners are concerned with indexing and backlink monitoring, enough to subscribe to a tool to monitor those trends.

About Indexing

In web indexing, search engines crawl the internet, filter content (such as removing duplicates or low-quality pages), and store the remaining pages in a structured database called a Search Index. This search index is stored on a distributed file system. Google originally used the Google File System (GFS) but later upgraded to Colossus, which is optimized for handling massive amounts of search data across thousands of servers.

Indexing Success Rates

The research shows that most pages in their dataset were not indexed but that indexing rates have improved from 2022 to 2025. Most pages that Google indexed are indexed within six months.

  • Most pages in the dataset were not indexed (61.94%).
  • Indexing rates have improved from 2022 to 2025.
  • Google indexes most pages that do get indexed within six months (93.2%).

Deindexing Trends

The indexing trends are very interesting, especially about how fast Google is at deindexing pages. Of all the indexed pages in the entire dataset, 13.7% of them are deindexed within three months after indexing. The overall rate of deindexing is 21.29%. A sunnier way of interpreting that data is that 78.71% remained firmly indexed by Google.

Deindexing is generally related to Google quality factors but it could also reflect website publishers and SEOs who purposely request web page deindexing through noindex directives like the Meta Robots element.

Here is the time-based cumulative percentages of deindexing:

  • 1.97% of the indexed pages are deindexed within 7 days.
  • 7.97% are deindexed within 30 days.
  • 13.70% deindexed within 90 days
  • 21.29% deindexed after 90 days.

The research paper that I was provided offers this observation:

“This timeline highlights the importance of early monitoring and optimization to address potential issues that could lead to deindexing. Beyond three months, the risk of deindexing diminishes but persists, making periodic audits essential for long-term content visibility.”

Impact Of Indexing Services

The next part of the research highlights the effectiveness of tools designed to increase the web page indexing. They found that URLs submitted to indexing tools had a low 29.37% success rate. That means that 70.63% of submitted web pages remained unindexed, possibly highlighting limitations in manual submission strategies.

High Percentage Of Pages Not Indexed

Less than 1% of the tracked websites were entirely unindexed. The majority of unindexed URLs were from websites that were indexed by Google. 37.08% of all the tracked pages were fully indexed.

These numbers may not reflect the state of the Internet because the data is pulled from a set of sites that are subscribers to an indexing tool. That slants the data being measured and makes it different from what the state of the entire Internet may be.

Google Indexing Has Improved Since 2022

Although there are some grim statistics in the data a bright spot is that there’s been a steady increase in indexing rates from 2022 to 2025, suggesting that Google’s ability to process and include pages may have improved.

According to IndexCheckr:

“The data from 2022 to 2025 shows a steady increase in Google’s indexing rate, suggesting that the search engine may be catching up after previously reported indexing struggles.”

Summary Of Findings

Complete deindexing at a website-level are rare for this dataset. Google’s indexing speed varies and more than half of the web pages in this dataset struggles to get indexed, possibly related to site quality.

What kinds of site quality issues would impact indexing? In my opinion, some of what is causing this could include commercial product pages with content that’s bulked up for the purposes of feeding the bot. I’ve reviewed a few ecommerce sites doing that who either struggled to get indexed or to rank. Google’s organic search results (SERPs) for ecommerce are increasingly precise. Those kinds of SERPs don’t make sense when reviewed through the lens of SEO and that’s because strategies based on feeding the bot entities, keywords and topical maps tend to result in search engine first websites and that’s not going to affect the ranking factors that really count that are related to how users may react to content.

Read the indexing study at IndexCheckr.com:

Google Indexing Study: Insights from 16 Million Pages

Featured Image by Shutterstock/Shutterstock AI Generator

AI Search Engines Often Cite Third-Party Content, Study Finds via @sejournal, @MattGSouthern

A recent analysis by xfunnel.ai examines citation patterns across major AI search engines.

The findings provide new insight into how these tools reference web content in their responses.

Here are the must-know highlights from the report.

Citation Frequency Differs By Platform

Researchers submitted questions across different buyer journey stages and tracked how the AI platforms responded.

The study analyzed 40,000 responses containing 250,000 citations and found differences in citation frequency:

  • Perplexity: 6.61 citations per response
  • Google Gemini: 6.1 citations per response
  • ChatGPT: 2.62 citations per response

ChatGPT was tested in its standard mode, not with explicitly activated search features, which may explain its lower citation count.

Third-Party Content Leads Citation Types

The research categorized citations into four groups:

  • Owned (company domains)
  • Competitor domains
  • Earned (third-party/affiliate sites)
  • UGC (user-generated content)

Across all platforms, earned content represents the largest percentage of citations, with UGC showing increasing representation.

Affiliate sites and independent blogs hold weight in AI-generated responses as well.

Citations Change Throughout Customer Journey

The data shows differences in citation patterns based on query types:

  • During the problem exploration and education stages, there is a higher percentage of citations from third-party editorial content.
  • UGC citations from review sites and forums increase in the comparison stages.
  • In the final research and evaluation phase, citations tend to come directly from brand websites and competitors.

Source Quality Distribution

When examining the quality distribution of cited sources, the data showed:

  • High-quality sources: ~31.5% of citations
  • Upper-mid quality sources: ~15.3% of citations
  • Mid-quality sources: ~26.3% of citations
  • Lower-mid quality sources: ~22.1% of citations
  • Low-quality sources: ~4.8% of citations

This indicates AI search engines prefer higher-quality sources but regularly cite content from middle-tier sources.

Platform-Specific UGC Preferences

Each AI search engine shows preferences for different UGC sources:

  • Perplexity: Favors YouTube and PeerSpot
  • Google Gemini: Frequently cites Medium, Reddit, and YouTube
  • ChatGPT: Often references LinkedIn, G2, and Gartner Peer Reviews

The Third-Party Citation Opportunity

The data exposes a key area that many SEO professionals might be overlooking.

While the industry often focuses on technical changes to owned content for AI search optimization, this research suggests a different approach may be more effective.

Since earned media (content from third parties) is the biggest citation source on AI search platforms, it’s important to focus on:

  • Building relationships with industry publications
  • Creating content that others want to cover
  • Contributing guest articles to trusted websites
  • Developing strategies for the user-generated content (UGC) platforms that each AI engine prefers

This is a return to basics: create valuable content that others will want to reference instead of just modifying existing content for AI.

Why This Matters

As AI search is more widely used, understanding these citation patterns can help you stay visible.

The findings show the need to use different content strategies across various platforms.

However, maintaining quality and authority is essential. So don’t neglect SEO fundamentals in pursuit of broader content distribution.

Top Takeaway

Invest in a mix of owned content, third-party coverage, and presence on relevant UGC platforms to increase the likelihood of your content being cited by AI search engines.

The data suggests that earning mentions on trusted third-party sites may be even more valuable than optimizing your domain content.


Featured Image: Tada Images/Shutterstock