OpenAI: Internal Experiment Caused Elevated Errors via @sejournal, @martinibuster

OpenAI published a new write-up about elevated errors in ChatGPT that significantly increased failed conversation attempts. The issue was caused by a misconfigured internal experiment.

According to OpenAI:

“On February 19, 2025, from 9:48 AM to 11:19 AM PT, ChatGPT experienced a service degradation, leading to a significant increase in failed conversation attempts. This resulted in blank responses for many users.

The root cause was a misconfigured internal experiment that unintentionally triggered a surge in traffic, overwhelming our inference infrastructure. This increase in load led to saturation of compute resources, causing failures in generating responses.

After identifying the root cause, we took immediate action by temporarily shedding load from free-tier users to stabilize the system. As capacity was restored, paid users gradually recovered, and the full service was restored by 11:19 AM PT.”

OpenAI Continues To Work On Solutions

The incident response goes on to note that they continue to work on changes that will prevent similar outages from happening, writing:

“Stronger Safeguards: Building better protections around experiment changes and configurations by moving from a uniform approval process to a risk-based model to ensure safer rollouts of experiments.

Faster Root Cause Identification: Automating notifications for relevant changes and experiments to more quickly identify root causes of increased failures.”

Read the incident report

Elevated errors for ChatGPT

Google On Search Console Noindex Detected Errors via @sejournal, @martinibuster

Google’s John Mueller answered a question on Reddit about a seemingly false ‘noindex detected in X-Robots-Tag HTTP header’ error reported in Google Search Console for pages that do not have that specific X-Robots-Tag or any other related directive or block. Mueller suggested some possible reasons, and multiple Redditors provided reasonable explanations and solutions.

Noindex Detected

The person who started the Reddit discussion described a scenario that may be familiar to many. Google Search Console reports that it couldn’t index a page because it was blocked not from indexing the page (which is different from blocked from crawling). Checking the page reveals no presence of a noindex meta element and there is no robots.txt blocking the crawl.

Here is what the described as their situation:

  • “GSC shows “noindex detected in X-Robots-Tag http header” for a large part of my URLs. However:
  • Can’t find any noindex in HTML source
  • No noindex in robots.txt
  • No noindex visible in response headers when testing
  • Live Test in GSC shows page as indexable
  • Site is behind Cloudflare (We have checked page rules/WAF etc)”

They also reported that they tried spoofing Googlebot and tested various IP addresses and request headers and still found no clue for the source of the X-Robots-Tag

Cloudflare Suspected

One of the Redditors commented in that discussion to suggest troubleshooting if the problem was originated from Cloudflare.

They offered a comprehensive step by step instructions on how to diagnose if Cloudflare or anything else was preventing Google from indexing the page:

“First, compare Live Test vs. Crawled Page in GSC to check if Google is seeing an outdated response. Next, inspect Cloudflare’s Transform Rules, Response Headers, and Workers for modifications. Use curl with the Googlebot user-agent and cache bypass (Cache-Control: no-cache) to check server responses. If using WordPress, disable SEO plugins to rule out dynamic headers. Also, log Googlebot requests on the server and check if X-Robots-Tag appears. If all fails, bypass Cloudflare by pointing DNS directly to your server and retest.”

The OP (orginal poster, the one who started the discussion) responded that they had tested all those solutions but were unable to test a cache of the site via GSC, only the live site (from the actual server, not Cloudflare).

How To Test With An Actual Googlebot

Interestingly, the OP stated that they were unable to test their site using Googlebot, but there is actually a way to do that.

Google’s Rich Results Tester uses the Googlebot user agent, which also originates from a Google IP address. This tool is useful for verifying what Google sees. If an exploit is causing the site to display a cloaked page, the Rich Results Tester will reveal exactly what Google is indexing.

A Google’s rich results support page confirms:

“This tool accesses the page as Googlebot (that is, not using your credentials, but as Google).”

401 Error Response?

The following probably wasn’t the solution but it’s an interesting bit of technical SEO knowledge.

Another user shared the experience of a server responding with a 401 error response. A 401 response means “unauthorized” and it happens when a request for a resource is missing authentication credentials or the provided credentials are not the right ones. Their solution to make the indexing blocked messages in Google Search Console was to add a notation in the robots.txt to block crawling of login page URLs.

Google’s John Mueller On GSC Error

John Mueller dropped into the discussion to offer his help diagnosing the issue. He said that he has seen this issue arise in relation to CDNs (Content Delivery Networks). An interesting thing he said was that he’s also seen this happen with very old URLs. He didn’t elaborate on that last one but it seems to imply some kind of indexing bug related to old indexed URLs.

Here’s what he said:

“Happy to take a look if you want to ping me some samples. I’ve seen it with CDNs, I’ve seen it with really-old crawls (when the issue was there long ago and a site just has a lot of ancient URLs indexed), maybe there’s something new here…”

Key Takeaways: Google Search Console Index Noindex Detected

  • Google Search Console (GSC) may report “noindex detected in X-Robots-Tag http header” even when that header is not present.
  • CDNs, such as Cloudflare, may interfere with indexing. Steps were shared to check if Cloudflare’s Transform Rules, Response Headers, or cache are affecting how Googlebot sees the page.
  • Outdated indexing data on Google’s side may also be a factor.
  • Google’s Rich Results Tester can verify what Googlebot sees because it uses Googlebot’s user agent and IP, revealing discrepancies that might not be visible from spoofing a user agent.
  • 401 Unauthorized responses can prevent indexing. A user shared that their issue involved login pages that needed to be blocked via robots.txt.
  • John Mueller suggested CDNs and historically crawled URLs as possible causes.
Analysis Forecasts More Vulnerabilities In 2025 via @sejournal, @martinibuster

A new analysis predicts that the number of reported vulnerabilities will reach record highs in 2025, continuing the trend of rising cybersecurity risks and increased vulnerability disclosures.

Analysis By FIRST

The analysis was published by the Forum of Incident Response and Security Teams (FIRST), a global organization that helps coordinate cybersecurity responses. It forecasts almost 50,000 vulnerabilities in 2025, an increase of 11% over 2024 and a 470% increase from 2023. The report suggest that organizations need to shift from reactive security measures to a more strategic approach that prioritizes vulnerabilities based on risk, planning patching efforts efficiently, and preparing for surges in disclosures rather than struggling to keep up after the fact.

Why Are Vulnerabilities Increasing?

There are three trends driving the increase in vulnerabilities.

1. AI-driven discovery and open-source expansion are accelerating CVE disclosures.

AI is vulnerability discovery, including machine learning and automated tools are making it easier to detect vulnerabilities in software which in turn leads to more CVE (Common Vulnerabilities and Exposures) reports. AI allows security researchers to scan larger amounts of code to quickly identify flaws that would have gone unnoticed using traditional methods.

The press release highlights the role of AI:

“More software, more vulnerabilities: The rapid adoption of open-source software and AI-driven vulnerability discovery has made it easier to identify and report flaws.”

2. Cyber Warfare And State-Sponsored Attacks

State-sponsored attacks are increasing which in turn leads to more of these kinds of vulnerabilities being discovered.

The press release explains:

“State-sponsored cyber activity: Governments and nation-state actors are increasingly engaging in cyber operations, leading to more security weaknesses being exposed.”

3. Shifts In CVE Ecosystem

Patchstack, a WordPress security company, identifies and patches vulnerabilities. Their work is adding to the number of vulnerabilities discovered every year. Patchstack offers vulnerability detection and virtual patches. Patchstack’s participation in this ecosystem is helping expose more vulnerabilities, particularly those affecting WordPress.

The press release provided to Search Engine Journal states:

“New contributors to the CVE ecosystem, including Linux and Patchstack, are influencing disclosure patterns and increasing the number of reported vulnerabilities. Patchstack, which focuses on WordPress security, is playing a role in surfacing vulnerabilities that might have previously gone unnoticed. As the CVE ecosystem expands, organizations must adapt their risk assessment strategies to account for this evolving landscape.”

Eireann Leverett, FIRST liaison and lead member of FIRST’s Vulnerability Forecasting Team, highlighted the accelerating growth of reported vulnerabilities and the need for proactive risk management, stating:

“For a small to medium-sized ecommerce site, patching vulnerabilities typically means hiring external partners under an SLA to manage patches and minimize downtime. These companies usually don’t analyze each CVE individually, but they should anticipate increased demands on their third-party IT suppliers for both planned and unplanned maintenance. While they might not conduct detailed risk assessments internally, they can inquire about the risk management processes their IT teams or external partners have in place. In cases where third parties, such as SOCs or MSSPs, are involved, reviewing SLAs in contracts becomes especially important.

For enterprise companies, the situation is similar, though many have in-house teams that perform more rigorous, quantitative risk assessments across a broad (and sometimes incomplete) asset register. These teams need to be equipped to carry out emergency assessments and triage individual vulnerabilities, often differentiating between mission-critical and non-critical systems. Tools like the SSVC (https://www.cisa.gov/ssvc-calculator) and EPSS (https://www.first.org/epss/) can be used to inform patch prioritization by factoring in bandwidth, file storage, and the human element in maintenance and downtime risks.

Our forecasts are designed to help organizations strategically plan resources a year or more in advance, while SSVC and EPSS provide a tactical view of what’s critical today. In this sense, vulnerability forecasting is like an almanac that helps you plan your garden months ahead, whereas a weather report (via EPSS and SSVC) guides your daily outfit choices. Ultimately, it comes down to how far ahead you want to plan your vulnerability management strategy.

We’ve found that Boards of Directors, in particular, appreciate understanding that the tide of vulnerabilities is rising. A clearly defined risk tolerance is essential to prevent costs from becoming unmanageable, and these forecasts help illustrate the workload and cost implications of setting various risk thresholds for the business.”

Looking Ahead to 2026 and Beyond

The FIRST forecast predicts that over 51,000 vulnerabilities will be disclosed in 2026, signaling that cybersecurity risks will continue to increase. This underscores the growing need for proactive risk management rather than relying on reactive security measures.

For users of software like WordPress, there are multiple ways to mitigate cybersecurity threats. Patchstack, Wordfence, and Sucuri each offer different approaches to strengthening security through proactive defense strategies.

The main takeaways are:

  • Vulnerabilities are increasing – FIRST predicts up to 50,000 CVEs in 2025, an 11% rise from 2024 and 470% increase from 2023.
  • AI and open-source adoption are driving more vulnerability disclosures.
  • State-sponsored cyber activity is exposing more security weaknesses.
  • Shifting from reactive to proactive security is essential for managing risks.

Read the 2025 Vulnerability Forecast:

Vulnerability Forecast for 2025

Featured Image by Shutterstock/Gorodenkoff

Data Suggests Google Indexing Rates Are Improving via @sejournal, @martinibuster

New research of over 16 million webpages shows that Google indexing rates have improved but that many pages in the dataset were not indexed and over 20% of the pages were eventually deindexed. The findings may be representative of trends and challenges that are specific to sites that are concerned about SEO and indexing.

Research By IndexCheckr Tool

IndexCheckr is a Google indexing tracking tool that enables subscribers to be alerted to when content is indexed, monitor currently indexed pages and to monitor the indexing status of external pages that are hosting backlinks to subscriber web pages.

The research may not statistically correlate to Internet-wide Google indexing trends but it may have a close-enough correlation to sites whose owners are concerned with indexing and backlink monitoring, enough to subscribe to a tool to monitor those trends.

About Indexing

In web indexing, search engines crawl the internet, filter content (such as removing duplicates or low-quality pages), and store the remaining pages in a structured database called a Search Index. This search index is stored on a distributed file system. Google originally used the Google File System (GFS) but later upgraded to Colossus, which is optimized for handling massive amounts of search data across thousands of servers.

Indexing Success Rates

The research shows that most pages in their dataset were not indexed but that indexing rates have improved from 2022 to 2025. Most pages that Google indexed are indexed within six months.

  • Most pages in the dataset were not indexed (61.94%).
  • Indexing rates have improved from 2022 to 2025.
  • Google indexes most pages that do get indexed within six months (93.2%).

Deindexing Trends

The indexing trends are very interesting, especially about how fast Google is at deindexing pages. Of all the indexed pages in the entire dataset, 13.7% of them are deindexed within three months after indexing. The overall rate of deindexing is 21.29%. A sunnier way of interpreting that data is that 78.71% remained firmly indexed by Google.

Deindexing is generally related to Google quality factors but it could also reflect website publishers and SEOs who purposely request web page deindexing through noindex directives like the Meta Robots element.

Here is the time-based cumulative percentages of deindexing:

  • 1.97% of the indexed pages are deindexed within 7 days.
  • 7.97% are deindexed within 30 days.
  • 13.70% deindexed within 90 days
  • 21.29% deindexed after 90 days.

The research paper that I was provided offers this observation:

“This timeline highlights the importance of early monitoring and optimization to address potential issues that could lead to deindexing. Beyond three months, the risk of deindexing diminishes but persists, making periodic audits essential for long-term content visibility.”

Impact Of Indexing Services

The next part of the research highlights the effectiveness of tools designed to increase the web page indexing. They found that URLs submitted to indexing tools had a low 29.37% success rate. That means that 70.63% of submitted web pages remained unindexed, possibly highlighting limitations in manual submission strategies.

High Percentage Of Pages Not Indexed

Less than 1% of the tracked websites were entirely unindexed. The majority of unindexed URLs were from websites that were indexed by Google. 37.08% of all the tracked pages were fully indexed.

These numbers may not reflect the state of the Internet because the data is pulled from a set of sites that are subscribers to an indexing tool. That slants the data being measured and makes it different from what the state of the entire Internet may be.

Google Indexing Has Improved Since 2022

Although there are some grim statistics in the data a bright spot is that there’s been a steady increase in indexing rates from 2022 to 2025, suggesting that Google’s ability to process and include pages may have improved.

According to IndexCheckr:

“The data from 2022 to 2025 shows a steady increase in Google’s indexing rate, suggesting that the search engine may be catching up after previously reported indexing struggles.”

Summary Of Findings

Complete deindexing at a website-level are rare for this dataset. Google’s indexing speed varies and more than half of the web pages in this dataset struggles to get indexed, possibly related to site quality.

What kinds of site quality issues would impact indexing? In my opinion, some of what is causing this could include commercial product pages with content that’s bulked up for the purposes of feeding the bot. I’ve reviewed a few ecommerce sites doing that who either struggled to get indexed or to rank. Google’s organic search results (SERPs) for ecommerce are increasingly precise. Those kinds of SERPs don’t make sense when reviewed through the lens of SEO and that’s because strategies based on feeding the bot entities, keywords and topical maps tend to result in search engine first websites and that’s not going to affect the ranking factors that really count that are related to how users may react to content.

Read the indexing study at IndexCheckr.com:

Google Indexing Study: Insights from 16 Million Pages

Featured Image by Shutterstock/Shutterstock AI Generator

AI Search Engines Often Cite Third-Party Content, Study Finds via @sejournal, @MattGSouthern

A recent analysis by xfunnel.ai examines citation patterns across major AI search engines.

The findings provide new insight into how these tools reference web content in their responses.

Here are the must-know highlights from the report.

Citation Frequency Differs By Platform

Researchers submitted questions across different buyer journey stages and tracked how the AI platforms responded.

The study analyzed 40,000 responses containing 250,000 citations and found differences in citation frequency:

  • Perplexity: 6.61 citations per response
  • Google Gemini: 6.1 citations per response
  • ChatGPT: 2.62 citations per response

ChatGPT was tested in its standard mode, not with explicitly activated search features, which may explain its lower citation count.

Third-Party Content Leads Citation Types

The research categorized citations into four groups:

  • Owned (company domains)
  • Competitor domains
  • Earned (third-party/affiliate sites)
  • UGC (user-generated content)

Across all platforms, earned content represents the largest percentage of citations, with UGC showing increasing representation.

Affiliate sites and independent blogs hold weight in AI-generated responses as well.

Citations Change Throughout Customer Journey

The data shows differences in citation patterns based on query types:

  • During the problem exploration and education stages, there is a higher percentage of citations from third-party editorial content.
  • UGC citations from review sites and forums increase in the comparison stages.
  • In the final research and evaluation phase, citations tend to come directly from brand websites and competitors.

Source Quality Distribution

When examining the quality distribution of cited sources, the data showed:

  • High-quality sources: ~31.5% of citations
  • Upper-mid quality sources: ~15.3% of citations
  • Mid-quality sources: ~26.3% of citations
  • Lower-mid quality sources: ~22.1% of citations
  • Low-quality sources: ~4.8% of citations

This indicates AI search engines prefer higher-quality sources but regularly cite content from middle-tier sources.

Platform-Specific UGC Preferences

Each AI search engine shows preferences for different UGC sources:

  • Perplexity: Favors YouTube and PeerSpot
  • Google Gemini: Frequently cites Medium, Reddit, and YouTube
  • ChatGPT: Often references LinkedIn, G2, and Gartner Peer Reviews

The Third-Party Citation Opportunity

The data exposes a key area that many SEO professionals might be overlooking.

While the industry often focuses on technical changes to owned content for AI search optimization, this research suggests a different approach may be more effective.

Since earned media (content from third parties) is the biggest citation source on AI search platforms, it’s important to focus on:

  • Building relationships with industry publications
  • Creating content that others want to cover
  • Contributing guest articles to trusted websites
  • Developing strategies for the user-generated content (UGC) platforms that each AI engine prefers

This is a return to basics: create valuable content that others will want to reference instead of just modifying existing content for AI.

Why This Matters

As AI search is more widely used, understanding these citation patterns can help you stay visible.

The findings show the need to use different content strategies across various platforms.

However, maintaining quality and authority is essential. So don’t neglect SEO fundamentals in pursuit of broader content distribution.

Top Takeaway

Invest in a mix of owned content, third-party coverage, and presence on relevant UGC platforms to increase the likelihood of your content being cited by AI search engines.

The data suggests that earning mentions on trusted third-party sites may be even more valuable than optimizing your domain content.


Featured Image: Tada Images/Shutterstock

TikTok Beats Competitors by 2X with $6B In-App Revenue via @sejournal, @MattGSouthern

TikTok, including Douyin in China, earned $6 billion in in-app purchases last year, more than double any competitor, according to Sensor Tower’s Q4 Digital Market Index.

This marked a 36% increase from the previous year, showcasing TikTok’s growing financial dominance in mobile commerce.

Mobile App Market Growth

Global in-app purchase revenue reached $39.4 billion in Q4 2024, up 13.5% year over year. Total 2024 revenue reached $150 billion (+12.5% from 2023).

Non-gaming apps grew faster, with revenue up 28.2% to $19.2 billion. The revenue gap between apps and games narrowed to just $1 billion, down from $5 billion a year earlier.

TikTok’s Commerce Transformation

Recent Ipsos research commissioned by TikTok reveals how the platform reshapes commerce behavior.

The study of nearly 4,000 US consumers found that 73% of TikTok shoppers value the platform’s personalized recommendations, with three-quarters agreeing it’s their go-to place for discovering new brands and products.

Discovery Engine Drives Purchases

TikTok’s approach to discovery helps drive in-app sales.

Two main components of product discovery on TikTok include:

  1. Personalized For You feed: 68% of TikTok shoppers say the personalized content allows for greater product discovery
  2. Intent-based search: Nearly 1 in 4 users search for something within 30 seconds of opening the app

This discovery system translates to sales. 70% of TikTok shoppers report purchasing after seeing an ad or shoppable content on the platform.

The Authenticity Factor

TikTok’s commerce success is built on trust.

Ipsos found that 74% of users believe TikTok’s creator content feels authentic, which is higher than that of other platforms.

Aaron Jones, Liquid I.V. VP of E-commerce & Media, explained how this authenticity drove results:

“An affiliate creator created an honest review that took off, resulting in a sales lift across omnichannel and a full sell out of the flavor with over 59K total orders on TikTok Shop. Of the purchasers, 88% were new customers.”

Actionable Strategies for Marketers

The Ipsos research identifies three key strategies for brands:

  1. Capture immediate purchases with in-app commerce: Use TikTok Shop for shoppable videos, LIVE Shopping, and affiliate partnerships
  2. Maximize e-commerce with always-on tactics: Create full-funnel experiences between TikTok engagement and external purchases
  3. Drive commerce everywhere with hybrid strategies: Facilitate seamless journeys across physical and online environments

Platform Context

iOS accounts for 70% of in-app revenue ($27.5B), while Google Play leads in downloads with 73.6% market share despite hitting its lowest download count (25.1B) since Q1 2020.

TikTok ROI

A recent study by Dentsu showed that TikTok gives advertisers the best short-term ROI in Nordic markets.

The study found that TikTok produced an ROI of 11.8. This means that brands earned almost 12 times their initial investment in sales revenue within six weeks of advertising on the platform.

Brands that consistently used TikTok as an always-on channel instead of running occasional campaigns saw better sales results and higher returns.

Looking Ahead

In 2025, TikTok is becoming an essential platform for digital marketers due to its solid monetization strategies, noteworthy ROI metrics, and expanding role in commerce.

Google Simplifies Removing Personal Info From Search Results via @sejournal, @MattGSouthern

Google is introducing new features that streamline removing personal information from search results.

These updates include:

  • A redesigned “Results about you” hub
  • A simplified removal request process
  • An option to refresh outdated search results.

Redesigned “Results About You” Page

Google has updated its Results About You tool.

Now, it proactively searches for personal information and alerts you if it finds any.

When you get this alert, you can ask Google to remove the information or contact the website directly.

The new interface is designed to make it easier for users to sign up for and manage alerts about their personal data.

Simplified Removal Process

Google is introducing a streamlined removal process that simplifies the steps needed to file a takedown request.

When you find a search result that contains your personal information, you can click on the three-dot menu next to that result to access an updated panel.

This panel clarifies the types of content that qualify for removal and guides you through the request process.

Image Credit: Google
Image Credit: Google
Image Credit: Google

Easier Refreshes For Outdated Results

Google is rolling out an update that addresses outdated search results.

Sometimes, a webpage’s content may no longer match what appears on Google if the webpage has been edited or removed.

Google now offers the ability to request a refresh of specific search results, prompting its systems to recrawl the webpage.

Previously, you had to wait for Google’s regular crawling schedule to notice any changes, which could take weeks.

Now, you can click the three dots next to an outdated search result and request a refresh. Google’s systems will then recrawl the page to retrieve the latest information.

Looking Ahead

Google’s latest update responds to the need for better privacy controls as more people worry about their personal information online. This change also shows that Google is adapting to regulatory pressure to protect personal data.

It’s important to note that these features only affect Google’s search results. They do not affect how your personal information appears on other search engines and websites.

For more details, see Google’s announcement.


Featured Image: mundissima/Shutterstock

Mullenweg Rebuffs Plea To Restore Automattic’s WordPress Core Contributions via @sejournal, @martinibuster

AA WordPress developer pleaded with Matt Mullenweg at WordCamp Asia 2025, asking him to restore Automattic’s contributions to the WordPress core. Mullenweg apologized and said it’s not up to him; it’s up to WP Engine to drop their lawsuit, and he encouraged the community to put pressure on WP Engine.

Automattic’s Scaled-Back WordPress Contributions

Automattic announced in January 2025 that they were scaling back contributions to the WordPress core to those related to security and critical updates. Contributions that would otherwise had gone to core would be diverted to for-profit initiatives related to Automattic and WordPress.com.

Automattic attributed its January 2025 decision to WP Engine’s lawsuits:

“We’ve made the decision to reallocate resources due to the lawsuits from WP Engine. This legal action diverts significant time and energy that could otherwise be directed toward supporting WordPress’s growth and health. We remain hopeful that WP Engine will reconsider this legal attack, allowing us to refocus our efforts on contributions that benefit the broader WordPress ecosystem.”

WP Engine’s lawsuits, however, were a response to Matt Mullenweg’s WordCamp USA 2024 statements and also activities against WP Engine (like the WP Engine Tracker website) . A federal judge has since sided with WP Engine and granted its request for a preliminary injunction against Automattic and Mullenweg.

WordCamp Attendee Urges Mullenweg To Reinstate Core Contributions

A WordCamp Asia 2024 attendee stepped up during the Q&A portion of the conference and shared his concerns, as a business owner and a plugin developer, for the stagnation of WordPress core development.

He said:

“Hi Matt. So this is not about a question, but I am a bit concerned about like if I see that the last five years or even ten years Automattic is the biggest core contributor in the code base and everything. So it’s not actually biggest, maybe 60%, 70% of the commit… as a company, Automattic do that.

So you recently published in a blog post that you are pulling out all the contribution and everything. So as a developer, as a business owner, …my whole business depends on WordPress. We build WordPress plugins, I think if there is no Automattic in the core contribution, the whole development will be super slow.

I want to request you to reconsider that, and at least in the core development maybe you can make some changes, give more resources in the core. Because it’s complicated, …someone needs to work and I think Automattic has lots of resources, experienced people in there, so I want to request you to reconsider your position and give more developers to the core.”

Matt Mullenweg States Condition For Restoring Core Contributions

Mullenweg responded that Automattic’s spending millions of dollars to defend itself against WP Engine. He insisted that the decision to restore Automattic’s core contributions hinges on WP Engine dropping their lawsuits and encouraged the person to ask WP Engine.

Mullenweg answered:

“Yeah, thank you. Well, it’s definitely not a situation I want to be in. As we said, we’re pausing things. But very, very excited to return to having all those hundred-ish folks back doing some of the work we were doing before.

But right now we’re facing not just a maker and taker program problem… but maker-attacker. So well Automattic’s having to spend millions of dollars, per month sometimes, to defend against these attacks from WP Engine and with the court injunction, it’s just hard to be both be motivated and to just spare the resources to contribute so much.

Now, they could end it tomorrow. And I would love to welcome WP Engine back into the fold, back at WordCamp and everything. But we can’t end it, we can only defend it, you know, to all the legal attacks and they are increasing actually. And they’re coming after me personally too. As soon as they stop that, we’ll get back to it.

So please, I can’t stop it. Ask them.”

Mullenweg Asks Audience To Pressure WP Engine To Drop Lawsuit

The person asking the question said he understood Mullenweg’s position but insisted that, as an end user, he wants the software to continue to thrive. For that reason, he pleaded for Automattic to find a way to restore core contributions.

Mullenweg answered the developers second plea and asked the audience to pressure WP Engine to drop the lawsuit:

“I can’t until the lawsuit is over. So if there’s anything y’all can do to put pressure for the lawsuit to end, that would be the fastest thing to get our contributions back.”

He ended his response with a smile, saying:

“So… sorry about that.”

Concern Over Cuts To Core Contribution

The WordPress developer expressed deep concern and anxiety about the pace of WordPress core development. He emphasized that Automattic has historically provided a significant portion of core contributions and feared that without its support, WordPress development would slow significantly, impacting his business and those of others who rely on the platform.

Matt Mullenweg’s response did not directly address the WordPress developer’s plea to reconsider Automattic’s core contribution cuts. His answer framed the decision to restore core contributions as out of his control because it is dependent on WP Engine dropping its lawsuit. He stated that the lawsuit costs Automattic millions of dollars.

Mullenweg’s main points in his response to restoring Automattic’s core contributions were:

  • Automattic’s reduced contributions result from the financial and legal burden of defending against WP Engine’s lawsuit.
  • WP Engine’s legal actions make it difficult for Automattic to contribute at previous levels.
  • He urged the audience to pressure WP Engine to drop the lawsuit.

Watch The Question and Answer segment at the 6:21:32 minute mark:

73% Of Marketers Use Generative AI, Consumer Acceptance Up via @sejournal, @MattGSouthern

Recent studies by Gartner and Adobe show that generative AI is becoming a key tool in marketing.

Almost three-quarters of marketing teams now use GenAI, and most consumers are comfortable with AI in advertising.

AI Adoption In Marketing

A survey by Gartner of 418 marketing leaders found that 73% of marketing teams use generative AI.

However, 27% of CMOs say their organizations have limited or no use of GenAI in their marketing campaigns.

Correlation With Top Performers

Marketing teams that consistently exceed targets and meet customer acquisition goals are adopting AI faster than competitors.

Greg Carlucci, Senior Director Analyst in the Gartner Marketing Practice, states:

“The most successful marketing organizations are leading the way when it comes to GenAI adoption.”

Most marketers are using GenAI for:

  • Creative development (77%)
  • Strategy work (48%)
  • Campaign evaluation (47% reporting benefits)

Challenges With Generative AI

Despite spending almost half their budgets on campaigns, 87% of CMOs faced performance problems last year, and nearly half had to end underperforming campaigns early.

The Gartner study found:

“On average, 87% of CMOs report they experienced campaign performance issues in the last 12 months, with 45% reporting that they sometimes, often, or always had occasion to terminate campaigns early in the last year due to poor performance.”

CMOs identified several departments as barriers to their success:

  • Finance (31%)
  • Executive leadership (26%)
  • Sales (26%)

Opportunities With Generative AI

Adobe’s research highlights personalization as the primary AI opportunity for marketers.

Heather Freeland, Chief Brand Officer at Adobe, notes:

“Across all industries, there is an insatiable demand for content as customers expect every encounter with a brand to be personalized.”

She adds:

“Just when this challenge seemed insurmountable, the emergence of generative AI is presenting creative and marketing teams with a new way to keep pace with customer demands while also breaking through with their brands.”

The study finds that 97% of marketers believe mass personalization is achievable with AI, but most find it challenging without appropriate tools.

AI Acceptance Among Consumers

Consumers say that knowing content was created by AI either makes them more engaged or does not change their engagement at all.

Adobe’s study found:

Three in four consumers surveyed agree that knowing content was AI-produced would either improve or not impact their likelihood of engaging with it.

Consumers are even willing to share their data for a better AI-driven experience.

Adobe’s study finds the top data points consumers are willing to share include:

“… past purchases (56%), products they’ve viewed (52%), their gender (47%), age (41%), and language (35%).”

Generational Differences

Different age groups prefer personalization in different channels.

According to Adobe’s research:

“Gen Z respondents show a higher affinity for personalized content from the consumer electronics industry, particularly music (45%) and video games (43%)…

This contrasts with Baby Boomers, who prefer personalization in retail industry content, specifically from grocery stores (46%).”

The study also found:

“Millennials prefer personalized email campaigns (45%) and website content (40%), while Gen Z values social media personalization (51%).”

Measurable Results

Adobe reports that the implementation of GenAI tools delivered performance improvements.

Its report states:

“… in one of our first generative AI-powered email tests, we used the tool to quickly build and test five versions of an Adobe Photoshop email. It delivered a more than 10% increase in click-through rates, and a subsequent test reported a 57% increase in click rates for an Adobe Illustrator email.”

Additionally:

“Testing scale and speed transformed our approach to content optimization, significantly enhancing our marketing performance and efficiency.”

What This Means

Generative AI is shifting from a novel technology to a standard practice within marketing.

Marketing departments are facing tighter budgets while consumer demand for personalized content grows. Generative AI offers a potential solution to create personalized content at scale.

Further, using AI to personalize marketing messages will unlikely impact consumer perception of your brand. Some marketers believe it may even improve retention.

Adobe’s research suggests:

“Over one in four (26%) marketer respondents agree that AI-powered personalization will increase consumer brand loyalty.”

If you want to incorporate AI into your advertising strategy but are unsure where to start, data suggests that the best approach is to enhance personalization.


Featured Image: Frame Stock Footage/Shutterstock

WordCamp Asia: No Plans For WordPress In 5 Years via @sejournal, @martinibuster

An awkward Q&A at WordCamp Asia 2025 saw Matt Mullenweg struggle to answer where WordPress will be in five years. Apparently caught off guard, he turned to the Lead Architect of Gutenberg for ideas, but he couldn’t answer either.

Project Gutenberg

Gutenberg is a reimagining of how WordPress users can build websites without knowing any code, with a visual interface of blocks for different parts of a web page, which is supposed to make it easy. Conceived as a four phase project, it’s been in development since 2017 and is currently in phase three.

The four phases are:

  • Phase 1: Easier Editing
  • Phase 2: Customization
  • Phase 3: Collaborative Editing
  • Phase 4: Multilingual Support

There’s a perception that Project Gutenberg has not been enthusiastically received by the WordPress developer community or by regular users, even though there are currently 85.9 million installations of the Gutenberg WordPress editor.

However, one developer at WordCamp Asia told Matt Mullenweg at the end of conference Q&A session that she was experiencing hesitations from people she speaks with about using WordPress and expressed frustration about how difficult it was to use it.

She said:

“Some of those hesitations were it’s easy to get overwhelmed. You know, when you look up how to learn WordPress, and I had to be really motivated… for myself to actually study it and kind of learn the basics of blocks… So do you have any advice on how I could convince my friends to start a WordPress site or how to address these challenges myself? You know like, getting overwhelmed and feeling like there’s just so much. I’m not a coder and things like that… any advice you can offer small business owners?”

The whole purpose of the Gutenberg block editor was to make it easier for non-coders to use WordPress. So a WordPress user asking for ideas on how to convince people to use WordPress presented an unflattering view of the success of the WordPress Gutenberg Project.

Where Will WordPress Be In Five Years?

Another awkward moment was when someone else asked Matt Mullenweg where he saw WordPress being in five years. The question seemingly caught him off guard as he was unable to articulate what the plan is for the world’s most popular content management system.

Mullenweg had been talking about the importance of AI and of some integrations being tested in the commercial version at WordPress.com. So the person asking the question asked if he had any other ideas beyond AI.

The person asked:

“If you have other ideas beyond AI or even how we consume WordPress five years from now that might be different from today.”

Matt Mullenweg answered:

“Yeah, it’s hard to think about anything except AI right now. And as I said a few years ago, before ChatGPT came out, learn AI deeply. Everyone in the room should be playing with it. Try out different models. Check out Grok, check out DeepSeek, two of the coolest ones that just launched.

And for WordPress, at that point will be past all the phases of Gutenberg. I think… I don’t know…”

It was at this point that Mullenweg calls on Matías Ventura, Lead Architect of Gutenberg, to ask him if he has any ideas of where WordPress is headed in five years.

He continued:

“Matías, what do you think? What’s post-Gutenberg? We’ve been working for so long, it’s…”

Matías Ventura, Lead Architect of Gutenberg, came up to a microphone to help Mullenweg answer the question he was struggling with.

Matías answered:

“I mean, hopefully we’ll be done by then so…”

Mullenweg commented:

“Sometimes that last 10% takes, you know, 90% of the time.”

Matías quipped that it can take a hundred years then continued his answer, which essentially admitted that there were no plans without actually admitting that there were no plans for five years out.

He continued his answer:

“I don’t know, I think, well in the talk I gave I… also reflected a bit that part of the thing is just discovering as we go, like figuring out how like, right now it’s AI that’s shaping reality but who knows, in a few decades what it would be. And to me, the only conviction is that yeah, we’ll need to adapt, we’ll need to change. And that’s part of the fun of it, I think. So I’m looking forward to whatever comes.”

Mullenweg jumped in at this point with his thoughts:

“That’s a good point of the, you know, how many releases we have of WordPress right now, 60 or whatever… 70 probably…. Outside of Gutenberg, we haven’t had a roadmap that goes six months or a year, or a couple versions, because the world changes in ways you can’t predict.

But being responsive is, I think, really is how organisms survive.

You know, Darwin, said it’s not the fittest of the species that survives. It’s the one that’s most adaptable to change. I think that’s true for software as well.”

Mullenweg Challenged To Adapt To Change

His statement about being adaptable to change set up another awkward moment at the 6:55:47 minute mark where Taco Verdonschot, co-owner of Progress Planner, stood up to the microphone and asked Mullenweg if he really was committed to being adaptable.

Taco Verdonschot is formerly of Yoast SEO and currently sponsored to work on WordPress by Emilia Capital (owned by Joost de Valk and Marieke van de Rakt).

Taco asked:

“I’m Taco, co-owner of Progress Planner. I was wondering, you were talking about adaptability before and survival of the fittest. That means being open to change. What we’ve seen in the last couple of months is that people who were talking about change got banned from the project. How open are you to discussing change in the project?”

Mullenweg responded:

“Sure. I don’t want to go too far into this but I will say that talking about change will not get you banned. There’s other behaviors… but just talking about change is something that we do pretty much every day. And we’ve changed a lot over the years. We’ve changed a lot in the past year. So yeah. But I don’t want to speak to anyone personally, you know. So keep it positive.”

Biggest Challenges WordPress Will Face In Next Five Years

Watch the question and answer at the 6:19:24 mark