Cloudflare DDoS Report: 63% Of Known Attacks Blamed On Competitors via @sejournal, @martinibuster

Cloudflare released their 2025 Q2 DDoS Threat Report, which names the top ten sources of DDoS attacks and cites businesses targeting competitors as the largest source of DDoS attacks, according to surveyed respondents who had identified their attackers.

Survey: Who Attacked You?

Cloudflare surveyed customers about DDoS attacks, and 29% claimed to have identified the sources of those attacks. Of those who identified the attackers, 63% pointed to competitors, the largest of whom were businesses in the crypto, gambling, and gaming industries. 21% of the respondents who identified their attackers said they were victims of state‑sponsored attacks, and 5% said they had accidentally attacked themselves, something that can happen with server misconfigurations

This is how Cloudflare explained it:

“When asked who was behind the DDoS attacks they experienced in 2025 Q2, the majority (71%) of respondents said they didn’t know who attacked them. Of the remaining 29% of respondents that claimed to have identified the threat actor, 63% pointed to competitors, a pattern especially common in the Gaming, Gambling and Crypto industries. Another 21% attributed the attack to state-level or state-sponsored actors, while 5% each said they’d inadvertently attacked themselves (self-DDoS), were targeted by extortionists, or suffered an assault from disgruntled customers/users.”

Most Attacked Locations

One would think that the United States would be the most attacked location, given how many businesses and websites are located there. But the most attacked location was China, which climbed from position three to position one. Brazil also climbed four positions to second place. Turkey dropped four positions to land in sixth place, and Hong Kong dropped to seventh place. Vietnam, however, jumped fifteen places to land in eighth place.

Top Ten Most DDoS-Attacked Countries

  1. China
  2. Germany
  3. India
  4. South Korea
  5. Turkey
  6. Hong Kong
  7. Vietnam
  8. Russia
  9. Azerbaijan

Top Attacked Industries

Telecommunications was the most attacked industry, followed by Internet and Information Technology Services. Gaming and Gambling were the third and fourth most attacked industries, followed by Banking/Financial and Retail industries.

  1. Telecommunications
  2. Internet
  3. Information Technology and Services
  4. Gaming
  5. Gambling and Casinos
  6. Banking and financial Services
  7. Retail
  8. Agriculture
  9. Computer Software
  10. Government

Top Country-Level Sources Of DDOS Attacks

Cloudflare’s data shows that Ukraine is the fifth‑largest source of DDoS attacks, but doesn’t say which areas of Ukraine are responsible. When I look at my logs of bot attacks, the Ukrainian‑origin bots are consistently in Russian‑occupied territories. Cloudflare should have made a distinction about this point, in my opinion.

The country of origin doesn’t mean that one country is shiftier than another. For example, the Netherlands rank as the ninth‑largest source of DDoS attacks, and that may be the case because they have strong user privacy laws that protect VPN users and are well positioned for low latency to both Europe and North America.

Cloudflare also provide the following note about country-level origins:

“It’s important to note that these “source” rankings reflect where botnet nodes, proxy or VPN endpoints reside — not the actual location of threat actors. For L3/4 DDoS attacks, where IP spoofing is rampant, we geolocate each packet to the Cloudflare data center that first ingested and blocked it, drawing on our presence in over 330 cities for truly granular accuracy.”

Top Ten Country Origins Of DDOS Attacks

  1. Indonesia
  2. Singapore
  3. Hong Kong
  4. Argentina
  5. Ukraine
  6. Russia
  7. Ecuador
  8. Vietnam
  9. Netherlands
  10. Thailand

Top ASN Sources Of DDOS Attacks

An ASN (Autonomous System Number) is a unique number assigned to networks or groups of networks that share the same rules for routing internet traffic. SEOs and publishers who track the origin of bad traffic and use .htaccess to block millions of IP ranges will recognize a number of the networks on this list. Hetzner, OVH, Tencent, Microsoft, the Google Cloud Platform, and Alibaba are all usual suspects.

According to Cloudflare, Hetzner dropped from first place as the origin of DDoS attacks to third place. DigitalOcean was formerly the number one source of DDoS attacks and was pushed down to position two by Drei‑K‑Tech‑GmbH, which jumped six places to become the leading source of DDoS attacks.

Top Ten Network Sources Of DDOS Attacks

  1. Drei-K-Tech-GmbH
  2. DigitalOcean
  3. Hetzner
  4. Microsoft
  5. Viettel
  6. Tencent
  7. OVH
  8. Chinanet
  9. Google Cloud Platform
  10. Alibaba

DDOS Attacks Could Be Better Mitigated

Cloudflare noted that it has a program that allows cloud computing providers to rapidly respond to bad actors abusing its networks. It’s not just DDoS attacks that originate at cloud and web hosting providers; it’s also bots scanning for vulnerabilities and actively trying to hack websites. If more providers joined Cloudflare, there could be fewer DDoS attacks, and the web would be a lot safer place.

This is how Cloudflare explains it:

“To help hosting providers, cloud computing providers and any Internet service providers identify and take down the abusive accounts that launch these attacks, we leverage Cloudflare’s unique vantage point to provide a free DDoS Botnet Threat Feed for Service Providers. Over 600 organizations worldwide have already signed up for this feed, and we’ve already seen great collaboration across the community to take down botnet nodes.”

Read the Cloudflare report:

Hyper-volumetric DDoS attacks skyrocket: Cloudflare’s 2025 Q2 DDoS threat report

Wix Announces AI Visibility Overview Citation & Sentiment Tracking Tool via @sejournal, @martinibuster

Wix adds support for Generative Engine Optimization (GEO) with new AI visibility tool called The AI Visibility Overview, available to users with a Wix Business Manager account in English, with more languages rolling out soon. AI Visibility Overview enables users to track citations, track AI query volume and traffic plus benchmark performance against competitors.

AI Visibility Overview

Wix continues its innovative and forward-thinking approach to adding AI-powered tools that provide users with real-world benefits that help get work done. Tracking AI visibility is an advanced capability that no other CMS offers.

The newly announced AI tools provides the following benefits for Generative Engine Optimization (GEO):

  • “Manage AI citations & visibility:
    Users can track how often their website is cited by AI platforms in response to relevant queries, as well as add, or remove questions to better reflect their business.
  • Monitor brand sentiment across LLMs:
    GEO empowers users to stay informed on how their brand is perceived by analyzing sentiment, perception, and positioning in AI-generated content.
  • Benchmark visibility and competitive context:
    Users can compare their AI visibility performance to competitors to gain a better understanding of how their visibility stacks up against industry peers, identify growth opportunities, and discover which other sources are being cited in similar contexts.
  • Measure AI-driven traffic & query volume:
    Users can see how much traffic is driven to their site from AI platforms, as well as how frequently people ask about their brand or services in these engines.”

AI Visibility

Business customers are increasingly searching with AI, and Wix’s new AI Visibility Overview is the right tool at the right time for businesses, enabling them to keep up with where customers are today and offering a competitive advantage over the competition.

Read more about the new tool here:

Wix Analytics: About the AI Visibility Overview

Featured Image by Shutterstock/Roman Samborskyi

Google Rolls Out Gemini 2.5 Pro & Deep Search For Paid Subscribers via @sejournal, @MattGSouthern

Google is rolling out two enhancements to AI Mode in Labs: Gemini 2.5 Pro and Deep Search.

These capabilities are exclusive to users subscribed to Google’s AI Pro and AI Ultra plans.

Gemini 2.5 Pro Now Available In AI Mode

Subscribers can now access Gemini 2.5 Pro from a dropdown menu within the AI Mode tab.

Screenshot from: blog.google/products/search/deep-search-business-calling-google-search, July 2025.

While the default model remains available for general queries, the 2.5 Pro model is designed to handle more complex prompts, particularly those involving reasoning, mathematics, or coding.

In an example shared by Google, the model walks through a multi-step physics problem involving gravitational fields, showing how it can solve equations and explain its reasoning with supporting links.

Screenshot from: blog.google/products/search/deep-search-business-calling-google-search, July 2025.

Deep Search Offers AI-Assisted Research

Today’s update also introduces Deep Search, which Google describes as a tool for conducting more comprehensive research.

The feature can generate detailed, citation-supported reports by processing multiple searches and aggregating information across sources.

Google stated in its announcement:

“Deep Search is especially useful for in-depth research related to your job, hobbies, or studies.”

Availability & Rollout

These features are currently limited to users in the United States who subscribe to Google’s AI Pro or AI Ultra plans and have opted into AI Mode through Google Labs.

Google hasn’t provided a firm timeline for when all eligible users will receive access, but rollout has begun.

The “experimental” label on Gemini 2.5 Pro suggests continued adjustments based on user testing.

What This Means

The launch of Deep Search and Gemini 2.5 Pro reflects Google’s broader effort to incorporate generative AI into the search experience.

For marketers, the shift raises questions about visibility in a time when AI-generated summaries and reports may increasingly shape user behavior.

If Deep Search becomes a commonly used tool for information gathering, the structure and credibility of content could play a larger role in discoverability.

Gemini 2.5 Pro’s focus on reasoning and code-related queries makes it relevant for more technical users. Google has positioned it as capable of helping with debugging, code generation, and explanation of advanced concepts. Similar to tools like ChatGPT’s coding features or GitHub Copilot.

Its integration into Search may appeal to users who want technical assistance without leaving the browser environment.

Looking Ahead

The addition of these features behind a paywall continues Google’s movement toward monetizing AI capabilities through subscription services.

While billed as experimental, these updates may provide early insight into how the company envisions the future of AI in search: more automated, task-oriented, and user-specific.

Search professionals will want to monitor how these features evolve, as tools like Deep Search could become more widely adopted.

Google Search Can Now Call Local Businesses Using AI via @sejournal, @MattGSouthern

Google has introduced a new AI-powered calling feature in Search that contacts local businesses on a user’s behalf to gather pricing and availability details.

The feature, rolling out to all U.S. Search users this week, allows people to request information from multiple businesses with a single query.

When searching for services like pet grooming or dry cleaning, users may now see a new option to “Have AI check pricing.”

How It Works

After selecting the AI option, users are guided through a form to provide details about the service they need.

Google’s AI then calls relevant local businesses to gather information such as pricing, appointment availability, and service options. The responses are consolidated and presented to the user.

The experience starts with a typical local search, such as “pet groomers near me.” If the AI calling feature is available, users can specify details like:

  • Pet type, breed, and size
  • Requested services (e.g., bath, nail trim, haircut)
  • Time preferences (e.g., within 48 hours)
  • Preferred method of communication (SMS or email)

According to a Google spokesperson, the AI determines which businesses to contact based on traditional local search rankings. Only those that appear in results for the relevant query and match the user’s criteria will be contacted.

What It Looks Like

Examples show a multi-step process where users enter information and confirm their request.

Google displays responses from participating businesses, including prices and availability, all gathered through automated calls.

Before submitting a request, users must confirm that Google can call businesses and share the submitted details. The process is governed by Google’s privacy policy, and users are informed of how their data will be used.

Business Participation & Control

Businesses can manage whether they receive these AI-driven calls via their Business Profile settings.

Google describes the feature as creating “new opportunities” to connect with potential customers, while also giving businesses control over participation.

Available to All (With Premium Perks)

The AI calling feature is available to all users in the U.S., though Google AI Pro and AI Ultra subscribers benefit from higher usage limits.

Google says more agentic AI features will debut for these subscribers before expanding globally.

What This Means

Because the AI selects businesses using standard local search rankings, maintaining strong local SEO becomes even more important.

Businesses with optimized listings and higher rankings are more likely to receive calls and capture leads.

This could also shift how businesses handle inbound requests. Those that rely on phone calls may want to prepare staff or systems to handle more frequent, possibly scripted, AI-initiated inquiries.

Looking Ahead

By automating time-consuming tasks like gathering service quotes, Google aims to make Search more actionable.

Adoption will depend on how well the AI handles real-world complexity, as well as how many businesses opt in.

For marketers and local service providers, it’s another sign that search visibility directly connects to lead generation. Keeping Business Profile data accurate and staying visible in local results could increasingly determine whether a business gets contacted at all.

Confirmed CWV Reporting Glitch In Google Search Console via @sejournal, @martinibuster

Google Search Console Core Web Vitals (CWV) reporting for mobile is experiencing a dip that is confirmed to be related to the Chrome User Experience Report (CrUX). Search Console CWV reports for mobile performance show a marked dip beginning around July 10, at which point the reporting appears to stop completely.

Not A Search Console Issue

Someone posted about it on Bluesky

“Hey @johnmu.com is there a known issue or bug with Core Web Vitals reporting in Search Console? Seeing a sudden massive drop in reported URLs (both “good” and “needs improvement”) on mobile as of July 14.”

The person referred to July 14th, but that’s the date the reporting hit zero. The drop actually starts closer to July 10th, which you can see when you hover a cursor at the point that the drops begin.

Google’s John Mueller responded:

“These reports are based on samples of what we know for your site, and sometimes the overall sample size for a site changes. That’s not indicative of a problem. I’d focus on the samples with issues (in your case it looks fine), rather than the absolute counts.”

The person who initially started the discussion responded to inform Mueller that this isn’t just on his site, the peculiar drop in reporting is happening on other sites.

Mueller was unaware of any problem with CWV reporting so he naturally assumed that this was an artifact of natural changes in Internet traffic and user behavior. So his next response continued under the assumption that this wasn’t a widespread issue:

He responded:

“That can happen. The web is dynamic and alive – our systems have to readjust these samples over time.”

Then Jamie Indigo responded to confirm she’s seeing it, too. 

“Hey John! Thanks for responding 🙂 It seems like … everyone beyond the usual ebb and flow. Confirming nothing in the mechanics have changed?”

At this point it was becoming clear that this weird behavior wasn’t isolated to just one site and Mueller’s response to Jamie reflected this growing awareness.  Mueller confirmed that there’s nothing happening on the Search Console side, leaving it open about the CrUX side of the Core Web Vitals reporting.

His response:

“Correct, nothing in the mechanics changed (at least with regards to Search Console — I’m also not aware of anything on the Chrome / CrUX side, but I’m not as involved there).”

CrUX CWV Field Data

CrUX is the acronym for the Chrome User Experience report. It’s CWV reporting based on real website visits. The data is collected from Chrome browser website visits by users who have opted in to reporting their data for the report.

Google’s Chrome For Developers page explains:

“The Chrome User Experience Report (also known as the Chrome UX Report, or CrUX for short) is a dataset that reflects how real-world Chrome users experience popular destinations on the web.

CrUX is the official dataset of the Web Vitals program. All user-centric Core Web Vitals metrics are represented.

CrUX data is collected from real browsers around the world, based on certain browser options which determine user eligibility. A set of dimensions and metrics are collected which allow site owners to determine how users experience their sites.”

Core Web Vitals Reporting Outage Is Widespread

At this point more people joined the conversation, with Alan Bleiweiss offering both a comment and a screenshot showing the same behavior where the reporting completely drops off is happening on the Search Console CWV reports for other websites.

He posted:

“oooh Google had to slow down server requests to set aside more power to keep the swimming pools cool as the summer heats up.”

Here’s a closeup detail of Alan’s screenshot of a Search Console CWV report:

Screenshot Of CWV Report Showing July 10 Drop

I searched the Chrome Lighthouse changelog to see if there’s anything there that corresponds to the drop but nothing stood out.

So what is going on?

CWV Reporting Outage Is Confirmed

I next checked the X and Bluesky accounts of Googlers who work on the Chrome team and found a post by Barry Pollard, Web Performance Developer Advocate on Google Chrome, who had posted about this issue last week.

Barry posted a note about a reporting outage on Bluesky:

“We’ve noticed another dip on the metrics this month, particularly on mobile. We are actively investigating this and have a potential reason and fix rolling out to reverse this temporary dip. We’ll update further next month. Other than that, there are no further announcements this month.”

Takeaways

Google Search Console Core Web Vitals (CWV) data drop:
A sudden stop in CWV reporting was observed in Google Search Console around July 10, especially on mobile.

Issue is widespread, not site-specific:
Multiple users confirmed the drop across different websites, ruling out individual site problems.

Origin of issue is not at Search Console:
John Mueller confirmed there were no changes on the Search Console side.

Possible link to CrUX data pipeline:
Barry Pollard from the Chrome team confirmed a reporting outage and mentioned a fix may be rolled out at an unspecified time in the future.

We now know that this is a confirmed issue. Google Search Console’s Core Web Vitals reports began showing a reporting outage around July 10, leading users to suspect a bug. The issue was later acknowledged by Barry Pollard as reporting outage affecting CrUX data, particularly on mobile.

Featured Image by Shutterstock/Mix and Match Studio

WordPress Malware Scanner Plugin Contains Vulnerability via @sejournal, @martinibuster

Wordfence published an advisory on the WordPress Malcure Malware Scanner plugin, which was discovered to have a vulnerability rated at a severity level of 8.1. At the time of publishing, there is no patch to fix the problem.

Screenshot Showing 8.1 Severity Rating

Malcure Malware Scanner Vulnerability

The Malcure Malware Scanner plugin, installed on over 10,000 WordPress websites, is vulnerable to “Arbitrary File Deletion due to a missing capability check on the wpmr_delete_file() function” by authenticated attackers. The fact that an attacker needs authentication as a user makes it a little less likely for it to be exploited, however not by much because it only requires subscriber level authentication, which is the lowest level of authentication. The “subscriber” role is the default level of registration on a WordPress website (if registration is allowed).

According to Wordfence:

“This makes it possible for authenticated attackers, with Subscriber-level access and above, to delete arbitrary files making remote code execution possible. This is only exploitable when advanced mode is enabled on the site.”

There is no known patch available for the plugin and users are cautioned to take necessary actions such as uninstalling the plugin to mitigate risk.

The plugin is currently unavailable for download with a notice showing that it is under review.

Screenshot Of Malcure Plugin At WordPress Repository

Read More WordPress News

WordPress Update 6.8.2 – Ends Security Support For 0.9% of Sites

Featured Image by Shutterstock/Kues

Anthropic’s New Financial Tool Signals Shift To Offering Specialized Services via @sejournal, @martinibuster

Anthropic announced a new Financial Analysis Solution powered by its Claude 4 and Claude Code models. This is Anthropic’s first foray into a major vertical-focused platform, signaling a shift toward AI providers building tools that directly address common pain points in business workflows and productivity.

Claude For Financial Services

Anthropic’s Claude’s new service is an AI-powered financial analysis tool that’s targeted to financial professionals. It offers data integration via MCP (Model Context Protocol) and secure handling of data and total privacy. No user data is used for training Claude’s generative models.

According to the announcement:

“Claude has real-time access to comprehensive financial information including:

  • Box enables secure document management and data room analysis
  • Daloopa supplies high-quality fundamentals and KPIs from SEC filings
  • Databricks offers unified analytics for big data and AI workloads
  • FactSet provides comprehensive equity prices, fundamentals, and consensus estimates
  • Morningstar contributes valuation data and research analytics
  • PitchBook delivers industry-leading private capital market data and research, empowering users to source investment and fundraising opportunities, conduct due diligence and benchmark performance, faster and with greater confidence
  • S&P Global enables access to Capital IQ Financials, earnings call transcripts, and more–essentially your entire research workflow”

Takeaway:

This launch may signal a shift among AI providers toward building industry-specific tools that solve problems for professionals, rather than offering only general-purpose models that others use to provide the same solutions. Generative AI companies have the ability to stitch together solutions from big data providers in ways that smaller companies can’t.

Read more at Anthropic:

Transform financial services with Claude

Featured Image by Shutterstock/gguy

WordPress Update 6.8.2 – Ends Security Support For 0.9% of Sites via @sejournal, @martinibuster

WordPress released a maintenance update that contains twenty changes to the core and fixes fifteen issues in the Gutenberg block editor. WordPress also announced that it is dropping security support for WordPress versions 4.1 to 4.6.

Short-Cycle Maintenance Release

This is a maintenance release that incrementally makes WordPress a smoother experience.

Some of the fixes that are representative of what’s in this release:

Dropping Security Support

WordPress announced that it is dropping support for versions 4.1 through 4.6. According to the official WordPress stats, only 0.9% of websites are using those versions of WordPress.

Statement on release page:

“Dropping security updates for WordPress versions 4.1 through 4.6
This is not directly related to the 6.8.2 maintenance release, but branches 4.1 to 4.6 had their final release today. These branches won’t receive any security update anymore.”

Another WordPress page provides more information:

“As of July 2025, the WordPress Security Team will no longer provide security updates for WordPress versions 4.1 through 4.6.

These versions were first released nine or more years ago and over 99% of WordPress installations run a more recent version. The chances this will affect your site, or sites, is very small.”

Read the official WordPress 6.8.2 announcement:

WordPress 6.8.2 Maintenance Release

Read More WordPress News

Malware Discovered In Gravity Forms WordPress Plugin

Featured Image by Shutterstock/Praew stock

Google Updates Search Analytics API To Clarify Data Freshness via @sejournal, @MattGSouthern

Google has added a new metadata field to the Search Analytics API, making it easier for developers and SEO professionals to identify when they’re working with incomplete or still-processing data.

The update introduces new transparency into the freshness of query results, an improvement for marketers who rely on up-to-date metrics to inform real-time decisions.

What’s New In The API

The metadata field appears when requests include the dataState parameter set to all or hourly_all, enabling access to data that may still be in the process of being collected.

Two metadata values are now available:

  • first_incomplete_date: Indicates the earliest date for which data is still incomplete. Only appears when data is grouped by date.
  • first_incomplete_hour: Indicates the first hour where data remains incomplete. Only appears when data is grouped by hour.

Both values help clarify whether recent metrics can be considered stable or if they may still change as Google finalizes its processing.

Why It Matters For SEO Reporting

This enhancement allows you to better distinguish between legitimate changes in search performance and temporary gaps caused by incomplete data.

To help reduce the risk of misinterpreting short-term fluctuations, Google’s documentation states:

“All values after the first_incomplete_date may still change noticeably.”

For those running automated reports, the new metadata enables smarter logic, such as flagging or excluding fresh but incomplete data to avoid misleading stakeholders.

Time Zone Consistency

All timestamps provided in the metadata field use the America/Los_Angeles time zone, regardless of the request origin or property location. Developers may need to account for this when integrating the data into local systems.

Backward-Compatible Implementation

The new metadata is returned as an optional object and doesn’t alter existing API responses unless requested. This means no breaking changes for current implementations, and developers can begin using the feature as needed.

Best Practices For Implementation

To take full advantage of this update:

  • Include logic to check for the metadata object when requesting recent data.
  • Consider displaying warnings or footnotes in reports when metadata indicates incomplete periods.
  • Schedule data refreshes after the incomplete window has passed to ensure accuracy.

Google also reminds users that the Search Analytics API continues to return only top rows, not a complete dataset, due to system limitations.

Looking Ahead

This small but meaningful addition gives SEO teams more clarity around data freshness, a frequent pain point when working with hourly or near-real-time performance metrics.

It’s a welcome improvement for anyone building tools or dashboards on top of the Search Console API.

The metadata field is available now through standard API requests. Full implementation details are available in the Search Analytics API documentation.


Featured Image: Roman Samborskyi/Shutterstock

Google Says AI Won’t Replace The Need For SEO via @sejournal, @martinibuster

Google’s John Mueller and Martin Splitt discussed the question of whether AI will replace the need for SEO. Mueller expressed a common-sense opinion about the reality of the web ecosystem and AI chatbots as they exist today.

Context Of Discussion

The context of the discussion was about SEO basics that a business needs to know. Mueller then mentioned that businesses might want to consider hiring an SEO who can help navigate the site through its SEO journey.

Mueller observed:

“…you also need someone like an SEO as a partner to give you updates along the way and say, ‘Okay, we did all of these things,’ and they can list them out and tell you exactly what they did, ‘These things are going to take a while, and I can show you when Google crawls, we can follow along to see like what is happening there.’”

Is There Value In Learning SEO?

It was at this point that Martin Splitt asked if generative AI will make having to learn SEO obsolete or whether entering a prompt will give all the answers a business person needs to know. Mueller’s answer was tethered to how things are right now and avoided speculating about how things will change in a year or more.

Splitt asked:

“Okay, I think that’s pretty good. Last but not least, with generative AI and chatbot AI things happening. Do you think there’s still a value in learning these kind of things? Or can I just enter a prompt and it’ll figure things out for me?”

Mueller affirmed that knowing SEO will still be needed as long as there are websites because search engines and chat bots need the information that exists on websites. He offered examples of local businesses and ecommerce sites that still need to be found, regardless of whether that’s through an AI chatbot or search.

He answered:

“Absolutely value in learning these things and in making a good website. I think there are lots of things that all of these chatbots and other ways to get information, they don’t replace a website, especially for local search and ecommerce.

So, especially if you’re a local business, maybe it’s fine if a chatbot mentions your business name and tells people how to get there. Maybe that’s perfectly fine, but oftentimes, they do that based on web content that they found.

Having a website is the basis for being visible in all of these systems, and for a lot of other things where you offer a service or something, some other kind of functionality on a website where you have products to sell, where you have subscriptions or anything, a chat response can’t replace that.

If you want a t shirt, you don’t want a description of how to make your own t-shirt. You want a link to a store where it’s like, ‘Oh, here’s t-shirt designs,’ maybe t-shirt designs in that specific style that you like, but you go to this website and buy those t-shirts there.”

Martin acknowledged the common sense of that answer and they joked around a bit about Mueller hoping that an AI will be able to do his job once he retires.

That’s the context for this part of their conversation:

“Okay. That’s very fair. Yeah, that makes sense. Okay, so you think AI is not going to take it all away from us?”

And Mueller answers with the comment about AI replacing him after he retires:

“Well, we’ll see. I can’t make any promises. I think, at some point, I would like to retire, and then maybe AI takes over my work then. But, like, there’s lots of stuff to be done until then. There are lots of things that I imagine AI is not going to just replace.”

What About CMS Platforms With AI?

Something that wasn’t discussed is the trend of AI within content management systems. Many web hosts and WordPress plugins are already integrating AI into the workflow of creating and optimizing websites. Wix has already integrated AI into their workflow and it won’t be much longer until AI makes a stronger presence within WordPress, which is what the new WordPress AI team is working on.

Screenshot Of ChatGPT Choosing Number 27

Will AI ever replace the need for SEO? Many easy things that can be scaled are already automated. However, many of the best ideas for marketing and communicating with humans are still best handled by humans, not AI. The nature of generative AI, which is to generate the most likely answer or series of words in a sentence, precludes it from ever having an original idea. AI is so locked into being average that if you ask it to pick a number between one and fifty, it will choose the number 27 because the AI training binds it to picking the likeliest number, even when instructed to randomize the choice.

Listen to Search Off The Record at about the 24 minute mark:

Featured Image by Shutterstock/Roman Samborskyi