2024 WordPress Vulnerability Report Shows Errors Sites Keep Making via @sejournal, @martinibuster

WordPress security scanner WPScan’s 2024 WordPress vulnerability report calls attention to WordPress vulnerability trends and suggests the kinds of things website publishers (and SEOs) should be looking out for.

Some of the key findings from the report were that just over 20% of vulnerabilities were rated as high or critical level threats, with medium severity threats, at 67% of reported vulnerabilities, making up the majority. Many regard medium level vulnerabilities as if they are low-level threats but they’re not and should be regarded as deserving attention.

The WPScan report advised:

“While severity doesn’t translate directly to the risk of exploitation, it’s an important guideline for website owners to make an educated decision about when to disable or update the extension.”

WordPress Vulnerability Severity Distribution

Critical level vulnerabilities, the highest level of threat, represented only 2.38% of vulnerabilities, which is (essentially good news for WordPress publishers. Yet as mentioned earlier, when combined with the percentages of high level threats (17.68%) the number or concerning vulnerabilities rises to almost 20%.

Here are the percentages by severity ratings:

  • Critical 2.38%
  • Low 12.83%
  • High 17.68%
  • Medium 67.12%

Authenticated Versus Unauthenticated

Authenticated vulnerabilities are those that require an attacker to first attain user credentials and their accompanying permission levels in order to exploit a particular vulnerbility. Exploits that require subscriber-level authentication are the most exploitable of the authenticated exploits and those that require administrator level access present the least risk (although not always a low risk for a variety of reasons).

Unauthenticated attacks are generally the easiest to exploit because anyone can launch an attack without having to first acquire a user credential.

The WPScan vulnerability report found that about 22% of reported vulnerabilities required subscriber level or no authentication at all, representing the most exploitable vulnerabilities. On the other end of the scale of the exploitability are vulnerabilities requiring admin permission levels representing a total of 30.71% of reported vulnerabilities.

Permission Levels Required For Exploits

Vulnerabilities requiring administrator level credentials represented the highest percentage of exploits, followed by Cross Site Request Forgery (CSRF) with 24.74% of vulnerabilities. This is interesting because CSRF is an attack that uses social engineering to get a victim to click a link from which the user’s permission levels are acquired. If they can trick an admin level user to follow a link then they will be able to assume that level of privileges to the WordPress website.

The following is the percentages of exploits ordered by roles necessary to launch an attack.

Ascending Order Of User Roles For Vulnerabilities

  • Author 2.19%
  • Subscriber 10.4%
  • Unauthenticated 12.35%
  • Contributor 19.62%
  • CSRF 24.74%
  • Admin 30.71%

Most Common Vulnerability Types Requiring Minimal Authentication

Broken Access Control in the context of WordPress refers to a security failure that can allow an attacker without necessary permission credentials to gain access to higher credential permissions.

In the section of the report that looks at the occurrences and vulnerabilities underlying unauthenticated or subscriber level vulnerabilities reported (Occurrence vs Vulnerability on Unauthenticated or Subscriber+ reports), WPScan breaks down the percentages for each vulnerability type that is most common for exploits that are the easiest to launch (because they require minimal to no user credential authentication).

The WPScan threat report noted that Broken Access Control represents a whopping 84.99% followed by SQL injection (20.64%).

The Open Worldwide Application Security Project (OWASP) defines Broken Access Control as:

“Access control, sometimes called authorization, is how a web application grants access to content and functions to some users and not others. These checks are performed after authentication, and govern what ‘authorized’ users are allowed to do.

Access control sounds like a simple problem but is insidiously difficult to implement correctly. A web application’s access control model is closely tied to the content and functions that the site provides. In addition, the users may fall into a number of groups or roles with different abilities or privileges.”

SQL injection, at 20.64% represents the second most prevalent type of vulnerability, which WPScan referred to as both “high severity and risk” in the context of vulnerabilities requiring minimal authentication levels because attackers can access and/or tamper with the database which is the heart of every WordPress website.

These are the percentages:

  • Broken Access Control 84.99%
  • SQL Injection 20.64%
  • Cross-Site Scripting 9.4%
  • Unauthenticated Arbitrary File Upload 5.28%
  • Sensitive Data Disclosure 4.59%
  • Insecure Direct Object Reference (IDOR) 3.67%
  • Remote Code Execution 2.52%
  • Other 14.45%

Vulnerabilities In The WordPress Core Itself

The overwhelming majority of vulnerability issues were reported in third-party plugins and themes. However, there were in 2023 a total of 13 vulnerabilities reported in the WordPress core itself. Out of the thirteen vulnerabilities only one of them was rated as a high severity threat, which is the second highest level, with Critical being the highest level vulnerability threat, a rating scoring system maintained by the Common Vulnerability Scoring System (CVSS).

The WordPress core platform itself is held to the highest standards and benefits from a worldwide community that is vigilant in discovering and patching vulnerabilities.

Website Security Should Be Considered As Technical SEO

Site audits don’t normally cover website security but in my opinion every responsible audit should at least talk about security headers. As I’ve been saying for years, website security quickly becomes an SEO issue once a website’s ranking start disappearing from the search engine results pages (SERPs) due to being compromised by a vulnerability. That’s why it’s critical to be proactive about website security.

According to the WPScan report, the main point of entry for hacked websites were leaked credentials and weak passwords. Ensuring strong password standards plus two-factor authentication is an important part of every website’s security stance.

Using security headers is another way to help protect against Cross-Site Scripting and other kinds of vulnerabilities.

Lastly, a WordPress firewall and website hardening are also useful proactive approaches to website security. I once added a forum to a brand new website I created and it was immediately under attack within minutes. Believe it or not, virtually every website worldwide is under attack 24 hours a day by bots scanning for vulnerabilities.

Read the WPScan Report:

WPScan 2024 Website Threat Report

Featured Image by Shutterstock/Ljupco Smokovski

Google On The SEO Impact Of 503 Status Codes via @sejournal, @MattGSouthern

Searchers who expect a website to always be available may be disappointed to encounter a 503 status code.

However, brief downtime is perfectly acceptable.

During the April edition of the Google Search Central SEO office hours, a question was raised about the potential impact of intermittently serving 503 “Service Unavailable” status codes.

Gary Illyes, a long-time Google webmaster trends analyst, clarified the search engine’s stance.

“Serving a 503 status code for an extended period of time will cause a decrease in crawl rate.

Fortunately for you, 10-15 minutes every now and then is not ‘extended’ by any means, so you should be fine.”

The Importance Of Uptime

While 100% uptime may be ideal, it’s not required to maintain good rankings in Google’s search results.

Websites undergo periods of maintenance and updates and can experience unplanned outages from time to time.

As long as these downtimes are brief and infrequent, they’re unlikely to impact crawling and indexing severely.

An extended period wasn’t clearly defined, but the example of 10-15-minute windows several times per week was deemed acceptable.

Previously, Google stated it would start deindexing webpages if a site is down for over a few days.

Planning For Updates

Advanced planning and strategy are recommended for websites expecting extended downtime.

Techniques like shadowing a staging site or employing progressive rollouts can reduce the visibility of errors and downtime.

Illyes advises:

“If you do things by the book, meaning the website keeps being resolvable and the actual downtime is minimal, changing [configuration] should not have negative effects on your pages’ rankings in Google’s search results.”

While continuous uptime is ideal for user experience, Google’s systems can tolerate short downtime without negatively ranking the website in search results.

Why SEJ Cares

Google’s guidance on brief 503 status codes provides relief for publishers who may be concerned about the potential negative impacts of website downtime.

Many sites go through regular update cycles on a weekly or monthly cadence, requiring some downtime windows.

These can be for publishing new content, product updates, security patches, and general housekeeping.

Additionally, unplanned outages, server issues, and other unavoidable downtime happen.

As long as temporary blips are being actively resolved, there’s no need to panic about losing rankings and search visibility.

How This Can Help You

Website owners, developers, and SEO professionals managing websites can benefit from understanding Google’s tolerances for 503 status codes.

Some key takeaways:

  • Plan for and minimize downtime during updates, but don’t stress over short periods serving 503s.
  • Monitor analytics and user feedback to ensure users aren’t severely impacted by unavailable periods.
  • Uptime and swift error resolutions should be prioritized as much as possible for overall website health.
  • Investigate techniques like staged rollouts to reduce errors for major updates requiring extended downtime.

With reasonable expectations set by Google, websites can make informed decisions about balancing downtimes with SEO priorities.

Hear the full question and answer in the video below:


Featured Image: Zikku Creative/Shutterstock

Mozilla Squeezes More Speed From Firefox Browser via @sejournal, @MattGSouthern

Mozilla has implemented a performance upgrade to its Firefox web browser that could translate into faster website load times – welcome news for SEO professionals and their clients.

The technical details involve moving certain tasks, specifically decompression of gzip and brotli content, away from the browser’s main processing thread.

While this might sound complex, the result is quite simple: web pages load more quickly and feel more responsive when using Firefox.

The Firefox dev team states:

“This work has delivered huge performance wins on our high-level page load metrics, reducing First Contentful Paint and Largest Contentful Paint by 10%.”

First Contentful Paint and Largest Contentful Paint measure how quickly websites render content visible to users after navigation.

Improving these by 10% could mean millions of web pages loading noticeably faster in Firefox.

Why SEJ Cares

For SEO professionals, websites that load quickly are crucial for providing a good user experience, potentially influencing search rankings.

Any measures that speed up load times are good for SEO.

The performance upgrade has also drawn praise from web experts.

Barry Pollard, a respected voice on web performance, tweeted that Firefox’s threading change “should be some good responsiveness wins” that could enhance browser interactivity.

Looking Ahead

In the constantly accelerating online world, shaving precious milliseconds off load times keeps websites competitive and users engaged.

As Firefox rolls out this updated version, expect faster load times and smoother user experiences in this browser.

FAQ

What are First Contentful Paint and Largest Contentful Paint, and why are they important?

First Contentful Paint (FCP) and Largest Contentful Paint (LCP) are performance metrics used to assess the speed at which a website presents visual content to its users after navigation.

FCP measures the time from navigation to when the browser renders the first piece of content from the DOM, providing a user with the first visual indication that a page is loading.

LCP, on the other hand, marks the point in the page load timeline when the largest text block or image element is rendered on the screen.

These metrics are relevant to SEO as they indicate user experience quality; faster FCP and LCP times generally correlate with a better user experience, which can positively impact search visibility.


Featured Image: T. Schneider/Shutterstock

Google Ends Video Carousel Structured Data Test via @sejournal, @martinibuster

Google has ended support for the video carousel (limited access) markup documentation and the associated rich results because after testing the structured data they realized that it wasn’t useful at scale. Site publishers who use this specific structured data don’t necessarily have to remove it, other search engines may find it useful. However, Google won’t use it for rich results.

Google removed a paragraph from the section of the documentation that was labeled “Video with key moments in search results”.

Removed paragraph:

Video host carousel (limited access): Enable users to explore your video gallery pages by adding ItemList structured data. This feature is currently limited to a small set of providers. Google Search may automatically show a host carousel for your site without you having to add ItemList structured data.”

Google also removed the entire structured data example.

Screenshot Of Removed Structured Data Example

Screenshot of a webpage featuring a videoobject json-ld structured data example. the text and code snippets are clearly visible, intended to explain how to implement structured data for a video about self-driving bicycles.Screenshot of a webpage featuring a videoobject json-ld structured data example. the text and code snippets are clearly visible, intended to explain how to implement structured data for a video about self-driving bicycles.

Google removed a link to the “Carousel guidelines” from the section titled Guidelines where Google links to specific guideline documentation but curiously the page that’s being linked to still exists.

This is what was removed:

Carousel guidelines

Google removed the following phrase from the “Structured data type definitions” section:

“ItemList: Mark up pages with a list of videos to enable a video host carousel.”

And the entire section listing the recommended properties is now gone.

Screenshot Of Removed Recommended Properties Section

Screenshot of Google structured data documentation that was removedScreenshot of a coding tutorial page showing json code examples for adding video objects to an itemlist, with explanations of the listitem and itemlist properties and urls.

Why Google Removed The Documentation

Google’s developer changelog lists the following reasons why the documentation was removed:

“Removing video carousel (limited access) documentation
What: Removed video carousel guidance from the video structured data documentation.

Why: We initially tested video carousel markup with a group of site owners, and ultimately found that it wasn’t useful for the ecosystem at scale. You can leave the markup on your site so that search engines and other systems can better understand your web page.”

Read the updated guidance for video structured data

Video (VideoObject, Clip, BroadcastEvent) structured data

View an archive of the old structured data documentation

Archive of Video (VideoObject, Clip, BroadcastEvent) structured data

Featured Image by Shutterstock/Koshiro K

Google Search & Maps Get Upgrades For Greener Travel via @sejournal, @MattGSouthern

Google announced updates to its Maps and Search products on Monday aimed at helping users find and choose more sustainable transportation options over gas-powered vehicles.

James Byers, Google Search’s Group Product Manager, states:

“When it comes to traveling and commuting, we want people to have the right information to make more sustainable choices.”

Eco-Friendly Route Suggestions

A smartphone displaying Google Maps navigation centered on Paris, with several greener travel routes and estimated times to a destination marked on the screen.Screenshot from: blog.google/products/search/, April 2024.

If travel times are comparable, one of Google Maps’ key new features will be the provision of transit and walking route suggestions alongside driving routes.

This update will be rolled out in over 15 major cities worldwide in the coming weeks, including Amsterdam, Barcelona, Paris, Rome, and Sydney.

Train & Bus Route Information

A smartphone displaying a Google Search for greener travel options for train tickets from Boston to Philadelphia with several travel options listed according to time and price.

Google is expanding its ability to find long-distance train and bus routes through search.

Users searching queries like “Boston to Philadelphia train” will see schedules, prices, and direct links to book tickets on the results page.

Train route information will now be available for 38 countries, while long-distance bus routes will initially be added for 15 countries.

When users search for flights on Google’s dedicated Flight tool, they will see train route suggestions as an alternative for applicable routes.

“So it’s easy to find the more sustainable option no matter where you begin your search,” Byers stated.

Highlighting Lower-Emission Flights

In cases where flying remains the best option, Google will show you the environmental impact.

Byers explains:

“These estimates are powered by the Travel Impact Model (TIM), a methodology we published in 2022 for predicting per-passenger flight emissions.

To make this information even more accessible, TIM estimates are now also available through a developer API that allows any app or website to display flight emissions.”

Google is also releasing a Google Sheets add-on and online calculator to make the emissions estimates more widely available.

Looking Ahead

While Byers acknowledged “there’s still a long road ahead when it comes to sustainable travel,” he expressed hope that “today’s updates will give more people the information they need to choose lower-carbon alternatives.”

The new features come as consumers increasingly seek ways to make sustainable choices. By integrating these metrics into its flagship maps and travel tools, Google is nudging users towards greener options.


Featured Image: Screenshot from blog.google/products/search/, April 2024. 

Wix Partners Can Now Sell Templates in Wix Marketplace via @sejournal, @martinibuster

Wix announced that Wix Studio agency and enterprise partners can now sell templates in the Wix Marketplace, opening up a new way for partners to earn more through Wix and for Wix publishers to gain access to professionally designed templates.

Wix Studio Partners

Wix Studio is a platform for agencies and enterprise users who manage multiple sites, providing a streamlined workspace for managing client sites all in one space. The Wix Studio platform provides the AI-powered Studio Editor for creating websites that makes it easy for teams to collaborate as well design websites in a code-free workspace that also allows for custom CSS.

Wix Marketplace Expansion

Wix partners can already sell their services such as web design, SEO, logo and graphic design through the Marketplace. Now they are able to sell their templates to users directly through the marketplace which can also help partners develop leads.

Templates Must Pass Quality Check

A benefit of the Marketplace offers pre-screened professionals and all templates must pass a review for quality and usability before being listed.

Wix team members review templates for:

  • Design & Aesthetics
  • Layout & structure
  • Template Exclusivity
  • Uniqueness

According to Michal Bignitz, VP of Partners at Wix:

“Expanding the Wix Marketplace to include template sales not only empowers our Partners to unlock additional revenue streams and strengthen their presence within the Wix community but also fosters a thriving ecosystem of creativity and innovation. It’s an exciting opportunity for Partners to collaborate with and benefit from other experts, as we’ve seen with beautifully crafted templates combining design and dev skills.

“This fosters a thriving ecosystem of creativity and innovation and puts us one step closer to the goal we envision for Studio to become the place where experts in web design, development, user experience, marketing, SEO, and more can monetize their creations and collaborate.”

Read more about how to sell templates:

Sell Wix Studio Templates

Brave Announces AI Search Engine – Shares Insights For SEO via @sejournal, @martinibuster

Brave announced their new privacy-focused AI search engine called Answer with AI that works with its own search index of billions of websites. Their current search engine already serves 10 billion search queries per year which means that Brave’s AI-powered search engine is now one of the largest AI search engines online.

Many in the search marketing and ecommerce communities have expressed anxiety about the future of the web because of AI search engines. Brave’s AI search engine still shows links and most importantly it does not by default answer commercial or transactional queries with AI, which should be good news for SEOs and online businesses. Brave values the web ecosystem and will be monitoring website visit patterns.

Search Engine Journal spoke with Josep M. Pujol, Chief of Search at Brave who answered questions about the search index, how it works with AI and most importantly, he shared what SEOs and business owners need to know in order to improve rankings.

Answer With AI Is Powered By Brave

Unlike other AI search solutions, Brave’s AI search engine is powered completely by its own search index of crawled and ranked websites. The entire underlying technology, from the search index to the Large Language Models (LLMs) and even the Retrieval Augmented Generation (RAG) technology is all developed by Brave. This is especially good from a standpoint of privacy and it also makes the Brave search results unique, further distinguishing it from other me-too search engine alternatives.

Search Technology

The search engine itself is all done in-house. According to Josep M. Pujol, Chief of Search at Brave:

“We have query-time access to all our indexes, more than 20 billion pages, which means we are extracting arbitrary information in real-time (schemas, tables, snippets, descriptions, etc.). Also, we go very granular on what data to use, from whole paragraphs or texts on a page to single sentences or rows in a table.

Given that we have an entire search engine at our disposal, the focus is not on retrieval, but selection and ranking. Additionally, to pages in our index, we do have access to the same information used to rank, such as scores, popularity, etc. This is vital to help select which sources are more relevant.”

Retrieval Augmented Generation (RAG)

The way the search engine works is it has a search index and large language models plus Retrieval Augmented Generation (RAG) technology in between that keeps the answers fresh and fact-based. I asked about RAG and Josep confirmed that’s how it works.

He answered:

“You are correct that our new feature is using RAG. As a matter of fact, we’ve already been using this technique on our previous Summarizer feature released in March 2023. However, in this new feature, we are expanding both the quantity and quality of the data used in the content of the prompt.”

Large Language Models Used

I asked about the language models in use in the new AI search engine and how they’re deployed.

“Models are deployed on AWS p4 instances with VLLM.

We use a combination of Mixtral 8x7B and Mistral 7B as the main LLM model.

However, we also run multiple custom trained transformer models for auxiliary tasks such as semantic matching and question answering. Those models are much smaller due to strict latency requirements (10-20 ms).

Those auxiliary tasks are crucial for our feature, since those are the ones that do the selection of data that will end up being on the final LLM prompt; this data can be query-depending snippets of text, schemas, tabular data, or internal structured data coming from our rich snippets. It is not a matter of being able to retrieve a lot of data, but to select the candidates to be added to the prompt context.

For instance, the query “presidents of france by party” processes 220KB of raw data, including 462 rows selected from 47 tables, 7 schemas. The prompt size is around 6500 tokens, and the final response is a mere 876 bytes.

In short, one could say that with “Answer with AI” we go from 20 billion pages to a few thousand tokens.”

How AI Works With Local Search Results

I next asked about how the new search engine will surface local search. I asked Josep if he could share some scenarios and example queries where the AI answer engine will surface local businesses. For example, if I query for best burgers in San Francisco will the AI answer engine provide an answer for that and links to it? Will this be useful for people making business or vacation travel plans?

Josep answered:

“The Brave Search index has more than 1 billion location-based schemas, from which we can extract more than 100 million businesses and other points of interest.

Answer with AI is an umbrella term for Search + LLMs + multiple specialized machine learning models and services to retrieve, rank, clean, combine and represent information. We mention this because LLMs do not make all the decisions. As of now, we use them predominantly to synthesize unstructured and structured information, which happens in offline operations as well as in query-time ones.

Sometimes the end result feels very LLM-influenced (this is the case when we believe the answer to the user question is a single Point of Interest, e.g. “checkin faro cuisine”, and other times their work is more subtle (e.g.”best burgers sf”), generating a business description across different web references or consolidating a category for the business in a consistent taxonomy.”

Tips For Ranking Well

I next asked if using Schema.org structured data was useful for helping a site rank better in Brave and if he had any other tips for SEO and online businesses.

He answered:

“Definitely, we pay special attention to schema.org structured data when building the context of the LLM prompt. The best is to have structured data about their business (standard schemas from schema.org). The more comprehensive those schemas are, the more accurate the answer will be.

That said, our Answer with AI will be able to surface data about the business not in those schemas too, but it is always advisable to repeat information in different formats.

Some businesses only rely on aggregators (Yelp, Tripadvisor, Yellow Pages) for their business information. There are advantages to adding schemas to the business web site even if only for crawling bots.”

Plans For AI Search In The Brave Browser

Brave shared that at some point in the near future they will integrate the new AI search functionality directly in the Brave Browser.

Josep explained:

“We plan to integrate the AI answer engine with Brave Leo (the AI assistant embedded in the Brave browser) very soon. Users will have the option to send the answer to Leo and continue the session there.”

Other Facts

Brave’s announcement also shared these facts about the new search engine:

“Brave Search’s generative answers are not just text. The deep integration between the index and model makes it possible for us to combine online, contextual, named entities enrichments (a process that adds more context to a person, place, or thing) as the answer is generated. This means that answers combine generative text with other media types, including informational cards and images.

The Brave Search answer engine can even combine data from the index and geo local results to provide rich information on points of interest. To date, the Brave Search index has more than 1 billion location-based schemas, from which we can extract more than 100 million businesses and other points of interest. These listings—larger than any public dataset—mean the answer engine can provide rich, instant results for points of interest all over the world.”

Try out the new AI search at http://search.brave.com/

Google Clarifies Vacation Rental Structured Data via @sejournal, @martinibuster

Google’s structured data documentation for vacation rentals was recently updated to require more specific data in a change that is more of a clarification than it is a change in requirements. This change was made without any formal announcement or notation in the developer pages changelog.

Vacation Rentals Structured Data

These specific structured data types makes vacation rental information eligible for rich results that are specific to these kinds of rentals. However it’s not available to all websites. Vacation rental owners are required to be connected to a Google Technical Account Manager and have access to the Google Hotel Center platform.

VacationRental Structured Data Type Definitions

The primary changes were made to the structured data property type definitions where Google defines what the required and recommended property types are.

The changes to the documentation is in the section governing the Recommended properties and represents a clarification of the recommendations rather than a change in what Google requires.

The primary changes were made to the structured data type definitions where Google defines what the required and recommended property types are.

The changes to the documentation is in the section governing the Recommended properties and represents a clarification of the recommendations rather than a change in what Google requires.

Address Schema.org property

This is a subtle change but it’s important because it now represents a recommendation that requires more precise data.

This is what was recommended before:

“streetAddress”: “1600 Amphitheatre Pkwy.”

This is what it now recommends:

“streetAddress”: “1600 Amphitheatre Pkwy, Unit 6E”

Address Property Change Description

The most substantial change is to the description of what the “address” property is, becoming more descriptive and precise about what is recommended.

The description before the change:

PostalAddress
Information about the street address of the listing. Include all properties that apply to your country.

The description after the change:

PostalAddress
The full, physical location of the vacation rental.
Provide the street address, city, state or region, and postal code for the vacation rental. If applicable, provide the unit or apartment number.
Note that P.O. boxes or other mailing-only addresses are not considered full, physical addresses.

This is repeated in the section for address.streetAddress property

This is what it recommended before:

address.streetAddress Text
The full street address of your vacation listing.

And this is what it recommends now:

address.streetAddress Text
The full street address of your vacation listing, including the unit or apartment number if applicable.

Clarification And Not A Change

Although these updates don’t represent a change in Google’s guidance they are nonetheless important because they offer clearer guidance with less ambiguity as to what is recommended.

Read the updated structured data guidance:

Vacation rental (VacationRental) structured data

Featured Image by Shutterstock/New Africa

Google Expands Auto Ads with “Ad Intents” Sponsored Links via @sejournal, @MattGSouthern

Google has launched a new “ad intents” format for its Auto ads program.

This format inserts links and anchors into publishers’ site content to serve contextual ads and search results in dialog boxes.

The ad intents format aims to surface relevant ads and information based on page content and visitor interests.

When visitors click on the dynamically generated links or anchors, an overlay dialog displaying organic search results and associated advertising opens.

Google explains in its support documentation:

“Ad intents works by scanning your pages for opportunities to help your users find something they might be looking for.

Based on your content and what your users may be interested in, ad intents automatically converts existing text on your page into links and places anchors that appear at the bottom of your page.”

The new format offers publishers a revenue-share model for monetizing their content. Website owners get paid whenever a visitor clicks on an ad displayed in the ad intents dialog boxes.

The expansion of ad intents aligns with Google’s broader strategic shift towards deploying more privacy-safe ad targeting technologies as it phases out support for third-party cookies.

Google touts the new format as serving “highly relevant ads that don’t rely on third-party cookies.”

Enabling Ad Intents

The ad intents functionality is opt-in for Google AdSense publishers.

You can toggle the setting in the AdSense dashboard under “Auto ads” > “Intent-driven formats.”

Publishers can enable or disable the “ad intent links” or “ad intent anchors” separately.

To implement the new ad units, you must accept Google’s updated terms of service and policies.

Backlash & Criticism

While touted as a way to increase publisher earnings, Google’s announcement of ad intents has drawn backlash from some corners of the publisher community over fears it could divert traffic from their websites.

On X (formerly Twitter), user @darth_na states:

“It’s not enough that you steal traffic from sites in the SERPs, you are now intent on stealing actual site visitors, and directing them back to Google?”

Responding to the concerns, Google Ads Liaison Ginny Marvin emphasized that ad intents is an optional program and that dialog boxes close, leaving visitors on the original publisher page.

The new ad intents format is now being implemented on websites with auto ads enabled through Google AdSense.

Why We Care

The ad intents format is new monetization opportunity for websites that have implemented Google’s Auto ads.

Ad intents could help publishers earn more revenue than traditional contextual ad placements by tapping visitor intent signals from page content.

However, inserting sponsored links and anchors into publisher content raises questions about editorial control and user experience issues, such as excessive ad obstruction.

The new format is notable because Google is shifting towards more privacy-centric targeting as it moves away from cross-site tracking cookies.

How This Can Help You

Website publishers should evaluate whether enabling ad intents makes sense for their traffic, content, and monetization strategies.

While the new format offers potential revenue upsides, balancing those benefits against maintaining high editorial standards is important.

It will be important to assess whether the targeting aligns with your ideal audience intent signals.


Featured Image: Tada Images/Shutterstock