Google Ads Introduces Generative AI Tools For Demand Gen Campaigns via @sejournal, @MattGSouthern

Google announced the rollout of new generative AI capabilities for Demand Gen campaigns in Google Ads today.

The tools, powered by Google’s artificial intelligence, will allow advertisers to create high-quality image assets using text prompts in just a few steps.

The AI-powered creative features aim to enhance visual storytelling to help brands generate new demand across Google’s platforms, including YouTube, YouTube Shorts, Discover, and Gmail.

Michael Levinson, Vice President and General Manager of Social, Local, and Vertical Ads at Google, stated in the announcement:

“Advertisers need to diversify their creative strategy with multi-format ads to keep audiences engaged and deliver results. With generative image tools, you can now test new creative concepts more efficiently – whether it’s experimenting with new types of images or simply building your creatives from scratch.”

How It Works

Starting today, the generative image tools are rolling out globally to advertisers in English, with more languages coming later this year.

Advertisers can provide text prompts to generate original, high-quality images tailored to their branding and marketing needs.

For example, an outdoor lifestyle brand selling camping gear could use a prompt like “vibrantly colored tents illuminated under the Aurora Borealis” to create engaging visuals targeting customers interested in camping trips to Iceland.

Additionally, a “generate more like this” feature allows advertisers to generate new images inspired by their existing high-performing assets.

Responsible AI Development

In the announcement, Google emphasized its commitment to developing generative AI technology responsibly, with principles in place for fairness, privacy, and security.

The company states:

“On top of making sure advertising content adheres to our long-standing Google Ads policies, we also employ additional technical measures to ensure generative image tools in Google Ads produce novel and unique content. Google AI will never create two identical images.”

All generated images will include identifiable markings, such as an open-standard markup and an invisible digital watermark resistant to manipulations like screenshots and filters.

Creative Best Practices

To accompany the new AI tools, Google released a “Creative Excellence Guide” with the following best practices for building Demand Gen campaigns:

  • Use a combination of videos and images to engage audiences at different stages of the buyer journey.
  • Provide Google with a variety of assets in different aspect ratios to maximize reach across inventory.
  • Utilize high-quality, high-resolution visuals to build brand trust and inspire action.
  • Adopt a test-and-learn strategy, evaluating performance metrics to optimize creatives.

Why SEJ Cares

Generative AI for Demand Gen campaigns represents a potentially valuable new capability for Google Ads advertisers.

By leveraging AI to streamline creative production, brands can experiment with a broader array of visual concepts.

This positions them to better engage audiences in Google’s premium ad environments, such as YouTube, Discover, and Gmail.

How This Can Help You

For brands, generative AI tools open up new creative possibilities with a level of image sophistication that was previously time-and resource-intensive to produce.

The ability to iterate rapidly on visual ideas can lead to more impactful ad creative.

Small businesses and agencies operating with leaner teams can now create a high volume of diverse, on-brand image assets with minimal design resources.

Additionally, the “Generate more like this” functionality allows advertisers to expand on existing assets while maintaining a consistent look and feel.

Brands need to approach this technology responsibly and strategically, however.

While Google has implemented safeguards, advertisers should still apply human oversight, creativity, and brand governance when using AI-generated assets.


Featured Image: blog.google/products/ads-commerce/, April 2024. 

Google On Hyphens In Domain Names via @sejournal, @martinibuster

Google’s John Mueller answered a question on Reddit about why people don’t use hyphens with domains and if there was something to be concerned about that they were missing.

Domain Names With Hyphens For SEO

I’ve been working online for 25 years and I remember when using hyphens in domains was something that affiliates did for SEO when Google was still influenced by keywords in the domain, URL, and basically keywords anywhere on the webpage. It wasn’t something that everyone did, it was mainly something that was popular with some affiliate marketers.

Another reason for choosing domain names with keywords in them was that site visitors tended to convert at a higher rate because the keywords essentially prequalified the site visitor. I know from experience how useful two-keyword domains (and one word domain names) are for conversions, as long as they didn’t have hyphens in them.

A consideration that caused hyphenated domain names to fall out of favor is that they have an untrustworthy appearance and that can work against conversion rates because trustworthiness is an important factor for conversions.

Lastly, hyphenated domain names look tacky. Why go with tacky when a brandable domain is easier for building trust and conversions?

Domain Name Question Asked On Reddit

This is the question asked on Reddit:

“Why don’t people use a lot of domains with hyphens? Is there something concerning about it? I understand when you tell it out loud people make miss hyphen in search.”

And this is Mueller’s response:

“It used to be that domain names with a lot of hyphens were considered (by users? or by SEOs assuming users would? it’s been a while) to be less serious – since they could imply that you weren’t able to get the domain name with fewer hyphens. Nowadays there are a lot of top-level-domains so it’s less of a thing.

My main recommendation is to pick something for the long run (assuming that’s what you’re aiming for), and not to be overly keyword focused (because life is too short to box yourself into a corner – make good things, course-correct over time, don’t let a domain-name limit what you do online). The web is full of awkward, keyword-focused short-lived low-effort takes made for SEO — make something truly awesome that people will ask for by name. If that takes a hyphen in the name – go for it.”

Pick A Domain Name That Can Grow

Mueller is right about picking a domain name that won’t lock your site into one topic. When a site grows in popularity the natural growth path is to expand the range of topics the site coves. But that’s hard to do when the domain is locked into one rigid keyword phrase. That’s one of the downsides of picking a “Best + keyword + reviews” domain, too. Those domains can’t grow bigger and look tacky, too.

That’s why I’ve always recommended brandable domains that are memorable and encourage trust in some way.

Read the post on Reddit:

Are domains with hyphens bad?

Read Mueller’s response here.

Featured Image by Shutterstock/Benny Marty

Google Ads To Retire Customizers For Text Ads via @sejournal, @MattGSouthern

In an email sent to Google Ads advertisers this week, the company announced a change coming to search ads.

Effective May 31, 2024, ad customizers will stop serving for expanded text ads (ETAs) and Dynamic Search Ads (DSAs).

The notification reads in part:

“On May 31, 2024, existing ad customizers for text ads, expanded text ads and Dynamic Search Ads will stop serving (after this date they will only be able to serve with their default value).”

For advertisers leveraging customizers with their text ads or DSAs, Google recommends “transitioning to responsive search ads and creating ad customizers for responsive search ads by May 31, 2024.”

Reactions Within The Search Marketing Community

News of the impending change sparked discussion among paid search professionals.

Navah Hopkins shared the Google announcement on LinkedIn along with her take:

“I’ve never been a huge fan of customizers (both anecdotally and when looking at large data sets), but I respect that they do work for some advertisers.

If you’re currently running ad customizers on your DSA or ETAs – now is the time to build them out as RSAs so you can gradually move to the new format.”

Hopkins clarified that the customizers will cease serving text ads and DSAs, not the ad types.

“If you didn’t bother with [customizers]…keep calm and carry on with your amazing human augmented creative segmentation!”

The Writing On The Wall For Text Ads

Google’s push for responsive search ads (RSAs) as the go-to search ad format has been apparent for some time.

In 2021, Google shared that RSAs would become the only search ad type advertisers could create or edit in standard search campaigns.

The following year, in June 2022, Google stopped allowing advertisers to create or edit expanded text ads within any of its surfaces—a clear sign that RSAs were taking over as the primary ad unit.

Many marketers have invested in responsive search ads over the past couple of years, and this latest move seems to be another step in that continued shift.

For those still leveraging custom ad text with their text ads and DSAs, the clock is ticking to rebuild those customized experiences with responsive search ads instead.


FAQ

How does Google Ads’ change regarding ad customizers impact advertisers?

To maintain personalized advertising experiences, Google Ads advertisers now need to:

  • Transition their existing ad customizers to responsive search ads (RSAs) by the May 31 deadline.
  • Rebuild customized ad experiences within the RSA format, which offers dynamic customization capabilities.
  • Adapt to a new ad landscape where RSAs are becoming the primary format for search ads.

What are the steps for advertisers to transition from ETAs to RSAs?

Advertisers must take proactive measures to ensure a smooth transition from expanded text ads to responsive search ads:

  • Review current ad campaigns using ETAs and identify which utilize ad customizers.
  • Create new responsive search ads that implement customizers before the deadline.
  • Test and optimize these RSAs for performance against current ETAs to ensure minimal disruption.
  • Gradually phase out ETAs in favor of RSAs to become accustomed to the new format.

Why is Google pushing for a transition to responsive search ads?

Google’s push towards responsive search ads is rooted in adaptability and efficiency. The transition reflects an effort to:

  • Simplify ad creation while maximizing reach and relevance across different search queries.
  • Employ a more automated approach to ad optimization using machine learning algorithms.
  • Streamline the ad platform by focusing on a single, more effective ad type that can adjust to user queries and device types.


Featured Image: Vladimka production/Shutterstock

Google Is Still Gen Z’s Top Search Engine, Study Shows via @sejournal, @MattGSouthern

A recent report shows that while Gen Z increasingly turns to social media platforms for information searches, Google remains the primary starting point for a significant portion of this age group.

The data published by Axios shows that 46% of internet users aged 18-24 begin their online queries on Google, compared to 58% of those aged 25-39.

This suggests a shift in search habits between Gen Z and millennials, who came of age during Google’s rise to dominance.

However, the numbers also indicate that the move away from traditional search engines may not be as dramatic as some have suggested.

Only 21% of Gen Z users start their searches on TikTok, while 5% begin on YouTube.

The Current Landscape Of Internet Searches

Despite the perceived competition from social media platforms, Google remains the leader in the search engine market.

MaryLeigh Bliss, chief content officer for YPulse, tells Axios:

“Google is still top overall for initial searches, followed by TikTok and YouTube.”

This reinforces the idea that traditional search engines remain at the core of information gathering while usage patterns change.

Challenges & Adaptations

Google has taken steps to adapt to changing user preferences, expanding its AI-powered Search Generative Experience (SGE) tool and highlighting results from forum websites like Reddit.

The company reports that 18-—to 24-year-olds have given the highest satisfaction scores for its AI search results.

Despite these efforts, some users have expressed dissatisfaction with the quality of Google’s search results.

A recent study by researchers in Germany found that low-quality results are often well-optimized to appear high in Google’s search rankings, particularly for product searches.

Looking Forward

Web usage habits are constantly shifting. Compared to older generations, Gen Z’s way of looking for information online will likely continue to evolve.

While social media is becoming a bigger part of how people search, traditional search engines like Google still dominate.

The tug-of-war between emerging and traditional platforms will shape how younger generations seek information on the web.


FAQ

In the context of SEO and online marketing, what are the implications of Gen Z’s shifts in search habits?

The data on Gen Z’s search habits have several implications for SEO and online marketing strategies:

  • Marketers should consider diversifying their SEO tactics. They should not solely rely on traditional search engines but also consider optimizing for vertical video platforms.
  • Understanding that younger audiences may start their product or information searches on platforms like TikTok means marketers may have to develop unique strategies to reach them.
  • To maintain visibility online, consider monitoring and adapting to Gen Z’s preferences, as their satisfaction levels could guide future algorithm adjustments.


Featured Image: DavideAngelini/Shutterstock

Reddit Post Ranks On Google In 5 Minutes – What’s Going On? via @sejournal, @martinibuster

Google’s Danny Sullivan disputed the assertions made in a Reddit discussion that Google is showing a preference for Reddit in the search results. But a Redditor’s example proves that it’s possible for a Reddit post to rank in the top ten of the search results within minutes and to actually improve rankings to position #2 a week later.

Discussion About Google Showing Preference To Reddit

A Redditor (gronetwork) complained that Google is sending so many visitors to Reddit that the server is struggling with the load and shared an example that proved that it can only take minutes for a Reddit post to rank in the top ten.

That post was part of a 79 post Reddit thread where many in the r/SEO subreddit were complaining about Google allegedly giving too much preference to Reddit over legit sites.

The person who did the test (gronetwork) wrote:

“…The website is already cracking (server down, double posts, comments not showing) because there are too many visitors.

…It only takes few minutes (you can test it) for a post on Reddit to appear in the top ten results of Google with keywords related to the post’s title… (while I have to wait months for an article on my site to be referenced). Do the math, the whole world is going to spam here. The loop is completed.”

Reddit Post Ranked Within Minutes

Another Redditor asked if they had tested if it takes “a few minutes” to rank in the top ten and gronetwork answered that they had tested it with a post titled, Google SGE Review.

gronetwork posted:

“Yes, I have created for example a post named “Google SGE Review” previously. After less than 5 minutes it was ranked 8th for Google SGE Review (no quotes). Just after Washingtonpost.com, 6 authoritative SEO websites and Google.com’s overview page for SGE (Search Generative Experience). It is ranked third for SGE Review.”

It’s true, not only does that specific post (Google SGE Review) rank in the top 10, the post started out in position 8 and it actually improved ranking, currently listed beneath the number one result for the search query “SGE Review”.

Screenshot Of Reddit Post That Ranked Within Minutes

Screenshot showing a Reddit post that ranked in Google within minutes

Anecdotes Versus Anecdotes

Okay, the above is just one anecdote. But it’s a heck of an anecdote because it proves that it’s possible for a Reddit post to rank within minutes and get stuck in the top of the search results over other possibly more authoritative websites.

hankschrader79 shared that Reddit posts outrank Toyota Tacoma forums for a phrase related to mods for that truck.

Google’s Danny Sullivan responded to that post and the entire discussion to dispute that Reddit is not always prioritized over other forums.

Danny wrote:

“Reddit is not always prioritized over other forums. [super vhs to mac adapter] I did this week, it goes Apple Support Community, MacRumors Forum and further down, there’s Reddit. I also did [kumo cloud not working setup 5ghz] recently (it’s a nightmare) and it was the Netgear community, the SmartThings Community, GreenBuildingAdvisor before Reddit. Related to that was [disable 5g airport] which has Apple Support Community above Reddit. [how to open an 8 track tape] — really, it was the YouTube videos that helped me most, but it’s the Tapeheads community that comes before Reddit.

In your example for [toyota tacoma], I don’t even get Reddit in the top results. I get Toyota, Car & Driver, Wikipedia, Toyota again, three YouTube videos from different creators (not Toyota), Edmunds, a Top Stories unit. No Reddit, which doesn’t really support the notion of always wanting to drive traffic just to Reddit.

If I guess at the more specific query you might have done, maybe [overland mods for toyota tacoma], I get a YouTube video first, then Reddit, then Tacoma World at third — not near the bottom. So yes, Reddit is higher for that query — but it’s not first. It’s also not always first. And sometimes, it’s not even showing at all.”

hankschrader79 conceded that they were generalizing when they wrote that Google always prioritized Reddit. But they also insisted that that didn’t diminish what they said is a fact that Google’s “prioritization” forum content has benefitted Reddit more than actual forums.

Why Is The Reddit Post Ranked So High?

It’s possible that Google “tested” that Reddit post in position 8 within minutes and that user interaction signals indicated to Google’s algorithms that users prefer to see that Reddit post. If that’s the case then it’s not a matter of Google showing preference to Reddit post but rather it’s users that are showing the preference and the algorithm is responding to those preferences.

Nevertheless, an argument can be made that user preferences for Reddit can be a manifestation of Familiarity Bias. Familiarity Bias is when people show a preference for things that are familiar to them. If a person is familiar with a brand because of all the advertising they were exposed to then they may show a bias for the brand products over unfamiliar brands.

Users who are familiar with Reddit may choose Reddit because they don’t know the other sites in the search results or because they have a bias that Google ranks spammy and optimized websites and feel safer reading Reddit.

Google may be picking up on those user interaction signals that indicate a preference and satisfaction with the Reddit results but those results may simply be biases and not an indication that Reddit is trustworthy and authoritative.

Is Reddit Benefiting From A Self-Reinforcing Feedback Loop?

It may very well be that Google’s decision to prioritize user generated content may have started a self-reinforcing pattern that draws users in to Reddit through the search results and because the answers seem plausible those users start to prefer Reddit results. When they’re exposed to more Reddit posts their familiarity bias kicks in and they start to show a preference for Reddit. So what could be happening is that the users and Google’s algorithm are creating a self-reinforcing feedback loop.

Is it possible that Google’s decision to show more user generated content has kicked off a cycle where more users are exposed to Reddit which then feeds back into Google’s algorithm which in turn increases Reddit visibility, regardless of lack of expertise and authoritativeness?

Featured Image by Shutterstock/Kues

Google Limits News Links In California Over Proposed ‘Link Tax’ Law via @sejournal, @MattGSouthern

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.

Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.

However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.


Featured Image:Ismael Juan/Shutterstock

Query Deserves Ads Is Where Google Is Headed via @sejournal, @martinibuster

Google’s CEO Sundar Pichai recently discussed the future of search, affirming the importance of websites (good news for SEO). But how can that be if AI is supposed to make search engines obsolete (along with SEO)?

Search vs Chatbots vs Generative Search

There’s a lot of discussion about AI search but what’s consistently missing is a delineation of what is meant by that phrase.

There are three ways to think about what is being discussed:

  • Search Engines
  • Chatbots like Gemini or ChatGPT
  • Generative Search (which are chatbots stacked on top of a traditional search engine like Perplexity.ai and Bing)

Traditional Search Is A Misnomer

The word misnomer means an inaccurate name, description or label that’s given to something. We still talk about traditional search, perhaps out of habit. The reality that must be acknowledged is that traditional search no longer exists. It’s a misnomer to refer to Google as traditional search.

Sundar Pichai made the point that Google has been using AI for years and we know this is true because of systems like RankBrain, SpamBrain, Helpful Content System (aka HCU) and the Reviews System. AI is involved at virtually every step of Google search from the backend to the frontend in the search results.

Google’s 2021 documentation about SpamBrain noted how AI is used at the crawling and indexing level:

“First, we have systems that can detect spam when we crawl pages or other content. …Some content detected as spam isn’t added to the index.

These systems also work for content we discover through sitemaps and Search Console. …We observed spammers hacking into vulnerable sites, pretending to be the owners of these sites, verifying themselves in the Search Console and using the tool to ask Google to crawl and index the many spammy pages they created. Using AI, we were able to pinpoint suspicious verifications and prevented spam URLs from getting into our index this way.”

AI is involved in the indexing process and all the way through to the ranking process and lastly in the search results themselves.

The most recent March 2024 update is described by Google as complex and is still not over in April 2024. I suspect that Google has transitioned to a more AI-friendly infrastructure in order to accommodate doing things like integrating the AI signals formerly associated with the HCU and the Reviews System straight into the core algorithm.

People are freaking out because the AI search of the future will summarize answers. Well, Google already does that in featured snippets and knowledge graph search results.

Let’s be real: traditional Search no longer exists, it’s a misnomer. Google is more accurately described as an AI search engine and this important to acknowledge because as you’ll shortly see, it directly relates to Sundar Pichai’s means when he talks about what search will look like in ten years.

Blended Hybrid Search AKA Generative Search

What people currently call AI Search is also a misnomer. The more accurate label is Generative Search. Bing and Perplexity.ai are generative AI chatbots stacked on top of a search index with something in the middle that coordinates between the two, generally referred to as Retrieval-Augmented Generation (RAG), a technology that was created in 2020 by Facebook AI researchers

Chatbots

Chatbots are a lot of things including ChatGPT and Gemini. No need to belabor this point, right?

Search Vs Generative Search Vs Chatbots: Who Wins?

Generative search is an awkward mix of a chatbot and a search engine with a somewhat busy interface. It’s awkward because it wants to do your homework and tell you the phone number of the local restaurant but it’s mediocre at both. But even if generative search improves does anyone really want a search engine that can also write an essay? It’s almost a given that those awkwardly joined capabilities are going to drop off and it’ll eventually come to resemble what Google already is.

Chatbots and Search Engines

That leaves us with a near-future of chatbots and search engines. Sam Altman said that an AI chatbot search that shows advertising is dystopian.

Google is pursuing both strategies by tucking the Gemini AI chatbot into Android as an AI assistant that can make your phone calls, phone the local restaurant for you and offer suggestions for the best pizza in town. CEO Sundar Pichai is on record stating that the web is an important resource that they’d like to continue using.

But if the chatbot doesn’t show ads, that’s going to significantly cut into Google’s ad revenue. Nevertheless, the SEO industry is convinced that SEO is over because search engines are going to be replaced by AI.

It’s possible that Google at some point makes a lot of money from cloud services and SaaS products and it will be able to walk away from search-based advertising revenue if everyone migrates towards AI chatbots.

Query Deserves Advertising

But if there’s money in search advertising, why go through all the trouble to crawl the web, develop the technology and not monetize it? Who leaves money on the table? Not Google.

There’s a search engine algorithm called Query Deserves Freshness. The algorithm determines if a search query is trending or is newsworthy and will choose a webpage on the topic that is recently published, fresh.

Similarly, I believe at some point that chatbots are going to differentiate when a search query deserves ads and switch over to a search result.

Google’s CEO Pichai contradicts the SEO narrative of the decline and disappearance of search engines. Pichai says that the future of search includes websites because search needs the diversity of opinions inherent in the web. So where is this all leading toward?

Google Search already surfaces answers for non-money queries that are informational like the weather and currency conversions. There are no ads for those queries so Google is not losing anything by showing informational queries in a chatbot.

But for shopping and other transactional types of search queries, the best solution is Query Deserves Advertising.

If a user asks a shopping related search query there’s going to come a time where the chatbot will “helpfully” decide that the Query Deserves Advertising and switch over to the search engine inventory that also includes advertising.

That may explain why Google’s CEO sees a future where the web is not replaced by an AI but rather they coexist. So if you think about it, Query Deserves Advertising may be how search engines preserve their lucrative advertising business in the age of AI.

Query Deserves Search

An extension of this concept is to think about search queries where comparisons, user reviews, expert human reviews, news, medical, financial and other queries that require human input will need to be surfaced. Those kinds of queries may also switch over to a search result. The results may not look like today’s search results but they will still be search results.

People love reading reviews, reading news, reading gossip and other human generated topic and that’s not going away. Insights matter. Personality matters.

Query Deserves SEO

So maybe the SEO knee jerk reaction that SEO is dead is premature. We’re still at the beginning of this and as long as there’s money to be made off of search there will still be a need for websites, search engines and SEO.

Featured Image by Shutterstock/Shchus

WordPress Releases A Performance Plugin For “Near-Instant Load Times” via @sejournal, @martinibuster

WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.

Speculative Loading

Speculative loading is a technique that fetches pages or resources before a user clicks a link to navigate to another webpage.

The official WordPress page about this new functionality describes it:

“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.

This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”

The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:

“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).

The Speculation Rules API provides an alternative to the widely-available feature and is designed to supersede the Chrome-only deprecated feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”

Performance Lab Plugin

The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.

The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:

Settings > Reading > Speculative Loading

Browser Compatibility

The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.

Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.

Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.

Speculative Loading By WordPress Performance Team

Are Websites Getting Faster? New Data Reveals Mixed Results via @sejournal, @MattGSouthern

Website loading times are gradually improving, but a new study shows significant variance in performance across sites and geographic regions.

The study from web monitoring company DebugBear examined data from Google’s Chrome User Experience Report (CrUX), which collects real-world metrics across millions of websites.

“The average website takes 1.3 seconds to load the main page content for an average visit,” the report stated, using Google’s Largest Contentful Paint (LCP) metric to measure when the main content element becomes visible.

While that median LCP time of 1.3 seconds represents a reasonably fast experience, the data shows a wide range of loading performances:

  • On 25% of mobile websites, visitors have to wait over 2.1 seconds for the main content to appear
  • For the slowest 1% of websites, even an average page load takes more than 5.7 seconds on mobile
  • The slowest 10% of websites make 10% of users wait over 5 seconds for the LCP on mobile
  • Almost 1% of mobile page loads take nearly 20 seconds before the main content shows up

“Even on a fast website, some percentage of page views will be slow,” the study reads.

Continue reading for a deeper dive into the study to understand how your website speed compares to others.

Site Speed Divergences

The data reveals divergences in speeds between different user experiences, devices, and geographic locations:

  • Desktop sites (1.1-second median LCP) load faster than mobile (1.4 seconds)
  • While 25% of mobile page loads hit LCP in under 1 second, 10% take over 4 seconds
  • In the Central African Republic, a typical mobile LCP is 9.2 seconds (75th percentile)
  • Sweden, Slovenia, Japan, and South Korea all had 75th percentile mobile LCPs under 1.7 seconds

“Differences in network connections and device CPU speed mean that visitors in different countries experience the web differently,” the report noted.

The study also found that more popular sites are faster, with the median LCP on the top 1000 sites being 1.1 seconds compared to 1.4 seconds for the top 10 million sites.

Steady Improvement Continues

DebugBear’s analysis shows that websites have steadily become faster across device types over the past few years despite the variances.

A similar improvement was seen for other loading metrics, like First Contentful Paint.

“While changes to the LCP definition may have impacted the data, the First Contentful Paint metric – which is more stable and well-defined – has also improved,” the report stated.

The gains could be attributed to faster devices and networks, better website optimization, and improvements in the Chrome browser.

The study’s key finding was that “Page speed has consistently improved.” However, it also highlighted the wide range of experiences in 2024.

As DebugBear summarized, “A typical visit to a typical website is fast, but you likely visit many websites each day, some slow and some fast.”

Why SEJ Cares

This study provides an annual check-in to see how the web is progressing in terms of loading performance.

In recent years, Google has been emphasizing page load times and its Core Web Vitals metrics to measure and encourage better user experiences.

Speed also plays a role in search rankings. However, its precise weight as a ranking signal is debated.

How This Can Help You

SEO professionals can use studies like this to advocate for prioritizing page speed across an organization.

This report highlights that even high-performing sites likely have a segment of visitors hitting a subpar speed.

Refer to the study as a benchmark for how your site compares to others. If you’re unsure where to start, look at LCP times in the Chrome User Experience Report.

If a segment is well above the 2.1-second threshold for mobile, as highlighted in this study, it may be worth prioritizing front-end optimization efforts.

Segment your page speed data by country for sites with an international audience. Identifying geographic weak spots can inform performance budgeting and CDN strategies.

Remember that you can’t do it all alone. Performance optimization is a collaborative effort between SEOs and developers.


Featured Image: jamesteohart/Shutterstock

Google Explains Index Selection During A Core Update via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about canonicalization, indexing and core algorithm updates that gives a clearer picture of how the different systems work together but independently.

A search marketer named David Minchala asked if Google’s canonicalization processes still worked but in a slower manner during a core algorithm update. The answer to that question is interesting because it offers a way to better understand how these backend processes function.

David’s question used the word “posit” which means to put an idea or statement forward for consideration as a possible fact.

This is the question:

“Posit: during core algo updates (and maybe any big update?), indexing services like canonicalization (i.e., selecting the URL to index and merging all signals from other known duplicate URLs) still work but are slower. Maybe much slower.

Any chance for a comment, Gary Illyes or John Mueller ? Could also be a good topic for Search Off the Record: what are the technical demands on Google to roll out core updates and how could that affect “normal” services like crawling and indexing.”

Google’s Gary Illyes responded by saying that the posited statement is incorrect, using an analogy to explain how the two things function. Gary specifically mentions the index selection process (where Google chooses what goes into the index) and canonicalization (choosing which URL represents the webpage when there are duplicates).

He explained:

“the posit is incorrect. those systems are independent from the “core updates”.

think of core updates as playing with cooking ingredients: you change how much salt or msg you put in your stir fry and you can radically change the result.

in this context index selection and canonicalization is more about what’s happening in the salt mines or the msg factories; not much to do with the cooking just yet.”

Google Indexing Engine

So in other words, what happens in a core update happens independently from the index selection and  canonicalization processes. That way of looking at it, as Gary Illyes suggested, aligns with many of Google’s patents that describe how search systems work. When talking about a search engine, patents describe them as a collection of engines, using the phrase “indexing engine” when talking about indexing.

For example, in one patent illustration there’s an indexing engine, a ranking engine, and a score modification engine. Data goes in and out of each engine where it gets processed according to its function.

Screenshot From A Google Patent

Flowchart depicting a search system that includes an indexing engineFlowchart depicting a search system. It includes a query input, search results output, components like an index database, indexing engine, ranking engine, and a score modification database.

The above screenshot makes it easier to understand what a search engine is and how the different parts work together and separately as well.

Read the LinkedIn discussion here.

Featured Image by Shutterstock/Roman Samborskyi