Query Deserves Ads Is Where Google Is Headed via @sejournal, @martinibuster

Google’s CEO Sundar Pichai recently discussed the future of search, affirming the importance of websites (good news for SEO). But how can that be if AI is supposed to make search engines obsolete (along with SEO)?

Search vs Chatbots vs Generative Search

There’s a lot of discussion about AI search but what’s consistently missing is a delineation of what is meant by that phrase.

There are three ways to think about what is being discussed:

  • Search Engines
  • Chatbots like Gemini or ChatGPT
  • Generative Search (which are chatbots stacked on top of a traditional search engine like Perplexity.ai and Bing)

Traditional Search Is A Misnomer

The word misnomer means an inaccurate name, description or label that’s given to something. We still talk about traditional search, perhaps out of habit. The reality that must be acknowledged is that traditional search no longer exists. It’s a misnomer to refer to Google as traditional search.

Sundar Pichai made the point that Google has been using AI for years and we know this is true because of systems like RankBrain, SpamBrain, Helpful Content System (aka HCU) and the Reviews System. AI is involved at virtually every step of Google search from the backend to the frontend in the search results.

Google’s 2021 documentation about SpamBrain noted how AI is used at the crawling and indexing level:

“First, we have systems that can detect spam when we crawl pages or other content. …Some content detected as spam isn’t added to the index.

These systems also work for content we discover through sitemaps and Search Console. …We observed spammers hacking into vulnerable sites, pretending to be the owners of these sites, verifying themselves in the Search Console and using the tool to ask Google to crawl and index the many spammy pages they created. Using AI, we were able to pinpoint suspicious verifications and prevented spam URLs from getting into our index this way.”

AI is involved in the indexing process and all the way through to the ranking process and lastly in the search results themselves.

The most recent March 2024 update is described by Google as complex and is still not over in April 2024. I suspect that Google has transitioned to a more AI-friendly infrastructure in order to accommodate doing things like integrating the AI signals formerly associated with the HCU and the Reviews System straight into the core algorithm.

People are freaking out because the AI search of the future will summarize answers. Well, Google already does that in featured snippets and knowledge graph search results.

Let’s be real: traditional Search no longer exists, it’s a misnomer. Google is more accurately described as an AI search engine and this important to acknowledge because as you’ll shortly see, it directly relates to Sundar Pichai’s means when he talks about what search will look like in ten years.

Blended Hybrid Search AKA Generative Search

What people currently call AI Search is also a misnomer. The more accurate label is Generative Search. Bing and Perplexity.ai are generative AI chatbots stacked on top of a search index with something in the middle that coordinates between the two, generally referred to as Retrieval-Augmented Generation (RAG), a technology that was created in 2020 by Facebook AI researchers

Chatbots

Chatbots are a lot of things including ChatGPT and Gemini. No need to belabor this point, right?

Search Vs Generative Search Vs Chatbots: Who Wins?

Generative search is an awkward mix of a chatbot and a search engine with a somewhat busy interface. It’s awkward because it wants to do your homework and tell you the phone number of the local restaurant but it’s mediocre at both. But even if generative search improves does anyone really want a search engine that can also write an essay? It’s almost a given that those awkwardly joined capabilities are going to drop off and it’ll eventually come to resemble what Google already is.

Chatbots and Search Engines

That leaves us with a near-future of chatbots and search engines. Sam Altman said that an AI chatbot search that shows advertising is dystopian.

Google is pursuing both strategies by tucking the Gemini AI chatbot into Android as an AI assistant that can make your phone calls, phone the local restaurant for you and offer suggestions for the best pizza in town. CEO Sundar Pichai is on record stating that the web is an important resource that they’d like to continue using.

But if the chatbot doesn’t show ads, that’s going to significantly cut into Google’s ad revenue. Nevertheless, the SEO industry is convinced that SEO is over because search engines are going to be replaced by AI.

It’s possible that Google at some point makes a lot of money from cloud services and SaaS products and it will be able to walk away from search-based advertising revenue if everyone migrates towards AI chatbots.

Query Deserves Advertising

But if there’s money in search advertising, why go through all the trouble to crawl the web, develop the technology and not monetize it? Who leaves money on the table? Not Google.

There’s a search engine algorithm called Query Deserves Freshness. The algorithm determines if a search query is trending or is newsworthy and will choose a webpage on the topic that is recently published, fresh.

Similarly, I believe at some point that chatbots are going to differentiate when a search query deserves ads and switch over to a search result.

Google’s CEO Pichai contradicts the SEO narrative of the decline and disappearance of search engines. Pichai says that the future of search includes websites because search needs the diversity of opinions inherent in the web. So where is this all leading toward?

Google Search already surfaces answers for non-money queries that are informational like the weather and currency conversions. There are no ads for those queries so Google is not losing anything by showing informational queries in a chatbot.

But for shopping and other transactional types of search queries, the best solution is Query Deserves Advertising.

If a user asks a shopping related search query there’s going to come a time where the chatbot will “helpfully” decide that the Query Deserves Advertising and switch over to the search engine inventory that also includes advertising.

That may explain why Google’s CEO sees a future where the web is not replaced by an AI but rather they coexist. So if you think about it, Query Deserves Advertising may be how search engines preserve their lucrative advertising business in the age of AI.

Query Deserves Search

An extension of this concept is to think about search queries where comparisons, user reviews, expert human reviews, news, medical, financial and other queries that require human input will need to be surfaced. Those kinds of queries may also switch over to a search result. The results may not look like today’s search results but they will still be search results.

People love reading reviews, reading news, reading gossip and other human generated topic and that’s not going away. Insights matter. Personality matters.

Query Deserves SEO

So maybe the SEO knee jerk reaction that SEO is dead is premature. We’re still at the beginning of this and as long as there’s money to be made off of search there will still be a need for websites, search engines and SEO.

Featured Image by Shutterstock/Shchus

Google Unplugs “Notes on Search” Experiment via @sejournal, @martinibuster

Google is shutting down it’s Google Notes Search Labs experiment that allowed users to see and leave notes on Google’s search results and many in the search community aren’t too surprised.

Google Search Notes

Availability of the feature was limited to Android and Apple devices and there was never a clearly defined practical purpose or usefulness of the Notes experiment. Search marketers reaction throughout has consistently been that would become a spam-magnet.

The Search Labs page for the experiment touts it as mode of self-expression, to help other users and as a way for users to collect their own notes within their Google profiles.

The official Notes page in Search Labs has a simple notice:

Notes on Search Ends May 2024

That’s it.

Screenshot Of Notice

Screenshot of Google's notice of cancellation

Reaction From Search Community

Kevin Indig tweeted his thoughts that anything Google makes with a user generated content aspect was doomed to attract spam.

He tweeted:

“I’m gonna assume Google retires notes because of spam.

It’s crazy how spammy the web has become. Google can’t launch anything UGC without being bombarded.”

Cindy Krum (@Suzzicks) tweeted that it was author Purna Virji (LinkedIn profile) who predicted that it would be shut down once Google received enough data.

She shared:

“It was actually @purnavirji who predicted it when we were at @BarbadosSeo – while I was talking. Everyone agreed that it would be spammed, but she said it would just be a test to collect a certain type of information until they got what they needed, and then it would be retired.”

Purna herself responded with a tweet:

“My personal (non-employer) opinion is that everyone wants all the UGC to train the AI models. Eg Reddit deal also could potentially help with that.”

Google’s Notes for Search seemed destined to never take off, it was met with skepticism and a shrug when it came out and nobody’s really mourning that it’s on the way out, either.

Featured Image by Shutterstock/Jamesbin

WordPress Releases A Performance Plugin For “Near-Instant Load Times” via @sejournal, @martinibuster

WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.

Speculative Loading

Speculative loading is a technique that fetches pages or resources before a user clicks a link to navigate to another webpage.

The official WordPress page about this new functionality describes it:

“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.

This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”

The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:

“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).

The Speculation Rules API provides an alternative to the widely-available feature and is designed to supersede the Chrome-only deprecated feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”

Performance Lab Plugin

The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.

The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:

Settings > Reading > Speculative Loading

Browser Compatibility

The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.

Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.

Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.

Speculative Loading By WordPress Performance Team

Are Websites Getting Faster? New Data Reveals Mixed Results via @sejournal, @MattGSouthern

Website loading times are gradually improving, but a new study shows significant variance in performance across sites and geographic regions.

The study from web monitoring company DebugBear examined data from Google’s Chrome User Experience Report (CrUX), which collects real-world metrics across millions of websites.

“The average website takes 1.3 seconds to load the main page content for an average visit,” the report stated, using Google’s Largest Contentful Paint (LCP) metric to measure when the main content element becomes visible.

While that median LCP time of 1.3 seconds represents a reasonably fast experience, the data shows a wide range of loading performances:

  • On 25% of mobile websites, visitors have to wait over 2.1 seconds for the main content to appear
  • For the slowest 1% of websites, even an average page load takes more than 5.7 seconds on mobile
  • The slowest 10% of websites make 10% of users wait over 5 seconds for the LCP on mobile
  • Almost 1% of mobile page loads take nearly 20 seconds before the main content shows up

“Even on a fast website, some percentage of page views will be slow,” the study reads.

Continue reading for a deeper dive into the study to understand how your website speed compares to others.

Site Speed Divergences

The data reveals divergences in speeds between different user experiences, devices, and geographic locations:

  • Desktop sites (1.1-second median LCP) load faster than mobile (1.4 seconds)
  • While 25% of mobile page loads hit LCP in under 1 second, 10% take over 4 seconds
  • In the Central African Republic, a typical mobile LCP is 9.2 seconds (75th percentile)
  • Sweden, Slovenia, Japan, and South Korea all had 75th percentile mobile LCPs under 1.7 seconds

“Differences in network connections and device CPU speed mean that visitors in different countries experience the web differently,” the report noted.

The study also found that more popular sites are faster, with the median LCP on the top 1000 sites being 1.1 seconds compared to 1.4 seconds for the top 10 million sites.

Steady Improvement Continues

DebugBear’s analysis shows that websites have steadily become faster across device types over the past few years despite the variances.

A similar improvement was seen for other loading metrics, like First Contentful Paint.

“While changes to the LCP definition may have impacted the data, the First Contentful Paint metric – which is more stable and well-defined – has also improved,” the report stated.

The gains could be attributed to faster devices and networks, better website optimization, and improvements in the Chrome browser.

The study’s key finding was that “Page speed has consistently improved.” However, it also highlighted the wide range of experiences in 2024.

As DebugBear summarized, “A typical visit to a typical website is fast, but you likely visit many websites each day, some slow and some fast.”

Why SEJ Cares

This study provides an annual check-in to see how the web is progressing in terms of loading performance.

In recent years, Google has been emphasizing page load times and its Core Web Vitals metrics to measure and encourage better user experiences.

Speed also plays a role in search rankings. However, its precise weight as a ranking signal is debated.

How This Can Help You

SEO professionals can use studies like this to advocate for prioritizing page speed across an organization.

This report highlights that even high-performing sites likely have a segment of visitors hitting a subpar speed.

Refer to the study as a benchmark for how your site compares to others. If you’re unsure where to start, look at LCP times in the Chrome User Experience Report.

If a segment is well above the 2.1-second threshold for mobile, as highlighted in this study, it may be worth prioritizing front-end optimization efforts.

Segment your page speed data by country for sites with an international audience. Identifying geographic weak spots can inform performance budgeting and CDN strategies.

Remember that you can’t do it all alone. Performance optimization is a collaborative effort between SEOs and developers.


Featured Image: jamesteohart/Shutterstock

Google Explains Index Selection During A Core Update via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about canonicalization, indexing and core algorithm updates that gives a clearer picture of how the different systems work together but independently.

A search marketer named David Minchala asked if Google’s canonicalization processes still worked but in a slower manner during a core algorithm update. The answer to that question is interesting because it offers a way to better understand how these backend processes function.

David’s question used the word “posit” which means to put an idea or statement forward for consideration as a possible fact.

This is the question:

“Posit: during core algo updates (and maybe any big update?), indexing services like canonicalization (i.e., selecting the URL to index and merging all signals from other known duplicate URLs) still work but are slower. Maybe much slower.

Any chance for a comment, Gary Illyes or John Mueller ? Could also be a good topic for Search Off the Record: what are the technical demands on Google to roll out core updates and how could that affect “normal” services like crawling and indexing.”

Google’s Gary Illyes responded by saying that the posited statement is incorrect, using an analogy to explain how the two things function. Gary specifically mentions the index selection process (where Google chooses what goes into the index) and canonicalization (choosing which URL represents the webpage when there are duplicates).

He explained:

“the posit is incorrect. those systems are independent from the “core updates”.

think of core updates as playing with cooking ingredients: you change how much salt or msg you put in your stir fry and you can radically change the result.

in this context index selection and canonicalization is more about what’s happening in the salt mines or the msg factories; not much to do with the cooking just yet.”

Google Indexing Engine

So in other words, what happens in a core update happens independently from the index selection and  canonicalization processes. That way of looking at it, as Gary Illyes suggested, aligns with many of Google’s patents that describe how search systems work. When talking about a search engine, patents describe them as a collection of engines, using the phrase “indexing engine” when talking about indexing.

For example, in one patent illustration there’s an indexing engine, a ranking engine, and a score modification engine. Data goes in and out of each engine where it gets processed according to its function.

Screenshot From A Google Patent

Flowchart depicting a search system that includes an indexing engineFlowchart depicting a search system. It includes a query input, search results output, components like an index database, indexing engine, ranking engine, and a score modification database.

The above screenshot makes it easier to understand what a search engine is and how the different parts work together and separately as well.

Read the LinkedIn discussion here.

Featured Image by Shutterstock/Roman Samborskyi

Google Updates Carousels (Beta) Structured Data Documentation via @sejournal, @martinibuster

Google updated the structured data documentation for the Structured Data Carousels (beta) that show rich results for qualifying topics. The new documentation clarifies specific requirements and makes it more explicit that the rich results features are limited to a single geographic area.

Structured Data Carousels (beta)

Carousels Structured Data (beta) enables web publishers that aggregate information related to travel, local, and shopping to add structured data to their pages that makes them eligible for a new carousel rich result that prominently displays their content in the search results in a horizontally scrollable list (the carousel).

This beta rich result feature uses the ItemList structured data and is available for webpages that display content related to LocalBusiness, Product, and Event Schema.org structured data properties. Each tile in the carousel displays relevant information such as price, rating, dates and images in a rich and interactive format.

Stronger Emphasis On Summary Page

The updated documentation makes it clearer that the beta carousel structured data is meant to be implemented on a summary page that links out to pages with more detailed information and that the linked pages that contain the details do not need to have this specific structured data on them.

The old documentation contained the following instructions:

“Add markup to a single page (also known as a single, all-in-one-page list) that contains all list information, including full text of each item. For example, a list of the top hotels in a location, all contained on one page.”

The new documentation now explains it like this:

“Pick a single summary page that contains some information about every entity in the list. For example, a category page that lists the “Top hotels in Paris”, with links out to specific detail pages on your site for more information about each hotel.”

There is also an addition of an example for clarification:

“For example, if you have a “Things to do in Switzerland” article that lists both local events and local businesses.

Add the required properties to that summary page. You don’t need to add markup to the detail pages in order to be eligible for this beta feature.”

There is also an entirely new paragraph:

“Your site must have a summary page and multiple detail pages. Currently, this feature isn’t designed to support other scenarios, such as an all-in-one page where the “details” are anchor points within the same page.

The markup must be on a summary or category page, which is a list-like page that contains information about at least three entities and then links out to other pages on your site for more information on those entities. While you don’t need to add markup to the detail pages, you must include the detail page URLs in your summary page’s markup.”

Lastly, there is an edit to a short paragraph that makes it clearer that the structured data is for a standalone summary page.

This is the previous version:

“The canonical URL of the item detail page (for example, hotel or vacation listing on that page). All URLs in the list must be unique, but live on the same domain (the same domain, or sub or super domain as the current page).”

This is the new version (new wording is italicized):

“The canonical URL of the item detail page (for example, the standalone page for a single hotel or vacation listing that was referenced in the summary page). All URLs in the list must be unique, but live on the same domain (the same domain, or sub or super domain as the summary page).”

Clarification On Geographic Eligibility

Google’s changelog documentation of the changes notes that the changes are meant to clarify that the structured data is for use on summary pages. However it fails to note that the new documentation also has more information about where the new rich results features are available.

This is what the changelog says:

“Clarified that the beta carousel feature is for sites that have a summary page that links out to other detail pages on their website. The markup must be on the summary page, and you don’t need to add markup to the detail pages in order to be eligible for this feature.”

But that changelog is incorrect because it omits that there is an additional paragraph that clarifies that this rich results feature is geographically limited.

The previous version said nothing about what countries are eligible for the beta rich results. That information was contained in the initial announcement of the the new feature but not in the documentation of the new feature.

The new documentation has this additional content which corrects the omission:

“Feature availability
This feature is in beta and you may see changes in requirements or guidelines, as we develop this feature. If your business is based in EEA, or serves users in EEA, and you would like to learn more and express interest in these new experiences, you can start by filling out the applicable form (for flights queries, use the interest form for flights queries).

This feature is currently only available in European Economic Area (EEA) countries, on both desktop and mobile devices. It’s available for travel, local, and shopping queries. For shopping queries, it’s being tested first in Germany, France, Czechia, and the UK.”

It is curious that Google would leave out important information about the feature availability in the original Carousels (beta) documentation and then omit to mention in the changelog documentation that it was added back in.

That’s important information and adding it to the newly updated documentation should have been noted in the changelog.

Read the newly updated documentation and guidelines:

Structured data carousels (beta)

Featured Image by Shutterstock/Framalicious

15 Reasons Why Your Business Absolutely Needs SEO via @sejournal, @searchmastergen

The need for quality SEO keeps increasing.

Brands that execute an organic strategy the right way are standing out early and often – and it’s more important now than ever, thanks to the emergence of AI and other technological innovations.

Blend those emerging technologies with the tumultuous few years that made up the COVID pandemic – where millions of consumers were pushed online to do their business, make purchases, and stay entertained – and you can begin to scratch the surface of SEO’s importance in marketing’s modern-day ecosystem.

SEO is the most viable, sustainable, and cost-effective way to both understand and reach your customers in key moments that matter.

Doing so not only helps build trust while educating the masses – it also establishes an organic footprint that transcends multiple marketing channels with measurable impact.

But while it will certainly improve a website’s overall searchability and visibility, what other real value does SEO offer for brands willing to commit to legitimate recurring or project-based SEO engagements?

And why does SEO continue to grow into a necessity rather than a luxury?

Here are 15 reasons why businesses need SEO to take their brand to the next level – regardless of the industry or business size.

1. Organic Search Is Most Often The Primary Source Of Website Traffic

Organic search is a massive part of most businesses’ website performance and a critical component of the buyer funnel, ultimately getting users to complete a conversion or engagement.

Google owns a significantly larger portion of the search market than competitors like Yahoo, Bing, Baidu, Yandex, DuckDuckGo, and many others.

search engine market shareScreenshot from gs.statcounter.com, February 2024

That’s not to say that all search engines don’t contribute to a brand’s visibility – they do. It’s just that Google owns a considerable portion of the overall search market. Thus, its guidelines are important to follow.

But the remaining part of the market owned by other engines is valuable to brands, too. This is especially true for brands in niche verticals where voice, visual, and vertical search engines play an essential role.

Google, being the most visited website in the world (and specifically in the United States), also happens to be one of the most popular email providers in the world.

YouTube is the second most-used search engine, with at least 2.5 billion people accessing it at least once a month, or 122 million people daily.

We know that a clear majority of the world with access to the internet is visiting Google at least once a day to get information.

Being highly visible as a trusted resource by Google and other search engines will always work in a brand’s favor. Quality SEO and a high-quality website take brands there.

2. SEO Builds Trust & Credibility

The problem for many brands is that building trust and credibility overnight is impossible – just like in real life. Authority is earned and built over time.

And, with the AI revolution we’ve experienced over the last year showing no signs of slowing down, building real credibility has become even harder to achieve – and even more critical.

Following Google’s E-E-A-T guidelines is vital to ensure successful results when creating content for your audience.

The goal of any experienced SEO professional is to establish a strong foundation of trust and credibility for a client. It helps to have a beautiful website with a clean, effective user experience that represents a quality brand with a loyal customer base – or at least the potential for one.

A brand of this nature would be easily discoverable in search with the right SEO strategy. The more channels you’re comfortable publishing on and partnering with, the more discoverable you will be.

This can also be attributed to being a respected brand offering quality goods or services to customers, being honest and forthcoming with the public, and earning the trust and credibility among peers, competitors, and other stakeholders.

This becomes a lot easier to succeed with when the brand already has trust signals tied to it and its digital properties.

So many varying elements contribute to establishing that authority with search engines like Google. It starts with building that credibility with humans.

In addition to the factors mentioned above, authority is accrued over time as a result of aspects like:

But now, in the age of AI, establishing that authority continues to become even more complicated and difficult to do.

Yet still, doing so the right way will do more for a brand than most other digital campaigns or optimizations.

Establishing a brand as an authority takes patience, effort, and commitment that relies on offering a valuable, quality product or service that allows customers to trust a brand.

3. It’s An AI Battlefield Out There & It’s Getting Even Harder

Since what seemed like the overnight emergence of AI going mainstream and becoming available at every person’s fingertips, search engine results pages (SERPs) are now more competitive than ever.

Organic real estate keeps shrinking.

Bots, scrapers, and other AI-led technologies are stealing content and regurgitating things they learn along the way, which are often inaccurate or confusing, all while clouding the competitive market with duplicated or plain awful content.

Real SEO – including thorough keyword research, industry analysis, and competitive benchmarking to create high-value content for your customers and loyalists – allows brands to stand apart from the lowly regurgitated spam that floods our SERPs daily.

The challenge of optimizing websites and content for search engines that are relying more on their own AI technologies to enhance the user experience within their platforms than they ever have before is just another layer of complication exemplified by the emergence of AI.

It’s no secret Google’s Search Generative Experience (SGE) hasn’t exactly been the magic touch to take search to the next level. And, in some instances – up to this point – SGE has even taken Google backward in terms of user experience and information retrieval on a boatload of varying topics and queries.

SEO will undoubtedly help brands navigate and distill – and stand out among – the search engine noise that is littered with D-list content and AI-generated mediocrity.

4. Good SEO Also Means A Better User Experience

User experience has become every marketer’s number one priority.

Everyone wants better organic rankings and maximum visibility. However, few realize that optimal user experience is a big part of getting there.

Google has learned how to interpret a good or unfavorable user experience, and a positive user experience has become a pivotal element to a website’s success.

Google’s Page Experience Update is something that marketers in all industries will need to adhere to and is part of their longstanding focus on the customer experience.

Customers know what they want. If they can’t find it, there will be a problem with that website holding up against the competition, which will inevitably surpass it by offering the same, or better, content with a better user experience.

We know how much Google values user experience. We see the search engine getting closer to delivering answers to search queries directly on the SERP every day, and it’s been doing it – and expanding its integration – for years.

The intention is to quickly and easily offer users the information they are looking for in fewer clicks.

Quality SEO incorporates a positive user experience, leveraging it to work in a brand’s favor.

It also understands the importance of leveraging Google’s updated on-the-SERP-delivery tactics for high-value content that has garnered significant traffic and engagement for sites in the past, but is now losing significant portions of it to the SERPs themselves.

5. Local SEO Means Increased Engagement, Traffic & Conversions

The mobile-first mindset of humans and search engines has shaped local search into a critical fundamental for most small- and medium-sized businesses.

Local SEO aims to optimize digital properties for a specific vicinity so people can find a business quickly and easily, putting them one step closer to a transaction.

Local optimizations focus on specific neighborhoods, towns, cities, regions, and even states to establish a meaningful medium for a brand’s messaging on a local level.

SEO pros do this by optimizing the brand’s website and its content, including local citations and backlinks, in addition to regional listings relevant to the location and business sector to which a brand belongs.

To promote engagement locally, SEO pros should optimize a brand’s Knowledge Graph panel, its Google Business Profile, and its social media profiles as a start.

There should also be a strong emphasis on user reviews on Google and other third-party sites like Yelp, Home Advisor, and Angie’s List (among others), depending on the industry.

I recommend following the local SEO tips on SEJ here.

6. SEO Impacts The Buying Cycle

Research is becoming a critical element of SEO, and the importance of real-time research is growing.

Using SEO tactics to relay your messaging for good deals, ground-breaking products and services, and the importance and dependability of what you offer customers will be a game-changer.

It will also undoubtedly positively impact the buying cycle when done right.

Brands must be visible where people need them for a worthy connection to be made. Local SEO enhances that visibility and lets potential customers find the answers and the businesses providing those answers.

7. SEO Is Constantly Improving & Best Practices Are Always Being Updated

It’s great to have SEO tactics implemented on a brand’s website and across its digital properties.

Still, if it’s a short-term engagement (budget constraints, etc.) and the site isn’t re-evaluated consistently over time, it will reach a threshold where it can no longer improve because of other hindrances.

Or, it will require such lift that brands will end up spending far more than expected to reach a place they could have otherwise obtained naturally over time through marketing efforts that included SEO.

How the search world evolves (basically at the discretion of Google) requires constant monitoring for changes to stay ahead of the competition and, hopefully, on Page 1.

Being proactive and monitoring for significant algorithm changes will always benefit the brands doing so.

We know Google makes thousands of algorithm changes a year. Fall too far behind, and it will be tough to come back. SEO pros help to ensure this is avoided.

8. Understanding SEO Helps You Understand The Environment Of The Web

With the always-changing environment that is the World Wide Web, it can be challenging to stay on top of the changes as they occur.

But staying on top of SEO includes being in the loop for the major changes taking place for search.

The AI renaissance has been a clear indication of that.

Knowing the environment of the web, including tactics being used by other local, comparable businesses and competitors, will always be beneficial for those brands.

Observing and measuring what works and what doesn’t only strengthens your brand further as well.

Knowing the search ecosystem will be beneficial 10 out of 10 times.

9. SEO Is Relatively Cheap & Extremely Cost-Effective

Sure, it costs money. But all the best things do, right?

SEO is relatively inexpensive in the grand scheme of things, and the payoff will most likely be considerable in terms of a brand’s benefit to the bottom line.

This isn’t a marketing cost; this is an actual business investment.

Exemplary SEO implementation will hold its own for years to come. And, like most things in life, it will only be better with the more attention (and investment) it gets.

Not only is it cost-effective, but it’s scaleable, measurable, and rarely loses value over time.

10. It’s A Long-Term Strategy

SEO can (and hopefully does) have a noticeable impact within the first year of taking action – and many of those actions will have a lasting effet.

As the market evolves, it’s best to follow the trends and changes closely.

But even a site that hasn’t had a boatload of intense SEO recommendations implemented will improve from basic SEO best practices being employed on an honest website with a decent user experience.

And the more SEO time, effort, and budget committed to it, the better and longer a website stands to be a worthy contender in its market.

The grass is green where you water it.

11. It’s Quantifiable

While SEO doesn’t offer the same easy-to-calculate return on investment (ROI) as paid search, you can measure almost anything with proper tracking and analytics.

The big problem is connecting the dots on the back end since there is no definitive way to understand the correlation between all actions.

Tracking and attribution technology will continue to improve, which will only help SEO pros and their efforts.

Still, it is worth understanding how specific actions are supposed to affect performance and growth – and hopefully, they do.

Any good SEO pro will aim at those improvements, so connecting the dots should not be a challenge.

Brands also want to know and understand where they were, where they are, and where they’re going in terms of digital performance – especially for SEO when they have a person/company being paid to execute on its behalf.

There’s no better way to show the success of SEO, either.

And we all know the data never lies.

12. SEO Is PR

SEO helps build long-term equity for your brand. A good ranking and a favorable placement help elevate your brand’s profile.

People search for news and related items, and having a good SEO and PR strategy means your brand will be seen and likely remembered for something positive.

Providing a good user experience on your website means your messages will be heard, and your products or services will sell.

SEO is no longer a siloed channel, so integrating with content and PR helps with brand reach and awareness alongside other worthwhile results.

13. SEO Brings New Opportunities To Light

High-quality SEO will always find a means of discovering and leveraging new opportunities for brands not just to be discovered but to shine.

And that becomes a lot easier when experienced SEO pros can help distill the millions and millions of websites competing – and flooding – the SERPs daily.

This goes beyond keyword research and website audits.

SEO is also extremely helpful for understanding the voice of your consumers.

From understanding macro market shifts to understanding consumer intent in granular detail, SEO tells us what customers want and need through the data it generates.

SEO data and formats – spoken or word – give us clear signals of intent and user behavior.

It does this in many ways:

Hiring an SEO professional is not always an easy task either. It requires money, time, vision, communication, more time, and some other things that will undoubtedly need to be fixed over the course of time.

Executive SEO on behalf of brands means immersing an SEO team in everything that makes that brand what it is. It’s the only way to truly market something with the passion and understanding that its stakeholders have for it: becoming a stakeholder.

The better a brand is understood, the more opportunities will arise to help it thrive. The same can be said about SEO.

New opportunities with SEO today can come in many ways – from content, digital, and social opportunities to helping with sales, product, and customer service strategies.

14. If You’re Not On Page One, You’re Not Winning The Click – Especially With Zero-Click Results

SEO is becoming a zero-sum game as zero-click SERPs show the answer directly at the top of a Google search result.

This has only intensified with AI, SGE, Gemini, and more sure-to-come technologies that continue to shape our industry.

Early data showed about 56% of queries in a testing sample triggered SGE automatically directly on the SERP as part of an answer to a specific query in 2023, largely based on the semantics and intent of the query.

SGE results are also still incredibly volatile; sometimes they show up automatically, other times not at all, and other times there’s even an option to use SGE for results or not.

Regardless of that or any speculation on the future, there’s one thing for sure: Zero-click results in searches are winning.

If you’re not on Page 1, you need to be.

There are still too many instances when a user types a search query and can’t find exactly what it’s looking for. And sadly, SGE hasn’t been great at changing that until now.

15. SEO Is Always Going To Be Here

Consumers will always want products and services online, and brands will always look for the most cost-effective way to connect them with each other.

While the role of SEO may shift and strategies will surely change, new avenues are constantly opening up through different entry points such as voice, apps, wearables, and the Internet of Things (IoT) AI is another prime example, and we can already see its impact greatly.

Outdated SEO tactics aren’t going to work much longer. New organic search opportunities will always arise. SEO helps find the best ways to capitalize on them.

Conclusion

The role of SEO has expanded significantly over the last few years, and it’s only becoming more challenging and expansive in the face of AI.

New technologies are constantly creating new processes and even shortcuts and workarounds that are changing the game, sometimes for the better and sometimes for the worse.

One thing is certain, though: Without giving SEO efforts some significant attention through a brand’s fiscal year, you are doing your business a disservice. Try it and see. Analyze the results. Test some more. Try new things.

Stay up to date with changes and guidelines, and make sure you’re offering unique content that is valuable. And if it’s not originally yours, include proper citation and linking.

SEO will continue to help consumers when in need.

Implementing robust, quality SEO updates on a brand’s website and digital properties will benefit them and their marketing efforts in measurable ways, and the impact will be felt.

There will be challenges, but when done right, there can also be success.

More Resources:


Featured Image: Rawpixel.com/Shutterstock

Google’s CEO On What Search Will Be Like In 10 Years via @sejournal, @martinibuster

Google and Alphabet’s CEO Sundar Pichai sat down for a discussion on AI that inevitably focused on the future of search. He explained his of search and the role of websites, insisting that the only thing different is the technology.

These Are The Early Days

The interviewer asked if Sundar was caught by surprise by how fast AI has progressed recently. Google’s CEO made it clear that Google was indeed at the forefront of AI and that they have been creating the infrastructure for it since 2016. He also reminded the interviewer that the world is at the very beginning of the AI age and that there’s a lot more coming.

Sundar answered:

“…one of the main things I did as a CEO is to really pivot the company towards working on AI and I think that’ll serve us well for the next decade ahead.

For example now I look back and compute is the hot currency now. We built TPUs, we started really building them at scale in 2016 right, so we have definitely been thinking about this for a long time.

…we’ve always had a sense for uh the trajectory ahead and in many ways we’ve been preparing the company for that and so I think foundationally a lot of our R&D …a lot of it has gone into AI for a long time and so I feel incredibly well positioned for what’s coming.

We’re still in the very early days I think people will be surprised at the level of progress we’re going to see and I feel like we’ve scratched the tip of the iceberg.”

The Only Thing Different Is The Technology

Sundar was also asked about the future of search and what it would look like. There’s a lot of anxiety by publishers and search marketers that AI will replace search entirely and that websites will fall into decline, taking the SEO industry down with it.

So it may come as a relief that Google’s CEO anticipates a future in which people and websites continue playing an important role in search just as it does today.

He starts by asserting that AI has been a part of search for many years and that web ecosystem still plays a role in making search useful. He also underlines the point that the ten blue links hasn’t been a thing for 15 years (people also ask, videos, top news, carousels), that Google has long given direct answers (featured snippets, etc).

This is the question asked:

How are things going to evolve? Like how will people access information in 10 years?

Sundar answers that the only thing different is the technology:

“Look, I think it’s one of the common myths around that Google has been ten blue links for a long time. You know, when mobile came we knew Google search had to evolve a lot. We call it Featured Snippets, but for almost ten years now, you go to Google for many questions we kind of use AI to answer them right, we call it web answers internally.

And so, we’ve always answered questions where we can but we always felt when people come and look for information people in certain cases want answers but they also want the richness and the diversity of what’s out there in the world and it’s a good balance to be had and we’ve always, I think, struck that balance pretty well.

To me all that is different is now the technology by which you can answer is progressing, so we will continue doing that. But this evolution has been underway in search for a long long time.”

People Trust Search

Sundar observed that search has always evolved and despite it being different today than what it was like fifteen years ago, it’s still about surfacing information from the web.

He continued:

“Search used to be text and 10 blue links maybe 15 years ago but you know be it images, be it videos, be it finding answers for your questions, those are all changes you know …to to my earlier point people kind of shrug and …we’ve done all this in Google search for a long time and people like it, people engage with it, people trust it.

So to me, I view it as a more natural continuation, obviously with LLMs and AI. I think you have a more powerful tool to do that and so which is what we are putting in search, you know with Search Generative Experience and so we’ll continue evolving it in that direction too.”

Search Engines And The Web Go Together

He was next asked about the question of political and cultural biases in search engines, mentioning that Google’s output has been accused of reflecting the liberal biases of its employees. He was asked, how do you think about what answer to give to questions?

Sundar’s answer returned to referencing the value of information created by people as found on websites as the best source of answers. He said that even with Search Generative Experience, they still want to point users to websites.

This is how he explained it:

“Let’s talk about search for a second here, you’re asking a very important question. I think you know the the work we have done over many many years making sure, from a search standpoint, in search we try to reflect what’s out in the web. And we want to give trustworthy high quality information. We’ve had to navigate all of this for a long time.

I think we’ve always struck the balance, that’s what I’m saying, it’s not about giving an answer, there are certain times you give an answer, what’s the population of the United States, yes it’s an answerable question. There are times you want to surface the breadth of opinions out there on the web which is what search does and does it well.

Just because you’re saying we are summarizing it on top doesn’t mean we veer from those principles. The summary can still point you to the range of opinions out there right, and we do that today all the time.”

SGE Is Not A Chatbot Experience

This next part is very important because it emphasizes the word “search” in the phrase, Search Generative Experience in order to contrast that with talking to a chatbot.

There are a lot of articles predicting the decline of search traffic due to SGE, but there are many reasons why that’s not what’s happening and Sundar explains that by differentiating the search experience from the chatbot experience. This is super important because it’s a point that’s lost on those who kneejerk react that SGE is going to replace websites. According to Sundar, that’s not the case because search and chatbots are two different things.

His answer:

“And so I think that’s different from when you’re in a chatbot and I think that’s the more active area of research where sometimes it has its voice so how do you get those moments right and you know again for us I think it’s an area where we will be deeply committed to getting it right.

How do you do it in a way that which you represent the wide range of views that are held by people around the world and I think there are many aspects to it, the issues with AI models are not just at Google you see it across other models.”

AI Improves Search (Not Replaces It)

Near the end of the discussion, Sundar describes AI as a technology that improves current technologies (and not as something that replaces them). This too is an important point to consider when thinking about how AI will impact search and SEO.

His explanation of how AI improves but not necessarily replaces:

“…of course as a company you want to make sure you’re capitalizing on those innovations and building successful products, businesses, but I think we’ve long demonstrated that we can do it. The thing that excites me about AI is it’s the same underlying piece of technology for the first time in our history we have one leveraged piece of technology which can improve search, can improve YouTube, can improve Waymo and we put it all as cloud to our customers outside and so I feel good about that.”

Takeaways

There was a lot of important information in this interview that provides the most comprehensive picture of what the state of search will look like in the future.

Some of the important points:

  • AI is not new. It’s been a part of Google for many years now.
  • Google has provided answers and summaries for years
  • Websites are important to search
  • SGE is not a chatbot experience, it’s a search experience
  • Search and chatbots are two different things
  • AI improves search (not replaces it)

Watch the interview at the 1 hour 17 minute mark:

Featured Image by Shutterstock/photosince

Google Improves INP For Sites Using Consent Management Platforms via @sejournal, @MattGSouthern

Google has announced improvements to its Interaction to Next Paint (INP) metric for websites using popular consent management platforms (CMPs).

Google made this possible by working directly with platforms like OneTrust, Complianz, and Axeptio.

Barry Pollard, a Chrome User Experience Report (CrUX) team member, announced the initiative in a recent post on the CrUX Announcements group.

Pollard stated:

“The team at Google have been working with a number of Consent Management Platforms, including OneTrust, Complianz and Axeptio, to improve Interaction To Next Paint (INP) by yielding more often—particularly when cookies are accepted.”

INP Insights From Google’s Chrome UX Team

Pollard revealed that Google’s collaboration with CMPs has “resulted in much improved INP for sites using these platforms.”

He explained that the platforms now “yield more often” when cookies are accepted, directly impacting the site’s INP performance.

Related: Get Ready For Google’s INP Metric With These 5 Tools

The Importance Of INP

Introduced as a replacement for First Input Delay (FID), INP measures the time from when a user interacts with a page to when the browser can render the changed pixels to the screen.

As a Core Web Vital, INP plays a role in assessing a website’s interactivity and overall user experience.

Optimizing INP & Identifying Issues

You can evaluate your site’s current INP performance using tools like PageSpeed Insights and CrUX.

Google has also published a tutorial on identifying and resolving INP issues, guiding developers through steps like diagnosing problematic areas, optimizing JavaScript, and streamlining the DOM structure.

According to data from DebugBear, a web performance monitoring platform, the average website takes 1.3 seconds to load the main page content (as measured by the Largest Contentful Paint metric).

However, there is significant variation in loading speeds across different websites, devices, and locations.

FAQ

What is Interaction to Next Paint (INP) and why is it important?

In simple terms INP measures the time from when a user interacts with a page to when the browser can render the changed pixels on the screen. It’s an evolution from the First Input Delay (FID) metric and is considered a Core Web Vital by Google.

INP is important because it quantifies the responsiveness of a webpage, which is an aspect of user experience. A lower INP ( < 200ms ) indicates a more interactive and responsive website, which can contribute to user satisfaction and potentially better search visibility.

How have Consent Management Platforms been improved for better INP results?

CMPs like OneTrust, Complianz, and Axeptio have been optimized through Google’s collaboration to enhance website INP metrics.

This was achieved by having the platforms “yield more often,” specifically when users accept cookies.

Yielding more often means these platforms allow the browser’s main thread to be less occupied with processing consent-related tasks, improving the INP metric and overall performance.

Featured Image: rafapress/Shutterstock

Google’s John Mueller Clarifies 404 & 410 Confusion For SEO via @sejournal, @MattGSouthern

A recent discussion on the r/SEO Reddit forum provided insights from Google’s Search Advocate, John Mueller, regarding website penalties and the use of HTTP status codes.

Mueller addressed questions raised by a website owner who had previously used AI to generate content for their videogame guide website.

After removing approximately 200 AI-generated pages due to concerns, the owner sought advice on recovery.

The conversation led to a discussion of the nuances of HTTP status codes 404 and 410, which indicate missing or permanently removed web pages.

Mueller’s responses clarified Google’s stance, emphasizing practical considerations over theoretical differences in SEO.

Website Owner Admits To AI Content Creation

The conversation began when a website owner admitted using AI technologies like GPT to generate content for older games on their long-standing game guide website.

The site owner confessed

“I did try to see if I could get GPT to write game guides for older games that I haven’t played, just to boost content on the site and take advantage of the authority the site had.”

After a brief period of success, concerns arose, prompting the removal of approximately 200 AI-generated pages.

As they grapple with the repercussions, they ask:

“I’m wondering if this has typically been enough for others to see some recovery?”

Addressing 404 Status Codes

One Reddit user suggested the site might be facing penalties due to 404 status codes, which indicate a webpage cannot be found.

However, Mueller swiftly clarified the situation:

“Google does not penalize for 404’s (those pages drop out of the index though).”

404 vs. 410 Status Codes

A follow-up question asked about the potential impact of using a 410 status code, indicating that a resource is permanently gone, versus a 404.

Mueller’s confirms the differences are negligible in terms of SEO:

“It doesn’t matter. The difference in processing of 404 vs 410 is so minimal that I can’t think of any time I’d prefer one over the other for SEO purposes.”

He acknowledged the theoretical correctness of using the appropriate status code but says practical considerations take priority.

A Lighthearted Closing

Recognizing the widespread attention his comments would likely receive, Mueller concluded his response with a touch of humor:

“And I realize that writing this out now will trigger another cycle of needless attention – or is it really needless? Hi, mom. I would like to thank the academy for the honor of being here. Support the Women in Tech SEO group. Floss.”

Why SEJ Cares

With the March core update still rolling out, Mueller’s insights provide valuable guidance on navigating potential demotions and ensuring compliance with best practices.

Mueller’s comments on HTTP status codes offer a pragmatic approach to handling missing or removed web pages.

With this knowledge, SEO professionals can make more informed decisions.

How This Can Help You

Mueller’s advice provides a starting point for those facing similar situations.

By following best practices and addressing potential issues promptly, website owners can work towards regaining their search engine visibility.


FAQ

How does Google view 404 and 410 HTTP status codes regarding SEO?

Google’s position on HTTP 404 and 410 status codes is that they are treated similarly with minimal differences in SEO impact.

These codes signal to Google that a page is missing (404) or permanently removed (410), and as such, the pages will be dropped from the index, but these responses do not result in penalties.

Understanding these distinctions allows SEO professionals to handle missing content appropriately without fear of negative SEO repercussions.

Are there negative ramifications for using AI to create content on websites?

While not inherently penalized, AI-generated content must meet quality guidelines, as low-quality content can negatively impact a site’s SEO.

Recovery from removing such content depends on various factors, including adherence to best practices and the quality of the remaining content.

Genuine and value-driven content tends to be favored in search ranking.

Can the removal of low-quality or non-compliant content lead to search ranking recovery?

Eliminating low-quality or non-compliant content is often a step towards recovery in search rankings because it aligns with Google’s emphasis on high-quality and relevant information.

However, the recovery process can also depend on factors like the creation of valuable content, overall site performance, and adherence to SEO best practices.


Featured Image: Roman Samborskyi/Shutterstock