Google: Proximity Not A Factor For Local Service Ads Rankings via @sejournal, @MattGSouthern

Google has clarified that a business’s proximity to a searcher isn’t a primary factor in how Local Services Ads are ranked.

This change reflects Google’s evolving understanding of what’s relevant to users searching for local service providers.

Chris Barnard, a Local SEO Analyst at Sterling Sky, started the discussion by pointing out an update to a Google Help Center article.

In a screenshot, he highlights that Google removed the section stating proximity is a factor in local search ad rankings.

Ginny Marvin, Google’s Ads Liaison, responded to clarify the change.

In a statement, Marvin said:

“LSA ranking has evolved over time as we have learned what works best for consumers and advertisers. We’ve seen that proximity of a business’ location is often not a key indicator of relevancy.

For example, the physical location of a home cleaning business matters less to potential customers than whether their home is located within the business’ service area.”

Marvin confirmed this wasn’t a sudden change but an update to “more accurately reflect these ranking considerations” based on Google’s learnings.

The updated article now states that location relevance factors include:

“…the context of a customer’s search… the service or job a customer is searching for, time of the search, location, and other characteristics.”

Proximity Still A Factor For Service Areas

Google maintains policies requiring service providers to limit their ad targeting to areas they can service from their business locations.

As Marvin cites, Google’s Local Services platform policies state:

“Local Services strives to connect consumers with local service providers. Targeting your ads to areas that are far from your business location and/or that you can’t reasonably serve creates a negative and potentially confusing experience for consumers.”

Why SEJ Cares

By de-emphasizing proximity, Google is giving its ad-serving algorithms the flexibility to surface the most relevant and capable providers.

This allows the results to match user intent better and connect searchers with companies that can realistically service their location.


FAQ

What should businesses do in response to the change in Local Services Ads ranking factors?

With the recent changes to how Google ranks Local Services Ads, businesses should update the service areas listed for their ads to reflect the regions they can realistically provide services. You’ll want to match the service areas to what’s listed on your Google Business Profile.

Companies should also ensure their service offerings and availability information are up-to-date, as these are other key factors that will impact how well their Local service ads rank and show up for relevant local searches.

Why is it important for marketers to understand changes to Local Services Ads ranking?

These changes affect how businesses get matched with potential customers. Google no longer heavily prioritizes closeness when ranking local service ads. Instead, it focuses more on other relevant factors.

Understanding this shift allows businesses to update their local service ad strategies. By optimizing for Google’s new priorities, companies can get their ads in front of the right audience.

Can a business still target areas far from their location with Local Services Ads?

No, Google doesn’t allow businesses to target areas they can’t realistically service.

This is to prevent customers from being matched with providers who are too far away to help them. Businesses can only advertise in areas close to their location or service areas.


Featured Image: Mamun sheikh K/Shutterstock

OpenAI Announces ChatGPT 4o Omni via @sejournal, @martinibuster

ChatGPT announced a new version of ChatGPT that can accept audio, image and text inputs and also generate outputs in audio, image and text. OpenAI is calling the new version of ChatGPT 4o, with the “o” standing for “omni” which is a combining form word that means “all”.

ChatGPT 4o (Omni)

OpenAI described this new version of ChatGPT as a progression toward more natural human and machine interactions which responds to user inputs at the same speed as a human to human conversations. The new version matches ChatGPT 4 Turbo in English and significantly outperforms Turbo in other languages. There is a significant improvement in API performance, increasing in speed and operating 50% less expensively.

The announcement explains:

“As measured on traditional benchmarks, GPT-4o achieves GPT-4 Turbo-level performance on text, reasoning, and coding intelligence, while setting new high watermarks on multilingual, audio, and vision capabilities.”

Advanced Voice Processing

The previous method for communicating with voice involved bridging together three different models to handle transcribing voice inputs to text where the second model (GPT 3.5 or GPT-4) processes it and outputs text and a third model that transcribes the text back into audio. That method is said to lose nuances in the various translations.

OpenAI described the downsides of the previous approach that are (presumably) overcome by the new approach:

“This process means that the main source of intelligence, GPT-4, loses a lot of information—it can’t directly observe tone, multiple speakers, or background noises, and it can’t output laughter, singing, or express emotion.”

The new version doesn’t need three different models because all of the inputs and outputs are handled together in one model for end to end audio input and output. Interestingly, OpenAI states that they haven’t yet explored the full capabilities of the new model or fully understand the limitations of it.

New Guardrails And An Iterative Release

OpenAI GPT 4o features new guardrails and filters to keep it safe and avoid unintended voice outputs for safety. However today’s announcement says that they are only rolling out the capabilities for text and image inputs and text outputs and a limited audio at launch. GPT 4o is available for both free and paid tiers, with Plus users receiving 5 times higher message limits.

Audio capabilities are due for a limited alpha-phase release for ChatGPT Plus and API users within weeks.

The announcement explained:

“We recognize that GPT-4o’s audio modalities present a variety of novel risks. Today we are publicly releasing text and image inputs and text outputs. Over the upcoming weeks and months, we’ll be working on the technical infrastructure, usability via post-training, and safety necessary to release the other modalities. For example, at launch, audio outputs will be limited to a selection of preset voices and will abide by our existing safety policies.”

Read the announcement:

Hello GPT-4o

Featured Image by Shutterstock/Photo For Everything

Google Warns Against “Sneaky Redirects” When Updating Content via @sejournal, @MattGSouthern

When dealing with outdated website content, Google has warned against using certain redirects that could be perceived as misleading to users.

The advice came up during a recent episode of Google’s Search Off The Record podcast.

In the episode, Search Relations team members John Mueller and Lizzi Sassman discussed strategies for managing “content decay” – the gradual process of website content becoming obsolete over time.

During the conversation, the two Googlers addressed the practice of using redirects when older content is replaced or updated.

However, they cautioned against specific redirect methods that could be seen as “sneaky.”

When Rel=canonical Becomes “Sneaky”

The redirect method that raised red flags is the incorrect use of rel=canonical tags.

This was brought up during a discussion about linking similar, but not equivalent, content.

Sassman stated:

“… for that case, I wish that there was something where I could tie those things together, because it almost feels like that would be better to just redirect it.

For example, Daniel Weisberg on our team blogged about debugging traffic drops with Search Console in a blog post. And then we worked on that to turn that into documentation and we added content to it. We want people to go look at the new thing, and I would want people to find that new thing in search results as well.

So, to me, like that one, I don’t know why people would need to find the older version forthat, because it’s not like an announcement. It was best practice kind of information.

So, for that, would it be better to do like a rel=canonical situation?”

Mueller immediately raised concerns with Sassman’s proposed use of the rel=canonical tag.

Mueller replied:

“The rel=canonical would be kind of sneaky there because it’s not really the same thing… it’s not equivalent.

I always see rel=canonical as something where you tell search engines ‘these are actually equivalent, and you can pick whichever one you want.

We’re kind of seeing it as like, ‘Well, these are equivalent, but treat this as a redirect,’ which is tricky because they’re like, ‘Ah, they say rel=canonical, but they actually mean something different.’”

What To Do Instead

If you find yourself having to make a similar decision as Sassman, Mueller says this is the correct approach:

“I think either redirecting or not redirecting. It’s like really saying that it’s replaced or keeping both.”

The best way to link a page to a newer, more comprehensive page is with a redirect, not a rel=canonical.

Or you can keep them both up if you feel there’s still value in the older page.

Why SEJ Cares

Using redirects or canonical tags incorrectly can be seen as an attempt to manipulate search rankings, which violates Google’s guidelines and can result in penalties or a decrease in visibility.

Following Google’s recommendations can ensure your site remains in good standing and visitors access the most relevant content.

Listen to the full podcast episode below:


FAQ

What are the issues with using rel=canonical tags for updated content?

Using rel=canonical tags can be misleading if the old and new pages aren’t equivalent.

Google’s John Mueller suggests that rel=canonical implies the pages are identical and a search engine can choose either.  Using it to signal a redirect when the content isn’t equivalent is seen as “sneaky” and potentially manipulative.

Rel=canonical should only be used when content is truly equivalent; otherwise, a 301 redirect or maintaining both pages is recommended.

Is it acceptable to keep outdated content accessible to users?

Yes, it’s acceptable to keep outdated content accessible if it still holds value. Google’s John Mueller suggests that you can either redirect outdated content to the updated page or keep both versions of the content live.

If the older content offers valuable information or historical context, it’s worthwhile to keep it accessible along with the updated version.

How should redirects be handled when updating website content?

The correct approach to handling redirects is to use a 301 redirect if the old content has been replaced or is considered obsolete.

A 301 redirect tells search engines—and visitors—that the old page has moved permanently to a new location. Additionally, it allows the transfer of link equity and minimizes negative impact on search rankings.


Featured Image: Khosro/Shutterstock

OpenAI Expected to Integrate Real-Time Data In ChatGPT via @sejournal, @martinibuster

Sam Altman, CEO of OpenAI, dispelled rumors that a new search engine would be announced on Monday, May 13. Recent deals have raised the expectation that OpenAI will announce the integration of real-time content from English, Spanish, and French publications into ChatGPT, complete with links to the original sources.

OpenAI Search Is Not Happening

Many competing search engines have tried and failed to challenge Google as the leading search engine. A new wave of hybrid generative AI search engines is currently trying to knock Google from the top spot with arguably very little success.

Sam Altman is on record saying that creating a search engine to compete against Google is not a viable approach. He suggested that technological disruption was the way to replace Google by changing the search paradigm altogether. The speculation that Altman is going to announce a me-too search engine on Monday never made sense given his recent history of dismissing the concept as a non-starter.

So perhaps it’s not a surprise that he recently ended the speculation by explicitly saying that he will not be announcing a search engine on Monday.

He tweeted:

“not gpt-5, not a search engine, but we’ve been hard at work on some new stuff we think people will love! feels like magic to me.”

“New Stuff” May Be Iterative Improvement

It’s quite likely that what’s going to be announced is iterative which means it improves ChatGPT but not replaces it. This fits into how Altman recently expressed his approach with ChatGPT.

He remarked:

“And it does kind of suck to ship a product that you’re embarrassed about, but it’s much better than the alternative. And in this case in particular, where I think we really owe it to society to deploy iteratively.

There could totally be things in the future that would change where we think iterative deployment isn’t such a good strategy, but it does feel like the current best approach that we have and I think we’ve gained a lot from from doing this and… hopefully the larger world has gained something too.”

Improving ChatGPT iteratively is Sam Altman’s preference and recent clues point to what those changes may be.

Recent Deals Contain Clues

OpenAI has been making deals with news media and User Generated Content publishers since December 2023. Mainstream media has reported these deals as being about licensing content for training large language models. But they overlooked a a key detail that we reported on last month which is that these deals give OpenAI access to real-time information that they stated will be used to give attribution to that real-time data in the form of links.

That means that ChatGPT users will gain the ability to access real-time news and to use that information creatively within ChatGPT.

Dotdash Meredith Deal

Dotdash Meredith (DDM) is the publisher of big brand publications such as Better Homes & Gardens, FOOD & WINE, InStyle, Investopedia, and People magazine. The deal that was announced goes way beyond using the content as training data. The deal is explicitly about surfacing the Dotdash Meredith content itself in ChatGPT.

The announcement stated:

“As part of the agreement, OpenAI will display content and links attributed to DDM in relevant ChatGPT responses. …This deal is a testament to the great work OpenAI is doing on both fronts to partner with creators and publishers and ensure a healthy Internet for the future.

Over 200 million Americans each month trust our content to help them make decisions, solve problems, find inspiration, and live fuller lives. This partnership delivers the best, most relevant content right to the heart of ChatGPT.”

A statement from OpenAI gives credibility to the speculation that OpenAI intends to directly show licensed third-party content as part of ChatGPT answers.

OpenAI explained:

“We’re thrilled to partner with Dotdash Meredith to bring its trusted brands to ChatGPT and to explore new approaches in advancing the publishing and marketing industries.”

Something that DDM also gets out of this deal is that OpenAI will enhance DDM’s in-house ad targeting in order show more tightly focused contextual advertising.

Le Monde And Prisa Media Deals

In March 2024 OpenAI announced a deal with two global media companies, Le Monde and Prisa Media. Le Monde is a French news publication and Prisa Media is a Spanish language multimedia company. The interesting aspects of these two deals is that it gives OpenAI access to real-time data in French and Spanish.

Prisa Media is a global Spanish language media company based in Madrid, Spain that is comprised of magazines, newspapers, podcasts, radio stations, and television networks. It’s reach extends from Spain to America. American media companies include publications in the United States, Argentina, Bolivia, Chile, Colombia, Costa Rica, Ecuador, Mexico, and Panama. That is a massive amount of real-time information in addition to a massive audience of millions.

OpenAI explicitly announced that the purpose of this deal was to bring this content directly to ChatGPT users.

The announcement explained:

“We are continually making improvements to ChatGPT and are supporting the essential role of the news industry in delivering real-time, authoritative information to users. …Our partnerships will enable ChatGPT users to engage with Le Monde and Prisa Media’s high-quality content on recent events in ChatGPT, and their content will also contribute to the training of our models.”

That deal is not just about training data. It’s about bringing current events data to ChatGPT users.

The announcement elaborated in more detail:

“…our goal is to enable ChatGPT users around the world to connect with the news in new ways that are interactive and insightful.”

As noted in our April 30th article that revealed that OpenAI will show links in ChatGPT, OpenAI intends to show third party content with links to that content.

OpenAI commented on the purpose of the Le Monde and Prisa Media partnership:

“Over the coming months, ChatGPT users will be able to interact with relevant news content from these publishers through select summaries with attribution and enhanced links to the original articles, giving users the ability to access additional information or related articles from their news sites.”

There are additional deals with other groups like The Financial Times which also stress that this deal will result in a new ChatGPT feature that will allow users to interact with real-time news and current events .

OpenAI’s Monday May 13 Announcement

There are many clues that the announcement on Monday will be that ChatGPT users will gain the ability to interact with content about current events.  This fits into the terms of recent deals with news media organizations. There may be other features announced as well but this part is something that there are many clues pointing to.

Watch Altman’s interview at Stanford University

Featured Image by Shutterstock/photosince

Google Defends Lack Of Communication Around Search Updates via @sejournal, @MattGSouthern

While Google informs the public about broad core algorithm updates, it doesn’t announce every minor change or tweak, according to Google’s Search Liaison Danny Sullivan.

The comments were in response to Glenn Gabe’s question about why Google doesn’t provide information about volatility following the March core update.

Gabe wrote:

“… when site owners think a major update is done, they are not expecting crazy volatility that sometimes completely reverses what happened with the major update.
The impact from whatever rolled out on 5/3 and now 5/8 into 5/9 has been strong.”

Sullivan explained that Google continuously updates its search ranking systems, with around 5,000 updates per year across different algorithms and components.

Many of these are minor adjustments that would go unnoticed, Sullivan says:

“If we were giving notice about all the ranking system updates we do, it would be like this:

Hi. It’s 1:14pm — we just did an update to system 112!
Hi. It’s 2:26pm — we just did an update to system 34!

That’s because we do around 5,000 updates per year.”

While Google may consider these minor changes, combining thousands of those tweaks can lead to significant shifts in rankings and traffic that sites need help understanding.

More open communication from Google could go a long way.

Ongoing Shifts From Web Changes

Beyond algorithm adjustments, Sullivan noted that search results can fluctuate due to the nature of web content.

Google’s ranking systems continually process new information, Sullivan explains:

“… already launched and existing systems aren’t themselves being updated in how they operate, but the information they’re processing isn’t static but instead is constantly changing.”

Google focuses communications on major updates versus a never-ending stream of notifications about minor changes.

Sullivan continues:

“This type of constant ‘hey, we did an update’ notification stuff probably isn’t really that useful to creators. There’s nothing to ‘do’ with those types of updates.”

Why SEJ Cares

Understanding that Google Search is an ever-evolving platform is vital for businesses and publishers that rely on search traffic.

It reiterates the need for a long-term SEO strategy focused on delivering high-quality, relevant content rather than reacting to individual algorithm updates.

However, we realize Google’s approach to announcing updates can leave businesses scrambling to keep up with ranking movements.

More insight into these changes would be valuable for many.

How This Can Help You

Knowing that Google processes new information in addition to algorithm changes, you may have more realistic expectations post-core updates.

Instead of assuming stability after a major update, anticipate fluctuations as Google’s systems adapt to new web data.


Featured Image: Aerial Film Studio/Shutterstock

Google’s Strategies For Dealing With Content Decay via @sejournal, @MattGSouthern

In the latest episode of the Search Off The Record podcast, Google Search Relations team members John Mueller and Lizzi Sassman did a deep dive into dealing with “content decay” on websites.

Outdated content is a natural issue all sites face over time, and Google has outlined strategies beyond just deleting old pages.

While removing stale content is sometimes necessary, Google recommends taking an intentional, format-specific approach to tackling content decay.

Archiving vs. Transitional Guides

Google advises against immediately removing content that becomes obsolete, like materials referencing discontinued products or services.

Removing content too soon could confuse readers and lead to a poor experience, Sassman explains:

“So, if I’m trying to find out like what happened, I almost need that first thing to know. Like, “What happened to you?” And, otherwise, it feels almost like an error. Like, “Did I click a wrong link or they redirect to the wrong thing?””

Sassman says you can avoid confusion by providing transitional “explainer” pages during deprecation periods.

A temporary transition guide informs readers of the outdated content while steering them toward updated resources.

Sassman continues:

“That could be like an intermediary step where maybe you don’t do that forever, but you do it during the transition period where, for like six months, you have them go funnel them to the explanation, and then after that, all right, call it a day. Like enough people know about it. Enough time has passed. We can just redirect right to the thing and people aren’t as confused anymore.”

When To Updates Vs. When To Write New Content

For reference guides and content that provide authoritative overviews, Google suggests updating information to maintain accuracy and relevance.

However, for archival purposes, major updates may warrant creating a new piece instead of editing the original.

Sassman explains:

“I still want to retain the original piece of content as it was, in case we need to look back or refer to it, and to change it or rehabilitate it into a new thing would almost be worth republishing as a new blog post if we had that much additional things to say about it.”

Remove Potentially Harmful Content

Google recommends removing pages in cases where the outdated information is potentially harmful.

Sassman says she arrived at this conclusion when deciding what to do with a guide involving obsolete structured data:

“I think something that we deleted recently was the “How to Structure Data” documentation page, which I thought we should just get rid of it… it almost felt like that’s going to be more confusing to leave it up for a period of time.

And actually it would be negative if people are still adding markup, thinking they’re going to get something. So what we ended up doing was just delete the page and redirect to the changelog entry so that, if people clicked “How To Structure Data” still, if there was a link somewhere, they could still find out what happened to that feature.”

Internal Auditing Processes

To keep your content current, Google advises implementing a system for auditing aging content and flagging it for review.

Sassman says she sets automated alerts for pages that haven’t been checked in set periods:

“Oh, so we have a little robot to come and remind us, “Hey, you should come investigate this documentation page. It’s been x amount of time. Please come and look at it again to make sure that all of your links are still up to date, that it’s still fresh.””

Context Is Key

Google’s tips for dealing with content decay center around understanding the context of outdated materials.

You want to prevent visitors from stumbling across obsolete pages without clarity.

Additional Google-recommended tactics include:

  • Prominent banners or notices clarifying a page’s dated nature
  • Listing original publish dates
  • Providing inline annotations explaining how older references or screenshots may be obsolete

How This Can Help You

Following Google’s recommendations for tackling content decay can benefit you in several ways:

  • Improved user experience: By providing clear explanations, transition guides, and redirects, you can ensure that visitors don’t encounter confusing or broken pages.
  • Maintained trust and credibility: Removing potentially harmful or inaccurate content and keeping your information up-to-date demonstrates your commitment to providing reliable and trustworthy resources.
  • Better SEO: Regularly auditing and updating your pages can benefit your website’s search rankings and visibility.
  • Archival purposes: By creating new content instead of editing older pieces, you can maintain a historical record of your website’s evolution.
  • Streamlined content management: Implementing internal auditing processes makes it easier to identify and address outdated or problematic pages.

By proactively tackling content decay, you can keep your website a valuable resource, improve SEO, and maintain an organized content library.

Listen to the full episode of Google’s podcast below:


Featured Image: Stokkete/Shutterstock

Google Defines “Content Decay” In New Podcast Episode via @sejournal, @MattGSouthern

In the latest episode of Google’s Search Off The Record podcast, hosts John Mueller and Lizzi Sassman discussed “content decay”—the natural process by which online content becomes outdated or loses relevance over time.

While not a widely used term among SEO professionals, the concept raises questions about how websites should handle aging content that may contain obsolete information, broken links, or outdated references.

What Is Content Decay?

Mueller, a Search Advocate at Google, defines content decay as:

“[Content decay is] something where, when you look at reference material, it’s kind of by definition old. People wrote about it because they’ve studied it for a really long time. So it’s an old thing. But that doesn’t mean it’s no longer true or no longer useful.”

It’s worth noting Mueller was initially unfamiliar with the term:

“When I looked at it, it sounded like this was a known term, and I felt inadequate when I realized I had no idea what it actually meant, and I had to interpret what it probably means from the name.

Sassman, who oversees the Search Central website’s content, admitted she was also unfamiliar with content decay.

She stated:

“… it sounded a little bit negative … Like something’s probably wrong with the content. Probably it’s rotting or something has happened to it over time.”

After defining the term, the two dissected various approaches to handling content decay, using Google’s help documents as a case study.

Content Decay Not Necessarily A Bad Thing

Content decay isn’t, by definition, a bad thing.

Blog posts announcing past events or product changes may seem like sources of content decay.

However, Sassman advises keeping that content for historical accuracy.

Sassman gives an example, citing Google’s decision to keep pages containing the outdated term “Webmaster Tools.”

“If we went back and replaced everything where we said ‘Google Webmasters’ with ‘Search Console,’ it would be factually incorrect. Search Console didn’t exist at that point. It was Webmaster Tools.”

Avoiding User Confusion

According to Mueller, the challenge in dealing with content decay is “avoiding confusing people.”

Indicating when content is outdated, providing context around obsolete references, and sensible use of redirects can help mitigate potential confusion.

Mueller stated

“People come to our site for whatever reason, then we should make sure that they find information that’s helpful for them and that they understand the context. If something is old and they search for it, they should be able to recognize, ‘Oh, maybe I have to rethink what I wanted to do because what I was searching for doesn’t exist anymore or is completely different now.’”

No One-Size-Fits-All Solution

There are no easy solutions to content decay. You must thoughtfully evaluate aging content, understanding that some pieces warrant archiving while others remain valuable historical references despite age.

Listen to the full episode of Google’s podcast below:

Why SEJ Cares

The concept of “content decay” addresses a challenge all website owners face – how to manage and maintain content as it ages.

Dealing with outdated website content is essential to creating a positive user experience and building brand trust.

How This Can Help You

By examining Google’s approaches, this podcast episode offers the following takeaways:

  • There’s value in preserving old content for historical accuracy.
  • Consider updating old pages to indicate outdated advice or deprecated features.
  • Establish an auditing process for routinely evaluating aging content.

FAQ

What does “content decay” mean in the context of SEO?

Online content tends to become outdated or irrelevant over time. This can happen due to industry changes, shifts in user interests, or simply the passing of time.

In the context of SEO, outdated content impacts how useful and accurate the information is for users, which can negatively affect website traffic and search rankings.

To maintain a website’s credibility and performance in search results, SEO professionals need to identify and update or repurpose content that has become outdated.

Should all outdated content be removed from a website?

Not all old content needs to be deleted. It depends on what kind of content it is and why it was created. Content that shows past events, product changes, or uses outdated terms can be kept for historical accuracy.

Old content provides context and shows how a brand or industry has evolved over time. It’s important to consider value before removing, updating, or keeping old content.

What are the best practices to avoid user confusion with outdated content?

Website owners and SEO professionals should take the following steps to avoid confusing users with outdated content:

  • Show when content was published or note if the information has changed since it was created.
  • Add explanations around outdated references to explain why they may no longer be relevant.
  • Set up redirects to guide users to the most current information if the content has moved or been updated.

These strategies help people understand a page’s relevance and assist them in getting the most accurate information for their needs.


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, May 2024. 

Ex-Google CEO Implies AI Search Will Have No Links via @sejournal, @martinibuster

Eric Schmidt, former CEO of Google, said in an interview that Google’s mission is to organize the world’s information, not to provide blue links. Schmidt’s pragmatic statements seem to describe a future where websites are unnecessary and advertising is increasingly effective.

Answers Without Links Is A Good User Experience?

The ex-CEO’s prediction of the future of Google may seems to contradict statements by Google’s current CEO that assert that search and the web will continue to coexist as well as by Danny Sullivan who has many times said that a healthy web ecosystem is important to Google.

There are many actions taken by Google in the past that indicate that Eric Schmidt’s prediction fit perfectly with how Google has ranked sites in the past.

The early days of the web were navigated not just by search engines but by curated web directories that served as starting places for Internet users to go find information, hopping from link to link in a hyperlinked Internet. The idea was that hyperlinks was how users could find information.

Google Search not only ranked webpages from web directories, Google itself hosted a version of DMOZ, an open source web directory that was curated by thousands of volunteers much like Wikipedia is maintained by volunteer editors today.

But a day came when Google stopped ranking directories and the reason given was that it was a better user experience show answers and not links to pages with more links (this event is likely archived somewhere on the WebmasterWorld forum, it happened a long time ago).

Then there are Google’s answers for flight tracking, package tracking, stock quotes, the time and weather information that has zero links.

Example Of An Answer Without Links

Eric Schmidt’s assertion that Google will take advantage of AI to show answers fits into Google’s design principle that showing answers is a good user experience if it fully satisfies the query.

The only difference between the old days and now is that Google is that AI has (mostly) unlocked the ability to show answers without linking to any websites.

So it’s not far-fetched that Google may decide that showing answers is a good user experience, there is precedence for that approach.

AI Is Underhyped

Schmidt put forward the idea that AI is not overhyped but in fact is underhyped.

He observed:

“I hate to tell you but I think this stuff is underhyped not overhyped. Because the arrival of intelligence of a non-human form is really a big deal for the world.

It’s coming. It’s here. It’s about to happen. It happens in stages. …the reason I’m saying it’s underhyped is you’re seeing the future of reasoning, the future of human interaction, the future of research, the future of planning is being invented right now.

There’s something called infinite context windows, which means that you can — it’s like having an infinite short-term memory, which I certainly don’t have, where you can basically keep feeding it information and it keeps learning and changing.”

Eric Schmidt On The Future Of Search

The interviewer asked Schmidt about a future where AI answers questions without links to sources on the web.

The interviewer asked this question:

“In a world where the AI provides the answer, and doesn’t necessarily need to send you to 12 places where you’re going to go find it yourself… what happens to all of that?

Eric Schmidt answered:

“It’s pretty important to understand that Google is not about blue links, it’s about organizing the world’s information. What better tool than the arrival of AI to do that better.

Do you think you can monetize that? You betcha.”

Will Answers Without Links Happen?

It has to be reiterated that Eric Schmidt (LinkedIn profile) is no longer the CEO at Google or Executive Chairman & Technical Advisor at Alphabet (for four years now). His opinions may not reflect the current thinking within Google.

However it’s not unreasonable to speculate that maybe he is saying out loud what those within Google cannot officially discuss.

The most solid information we have now is that Google Search will continue to have links but that Google (and others like Apple) are moving ahead with AI assistants on mobile devices that can answer questions and perform tasks.

Watch the Eric Schmidt interview here:

Google Launches New ‘Saved Comparisons’ Feature For Analytics via @sejournal, @MattGSouthern

Google announced a new tool for Analytics to streamline data comparisons.

The ‘saved comparisons’ feature allows you to save filtered user data segments for rapid side-by-side analysis.

Google states in an announcement:

“We’re launching saved comparisons to help you save time when comparing the user bases you care about.

Learn how you can do that without recreating the comparison every time!”

Google links to a help page that lists several benefits and use cases:

“Comparisons let you evaluate subsets of your data side by side. For example, you could compare data generated by Android devices to data generated by iOS devices.”

“In Google Analytics 4, comparisons take the place of segments in Universal Analytics.”

Saved Comparisons: How They Work

The new comparisons tool allows you to create customized filtered views of Google Analytics data based on dimensions like platform, country, traffic source, and custom audiences.

These dimensions can incorporate multiple conditions using logic operators.

For example, you could generate a comparison separating “Android OR iOS” traffic from web traffic. Or you could combine location data like “Country = Argentina OR Japan” with platform filters.

These customized comparison views can then be saved to the property level in Analytics.

Users with access can quickly apply saved comparisons to any report for efficient analysis without rebuilding filters.

Google’s documentation states:

“As an administrator or editor…you can save comparisons to your Google Analytics 4 property. Saved comparisons enable you and others with access to compare the user bases you care about without needing to recreate the comparisons each time.”

Rollout & Limitations

The saved comparisons feature is rolling out gradually. There’s a limit of 200 saved comparisons per property.

For more advanced filtering needs, such as sequences of user events, Google recommends creating a custom audience first and saving a comparison based on that audience definition.

Some reports may be incompatible if they don’t include the filtered dimensions used in a saved comparison. In that case, the documentation suggests choosing different dimensions or conditions for that report type.

Why SEJ Cares

The ability to create and apply saved comparisons addresses a time-consuming aspect of analytics work.

Analysts must view data through different lenses, segmenting by device, location, traffic source, etc. Manually recreating these filtered comparisons for each report can slow down production.

Any innovation streamlining common tasks is welcome in an arena where data teams are strapped for time.

How This Can Help You

Saved comparisons mean less time getting bogged down in filter recreation and more time for impactful analysis.

Here are a few key ways this could benefit your work:

  • Save time by avoiding constant recreation of filters for common comparisons (e.g. mobile vs desktop, traffic sources, geo locations).
  • Share saved comparisons with colleagues for consistent analysis views.
  • Switch between comprehensive views and isolated comparisons with a single click.
  • Break down conversions, engagement, audience origins, and more by your saved user segments.
  • Use thoughtfully combined conditions to surface targeted segments (e.g. paid traffic for a certain product/location).

The new saved comparisons in Google Analytics may seem like an incremental change. However, simplifying workflows and reducing time spent on mundane tasks can boost productivity in a big way.


Featured Image: wan wei/Shutterstock

LinkedIn Report: AI Skills Now Must-Have For Marketers via @sejournal, @MattGSouthern

A new report by Microsoft and LinkedIn reveals the rapid adoption of AI tools and skills in the marketing industry.

According to the 2024 Work Trend Index Annual Report, which surveyed over 31,000 people across 31 countries, marketing professionals who leverage AI enjoy a competitive advantage.

Employers recognize the efficiency gains AI capabilities provide in marketing roles and increasingly seek applicants with those skills.

Karim R. Lakhani, Chair of the Digital Data Design Institute at Harvard, states in the report:

“Marketers are harnessing the power of AI to work smarter, not just faster. It’s enabling them to focus on higher-value, creative work while automating more routine tasks.”

Here are some highlights from the report illustrating the need to develop an AI skill set to remain competitive.

AI Aptitude: The New Must-Have Skill for Marketers

The survey data reveals a strong preference among business leaders for candidates and employees with AI skills.

A majority, 66%, stated they wouldn’t consider hiring candidates lacking AI proficiency.

Further, 71% expressed a preference for less experienced job seekers with AI skills over more seasoned professionals without that expertise.

This inclination was pronounced in creative fields like marketing and design.

Michael Platt, a neuroscience professor at the Wharton School, states in the report:

“AI is redefining what it means to be a competitive marketer in today’s digital landscape. Professionals who can effectively integrate AI into their work are positioning themselves as invaluable assets to their organizations.”

The report indicates that early-career marketers who develop AI skills could benefit significantly.

77% of leaders reported that employees adept at leveraging AI would be trusted with greater responsibilities earlier in their careers than their peers without AI skills.

The AI Arms Race For Top Marketing Talent

Data from LinkedIn shows that job postings highlighting AI tools and applications have seen a 17% increase in application growth compared to those that don’t mention AI.

Additionally, 54% of early-career employees cited access to AI technologies as a key factor influencing their choice of employer.

Organizations that provide AI training and support for their marketing teams likely have an advantage in attracting top talent.

Why SEJ Cares

The widespread adoption of AI in marketing signifies a shift in the skills and capabilities necessary for succeeding in this rapidly evolving industry.

As AI transforms marketing approaches, professionals who fail to adapt risk being left behind.

The 2024 Work Trend Index Annual Report’s findings are relevant to marketing professionals at all levels. They demonstrate that AI proficiency is necessary for career advancement and job market competitiveness.

Additionally, the report highlights businesses’ role in fostering an AI-driven culture.

Companies investing in AI tools, training, and employee support will be better positioned to attract and retain top talent, drive innovation, and achieve better results.

Read the full report.

How This Can Help You

For marketing professionals to succeed in the AI era, the report suggests:

  • Prioritize developing AI skills through courses, workshops, training programs, and collaborating with AI practitioners to gain hands-on experience.
  • Embrace experimenting with new AI tools and techniques, integrating them into daily workflows to improve efficiency.
  • Share AI knowledge actively with colleagues to foster a culture of knowledge sharing and drive organizational AI adoption.
  • Highlight AI capabilities during job searches by demonstrating the successful use of AI to drive results in previous roles.
  • Choose employers committed to AI adoption that provide access to cutting-edge AI tools and support ongoing learning.

These recommendations can help you future-proof your career and advance in an increasingly competitive field.


Featured Image: eamesBot/Shutterstock