Google Lighthouse To Undergo Major Audit Overhaul: What To Know via @sejournal, @MattGSouthern

Google announced plans to revamp Lighthouse’s performance audits.

The new version will match the recently launched insights in Chrome DevTools’ Performance panel.

This shift will alter how performance data is organized and presented, impacting SEO professionals who utilize Lighthouse for website optimization.

Background: Combining Google’s Performance Tools

This update is part of Google’s effort to consolidate its various performance tools.

Barry Pollard from Google’s Chrome team explains:

“We’re updating the audits in Lighthouse to be based on the same Insights we recently launched in the Performance panel of Chrome DevTools. This will help align the two tools but will be a breaking change.”

What’s Changing: Renamed, Combined, and Removed Audits

The upcoming changes fall into three main categories:

1. Audit Merging and Renaming

Many existing Lighthouse audits will get new names and be merged. For example:

  • Three separate audits (“layout shifts,” “non-composited animations,” and “unsized images”) will be combined into a single “cls culprits insight” audit.
  • Several image optimization audits will combine into a single “image-delivery-insight” audit.

This merging means you can no longer turn off individual parts of these combined audits. You’ll need to turn the entire insight audit on or off.

Note, this is not a comprehensive list. For the complete list of renamed and consolidated audits, please refer to Google’s announcement.

2. Audit Removals

Several audits will be removed entirely, including:

  • First Meaningful Paint (replaced by Largest Contentful Paint)
  • No Document Write (rarely an issue in modern scripts)
  • Offscreen Images (browsers already handle these well)
  • Uses Passive Event Listeners (rarely an issue today)
  • Uses Rel Preload (too often recommended when not needed)
  • Third-Party Facades (limited usefulness and potential concerns)

3. New Organization

The new insight audits will appear under an “Insights” heading in reports. Unchanged audits will stay under the “Diagnostics” heading.

Timeline for Changes

Google will roll out these changes in stages:

  • Now: The new insights are already available in the Lighthouse JSON output for early adopters
  • May/June 2025 (Chrome 137): Lighthouse 12.6 will include a toggle to switch between old and new views
  • June: Lighthouse 12.7 will use newer insights audits by default
  • October: Lighthouse 13 will remove the old audit data completely

Pollard confirms:

“This has now been released to PageSpeed Insights too and will be included in Chrome 137 in about a month.”

How To Prepare

Here’s what to do to get ready:

  1. Use Lighthouse 12.6.0’s toggle feature to see how future reports will look
  2. If you use specific audit names in reports or analysis, start updating them
  3. Update any systems that use Lighthouse data
  4. Explain why performance reports will look different later this year

Pollard advises:

“Other Lighthouse tooling (for example if you’re using this in your CI) can also start migrating to these insights-based audits — the audits are available in the JSON outputs now.”

What This Means

Google continues to emphasize page experience and Core Web Vitals in its ranking algorithm. The underlying metrics remain unchanged, but the reorganization will impact how you identify and address performance issues.

The merged audits may provide a more comprehensive overview of related performance issues. This could make it easier to spot patterns and prioritize fixes. However, teams that have built custom tools around specific Lighthouse audits will need to adapt.

Looking Ahead

Google will publish documentation about the new insights on developer.chrome.com before the October change. They’ll keep the older documentation available for users of previous Lighthouse versions.

If you have concerns about these changes, Google has opened a GitHub discussion to gather feedback and answer questions.


Featured Image: brucephotography103/Shutterstock

Google Is Launching Search Central Deep Dive Events via @sejournal, @martinibuster

Google announced via their Search Off the Record podcast that they are launching a multi-day conference series to enable in-depth workshops on SEO topics that matter. The series is launching as a test pilot in the Asia-Pacific region, then expanding from there.

Google’s Gary Illyes said that he’s been thinking of doing a multi-day event for the past year because he believes that the one-day format only allows for a relatively shallow coverage of important topics. He said that they’re constrained to 25 minutes to cover a topic which means that they end up speeding through the discussion without being able to “contextualize” them, to show how they’re relevant for people.

Gary explained:

“One of my pet peeves with Search Central Live is that we have these well-rehearsed talks that speed through one topic, and then you do, with that information, whatever you want. Basically, we don’t have time, like we have 25 minutes, maybe, for a talk. …how do you link the topic that you talked about to something tangible? Like, for example, if you are talking about crawling, then how do you show people how that looks like in Search Console or in server logs or whatever, if you don’t have the time, if you only have 25 minutes or even less?”

A Googler named Cherry commented:

“With longer time, of course, we can talk about more things, deeper things. We can have more time for networking, interactive… or even practical things that usually we might not have.”

Topics To Be Covered

The Googlers indicated that they’re not settled on the topics that they’ll cover, whether it will focus on the technical or marketing side of SEO or both. User feedback during the signup process may influence the sessions that they choose to present so that they can keep it relevant to what users in any particular geographic are are most concerned about.

Location Of Deep Dive Events

In a sign that this event is still in the planning stage, the Googlers said that they haven’t chosen where the first event will be held, only that they’re looking to kick them off in the Asia Pacific (APAC) region, mentioning that Bali is on the list of places under consideration. Budget is one of the considerations.

Search Central Live Global

Lastly, they announced that they will still be presenting Search Central Live but will be expanding it to more locations globally, including possibly to the Baltics.

Listen To Search Off The Record Episode 90

Featured Image by Shutterstock/fongleon356

Google AI Mode Exits Waitlist, Now Available To All US Users via @sejournal, @MattGSouthern

Google has removed the waitlist for AI Mode in Search. This Gemini-powered search tool is now available to all US users.

The update introduces new features, including visual cards for places and products, shopping integration, and a history panel for desktop users.

This growth aligns with Google’s recent earnings reports, which indicate that investments in AI are yielding financial returns.

AI Mode Now Available to All US Users

Previously, AI Mode was only available to participants in Google Labs. Now, anyone in the United States can access it.

Google reports that early users provided “incredibly positive feedback” about the tool.

The announcement reads:

“Millions of people are using AI Mode in Labs to search in new ways. They’re asking longer, harder questions, using follow-ups to dig deeper, and discovering new websites and businesses.”

New Visual Cards for Places and Products

The update adds visual cards to AI Mode results. These cards help users take action after getting information.

For local businesses, cards show:

  • Ratings and reviews
  • Opening hours
  • How busy a place is right now
  • Quick buttons to call or get directions

Here’s an example of a local business query in Google’s AI mode:

Image Credit: Google
Image Credit: Google

For products, cards include:

  • Current prices and deals
  • Product images
  • Shipping details
  • Local store availability

Google’s announcement reads:

“This is made possible by Google’s trusted and up-to-date info about local businesses, and our Shopping Graph — with over 45 billion product listings.”

It’s worth noting this expansion comes days after OpenAI announced an upgrade to ChatGPT’s shopping capabilities.

History Panel for Continuous Research

Google has added a new left-side panel on desktop that saves your past AI Mode searches. This helps with ongoing research projects. You can:

  • Return to previous search topics
  • Pick up where you left off
  • Ask follow-up questions
  • Take the next steps based on what you found earlier

Here’s an example of what it looks like:

Image Credit: Google

Limited Test Outside of Labs

Google plans to test AI Mode beyond the Labs environment. The company says:

“In the coming weeks, a small percentage of people in the U.S. will see the AI Mode tab in Search.”

This indicates that Google is moving cautiously toward broader integration.

AI Mode Capabilities

Google’s AI Mode utilizes a technology called “query fan-out.” This means it runs multiple searches at once across different topics and sources. It then combines this information into a comprehensive answer, providing links to sources.

The system also supports image search. You can upload pictures and ask questions about them. It combines Google Lens, which identifies objects, with Gemini’s reasoning abilities to understand and explain what’s in the image.

AI Investment Reflected in Earnings

The expansion of AI Mode follows strong financial results from Google.

Despite concerns that AI might harm traditional search, Google Search revenue increased 10% to $50.7 billion in Q1 2025. This suggests AI is helping, not hurting, their core business.

Google plans to invest $75 billion in capital improvements in 2025, including infrastructure to support its AI features.

In February, CEO Sundar Pichai announced:

  • 11 new Cloud regions and data centers worldwide.
  • 7 new undersea cable projects to improve global connectivity.

Alphabet’s spending on infrastructure jumped 43% to $17.2 billion in Q1 2025.

Pichai claims that modern data centers now deliver four times more computing power using the same amount of energy.

For marketers, this financial context matters. Google’s investment in AI search isn’t just a tech experiment. It’s a core business strategy that’s already showing positive returns.

As these AI-powered search experiences continue to grow, marketing strategies must evolve to remain visible.

What This Means for Digital Marketers

For SEO and marketing professionals, these updates signal the following trends:

  • Visual content is becoming increasingly important as Google improves its ability to understand and display images in search results.
  • Local SEO remains critical, with business details appearing directly in AI Mode responses.
  • As AI Mode pulls from Google’s Shopping Graph, product data feeds must be accurate and complete.
  • Long-form content addressing complex questions may become more valuable, as AI Mode is better equipped to handle longer, more nuanced queries.
  • Google’s success with AI search, resulting in 10% revenue growth in Q1 2025, indicates that these features will continue to expand.

Availability

To access AI Mode, you need:

  • To be in the United States
  • To be at least 18 years old
  • The latest Google app or Chrome browser
  • Search history turned on

You can access AI Mode through google.com/aimode, the Google.com homepage (tap AI Mode below the search bar), or the Google app.

Reimagining EEAT To Drive Higher Sales And Search Visibility via @sejournal, @martinibuster

The SEO Charity podcast recently discussed a different way to think about EEAT that focuses on activities that leads to external signals that Google may associate with the underlying concepts of EEAT (expertise, experience, authoritativeness, and trustworthiness). Google’s John Mueller recently said that EEAT is not something that you can add to a site and most of what was discussed on the show lines up perfectly with that reality.

The podcast, hosted by Olesia Korobka and Anton Shulke, featured Amanda Walls (LinkedIn profile), founder of Cedarwood Digital in Manchester, UK.

Aristotle And SEO

Amanda introduced the concept of applying Aristotle’s principles of ethos, pathos, and logos to SEO strategy. These principles are three ways to persuade site visitors and potential customers:

  1. Credibility (ethos)
  2. Emotional appeal (pathos)
  3. Logical reasoning (logos), which is used to convince an audience.

Amanda explains these concepts in more depth but those three principles form the basis for her approach to creating the circumstances that lead to positive external signals that can be correlated to concepts like expertise, experience, authoritativeness, and trustworthiness.

Why It Matters for SEO

Amanda says that SEO is ultimately about driving leads and conversions, not just rankings and I agree with that 100%. The history of SEO is littered with gurus crowing about all the traffic they gained for clients but they never talk about the part that really matters which is sales and leads.

Link building historically falls into that trap where both the client and the link builder focus on how many links are acquired each month and look to traffic as evidence of success. But really, as Amanda points out, everything that a good SEO does should be focused on increasing sales. Nothing else matters.

Amanda explained:

“SEO is more than just rankings, it’s about conversion. It’s about business return. It’s about getting that success, those leads, those sales that we need… Bringing people to a website ….means nothing if they don’t convert. …we don’t just want to bring people to the website, we want them to engage and love your brand and have a really, really good reason to go through and fulfill the conversion journey.”

Reputation Management

Amanda recommends focusing on managing the business’s reputation, such as in reviews, interviews, and what’s written online about the brand.

She cites the following statistics:

  • 87% of consumers will back out of a purchase decision if they read something negative about the brand.
  • 81% of consumers do extensive research before a purchase, as much as 79 days.

Amanda prescribes findability, credibility, and persuasion as the ingredients for successful search optimization:

“We’re working on SEO to help people find us, and then most importantly, we are convincing them or we’re persuading them to actually go and purchase our product…”

Monitor Off-Site Signals

Amanda recommends regularly researching your brand to uncover potential issues, to monitor the online user sentiment, and to assess media coverage because poor off-site sentiment can remove users out of the conversion funnel.

Manage On-Site Signals

Amanda also recommends using the About Us page for sharing relatable stories that users can generate actual positive feelings for the brand, using the phrase emotional appeal to describe the experience users should get from an About Us page. She says that this can be as simple as telling potential customers about the business.

User-Generated Content And Authenticity

Many of the fastest growing business on the Internet cultivate high quality user generated content. Encouraging customers to post reviews and images helps to build confidence in products.

Amanda explains:

“And then also from a pathos perspective, you know, really getting that kind of user generated content, getting people to connect… because fundamentally humans, they buy from humans and the more human and the more emotional that we can be in our sales process, the more likely that we are to get that buy-in and that connection that we need to actually get across to our audience.”

Pitching To Journalists

This last part, pitching story ideas to journalists, is something that link building companies consistently get wrong. I know because I get approached by them all the time and they consistently have the wrong approach, which is focusing too much on links and not enough on understanding my audience.

I specialized in link building back in the early days of SEO (early 2000s). I was even the moderator of the link building forum at WebmasterWorld. Although I don’t do link building anymore, I have a vast, vast amount of experience persuading publishers to give my clients a link.

My opinion is that PR to journalists should be approached strictly for brand exposure. Don’t make links the goal.

Focus instead on building positive stories with journalists and let them write those articles with or without adding a link, let them decide. What will happen is that the consumers will go out and type your business’s name into Google and that’s a strong, strong signal. I prefer thousands of consumers typing my website’s name on Google over a handful of links, every time, all day long.

I strongly agree with what Amanda says about understanding a journalist’s audience:

“92% of journalists say that understanding their audience is crucial for them to consider a story pitch.”

Understanding the audience is super important. I’ll go even deeper and recommend understanding what motivates the audience. Focus on the reasons why a journalist’s readers will click an article title that’s displayed on Google News. Once you understand that part, I can practically guarantee that PR outreach approval rates will skyrocket.

Takeaway

The SEO Charity podcast episode featuring Amanda Walls introduces a novel way to build signals associated with Google’s EEAT (expertise, experience, authoritativeness, trustworthiness) by focusing on credibility, emotion, and logic in content strategy. Walls emphasizes using Aristotle’s persuasive principles to influence reputation, brand perception, and conversion, encouraging SEO strategies focused on meaningful business outcomes like leads and sales, with better search visibility that supports those ends.

Watch the SEO Charity episode on EEAT:

Reimagining E-E-A-T with Amanda Walls

Featured Image by Shutterstock/Ollyy

WordPress Jubilee Of Forgiveness Continues via @sejournal, @martinibuster

Last week, WordPress declared a “jubilee” and is unblocking all community members who were previously blocked. The official WordPress X (formerly Twitter) account posted a reminder that the unblocking is still ongoing.

According to the latest post:

“We’re clearing out all previous human blocks to create a more open and collaborative environment. While community and directory guidelines remain, consider any old blocks to be bugs that are on their way out.”

A similar post on the official WordPress site echoed the post on X:

“As I said, we’re dropping all the human blocks. Community guidelines, directory guidelines, and such will need to be followed going forward, but whatever blocks were in place before are now cleared. It may take a few days, but any pre-existing blocks are considered bugs to be fixed.”

WordPress appears to be using the word Jubilee in the sense of the Jewish and biblical tradition of a year of forgiveness.

The part about “Dropping all the human blocks” is similar to the Jewish jubilee in terms of forgiveness.

Moving forward, all pre-existing blocks will be considered “bugs” for fixing and everyone who is unblocked and those who were never blocked will still be subject to being banned should they fail to abide by WordPress community guidelines.

The post on X received a handful of responses.

Read the latest post on X:

Featured Image by Shutterstock/Ollyy

Google’s Updated Raters Guidelines Refines Concept Of Low Quality via @sejournal, @martinibuster

Google’s Search Quality Rater Guidelines were updated a few months ago, and several of the changes closely track the talking points shared by Googlers at the 2025 Search Central Live events. Among the most consequential updates are those to the sections defining the lowest quality pages, which more clearly reflect the kinds of sites Google wants to exclude from the search results.

Section 4.0 Lowest Quality Pages

Google added a new definition of the Lowest Rating in the Lowest Quality Pages section. While Google has always been concerned about removing low quality sites from the search results, this change to their raters guideline likely reflects an emphasis on weeding out a specific kind of low quality website.

The new guideline focuses on identifying the publisher’s motives for publishing the content.

The previous definition said:

“The Lowest rating is required if the page has a harmful purpose, or if it is designed to deceive people about its true purpose or who is responsible for the content on the page.”

The new version keeps that sentence but adds a new sentence that encourages the quality rater to consider the underlying motives of the publisher responsible for the web page. The focus of this guidance is to encourage the quality raters to consider how the page benefits a site visitor and to judge whether the purpose of the page is entirely for benefiting the publisher.

The addition to this section reads:

“The Lowest rating is required if the page is created to benefit the owner of the website (e.g. to make money) with very little or no attempt to benefit website visitors or otherwise serve a beneficial purpose.”

There’s nothing wrong with being motivated to earn an income from a website. What Google is looking at is if the content only serves that purpose or if there is also a benefit for the user.

Focus On Effort

The next change is focused on identifying how much effort was put into creating the site. This doesn’t mean that publishers must now document how much time and effort was put into the creating the content. This section is simply about looking for evidence that the content is not distinguishable from content on other sites and offers no clear advantages over the content found elsewhere on the Internet.

This part about the main content (MC) was essentially rewritten:

“● The MC is copied, auto-generated, or otherwise created without adequate effort.”

The new version has more nuance about the main content (MC):

“● The MC is created with little to no effort, has little to no originality and the MC adds no value compared to similar pages on the web”

Three things to unpack there:

  1. Content created with little to no effort
  2. Contains little to no originality
  3. Main content adds no additional value

Publishers who focus on keeping up with competitors should be careful that they’re not simply creating the same thing as their competitors. Saying that it’s not the same thing because it’s the same topic only better doesn’t change the fact that it’s the same thing. Even if the content is “ten times better” the fact remains that it’s still basically the same thing as the competitor’s content, only ten times more of it.

A Word About Content Gap Analysis

Some people are going to lose their minds about what I’m going to say about this, but keep an open mind.

There is a popular SEO process called Content Gap Analysis. It’s about reviewing competitors to identify topics that the competitors are writing about that are missing on the client’s site then copying those topics to fill the content gap.

That is precisely the kind of thing that leads to unoriginality and content that is indistinguishable from everything else that’s on the Internet. It’s my number one reason I would never use a software program that scrapes top ranked sites and suggests topics based on what the competitor’s are publishing. It results in virtually indistinguishable content and pure unoriginality.

Who wants to jump from one site to another site and read the same exact  recipes, even if they have more images and graphs and videos. Copying a competitor’s content “but doing it better” is not original.

Scraping Google’s PAAs (People Also Asked) just like everyone else does not result in original content. It results in content that’s exactly the same as everyone else that’s scraping PAAs.

While the practice of content gap analysis is about writing about the same thing only better, it’s still unoriginal. Saying it’s better doesn’t change the fact that it’s the same thing.

Lack of originality is a huge issue with Internet content and it’s something that Google’s Danny Sullivan discussed extensively at the recent Google Search Central Live in New York City.

Instead of looking for information gaps, it’s better to review your competitor’s weaknesses. Then look at their strengths. Then compare that to your own weaknesses and strengths.

A competitor’s weakness can become your strength. This is especially valuable information when competing against a bigger and more powerful competitor.

Takeaways

1. Google’s Emphasis on Motive-Based Quality Judgments

  • Quality raters are now encouraged to judge not just content, but the intent behind it.
  • Pages created purely for monetization, with no benefit to users, should be rated lowest.
  • This may signal Google’s intent to refine their ability to week out low quality content based on the user experience.

2. Effort and Originality Are Now Central Quality Signals

  • Low-effort or unoriginal content is explicitly called out as justification for the lowest rating.
  • This may signal that Google’s algorithms may increasingly focus on surfacing content with higher levels of originality.
  • Content that doesn’t add distinctive value over competitors may struggle in the search results

3. Google’s Raters Guidelines Reflect Public Messaging

  • Changes to the Guidelines mirror talking points in recent Search Central Live events.
  • This suggests that Google’s algorithms may become more precise on things like originality, added value, and effort put into creating the content.
  • This means publishers should (in my opinion) consider ways to make their sites more original than other sites, to compete by differentiation.

Google updated its Quality Rater Guidelines to draw a sharper line between content that helps users and content that only helps publishers. Pages created with little effort, no originality, or no user benefit are now listed as examples of the lowest quality, even if they seem more complete than competing pages.

Google’s Danny Sullivan used the example of travel sites that all have the sidebar that introduces the smiling site author and other hallmarks of travel sites as an example of an area where sites become indistinguishable from each other.

The reason why publishers do that is that they see what Google is ranking and assume that’s what Google wants. In my experience, that’s not the case. In my opinion it may be useful to think about what you can do to make a site more original.

Download the latest version of Google’s Search Quality Raters Guidelines here (PDF).

Featured Image by Shutterstock/Kues

Google Answers Why Landing Page Ranks For An E-Commerce Query via @sejournal, @martinibuster

Google’s John Mueller answered a question on Bluesky about why an e-commerce page with minimal content is ranking, illustrating that sometimes optimized content isn’t enough.

E-Commerce Search Results

A person posted their concerns about an e-commerce site that was ranking in the search results with barely any content. In fact, the domain that was ranking redirects to another domain. On the face of it, it appears like something is not right. Why would Google rank a landing page about a domain name transfer, right?

Why would Google rank what is essentially a landing page with virtually zero content for a redirected domain?

Why A Landing Page Ranks

The company with the landing page had acquired another company and they subsequently joined the two domains. There was nothing wrong or spammy going on, one business bought another business, it happens every day.

The person asking the question dropped a URL and a screenshot of the landing page and asked:

“How does Google think this would be the best result and also, do you think this is a relevant result for users?”

Google’s John Mueller answered:

“It looks like a normal ecommerce site to me. They could have handled the site-migration a bit more gracefully (and are probably losing a lot of “SEO value” by doing this instead of a real migration), but it doesn’t seem terrible for users.”

Site Migration

Mueller’s comment about the site migration was expanded further.

He posted:

“Our guidance for site migrations is at https://developers.google.com/search/docs/crawling-indexing/site-move-with-url-changes . What they’re doing is a “soft or crypto redirect”, and they’re doing it “N:1″ (meaning all old pages go there). Both of these make transfering information about the old site hard / impossible.”

Sometimes Google ranks pages that seem like they don’t belong. But sometimes the site rankings make sense when looked at from a different perspective, particularly from the perspective of what’s good and makes sense for the user. Rankings change all the time and it could be that the rankings for that page could go away after a certain amount of time. But waiting for a competitor to drop away isn’t really a good SEO strategy. Google’s Danny Sullivan had some good advice about differentiating a site for better rankings.

Channel Reporting Is Coming To Performance Max Campaigns via @sejournal, @brookeosmundson

Google just launched substantial upgrades to its Performance Max campaigns today.

In their announcement, they introduced long-anticipated reporting features that will provide advertisers with much-needed visibility into how their campaigns perform across different Google surfaces.

These updates include new channel-level reporting, full search terms data, and expanded asset performance metrics.

The goal?

It’s aimed at helping marketers better understand, evaluate, and optimize their Performance Max campaigns.

The rollout is expected to begin with an open beta for channel performance reporting in the coming weeks.

For advertisers managing budget and strategy across a mix of formats and inventory, these reporting enhancements mark a meaningful step forward in understanding where results are coming from and how to take informed action.

Advertiser Feedback is Directly Shaping PMax’s Direction

According to Google, Performance Max is now used by over one million advertisers.

In 2024 alone, Google implemented more than 90 improvements to Performance Max, leading to measurable gains in both conversions and conversion value.

But alongside performance, advertisers have consistently asked for better transparency and reporting.

Google’s latest announcements make clear that advertiser feedback has played a central role in shaping these enhancements.

The goal is to deliver clearer insights, support decision-making, and increase control—without sacrificing the benefits of automation.

Channel Performance Reporting Is Coming To Performance Max

Channel-level reporting is the most significant update in this release.

For the first time, advertisers will be able to view results by channel: Search, YouTube, Display, Discover, Gmail, Maps, and Search partners.

The new “Channel performance” page will show:

  • Visual breakdowns of performance by surface
  • Campaign-level metrics for each channel, including clicks, conversions, and spend
  • A downloadable table with key performance data
  • Diagnostics to surface missed opportunities or setup issues.

You’ll be able to find the Channel Performance reporting in the “Insights & reports” tab on the left-hand side of Google. See the example below on how the report will function.

For example, if Maps isn’t generating traffic, diagnostics might suggest adding a location asset. Or if YouTube is outperforming, advertisers can shift their focus to high-impact video creatives.

The ability to view spend and conversion value by channel adds clarity that Performance Max has previously lacked.

Search Terms Reporting Reaches (Almost) Full Visibility

Another major enhancement is the addition of full search terms reporting.

Advertisers will now be able to see the actual queries driving performance – similar to what’s available in standard Search and Shopping campaigns.

With this rollout, marketers can:

  • Identify top-performing search terms
  • Create tailored assets round those queries
  • Apply negative keywords or brand exclusions when needed

For agencies managing multiple clients or accounts at scale, this change improves daily workflow efficiency.

Rather than relying solely on limited theme-level insights or making assumptions about what’s driving performance, teams can now analyze exact queries.

This supports better keyword refinement, more accurate exclusions, and tighter alignment between campaign objectives and user behavior, all within the familiar framework of Search best practices.

Privacy thresholds will still apply, but the reporting experience will be much more detailed than before.

At launch, this feature will be available in the Google Ads UI only, with API support expected later.

For marketers focused on search intent, this change makes Performance Max a more actionable channel.

More Granular Asset Metrics Across Campaign Types

Asset reporting is also expanding. In addition to conversion data, advertisers will now see:

  • Impressions
  • Clicks
  • Cost
  • Conversion Value
Example of expanded asset-level reporting in Performance Max.Image Credit: Google, April 2025

These new metrics will apply across Performance Max, Search, and Display. This allows advertisers to evaluate creative performance at a deeper level.

Want to know if your video is driving more conversions than your static image? Now you can. Want to see if your headline gets more clicks than your call-to-action? The data is there.

These insights support better creative testing and stronger Ad Strength scores, all based on performance—not assumptions.

Built-In Diagnostics Help Spot Gaps and Missed Opportunities

Google is also adding diagnostics that flag potential performance issues. These insights will live within the Channel performance page and highlight areas for improvement.

For example:

  • If you’re not showing on Maps, diagnostics might suggest adding a location feed or location asset
  • If Search delivery is limited, landing page relevance could be the cause
Image credit: Google, April 2025

This feature won’t give full control over where ads appear, but it does provide better visibility into what’s working and what’s not.

Channel exclusions are still not available in Performance Max, but Google confirmed it’s exploring future control options. For now, diagnostics serve as a step toward more informed decision-making.

Why These Updates Matter For Advertisers

This round of updates helps address a long-standing challenge with Performance Max: the lack of visibility.

Advertisers have embraced the campaign type for its scale and automation, but often struggled to understand the “how” behind performance.

With these new features, advertisers will gain:

  • Channel-level transparency
  • Deeper search intent insights
  • Clearer creative performance metrics
  • Actionable recommendations to fix delivery issues

These aren’t just incremental changes. They reshape how marketers can evaluate and optimize PMax.

The updates make it easier to align creative strategy, understand channel contribution, and refine search targeting.

It’s also clear that Google is listening. The inclusion of diagnostics, downloadable tables, and more detailed reporting shows a strong response to real-world feedback.

These updates also signal a broader industry shift toward hybrid automation models: where AI handles scale, but humans still guide strategy with the help of robust data.

As marketers continue to seek clarity on campaign performance, updates like these help reinforce trust in automated systems by making them easier to measure and manage.

More details are expected at Google Marketing Live. But this release signals a new phase for Performance Max: one that balances automation with greater accountability and insight.

SEO Rockstar Names 7 SEO Fundamentals To Win In AI Search via @sejournal, @martinibuster

Todd Friesen, one of the most experienced digital marketers in our industry, recently posted on LinkedIn that the core fundamentals that apply to traditional search engines work exactly the same for AI search optimization. His post quickly received dozens of comments and more than a hundred likes, indicating that he’s not the only one who believes there’s no need to give SEO another name.

Who Is Todd Friesen?

Todd has had a long career in SEO, formerly of Salesforce and other top agencies and businesses. Like me, he was a moderator at the old WebmasterWorld Forums, only he’s been doing SEO for even longer. Although he’s younger than I am, I totally consider him my elder in the SEO business. Todd Friesen, along with Greg Boser, was an SEO podcasting pioneer with their SEO Rockstars show.

AEO – Answer Engine Optimization

There’s been a race to give a name to optimizing web content for AI search engines and few details on why it merits a new name.

We find ourselves today with five names for the exact same thing:

  1. AEO (Answer Engine Optimization)
  2. AIO (AI Optimization)
  3. CEO (Chat Engine Optimization)
  4. GEO (Generative Engine Optimization)
  5. LMO (Language Model Optimization)

There are many people today that agree that optimizing for an AI search engine is fundamentally the same as optimizing for a traditional search engine.  There’s little case for a new name when even an AI search engine like Perplexity uses a version of Google’s PageRank algorithm for ranking authoritative websites.

Todd Friesen’s post on LinkedIn made the case that optimizing for AI search engines is essentially the same thing as SEO:

“It is basically fundamental SEO and fundamental brand building. Can we stop over complicating it?

– proper code (html, schema and all that)
– fast and responsive site
– good content
– keyword research (yes, we still do this)
– coordination with brand marketing
– build some links
– analytics and reporting (focus on converting traffic)
– rinse and repeat”

SEO For AI = The Same SEO Fundamentals

Todd Friesen is right. While there’s room for quibbling about the details the overall framework for SEO, regardless if it’s for an AI search engine or not, can be reduced to these seven fundamentals of optimization.

Digital Marketer Rosy Callejas (LinkedIn Profile) agreed that there were too many names for the same thing:

“Too many names! SEO vs AEO vs GEO”

Kevin Doory, (LinkedIn Profile) Director Of SEO at RazorFish commented:

“The ones that talk about what they do, can change the names to whatever they want. The rest of us will just do the darn things.”

SEO Consultant Don Rhoades (LinkedIn Profile) agreed:

“Still SEO after all these (failed) attempts to distance from it by “thought leaders” – eg: inbound marketing, growth hacking, and whatever other nomenclature du jour they decide to cook up next.”

Ryan Jones (LinkedIn Profile), Senior Vice President, SEO at Razorfish (and founder of SERPrecon.com) commented on the ridiculousness of the GEO name: 

“GEO is a terrible name”

Pushback On AEO Elsewhere

A discussion on Bluesky saw Google’s John Mueller commenting on the motivations for creating hype.

Preeti Gupta‬ posted her opinion on Bluesky:

“It is absolutely wild to me that in this debate of GEO/AEO and SEO, everyone is saying that building a brand is not a requisite for SEO, but it is important for GEO/AEO.

Like bro, chill. This AI stuff didn’t invent the need for building a brand. It existed way before it. smh.”

Google’s John Mueller responded:

“You don’t build an audience online by being reasonable, and you don’t sell new things / services by saying the current status is sufficient.”

What Do You Think?

What’s your opinion? Is SEO for AI fundamentally the same as for regular search engines?

Google & Apple Maps: 20% of Local Searches Now Start Here via @sejournal, @MattGSouthern

New research shows that map platforms have become key search engines for local businesses.

One in five consumers now searches directly in map apps instead of traditional search engines.

BrightLocal’s Consumer Search Behavior study found that Google, Apple, and Bing Maps make up 20% of all local searches.

This is a big part of search traffic that many marketers might be missing in their local SEO plans.

The Rise of Map-First Search Behavior

The research found that 15% of consumers use Google Maps as their first choice for local searches. This makes it the second most popular platform after Google Search (45%).

The study reads:

“Another significant finding is the prominence of Google Maps in local search. 15% of consumers said they would use Google Maps as their first port of call, meaning they are searching local terms—which could be brand or non-brand terms—directly in Google Maps.”

It continues:

“Google Maps, Apple Maps, and Bing Maps combined make up 20% of default local search platforms. This reinforces the importance of ensuring you’re optimizing for both map packs and organic search listings. You might have a strong presence in the SERPs, but if consumers are looking for businesses like yours on a map search, you need to ensure you’re going to be found there, too.”

This change shows that consumers favor visual, location-based searches for local businesses, especially when making spontaneous decisions.

Generational Differences in Map Usage

Different age groups use map platforms at different rates:

  • Eighteen percent of Gen Z consumers use Google Maps as their primary local search tool, which is three percentage points higher than the average.
  • 21% of Millennials use Google Maps as their default local search platform.
  • 5% of Millennials prefer Apple Maps as their primary local search option.
  • Younger consumers appear to be more comfortable using maps to discover local businesses. This might be because they’re used to doing everything on mobile devices.

What Consumers Look for in Map Results

The study found key information that drives consumer decisions when using maps:

  • 85% of consumers say contact information and opening hours are “important” or “very important”
  • 46% rate business contact information as “very important”
  • Nearly half (49%) of consumers “often” or “always” plan their route to a business after searching

Map-based searches have high potential to convert browsers into customers, the report notes:

“Almost half of consumers (49%) said that they ‘often’ or ‘always’ go on to plan their travel route to the chosen business. This suggests two things: one, how quickly consumers seem to be making their decisions, and two, that consumers are conducting local business research with the aim of visiting in the very near future.”

SEO Implications for Local Businesses

For SEO pros and local marketers, these findings highlight several actions to take:

  • Prioritize optimizing map listings beyond your Google Business Profile.
  • Ensure accuracy across all map platforms, not just Google.
  • Focus on complete business information, especially contact details and hours.
  • Monitor the “justifications” in map results, which can be sourced from your business information, reviews, and website.
  • Treat maps as a primary search channel rather than an afterthought.

BrightLocal highlights:

“So, don’t lose out to potential customers by not having a correct address, phone number, or email address listed on your platforms—and be sure to check your opening hours are up to date.”

Looking Ahead

Map platforms are evolving from simple navigation tools into search engines that drive sales and revenue.

If you treat map listings as an afterthought, you risk missing many motivated, ready-to-buy consumers.

As search continues to fragment across platforms, investing specific resources in optimizing your map presence, beyond standard local SEO, is increasingly essential for businesses that rely on local traffic.


Featured Image: miss.cabul/Shutterstock