Google On Diagnosing Multi-Domain Crawling Issues via @sejournal, @MattGSouthern

Google’s Search Advocate, John Mueller, shared insights on diagnosing widespread crawling issues.

This guidance was shared in response to a disruption reported by Adrian Schmidt on LinkedIn. Google’s crawler stopped accessing several of his domains at the same time.

Despite the interruption, Schmidt noted that live tests via Search Console continued to function without error messages.

Investigations indicated no increase in 5xx errors or issues with robots.txt requests.

What could the problem be?

Mueller’s Response

Addressing the situation, Mueller pointed to shared infrastructure as the likely cause:

“If it shared across a bunch of domains and focuses on something like crawling, it’s probably an issue with a shared piece of infrastructure. If it’s already recovering, at least it’s not urgent anymore and you have a bit of time to poke at recent changes / infrastructure logs.”

Infrastructure Investigation

All affected sites used Cloudflare as their CDN, which raised some eyebrows.

When asked about debugging, Mueller recommended checking Search Console data to determine whether DNS or failed requests were causing the problem.

Mueller stated:

“The crawl stats in Search Console will also show a bit more, perhaps help decide between say DNS vs requests failing.”

He also pointed out that the timing was a key clue:

“If it’s all at exactly the same time, it wouldn’t be robots.txt, and probably not DNS.”

Impact on Search Results

Regarding search visibility concerns, Mueller reassured this type of disruption wouldn’t cause any problems:

“If this is from today, and it just lasted a few hours, I wouldn’t expect any visible issues in search.”

Why This Matters

When Googlebot suddenly stops crawling across numerous sites simultaneously, it can be challenging to identify the root cause.

While temporary crawling pauses might not immediately impact search rankings, they can disrupt Google’s ability to discover and index new content.

The incident highlights a vulnerability organizations might face without realizing it, especially those relying on shared infrastructure.

How This Can Help You

If time Googlebot stops crawling your sites:

  • Check if the problem hits multiple sites at once
  • Look at your shared infrastructure first
  • Use Search Console data to narrow down the cause
  • Don’t rule out DNS just because regular traffic looks fine
  • Keep an eye on your logs

For anyone running multiple sites behind a CDN, make sure you:

  • Have good logging set up
  • Watch your crawl rates
  • Know who to call when things go sideways
  • Keep tabs on your infrastructure provider

Featured Image: PeopleImages.com – Yuri A/Shutterstock

Google Revises URL Parameter Best Practices via @sejournal, @MattGSouthern

In a recent update to its Search Central documentation, Google has added specific guidelines for URL parameter formatting.

The update brings parameter formatting recommendations from a faceted navigation blog post into the main URL structure documentation, making these guidelines more accessible.

Key Updates

The new documentation specifies that developers should use the following:

  • Equal signs (=) to separate key-value pairs
  • Ampersands (&) to connect multiple parameters

Google recommends against using alternative separators such as:

  • Colons and brackets
  • Single or double commas

Why This Matters

URL parameters play a role in website functionality, particularly for e-commerce sites and content management systems.

They control everything from product filtering and sorting to tracking codes and session IDs.

While powerful, they can create SEO challenges like duplicate content and crawl budget waste.

Proper parameter formatting ensures better crawling efficiency and can help prevent common indexing issues that affect search performance.

The documentation addresses broader URL parameter challenges, such as managing dynamic content generation, handling session IDs, and effectively implementing sorting parameters.

Previous Guidance

Before this update, developers had to reference an old blog post about faceted navigation to find specific URL parameter formatting guidelines.

Consolidating this information into the main guidelines makes it easier to find.

The updated documentation can be found in Google’s Search Central documentation under the Crawling and Indexing section.

Looking Ahead

If you’re using non-standard parameter formats, start planning a migration to the standard format. Ensure proper redirects, and monitor your crawl stats during the switch.

While Google has not said non-standard parameters will hurt rankings, this update clarifies what they prefer. New sites and redesigns should adhere to the standard format to avoid future headaches.


Featured Image: Vibe Images/Shutterstock

Google Now Recommends Higher Resolution Favicons via @sejournal, @martinibuster

Google Search Central updated their favicon documentation to recommend higher-resolution images, exceeding the previous minimum standard. Be aware of the changes described below, as they may impact how your site appears in search results.

Favicon

A favicon is a custom icon that is shown in browser tabs, browser bookmarks, browser favorites and sometimes in the search results. The word “favicon” is short for Favorites Icon.

An attractive favicon is useful for making it easier for users to find links to your site from their bookmarks, folders and browser tabs and can (in theory) help increase clicks from the search results. Thus, a high quality favicon that meets Google’s requirements is important in order to maximize popularity, user interactions and engagements, and visits from the search engine results pages (SERPs).

What Changed?

One of the changes to Google’s documentation is to make it clearer that a favicon must be in a square aspect ratio. The other important change is to strongly emphasize that publishers use a favicon that’s at least a 48×48 pixel size. Eight by eight pixels is still the minimum acceptable size for a favicon but publishers will probably miss out on the opportunity for a better presentation in the search results by going with a an 8×8 pixel favicon.

This is the part of the documentation that changed:

Previous version:

“Your favicon must be a multiple of 48px square, for example: 48x48px, 96x96px, 144x144px and so on (or SVG with a 1:1 (square) aspect ratio).”

New version:

“Your favicon must be a square (1:1 aspect ratio) that’s at least 8x8px. While the minimum size requirement is 8x8px, we recommend using a favicon that’s larger than 48x48px so that it looks good on various surfaces.”

Comparison Of Favicon Sizes

Reason For Documentation Changes

Google’s changelog for documentation says that the change was made to make it clearer what Google’s requirements are. This is an example of Google taking a look at their documentation to see how it can be improved. It’s the kind of thing that all publishers, even ecommerce merchants, should do at least once a year to identify if they overlooked an opportunity to be communicate a clearer message. Even ecommerce or local merchants can benefit from a yearly content review because things change or customer feedback can indicate a gap in necessary information.

This is Google’s official explanation for the change:

“Updated the favicon guidelines to state that favicons must have a 1:1 aspect ratio and be at least 8x8px in size, with a strong recommendation for using a higher resolution favicon of at least 48x48px.

What: Updated the favicon guidelines to state that favicons must have a 1:1 aspect ratio and be at least 8x8px in size, with a strong recommendation for using a higher resolution favicon of at least 48x48px.

Why: To reflect the actual requirements for favicons.”

Read the newly updated favicon guidelines at:

Guidelines

Featured Image by Shutterstock/Elnur

Google Rolls Out AI-Powered Updates To Performance Max Campaigns via @sejournal, @MattGSouthern

Google Ads is enhancing its Performance Max campaigns with new AI-driven features.

These updates are focused on asset testing, video optimization, and campaign management.

The features arrive as advertisers gear up for the holiday shopping season.

Key Updates

New Asset Testing Capabilities

Starting in early November, retailers will gain access to new experimental features within Performance Max.

A key addition is the ability to measure the impact of supplementary assets beyond product feeds.

That means advertisers can measure the effectiveness of adding images, text, and video content to product-feed campaigns.

Google is also implementing Final URL expansion testing. This allows advertisers to evaluate whether alternative landing pages can drive better conversion rates by matching user intent.

Advanced Image Generation

Google is integrating Imagen 3, its latest text-to-image AI model, into the Google Ads platform.

This update aims to generate higher-performing visuals across Performance Max, Demand Gen, App, and Display campaigns.

The model has been trained on advertising performance data to create more effective commercial imagery.

Video Enhancement Tools

Google Ads is introducing automated video optimization features that include:

  • Automatic aspect ratio adaptation for different YouTube formats
  • Smart video shortening while preserving key messages
  • Granular control over enhanced video assets

These features roll out with built-in quality controls and opt-out options at the campaign and individual asset levels.

While most features are immediately available, video shortening for Demand Gen campaigns will launch in 2025.

Campaign Hierarchy Changes

There is a significant change in how Performance Max and Standard Shopping campaigns interact.

Instead of automatic prioritization for Performance Max campaigns, Google is introducing an Ad Rank-based system.

This new system determines which ads to serve when both campaign types target the same products within an account.

Improved Collaboration Features

Google is expanding shareable ad previews to Performance Max campaigns that include product feeds and travel objectives.

This simplifies the creative review process by allowing preview access without requiring Google Ads credentials.

Context

These updates demonstrate Google’s commitment to AI-driven advertising, particularly as businesses prepare for seasonal peaks.

This timely release suggests Google Ads is focusing on providing advanced tools for optimizing holiday marketing campaigns.

Looking Ahead

For advertisers currently using Performance Max, these updates provide new opportunities to optimize campaign performance with experimental features and improved creative capabilities.

The rollout starts immediately for most features. Specific tools, such as retail asset testing, will be available in early November, and video shortening for Demand Gen campaigns is expected to launch in 2025.

Google’s Mueller Dismisses Core Web Vitals Impact On Rankings via @sejournal, @MattGSouthern

Google Search Advocate John Mueller has reaffirmed that Core Web Vitals are not major ranking factors, responding to data that suggested otherwise.

His statements come amid growing industry discussion about the immediate impact of site performance on search visibility.

Mueller’s Stance

Mueller stated on LinkedIn:

“We’ve been pretty clear that Core Web Vitals are not giant factors in ranking, and I doubt you’d see a big drop just because of that.”

The main benefit of improving website performance is providing a better user experience.

A poor experience could naturally decrease traffic by discouraging return visitors, regardless of how they initially found the site.

Mueller continues:

“Having a website that provides a good experience for users is worthwhile, because if users are so annoyed that they don’t want to come back, you’re just wasting the first-time visitors to your site, regardless of where they come from.”

Small Sites’ Competitive Edge

Mueller believes smaller websites have a unique advantage when it comes to implementing SEO changes.

Recalling his experience of trying to get a big company to change a robots.txt line, he explains:

“Smaller sites have a gigantic advantage when it comes to being able to take advantage of changes – they can be so much more nimble.”

Mueller noted that larger organizations may need extensive processes for simple changes, while smaller sites can update things like robots.txt in just 30 minutes.

He adds:

“None of this is easy, you still need to figure out what to change to adapt to a dynamic ecosystem online, but I bet if you want to change your site’s robots.txt (for example), it’s a matter of 30 minutes at most.”

Context

Mueller’s response followed research presented by Andrew Mcleod, who documented consistent patterns across multiple websites indicating rapid ranking changes after performance modifications.

In one case, a site with over 50,000 monthly visitors experienced a drop in traffic within 72 hours of implementing advertisements.

Mcleod’s analysis, which included five controlled experiments over three months, showed:

  • Traffic drops of up to 20% within 48 hours of enabling ads
  • Recovery periods of 1-2 weeks after removing ads
  • Consistent patterns across various test cases

Previous Statements

This latest guidance aligns with Mueller’s previous statements on Core Web Vitals.

In a March podcast, Mueller confirmed that Core Web Vitals are used in “ranking systems or in Search systems,” but emphasized that perfect scores won’t notably affect search results.

Mueller’s consistent message is clear: while Core Web Vitals are important for user experience and are part of Google’s ranking systems, you should prioritize content quality rather than focus on metrics.

Looking Ahead

Core Web Vitals don’t directly affect rankings, per Mueller.

While Google’s stance on ranking factors remains unchanged, the reality is that technical performance and user experience work together to influence traffic.


Featured Image: Ye Liew/Shutterstock

Google’s Mueller On How To Handle Legacy AMP Subdomains via @sejournal, @MattGSouthern

Google’s John Mueller advises site owners on managing outdated AMP subdomains, suggesting redirects or complete DNS removal.

  • Mueller recommends either keeping 301 redirects or removing the AMP subdomain entirely from DNS.
  • For sites with 500,000 pages, crawl budget impact from legacy AMP URLs isn’t a major concern.
  • AMP subdomains have their own separate crawl budget from the main domain.
Google Expands Travel Feeds In Search Ads via @sejournal, @MattGSouthern

Google has announced an expansion of its Travel Feeds feature for Search Ads.

This update allows hotel advertisers to directly include more detailed information from their feeds in search results.

A Google support page reads:

“When you link your Hotel Center to Google Ads, Google will automatically enrich existing Search ad formats with prices, images, and more to help drive better performance.”

New Capabilities

The expanded feature lets hotel advertisers display the following information in their ads:

  • Hotel details
  • Current pricing
  • Available dates
  • Customer ratings
  • Property images

Google notes that advertisers can:

“Use the same feed data already available in Hotel Center and used by travel advertisers in hotel campaigns to enhance more ad types.”

The hotel price and landing page are automatically sourced from your Hotel Center feed, with selections based on ad relevance, creative, and query.

Multiple designs are available as part of Travel Feeds in Search Ads.

Google provided some examples. Note, these are mockups only.

Screenshot from support.google.com, October 2024.

Potential Impact

According to Google’s internal data, advertisers using the full range of available formats have observed up to a 20% increase in click-through rates.

However, individual results may vary, and these figures have not been independently verified.

Implementation

Google will automatically display Travel Feeds in Search Ads after you Link your Hotel Center feed to a Google Ads account.

You can set feed sharing controls at the account and campaign levels.

Key points:

  • If you need a Hotel Center account, refer to Google’s Hotels starter guide.
  • Create subset feeds to link specific properties.
  • Use URL parameters to track clicks from travel ads.

Availability

Advertisers must have a Hotel Center account with a price accuracy rating of at least “Poor ” to use Travel Feeds in Search Ads.

The feature is currently available in 21 countries and supports 12 languages.

Looking Ahead

The expansion of Travel Feeds in Search Ads represents one of several recent changes to travel-related search results.

Google plans to test the Travel Feeds feature beyond hotels.

In the coming months, Google will include other travel-related categories such as attractions, car rentals, and events.


Featured Image: support.google.com, October 2024. 

Google To Retire Sitelinks Search Box In November via @sejournal, @MattGSouthern

Google has announced the retirement of the sitelinks search box feature.

This change, set to take effect on November 21, marks the end of a tool that has been part of Google Search for over a decade.

The sitelinks search box, introduced in 2014, allowed users to perform site-specific searches directly from Google’s search results page.

It appeared above the sitelinks for certain websites, usually when searching for a company by name.

Declining Usage

Google cites declining usage as the reason for this decision, stating:

“Over time, we’ve noticed that usage has dropped.”

Potential Impact

Google affirms that removing the sitelinks search box won’t affect search rankings or the display of other sitelinks.

This change is purely visual and doesn’t impact a site’s position in search results.

Implementation

This update will be rolled out globally, affecting search results in all languages and countries.

Google has confirmed that the change won’t be listed in the Search status dashboard, indicating that it’s not considered a significant algorithmic update.

Search Console & Rich Results Test

Following the removal of the sitelinks search box, Google plans to update the following tools:

  1. The Search Console rich results report for sitelinks search box will be removed.
  2. The Rich Results Test will no longer highlight the related markup.

Structured Data Considerations

While you can remove the sitelinks search box structured data from their sites, Google says that’s unnecessary.

Unsupported structured data won’t cause issues in Search or trigger errors in Search Console reports.

It’s worth noting that the ‘WebSite’ structured data, also used for site names, continues to be supported.

Historical Context

The sitelinks search box was initially announced in September 2014 as an improvement to help users find specific website content more easily.

It supported features like autocomplete and allowed websites to implement schema markup for better integration with their own search pages.

Looking Ahead

Website owners and SEO professionals should take note of this update, though no immediate action is required.


Featured Image: MrB11/Shutterstock

Google’s AI Fails At 43% Of Finance Queries, Study Finds via @sejournal, @MattGSouthern

A study by The College Investor finds significant inaccuracies in Google’s AI-generated summaries for finance queries.

Out of 100 personal finance searches, 43% had misleading or incorrect information.

Key Findings

The study evaluated AI overviews across various financial topics, including banking, credit, investing, taxes, and student loans.

The results showed:

  • 57% of AI overviews were accurate
  • 43% contained misleading or inaccurate information
  • 12% were completely incorrect
  • 31% were either misleading or missing crucial details

Areas of Concern

Researchers noted that the AI struggled most with nuanced financial topics, such as taxes, investing, and student loans.

Some of the most concerning issues included:

  • Outdated information on student loan repayment plans
  • Incorrect details about IRA contribution limits
  • Misleading statements regarding 529 college savings plans
  • Inaccurate tax information that could potentially lead to penalties if followed

The AI handled basic financial concepts well but overlooked important exceptions and recent policy changes.

There are notable patterns in the queries Google’s AI got right versus those it got wrong.

Here are common themes.

Queries Google AI Got Right

  • Basic definitions and explanations: For example, “What is a wire transfer?” and “How does a credit card work?”
  • Simple, straightforward questions: Such as “Do I have to pay back student loans?”
  • Recent trending topics: Like “What was the Chase Glitch?”
  • General insurance questions: For instance, “When should I get life insurance?”

Queries Google AI Got Wrong

  • Complex tax topics: For example, “Can you use a 529 plan for a Roth IRA?” and “Does owning your house in an LLC help with taxes?”
  • Nuanced financial products: Such as “Is an IUL better than a 401k?”
  • Time-sensitive information: Like outdated student loan repayment plans or savings account rates.
  • State-specific financial rules: For instance, misrepresenting California’s 529 plan rules.
  • Questions requiring context-dependent answers: Such as “Can I file as independent for FAFSA?”
  • Queries about financial limits or thresholds: For example, incorrect IRA contribution limits.
  • Complex student loan topics: Particularly around forgiveness programs and repayment plans.
  • Investment comparisons: Like “Are annuities better than CDs?”

What This Means

Google’s AI performs well at giving straightforward answers to factual queries.

On the other hand, it struggles with nuanced understanding, up-to-date information, and consideration of multiple factors.

This suggests that the AI can handle basic financial literacy topics, but it’s unreliable for complex financial decisions or advice.

Potential Impact

Robert Farrington, founder of The College Investor, expressed concern about the findings, stating:

“If Google continues to present bad or misinformation about money topics to searchers, not only could it hurt their personal finances, but it could weaken already poor financial literacy in the United States.”

The study noted that following AI guidance could result in tax penalties or financial harm to consumers.

The College Investor believes Google should disable these AI-generated overviews for finance-related queries, especially those concerning taxes and investments.

Looking Ahead

Searchers must exercise caution when relying on AI-generated summaries for financial decisions.

When questioned about instances of misinformation, Google has previously stated, “the vast majority of AI Overviews provide high-quality information.”

The complete study, including detailed examples and methodology, is available on The College Investor’s website.


Featured Image: Koshiro K/Shutterstock

Mullenweg Criticized for 1st Amendment Claims via @sejournal, @martinibuster

Matt Mullenweg portrayed himself as a victim in his dispute with WP Engine, claiming in a tweet and blog post that they are ‘trying to curtail’ his free speech. Social media responses ranged from polite debunking of his First Amendment claim to accusations of hypocrisy.

TL/DR Of Dispute With WP Engine

Matt Mullenweg, co-creator of WordPress and CEO of Automattic ignited a dispute with managed WordPress web host WP Engine (WPE), using a Q&A at the WordCamp WordPress conference to denounce WP for not giving enough back to the WordPress open source project. He followed that statement with a post on WordPress.org that called WPE a cancer to WordPress, writing:

“This is one of the many reasons they are a cancer to WordPress, and it’s important to remember that unchecked, cancer will spread. WP Engine is setting a poor standard that others may look at and think is ok to replicate. We must set a higher standard to ensure WordPress is here for the next 100 years.”

He next banned thousands of WP Engine customers, cutting them off from updating their websites. Mullenweg later offered a temporary “reprieve” to prevent further inconvenience by WordPress publishers caught in the middle of the dispute and allow WP Engine to create a workaround.

Banning WP Engine elicited a negative response from WordPress developers and businesses. A tweet by the CEO of Ruby Media Group was representative of the general sentiment:

“My dev team can’t update my plugins because of this. You are destroying people’s lives.”

WP Engine responded with a Cease and Desist letter against Mullenweg and Automattic, followed by a federal lawsuit by WP Engine against Mullenweg and Automattic seeking relief from what they allege is an attempt by Mullenweg to extort millions of dollars from WPE.

Claim Of Attempt To Curtail First Amendment Rights

Mullenweg on Sunday published a blog post claiming that WP Engine’s lawsuit against him and Automattic is an attempt to “curtail” his “First Amendment rights.”

He wrote:

“WP Engine has filed hundreds pages of legal documents seeking an injunction against me and Automattic. They say this about community or some nonsense, but if you look at the core, what they’re trying to do is ask a judge to curtail my First Amendment rights.”

Mullenweg ended the post by stating he will no longer comment on the lawsuit filed by WP Engine but encouraged others to speak up in support of his side of the dispute.

Mullenweg Mocked On Social Media

The First Amendment is a guarantee that the United States government shall not create a law that infringes on a person’s free speech. Many on social media were quick to point out that WP Engine cannot curtain is First Amendment rights because they’re not the government.

A WordPress software developer tweeted:

“You have no first amendment rights in this context. WP Engine is not the government trying to curtail your 1A rights. 1A only applies to government entities.”

Another person followed up with:

“Please reread what the first amendment is, dum dum”

Another developer went further, calling Mullenweg “moronic”:

“WPEngine isn’t the government, how moronic can one man be?”

Another web developer advised Mullenweg to seek legal counsel to explain to him how the First Amendment works:

“Please go talk to your big expensive lawyer. I am sure they can break it down into small words for you.”

Accused Of Hypocrisy

Others on social media accused Mullenweg of hypocrisy for curtailing the free speech of others in the official WordPress Slack channel and banning WP Engine users from accessing plugins from the official repository.

A tweet by a WordPress and open source enthusiast captured the general feeling:

“Yes, “freedom of speech” is so important. I assume you now will be unblocking everyone that was exercising their right to freedom of speech in the WordPress Slack and here on X. Or, did you just mean literally “my freedom of speech” only.”

A WordPress developer from Denver tweeted:

“How many people have you banned from the WP slack channel over the past couple weeks?”

And another tweet:

“This coming from the guy cancelling anyone’s account/ access that disagrees with him. Really?”

Not Much Sympathy For Mullenweg

The overwhelming response to Matt Mullenweg’s post about his First Amendment rights was not sympathetic to his side of the story. A web applications developer’s tweet captured the lack of support:

“Maybe, maybe. But you probably won’t be getting a lot of sympathy from the crowd right now due to this thing the kids these days call “consequences.””

Read the original tweet by Mullenweg:

Matt Mullenweg’s blog post:

My Freedom of Speech

Featured Image by Shutterstock/Wirestock Creators