The Great Reversal: Why Agencies Are Replacing PPC With Predictable SEO via @sejournal, @mktbrew

This post was sponsored by Market Brew. The opinions expressed in this article are the sponsor’s own.

What if your client’s PPC budget could fund long-term organic growth instead?

Why do organic results dominate user clicks, but get sidelined in budget discussions?

Organic Drives 5x More Traffic Than PPC. Can We Prove It?

The Short Answer: Yes!

Over the past decade, digital marketers have witnessed a dramatic shift in how search budgets are allocated.

In the past decade, companies were funding SEO teams alongside PPC teams. However, a shift towards PPC-first has dominated the inbound marketing space.

Where Have SEO Budgets Gone?

Today, more than $150 billion is spent annually on paid search in the United States alone, while only $50 billion is invested in SEO.

That’s a 3-to-1 ratio, even though 90% of search clicks go to organic results, and only 10% to ads.

It’s not because paid search is more effective. Paid search is just easier to measure.

But that’s changing with the return of attribution within predictive SEO.

What Is Attribution?

Attribution in marketing is the process of identifying which touchpoints or channels contributed to a conversion or sale.

It helps us understand the customer journey so we can allocate budget more effectively and optimize campaigns for higher ROI.

As Google’s algorithms evolved, the cause-and-effect between SEO efforts and business outcomes became harder to prove.

Ranking fluctuations seemed random. Timelines stretched.

Clients became impatient.

Trackable Digital Marketing Has Destroyed SEO

With Google Ads, every dollar has a direct, reportable outcome:

  • Impressions.
  • Clicks.
  • Conversions.

SEO, by contrast, has long been:

  • A black box.

As a result, agencies and the clients that hire them followed the money, even when SEO’s results were higher.

PPC’s Direct Attribution Makes PPC Look More Important, But SEO Still Dominates

Hard facts:

  • SEO drives 5x more traffic than PPC.
  • Companies pay 3x more on PPC than SEO.
Image created by MarketBrew, August 2025

You Can Now Trace ROI Back To SEO

As a result, many SEO professionals and agencies want a way back to organic. Now, there is one, and it’s powered by attribution.

Attribution Is the Key to Measurable SEO Performance

Instead of sitting on the edge of the search engine’s black box, guessing what might happen, we can now go inside the SEO black box, to simulate how the algorithms behave, factor by factor, and observe exactly how rankings react to each change.

This is SEO with attribution.

Image created by MarketBrew, August 2025

With this model in place, you are no longer stuck saying “trust us.”

You can say, “Here’s what we changed. Here’s how rankings moved. Here’s the value of that movement.” Whether the change was a new internal link structure or a content improvement, it’s now visible, measurable, and attributable.

For the first time, SEO teams have a way to communicate performance in terms executives understand: cause, effect, and value.

This transparency is changing the way agencies operate. It turns SEO into a predictable system, not a gamble. And it arms client-facing teams with the evidence they need to justify the budget, or win it back.

How Agencies Are Replacing PPC With Measurable Organic SEO

For agencies, attribution opens the door to something much bigger than better reporting; it enables a completely new kind of offering: performance-based SEO.

Traditionally, SEO services have been sold as retainers or hourly engagements. Clients pay for effort, not outcomes. With attribution, agencies can now flip that model and say: You only pay when results happen.

Enter Market Brew’s AdShifted feature to model this value and success as shown here:

Screenshot from a video by MarketBrew, August 2025

The AdShift tool starts by entering a keyword to discover up to 4* competitive URLs for the Keyword’s Top Clustered Similarities. (*including your own website plus 4 top-ranking competitors)

Screenshot of PPC vs. MarketBrew comparison dashboard by Marketbrew, August 2025

AdShift averages CPC and search volume across all keywords and URLs, giving you a reliable market-wide estimate and details for your brand towards a monthly PPC investment to rank #1.

The dashboard of a business dashboard.
Screenshot of a dashboard by Marketbrew, August 2025

AdShift then calculates YOUR percentage of replacement for PPC to fund SEO.

This allows you to model your own Performance Plan with variable discounts available to the Market Brew license fees with an always less than 50% of PPC Fee for clicks replaced by new SEO traffic.

The dashboard for a business account.
Screenshot of a dashboard by Marketbrew, August 2025

AdShift simulates a PPC replacement plan option selected based on its keywords footprint to instantly see savings from the associated Performance Plans.

That’s the heart of the PPC replacement plan: a strategy you can use to gradually shift a  clients’ paid search budgets into measurable performance-based SEO.

What Is A PPC Replacement Plan? Trackable SEO.

A PPC replacement plan is a strategy in which agencies gradually shift their clients’ paid search budgets into organic investments, with measurable outcomes and shared performance incentives.

Here’s how it works:

  1. Benchmark Paid Spend: Identify the current Google Ads budget, i.e., $10,000 per month or $120,000 per year.
  2. Forecast Organic Value: Use search engine modeling to predict the lift in organic traffic from specific SEO tasks.
  3. Execute & Attribute: Complete tasks and monitor real-time changes in rankings and traffic.
  4. Charge on Impact: Instead of billing for time, bill for results, often at a fraction of the client’s former ad spend.

This is not about replacing all paid spend.

Branded queries and some high-value targets may remain in PPC. But for the large, expensive middle of the keyword funnel, agencies can now offer a smarter path: predictable, attributable organic results, at a lower cost-per-click, with better margins.

And most importantly, instead of lining Google’s pockets with PPC revenue, your investments begin to fuel both organic and LLM searches!

Real-World Proof That SEO Attribution Works

Agencies exploring this new attribution-powered model aren’t just intrigued … they’re energized. For many, it’s the first time in years that SEO feels like a strategic growth engine, not just a checklist of deliverables.

“We’ve pitched performance SEO to three clients this month alone,” said one digital strategy lead. “The ability to tie ranking improvements to specific tasks changed the entire conversation.”

Sean Myers, CEO, ThreeTech

Another partner shared,

“Instead of walking into meetings looking to justify an SEO retainer, we enter with a blueprint representing a SEO/GEO/AEO Search Engine’s ‘digital twin’ with the AI-driven tasks that show exactly what needs to be changed and the rankings it produces. Clients don’t question the value … they ask what’s next.”

Stephen Heitz, Chief Innovation Officer, LAVIDGE

Several agencies report that new business wins are increasing simply because they offer something different. While competitors stick to vague SEO promises or expensive PPC management, partners leveraging attribution offer clarity, accountability, and control.

And when the client sees that they’re paying less and getting more, it’s not a hard sell, it’s a long-term relationship.

A Smarter, More Profitable Model for Agencies and SEOs

The traditional agency model in search has become a maze of expectations.

Managing paid search may deliver short-term wins, but it comes to a bidding war with only those with the biggest budgets winning. SEO, meanwhile, has often felt like a thankless task … necessary but underappreciated, valuable but difficult to prove.

Attribution changes that.

For agencies, this is a path back to profitability and positioning. With attribution, you’re not just selling effort … you’re selling outcomes. And because the work is modeled and measured in advance, you can confidently offer performance plans that are both client-friendly and agency-profitable.

For SEOs, this is about getting the credit they deserve. Attribution allows practitioners to demonstrate their impact in concrete terms. Rankings don’t just move, … they move because of you. Traffic increases aren’t vague, … they’re connected to your specific strategies.

Now, you can show this.

Most importantly, this approach rebuilds trust.

Clients no longer have to guess what’s working. They see it. In dashboards, in forecasts, in side-by-side comparisons of where they were and where they are now. It restores SEO to a place of clarity and control where value is obvious, and investment is earned.

The industry has been waiting for this. And now, it’s here.

From PPC Dependence to Organic Dominance — Now Backed by Data

Search budgets have long been upside down, pouring billions into paid clicks that capture a mere fraction of user attention, while underfunding the organic channel that delivers lasting value.

Why? Because SEO lacked attribution.

That’s no longer the case.

Today, agencies and SEO professionals have the tools to prove what works, forecast what’s next, and get paid for the real value they deliver. It’s a shift that empowers agencies to move beyond bidding-war PPC management and into a lower cost & higher ROAS, performance-based SEO.

This isn’t just a new service mode it’s a rebalancing of power in search.

Organic is back. It’s measurable. It’s profitable. And it’s ready to take center stage again.

The only question is: will you be the agency or brand that leads the shift or watch as others do it first?

Citations

Image Credits

Featured Image: Image by Market Brew. Used with permission.

In-Post Image: Images by Market Brew. Used with permission.

Bing Recommends lastmod Tags For AI Search Indexing via @sejournal, @MattGSouthern

Bing has updated its sitemap guidance with a renewed focus on the lastmod tag, highlighting its role in AI-powered search to determine which pages need to be recrawled.

While real-time tools like IndexNow offer faster updates, Bing says accurate lastmod values help keep content discoverable, especially on frequently updated or large-scale sites.

Bing Prioritizes lastmod For Recrawling

Bing says the lastmod field in your sitemap is a top signal for AI-driven indexing. It helps determine whether a page needs to be recrawled or can be skipped.

To make it work effectively, use ISO 8601 format with both date and time (e.g. 2004-10-01T18:23:17+00:00). That level of precision helps Bing prioritize crawl activity based on actual content changes.

Avoid setting lastmod to the time your sitemap was generated, unless the page was truly updated.

Bing also confirmed that changefreq and priority tags are ignored and no longer affect crawling or ranking.

Submission & Verification Tips

Bing recommends submitting your sitemap in one of two ways:

  • Reference it in your robots.txt file
  • Submit it via Bing Webmaster Tools

Once submitted, Bing fetches the sitemap immediately and rechecks it daily.

You can verify whether it’s working by checking the submission status, last read date, and any processing errors in Bing Webmaster Tools.

Combine With IndexNow For Better Coverage

To increase the chances of timely indexing, Bing suggests combining sitemaps with IndexNow.

While sitemaps give Bing a full picture of your site, IndexNow allows real-time URL-level updates—useful when content changes frequently.

The Bing team states:

“By combining sitemaps for comprehensive site coverage with IndexNow for fast, URL-level submission, you provide the strongest foundation for keeping your content fresh, discoverable, and visible.”

Sitemaps at Massive Scale

If you manage a large website, Bing’s sitemap capacity limits are worth your attention:

  • Up to 50,000 URLs per sitemap
  • 50,000 sitemaps per index file
  • 2.5 billion URLs per index
  • Multiple index files support indexing up to 2.5 trillion URLs

That makes the standard sitemap protocol scalable enough even for enterprise-level ecommerce or publishing platforms.

Fabrice Canel and Krishna Madhavan of Microsoft AI, Bing, noted that using these limits to their full extent helps ensure content remains discoverable in AI search.

Why This Matters

As search becomes more AI-driven, accurate crawl signals matter more.

Bing’s reliance on sitemaps, especially the lastmod field, shows that basic technical SEO practices still matter, even as AI reshapes how content is surfaced.

For large sites, Bing’s support for trillions of URLs offers scalability. For everyone else, the message is simpler: keep your sitemaps clean, accurate, and updated in real-time. This gives your content the best shot at visibility in AI search.


Featured Image: PJ McDonnell/Shutterstock

How To Win In Generative Engine Optimization (GEO) via @sejournal, @maltelandwehr

This post was sponsored by Peec.ai. The opinions expressed in this article are the sponsor’s own.

The first step of any good GEO campaign is creating something that LLM-driven answer machines actually want to link out to or reference.

GEO Strategy Components

Think of experiences you wouldn’t reasonably expect to find directly in ChatGPT or similar systems:

  • Engaging content like a 3D tour of the Louvre or a virtual reality concert.
  • Live data like prices, flight delays, available hotel rooms, etc. While LLMs can integrate this data via APIs, I see the opportunity to capture some of this traffic for the time being.
  • Topics that require EEAT (experience, expertise, authoritativeness, trustworthiness).

LLMs cannot have first-hand experience. But users want it. LLMs are incentivized to reference sources that provide first-hand experience. That’s just one of the things to keep in mind, but what else?

We need to differentiate between two approaches: influencing foundational models versus influencing LLM answers through grounding. The first is largely out of reach for most creators, while the second offers real opportunities.

Influencing Foundational Models

Foundational models are trained on fixed datasets and can’t learn new information after training. For current models like GPT-4, it is too late – they’ve already been trained.

But this matters for the future: imagine a smart fridge stuck with o4-mini from 2025 that might – hypothetically – favor Coke over Pepsi. That bias could influence purchasing decisions for years!

Optimizing For RAG/Grounding

When LLMs can’t answer from their training data alone, they use retrieval augmented generation (RAG) – pulling in current information to help generate answers. AI Overviews and ChatGPT’s web search work this way.

As SEO professionals, we want three things:

  1. Our content gets selected as a source.
  2. Our content gets quoted most within those sources.
  3. Other selected sources support our desired outcome.

Concrete Steps To Succeed With GEO

Don’t worry, it doesn’t take rocket science to optimize your content and brand mentions for LLMs. Actually, plenty of traditional SEO methods still apply, with a few new SEO tactics you can incorporate into your workflow.

Step 1: Be Crawlable

Sounds simple but it is actually an important first step. If you aim for maximum visibility in LLMs, you need to allow them to crawl your website. There are many different LLM crawlers from OpenAI, Anthropic & Co.

Some of them behave so badly that they can trigger scraping and DDoS preventions. If you are automatically blocking aggressive bots, check in with your IT team and find a way to not block LLMs you care about.

If you use a CDN, like Fastly or Cloudflare, make sure LLM crawlers are not blocked by default settings.

Step 2: Continue Gaining Traditional Rankings

The most important GEO tactic is as simple as it sounds. Do traditional SEO. Rank well in Google (for Gemini and AI Overviews), Bing (for ChatGPT and Copilot), Brave (for Claude), and Baidu (for DeepSeek).

Step 3: Target the Query Fanout

The current generation of LLMs actually does a little more than simple RAG. They generate multiple queries. This is called query fanout.

For example, when I recently asked ChatGPT “What is the latest Google patent discussed by SEOs?”, it performed two web searches for “latest Google patent discussed by SEOs patent 2025 SEO forum” and “latest Google patent SEOs 2025 discussed”.

Advice: Check the typical query fanouts for your prompts and try to rank for those keywords as well.

Typical fanout-patterns I see in ChatGPT are appending the term “forums” when I ask what people are discussing and appending “interview” when I ask questions related to a person. The current year (2025) is often added as well.

Beware: fanout patterns differ between LLMs and can change over time. Patterns we see today may not be relevant anymore in 12 months.

Step 4: Keep Consistency Across Your Brand Mentions

This is something simple everyone should do – both as a person and an enterprise. Make sure you are consistently described online. On X, LinkedIn, your own website, Crunchbase, Github – always describe yourself the same way.

If your X and LinkedIn profiles say you are a “GEO consultant for small businesses”, don’t change it to “AIO expert” on Github and “LLMO Freelancer” in your press releases.

I have seen people achieve positive results within a few days on ChatGPT and Google AI Overviews by simply having a consistent self description across the web. This also applies to PR coverage – the more and better coverage you can obtain for your brand, the more likely LLMs are to parrot it back to users.

Step 5: Avoid JavaScript

As an SEO, I always ask for as little JavaScript usage as possible. As a GEO, I demand it!

Most LLM crawlers cannot render JavaScript. If your main content is hidden behind JavaScript, you are out.

Step 6: Embrace Social Media & UGC

Unsurprisingly, LLMs seem to rely on reddit and Wikipedia a lot. Both platforms offer user-generated-content on virtually every topic. And thanks to multiple layers of community-driven moderation, a lot of junk and spam is already filtered out.

While both can be gamed, the average reliability of their content is still far better than on the internet as a whole. Both are also regularly updated.

reddit also provides LLM labs with data into how people discuss topics online, what language they use to describe different concepts, and knowledge on obscure niche topics.

We can reasonably assume that moderated UGC found on platforms like reddit, Wikipedia, Quora, and Stackoverflow will stay relevant for LLMs.

I do not advocate spamming these platforms. However, if you can influence how you and competitors show up there, you might want to do so.

Step 7: Create For Machine-Readability & Quotability

Write content that LLMs understand and want to cite. No one has figured this one out perfectly yet, but here’s what seems to work:

  • Use declarative and factual language. Instead of writing “We are kinda sure this shoe is good for our customers”, write “96% of buyers have self-reported to be happy with this shoe.
  • Add schema. It has been debated many times. Recently, Fabrice Canel (Principal Product Manager at Bing) confirmed that schema markup helps LLMs to understand your content.
  • If you want to be quoted in an already existing AI Overview, have content with similar length to what is already there. While you should not just copy the current AI Overview, having high cosine similarly helps. And for the nerds: yes, given normalization, you can of course use the dot product instead of cosine similarity.
  • If you use technical terms in your content, explain them. Ideally in a simple sentence.
  • Add summaries of long text paragraphs, lists of reviews, tables, videos, and other types of difficult-to-cite content formats.

Step 8: Optimize your Content

Start of the paper GEO: Generative Engine Optimization (arXiv:2311.09735)The original GEO paper

If we look at GEO: Generative Engine Optimization (arXiv:2311.09735) , What Evidence Do Language Models Find Convincing? (arXiv:2402.11782v1), and similar scientific studies, the answer is clear. It depends!

To be cited for some topics in some LLMs, it helps to:

  • Add unique words.
  • Have pro/cons.
  • Gather user reviews.
  • Quote experts.
  • Include quantitative data and name your sources.
  • Use easy to understand language.
  • Write with positive sentiment.
  • Add product text with low perplexity (predictable and well-structured).
  • Include more lists (like this one!).

However, for other combinations of topics and LLMs, these measures can be counterproductive.

Until broadly accepted best practices evolve, the only advice I can give is do what is good for users and run experiments.

Step 9: Stick to the Facts

For over a decade, algorithms have extracted knowledge from text as triples like (Subject, Predicate, Object) — e.g., (Lady Liberty, Location, New York). A text that contradicts known facts may seem untrustworthy. A text that aligns with consensus but adds unique facts is ideal for LLMs and knowledge graphs.

So stick to the established facts. And add unique information.

Step 10: Invest in Digital PR

Everything discussed here is not just true for your own website. It is also true for content on other websites. The best way to influence it? Digital PR!

The more and better coverage you can obtain for your brand, the more likely LLMs are to parrot it back to users.

I have even seen cases where advertorials were used as sources!

Concrete GEO Workflows To Try

Before I joined Peec AI, I was a customer. Here is how I used the tool – and how I advise our customers to use it.

Learn Who Your Competitors Are

Just like with traditional SEO, using a good GEO tool will often reveal unexpected competitors. Regularly look at a list of automatically identified competitors. For those who surprise you, check in which prompts they are mentioned. Then check the sources that led to their inclusion. Are you represented properly in these sources? If not, act!

Is a competitor referenced because of their PeerSpot profile but you have zero reviews there? Ask customers for a review.

Was your competitor’s CEO interviewed by a Youtuber? Try to get on that show as well. Or publish your own videos targeting similar keywords.

Is your competitor regularly featured on top 10 lists where you never make it to the top 5? Offer the publisher who created the list an affiliate deal they cannot decline. With the next content update, you’re almost guaranteed to be the new number one.

Understand the Sources

When performing search grounding, LLMs rely on sources.

Typical LLM Sources: Reddit & Wikipedia

Look at the top sources for a large set of relevant prompts. Ignore your own website and your competitors for a second. You might find some of these:

  • A community like Reddit or X. Become part of the community and join the discussion. X is your best bet to influence results on Grok.
  • An influencer-driven website like YouTube or TikTok. Hire influencers to create videos. Make sure to instruct them to target the right keywords.
  • An affiliate publisher. Buy your way to the top with higher commissions.
  • A news and media publisher. Buy an advertorial and/or target them with your PR efforts. In certain cases, you might want to contact their commercial content department.

You can also check out this in-depth guide on how to deal with different kinds of source domains.

Target Query Fanout

Once you have observed which searches are triggered by query fanout for your most relevant prompts, create content to target them.

On your own website. With posts on Medium and LinkedIn. With press releases. Or simply by paying for article placements. If it ranks well in search engines, it has a chance to be cited by LLM-based answer engines.

Position Yourself for AI-Discoverability

Generative Engine Optimization is no longer optional – it’s the new frontline of organic growth. At Peec AI, we’re building the tools to track, influence, and win in this new ecosystem.

Generative Engine Optimization is no longer optional – it’s the new frontline of organic growth. We currently see clients growing their LLM traffic by 100% every 2 to 3 months. Sometimes with up to 20x the conversation rate of typical SEO traffic!

Whether you’re shaping AI answers, monitoring brand mentions, or pushing for source visibility, now is the time to act. The LLMs consumers will trust tomorrow are being trained today.


Image Credits

Featured Image: Image by Peec.ai Used with permission.

Google Warns: CSS Background Images Aren’t Indexed via @sejournal, @MattGSouthern

In a recent Search Off the Record podcast, Google’s Search Relations team cautioned developers against using CSS for all website images.

While CSS background images can enhance visual design, they’re invisible to Google Image Search. This could lead to missed opportunities in image indexing and search visibility.

Here’s what Google’s Search Advocates advise.

The CSS Image Problem

During the episode, John Mueller shared a recurring issue:

“I had someone ping me I think last week or a week before on social media: “It looks like my developer has decided to use CSS for all of the images because they believe it’s better.” Does this work?”

According to the Google team, this approach stems from a misunderstanding of how search engines interpret images.

When visuals are added via CSS background properties instead of standard HTML image tags, they may not appear in the page’s DOM, and therefore can’t be indexed.

As Martin Splitt explained:

“If you have a content image, if the image is part of the content… you want an img, an image tag or a picture tag that actually has the actual image as part of the DOM because you want us to see like ah so this page has this image that is not just decoration. It is part of the content and then image search can pick it up.”

Content vs. Decoration

The difference between a content image and a decorative image is whether it adds meaning or is purely cosmetic.

Decorative images, such as patterned backgrounds, atmospheric effects, or animations, can be safely implemented using CSS.

When the image conveys meaning or is referenced in the content, CSS is a poor fit.

Splitt offered the following example:

“If I have a blog post about this specific landscape and I want to like tell people like look at this amazing panoramic view of the landscape here and then it’s a background image… the problem is the content specifically references this image, but it doesn’t have the image as part of the content.”

In such cases, placing the image in HTML using the img or picture tag ensures it’s understood as part of the page’s content and eligible for indexing in Google Image Search.

What Makes CSS Images Invisible?

Splitt explained why this happens:

“For a user looking at the browser, what are you talking about, Martin? The image is right there. But if you look at the DOM, it absolutely isn’t there. It is just a CSS thing that has been loaded to style the page.”

Because Google parses the DOM to determine content structure, images styled purely through CSS are often overlooked, especially if they aren’t included as actual HTML elements.

This distinction reflects a broader web development principle.

Splitt adds:

“There is ideally a separation between the way the site looks and what the content is.”

What About Stock Photos?

The team addressed the use of stock photos, which are sometimes added for visual appeal rather than original content.

Splitt says:

“The meaning is still like this image is not mine. It’s a stock image that we bought or licensed but it is still part of the content,” the team noted.

While these images may not rank highly due to duplication, implementing them in HTML still helps ensure proper indexing and improves accessibility.

Why This Matters

The team highlighted several examples where improper implementation could reduce visibility:

  • Real estate listings: Home photos used as background images won’t show up in relevant image search queries.
  • News articles: Charts or infographics added via CSS can’t be indexed, weakening discoverability.
  • E-commerce sites: Product images embedded in background styles may not appear in shopping-related searches.

What To Do Next

Google’s comments indicate that you should follow these best practices:

  • Use HTML (img or picture) tags for any image that conveys content or is referenced on the page.
  • Reserve CSS backgrounds for decorative visuals that don’t carry meaning.
  • If users might expect to find an image via search, it should be in the HTML.
  • Proper implementation helps not only with SEO, but also with accessibility tools and screen readers.

Looking Ahead

Publishers should be mindful of how images are implemented.

While CSS is a powerful tool for design, using it to deliver content-related images may conflict with best practices for indexing, accessibility, and long-term SEO strategy.

Listen to the full podcast episode below:


Featured Image: Roman Samborskyi/Shutterstock

Google Confirms CSS Class Names Don’t Influence SEO via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off the Record podcast, Martin Splitt and John Mueller clarified how CSS affects SEO.

While some aspects of CSS have no bearing on SEO, others can directly influence how search engines interpret and rank content.

Here’s what matters and what doesn’t.

Class Names Don’t Matter For Rankings

One of the clearest takeaways from the episode is that CSS class names have no impact on Google Search.

Splitt stated:

“I don’t think it does. I don’t think we care because the CSS class names are just that. They’re just assigning a specific somewhat identifiable bit of stylesheet rules to elements and that’s it. That’s all. You could name them all “blurb.” It would not make a difference from an SEO perspective.”

Class names, they explained, are used only for applying visual styling. They’re not considered part of the page’s content. So they’re ignored by Googlebot and other HTML parsers when extracting meaningful information.

Even if you’re feeding HTML into a language model or a basic crawler, class names won’t factor in unless your system is explicitly designed to read those attributes.

Why Content In Pseudo Elements Is A Problem

While class names are harmless, the team warned about placing meaningful content in CSS pseudo elements like :before and :after.

Splitt stated:

“The idea again—the original idea—is to separate presentation from content. So content is in the HTML, and how it is presented is in the CSS. So with before and after, if you add decorative elements like a little triangle or a little dot or a little light bulb or like a little unicorn—whatever—I think that is fine because it’s decorative. It doesn’t have meaning in the sense of the content. Without it, it would still be fine.”

Adding visual flourishes is acceptable, but inserting headlines, paragraphs, or any user-facing content into pseudo elements breaks the core principle of web development.

That content becomes invisible to search engines, screen readers, and any other tools that rely on parsing the HTML directly.

Mueller shared a real-world example of how this can go wrong:

“There was once an escalation from the indexing team that said we should contact the site and tell them to stop using before and after… They were using the before pseudo class to add a number sign to everything that they considered hashtags. And our indexing system was like, it would be so nice if we could recognize these hashtags on the page because maybe they’re useful for something.”

Because the hashtag symbols were added via CSS, they were never seen by Google’s systems.

Splitt tested it live during the recording and confirmed:

“It’s not in the DOM… so it doesn’t get picked up by rendering.”

Oversized CSS Can Hurt Performance

The episode also touched on performance issues related to bloated stylesheets.

According to data from the HTTP Archive’s 2022 Web Almanac, the median size of a CSS file had grown to around 68 KB for mobile and 72 KB for desktop.

Mueller stated:

“The Web Almanac says every year we see CSS grow in size, and in 2022 the median stylesheet size was 68 kilobytes or 72 kilobytes. … They also mentioned the largest one that they found was 78 megabytes. … These are text files.”

That kind of bloat can negatively impact Core Web Vitals and overall user experience, which are two areas that do influence rankings. Frameworks and prebuilt libraries are often the cause.

While developers can mitigate this with minification and unused rule pruning, not everyone does. This makes CSS optimization a worthwhile item on your technical SEO checklist.

Keep CSS Crawlable

Despite CSS’s limited role in ranking, Google still recommends making CSS files crawlable.

Mueller joked:

“Google’s guidelines say you should make your CSS files crawlable. So there must be some kind of magic in there, right?”

The real reason is more technical than magical. Googlebot uses CSS files to render pages the way users would see them.

Blocking CSS can affect how your pages are interpreted, especially for layout, mobile-friendliness, or elements like hidden content.

Practical Tips For SEO Pros

Here’s what this episode means for your SEO practices:

  • Stop optimizing class names: Keywords in CSS classes won’t help your rankings.
  • Check pseudo elements: Any real content, like text meant to be read, should live in HTML, not in :before or :after.
  • Audit stylesheet size: Large CSS files can hurt page speed and Core Web Vitals. Trim what you can.
  • Ensure CSS is crawlable: Blocking stylesheets may disrupt rendering and impact how Google understands your page.

The team also emphasized the importance of using proper HTML tags for meaningful images:

“If the image is part of the content and you’re like, ‘Look at this house that I just bought,’ then you want an img, an image tag or a picture tag that actually has the actual image as part of the DOM because you want us to see like, ah, so this page has this image that is not just decoration.”

Use CSS for styling and HTML for meaning. This separation helps both users and search engines.

Listen to the full podcast episode below:

Google Clarifies Structured Data Rules For Returns & Loyalty Programs via @sejournal, @MattGSouthern

Google has updated its structured data documentation to clarify how merchants should implement markup for return policies and loyalty programs.

The updates aim to reduce confusion and ensure compatibility with Google Search features.

Key Changes In Return Policy Markup

The updated documentation clarifies that only a limited subset of return policy data is supported at the product level.

Google now explicitly states that comprehensive return policies must be defined using the MerchantReturnPolicy type under the Organization markup. This ensures a consistent policy across the full catalog.

In contrast, product-level return policies, defined underOffer, should be used only for exceptions and support fewer properties.

Google explains in its return policy documentation:

“Product-level return policies support only a subset of the properties available for merchant-level return policies.”

Loyalty Program Markup Must Be Separate

For loyalty programs, Google now emphasizes that the MemberProgram structured data must be defined under the Organization markup, either on a separate page or in Merchant Center.

While loyalty benefits like member pricing and points can still be referenced at the product level via UnitPriceSpecification, the program structure itself must be maintained separately.

Google notes in the loyalty program documentation:

“To specify the loyalty benefits… separately add UnitPriceSpecification markup under your Offer structured data markup.”

What’s Not Supported

Google’s documentation now states that shipping discounts and extended return windows offered as loyalty perks aren’t supported in structured data.

While merchants may still offer these benefits, they won’t be eligible for enhanced display in Google Search results.

This is particularly relevant for businesses that advertise such benefits prominently within loyalty programs.

Why It Matters

The changes don’t introduce new capabilities, but they clarify implementation rules that have been inconsistently followed or interpreted.

Merchants relying on offer-level markup for return policies or embedding loyalty programs directly in product offers may need to restructure their data.

Here are some next steps to consider:

  • Audit existing markup to ensure return policies and loyalty programs are defined at the correct levels.
  • Use product-level return policies only when needed, such as for exceptions.
  • Separate loyalty program structure from loyalty benefits, using MemberProgram under Organization, and validForMemberTier under Offer.

Staying compliant with these updated guidelines ensures eligibility for structured data features in Google Search and Shopping.


Featured Image: Roman Samborskyi/Shutterstock

Google: Many Top Sites Have Invalid HTML And Still Rank via @sejournal, @MattGSouthern

A recent discussion on Google’s Search Off the Record podcast challenges long-held assumptions about technical SEO, revealing that most top-ranking websites don’t use valid HTML.

Despite these imperfections, they continue to rank well in search results.

Search Advocate John Mueller and Developer Relations Engineer Martin Splitt referenced a study by former Google webmaster Jens Meiert, which found that only one homepage among the top 200 websites passed HTML validation tests.

Mueller highlighted:

“0.5% of the top 200 websites have valid HTML on their homepage. One site had valid HTML. That’s it.”

He described the result as “crazy,” noting that the study surprised even developers who take pride in clean code.

Mueller added:

“Search engines have to deal with whatever broken HTML is out there. It doesn’t have to be perfect, it’ll still work.”

When HTML Errors Matter

While most HTML issues are tolerated, certain technical elements, such as metadata, must be correctly implemented.

Splitt said:

“If something is written in a way that isn’t HTML compliant, then the browser will make assumptions.”

That usually works fine for visible content, but can fail “catastrophically” when it comes to elements that search engines rely on.

Mueller said:

“If [metadata] breaks, then it’s probably not going to do anything in your favor.”

SEO Is Not A Technical Checklist

Google also challenged the notion that SEO is a box-ticking exercise for developers.

Mueller said:

“Sometimes SEO is also not so much about purely technical things that you do, but also kind of a mindset.”

Splitt said:

“Am I using the terminology that my potential customers would use? And do I have the answers to the things that they will ask?”

Naming things appropriately, he said, is one of the most overlooked SEO skills and often more important than technical precision.

Core Web Vitals and JavaScript

Two recurring sources of confusion, Core Web Vitals and JavaScript, were also addressed.

Core Web Vitals

The podcast hosts reiterated that good Core Web Vitals scores don’t guarantee better rankings.

Mueller said:

“Core Web Vitals is not the solution to everything.”

Mueller added:

“Developers love scores… it feels like ‘oh I should like maybe go from 85 to 87 and then I will rank first,’ but there’s a lot more involved.”

JavaScript

On the topic of JavaScript, Splitt said that while Google can process it, implementation still matters.

Splitt said:

“If the content that you care about is showing up in the rendered HTML, you’ll be fine generally speaking.”

Splitt added:

“Use JavaScript responsibly and don’t use it for everything.”

Misuse can still create problems for indexing and rendering, especially if assumptions are made without testing.

What This Means

The key takeaway from the podcast is that technical perfection isn’t 100% necessary for SEO success.

While critical elements like metadata must function correctly, the vast majority of HTML validation errors won’t prevent ranking.

As a result, developers and marketers should be cautious about overinvesting in code validation at the expense of content quality and search intent alignment.

Listen to the full podcast episode below:

Google’s ‘srsltid’ Parameter Appears In Organic URLs, Creating Confusion via @sejournal, @MattGSouthern

Google’s srsltid parameter, originally meant for product tracking, is now showing on blog pages and homepages, creating confusion among SEO pros.

Per a recent Reddit thread, people are seeing the parameter attached not just to product pages, but also to blog posts, category listings, and homepages.

Google Search Advocate John Mueller responded saying, “it doesn’t cause any problems for search.”  However, it may still raise more questions than it answers.

Here’s what you need to know.

What Is the srsltid Parameter Supposed to Do?

The srsltid parameter is part of Merchant Center auto-tagging. It’s designed to help merchants track conversions from organic listings connected to their product feeds.

When enabled, the parameter is appended to URLs shown in search results, allowing for better attribution of downstream behavior.

A post on Google’s Search Central community forum clarifies that these URLs aren’t indexed.

As Product Expert Barry Hunter (not affiliated with Google) explained:

“The URLs with srsltid are NOT really indexed. The param is added dynamically at runtime. That’s why they don’t show as indexed in Search Console… but they may appear in search results.”

While it’s true the URLs aren’t indexed, they’re showing up in indexed pages reported by third-party tools.

Why SEO Pros Are Confused

Despite Google’s assurances, the real-world impact of srsltid is causing confusion for these reasons:

  • Inflated URL counts: Tools often treat URLs with unique parameters as separate pages. This inflates site page counts and can obscure crawl reports or site audits.
  • Data fragmentation: Without filtering, analytics platforms like GA4 split traffic between canonical and parameterized URLs, making it harder to measure performance accurately.
  • Loss of visibility in Search Console: As documented in a study by Oncrawl, sites saw clicks and impressions for srsltid URLs drop to zero around September, even though those pages still appeared in search results.
  • Unexpected reach: The parameter is appearing on pages beyond product listings, including static pages, blogs, and category hubs.

Oncrawl’s analysis also found that Googlebot crawled 0.14% of pages with the srsltid parameter, suggesting minimal crawling impact.

Can Anything Be Done?

Google hasn’t indicated any rollback or revision to how srsltid works in organic results. But you do have a few options depending on how you’re affected.

Option 1: Disable Auto-Tagging

You can turn off Merchant Center auto-tagging by navigating to Tools and settings > Conversion settings > Automatic tagging. Switching to UTM parameters can provide greater control over traffic attribution.

Option 2: Keep Auto-Tagging, Filter Accordingly

If you need to keep auto-tagging active:

  • Ensure all affected pages have correct canonical tags.
  • Configure caching systems to ignore srsltid as a cache key.
  • Update your analytics filters to exclude or consolidate srsltid traffic.

Blocking the parameter in robots.txt won’t prevent the URLs from appearing in search results, as they’re added dynamically and not crawled directly.

What This Means

The srsltid parameter may not affect rankings, but its indirect impact on analytics and reporting is being felt.

When performance reporting shifts without explanation, SEO pros need to provide answers. Understanding how srsltid functions work, and how it doesn’t, helps mitigate confusion.

Staying informed, filtering correctly, and communicating with stakeholders are the best options for navigating this issue.


Featured Image: Roman Samborskyi/Shutterstock

The Smart SEO Team’s Guide To Timing & Executing A Large-Scale Site Migration via @sejournal, @inmotionhosting

This post was sponsored by InMotion Hosting. The opinions expressed in this article are the sponsor’s own.

We’ve all felt it, that sinking feeling in your stomach when your site starts crawling instead of sprinting.

Page speed reports start flashing red. Search Console is flooding your inbox with errors.

You know it’s time for better hosting, but here’s the thing: moving a large website without tanking your SEO is like trying to change tires while your car is still moving.

We’ve seen too many migrations go sideways, which is why we put together this guide.

Let’s walk through a migration plan that works. One that’ll future-proof your site without disrupting your rankings or overburdening your team.

Free Website Migration Checklist

Step 1: Set Your Performance Goals & Audit Your Environment

Establish Performance Benchmarks

Before you touch a single line of code, you need benchmarks. Think of these as your “before” pictures in a website makeover.

If you skip this step, you’ll regret it later. How will you know if your migration was successful if you don’t know where you started?

Gather your current page speed numbers, uptime percentages, and server response times. These will serve as proof that the migration was worth it.

Document Current Site Architecture

Next, let’s identify what’s working for your site and what’s holding it back. Keep a detailed record of your current setup, including your content management system (CMS), plugins, traffic patterns, and peak periods.

Large sites often have unusual, hidden connections that only reveal themselves at the worst possible moments during migrations. Trust us, documenting this now prevents those 2 AM panic attacks later.

Define Your Website Migration Goals

Let’s get specific about what success looks like. Saying “we want the site to be faster” is like saying “we want more leads.” It sounds great, but how do you measure it?

Aim for concrete targets, such as:

  • Load times under 2 seconds on key pages (we like to focus on product pages first).
  • 99.99% uptime guarantees (because every minute of downtime is money down the drain).
  • Server response times under 200ms.
  • 30% better crawl efficiency (so Google sees your content updates).

We recommend running tests with Google Lighthouse and GTmetrix at different times of day. You’d be surprised how performance can vary between your morning coffee and afternoon slump.

Your top money-making pages deserve special attention during migration, so keep tabs on those.

Step 2: Choose The Right Hosting Fit

Not all hosting options can handle the big leagues.

We’ve seen too many migrations fail because someone picked a hosting plan better suited for a personal blog than an enterprise website.

Match Your Needs To Solutions

Let’s break down what we’ve found works best.

Managed VPS is excellent for medium-sized sites. If you’re receiving 100,000 to 500,000 monthly visitors, this might be your sweet spot. You’ll have the control you need without the overkill.

Dedicated servers are what we recommend for the major players. If you’re handling millions of visitors or running complex applications, this is for you.

What we appreciate about dedicated resources is that they eliminate the “noisy neighbor” problem, where someone else’s traffic spike can tank your performance. Enterprise sites on dedicated servers load 40-60% faster and rarely experience those resource-related outages.

WordPress-optimized hosting is ideal if you’re running WordPress. These environments come pre-tuned with built-in caching and auto-updates. Why reinvent the wheel, right?

Understand The Must-Have Features Checklist

Let’s talk about what your web hosting will need for SEO success.

Free Website Migration Checklist

NVMe SSDs are non-negotiable these days. They’re about six times faster than regular storage for database work, and you’ll feel the difference immediately.

A good CDN is essential if you want visitors from different regions to have the same snappy experience. Server-level caching makes a huge difference, as it reduces processing work and speeds up repeat visits and search crawls.

Illustration showing how caching works on a websiteImage created by InMotion Hosting, June 2025

Staging environments aren’t optional for big migrations. They’re your safety net. Keep in mind that emergency fixes can cost significantly more than setting up staging beforehand.

And please ensure you have 24/7 migration support from actual humans. Not chatbots, real engineers who answer the phone when things go sideways at midnight.

Key Considerations for Growth

Think about where your site is headed, not just where it is now.

Are you launching in new markets? Planning a big PR push? Your hosting should handle growth without making you migrate again six months later.

One thing that often gets overlooked: redirect limits. Many platforms cap at 50,000-100,000 redirects, which sounds like a lot until you’re migrating a massive product catalog.

Step 3: Prep for Migration – The Critical Steps

Preparation separates smooth migrations from disasters. This phase makes or breaks your project.

Build Your Backup Strategy

First things first: backups, backups, backups. We’re talking complete copies of both files and databases.

Don’t dump everything into one giant folder labeled “Site Stuff.” Organizing backups by date and type. Include the entire file system, database exports, configuration files, SSL certificates, and everything else.

Here’s a common mistake we often see: not testing the restore process before migration day. A backup you can’t restore is wasted server space. Always conduct a test restore on a separate server to ensure everything works as expected.

Set Up the New Environment and Test in Staging

Your new hosting environment should closely mirror your production environment. Match PHP versions, database settings, security rules, everything. This isn’t the time to upgrade seven different things at once (we’ve seen that mistake before).

Run thorough pre-launch tests on staging. Check site speed on different page types. Pull out your phone and verify that the mobile display works.

Use Google’s testing tools to confirm that your structured data remains intact. The goal is no surprises on launch day.

Map Out DNS Cutover and Minimize TTL for a Quick Switch

DNS strategy might sound boring, but it can make or break your downtime window.

Here’s what works: reduce your TTL to at least 300 seconds (5 minutes) about 48 hours before migration. This makes DNS changes propagate quickly when you flip the switch.

Have all your DNS records prepared in advance: A records, CNAMEs for subdomains, MX records for email, and TXT records for verification. Keep a checklist and highlight the mission-critical ones that would cause panic if forgotten.

Freeze Non-Essential Site Updates Before Migration

This might be controversial, but we’re advocates for freezing all content and development changes for at least 48 hours before migration.

The last thing you need is someone publishing a new blog post right as you’re moving servers.

You can use this freeze time for team education. It’s a perfect moment to run workshops on technical SEO or explain how site speed affects rankings. Turn downtime into learning time.

Step 4: Go-Live Without the Guesswork

Migration day! This is where all your planning pays off, or where you realize what you forgot.

Launch Timing Is Everything

Choose your timing carefully. You should aim for when traffic is typically lowest.

For global sites, consider the “follow-the-sun” approach. This means migrating region by region during their lowest traffic hours. While it takes longer, it dramatically reduces risk.

Coordinate Your Teams

Clear communication is everything. Everyone should know exactly what they’re doing and when.

Define clear go/no-go decision points. Who makes the call if something looks off? What’s the threshold for rolling back vs. pushing through?

Having these conversations before you’re in the middle of a migration saves a ton of stress.

Live Performance Monitoring

Once you flip the switch, monitoring becomes your best friend. Here are the key items to monitor:

  • Watch site speed across different page types and locations.
  • Set up email alerts for crawl errors in Search Console.
  • Monitor 404 error rates and redirect performance.

Sudden spikes in 404 errors or drops in speed need immediate attention. They’re usually signs that something didn’t migrate correctly.

The faster you catch these issues, the less impact they’ll have on your rankings.

Post-Migration Validation

After launch, run through a systematic checklist:

  • Test redirect chains (we recommend Screaming Frog for this).
  • Make sure internal links work.
  • Verify your analytics tracking (you’d be surprised how often this breaks).
  • Check conversion tracking.
  • Validate SSL certificates.
  • Watch server logs for crawl issues.

One step people often forget: resubmitting your sitemap in Search Console as soon as possible. This helps Google discover your new setup faster.

Even with a perfect migration, most large sites take 3-6 months for complete re-indexing, so patience is key.

Step 5: Optimize, Tune, and Report: How To Increase Wins

The migration itself is just the beginning. Post-migration tuning is where the magic happens.

Fine-Tune Your Configuration

Now that you’re observing real traffic patterns, you can optimize your setup.

Start by enhancing caching rules based on actual user behavior. Adjust compression settings, and optimize those database queries that seemed fine during testing but are sluggish in production.

Handling redirects at the server level, rather than through plugins or CMS settings, is faster and reduces server load.

Automate Performance Monitoring

Set up alerts for issues before they become problems. We recommend monitoring:

  • Page speed drops by over 10%.
  • Uptime drops.
  • Changes in crawl rates.
  • Spikes in server resource usage.
  • Organic traffic drops by over 20%.

Automation saves you from constantly checking dashboards, allowing you to focus on improvements instead of firefighting.

Analyze for SEO Efficiency

Server logs tell you a lot about how well your migration went from an SEO perspective. Look for fewer crawl errors, faster Googlebot response times, and better crawl budget usage.

Improvements in crawl efficiency mean Google can discover and index your new content much faster.

Measure and Report Success

Compare your post-migration performance to those baseline metrics you wisely collected.

When showing results to executives, connect each improvement to business outcomes. For example:

  • “Faster pages reduced our bounce rate by 15%, which means more people are staying on the site.”
  • “Better uptime means we’re not losing sales during peak hours.”
  • “Improved crawl efficiency means our new products get indexed faster.”

Pro tip: Build easy-to-read dashboards that executives can access at any time. This helps build confidence and alleviate concerns.

Ready to Execute Your High-Performance Migration?

You don’t need more proof that hosting matters. Every slow page load and server hiccup already demonstrates it. What you need is a plan that safeguards your SEO investment while achieving tangible improvements.

This guide provides you with that playbook. You now know how to benchmark, choose the right solutions, and optimize for success.

This approach can be applied to sites of all sizes, ranging from emerging e-commerce stores to large enterprise platforms. The key lies in preparation and partnering with the right support team.

If you’re ready to take action, consider collaborating with a hosting provider that understands the complexities of large-scale migrations. Look for a team that manages substantial redirect volumes and builds infrastructure specifically for high-traffic websites. Your future rankings will thank you!

Image Credits

Featured Image: Image by InMotion Hosting. Used with permission.

In-Post Image: Images by InMotion Hosting. Used with permission.