Ask An SEO: Should We Optimize For Keywords With High Search Volume Or Competition? via @sejournal, @rollerblader

In this week’s Ask An SEO, Chandrika asks:

“What are the important points to consider when doing keyword research for SEO using Google Keyword Planner? Should we focus on keywords with a monthly search volume of 500? Or, should we prioritize keywords with low or high competition?”

This is a great question, and here’s an easy answer: Don’t focus on the keyword. Focus on the solution for the user based on the intent of the keyword.

Google Keyword Planner shares the estimated search volume for a keyword, but that doesn’t mean the entire volume represents your audience. Some of them may be looking for information rather than shopping, and only a portion of them are there to be converted into revenue.

The word “bark,” for example, could be the bark on a tree or the noise a dog makes.

A search for bark on a tree could be what it looks like or feels like, whether it’s a sign the tree is healthy or not, and questions about using it to determine the age or genus of the tree.

“Bark” for a dog could refer to the specific sounds made by certain breeds, could indicate that the dog is sick, or the user is looking for ways to get a dog to stop barking or train a dog to bark on command.

If there are 500 searches, perhaps 300 are for the noise the dog makes, from which 200 are for determining if the dog is sick or healthy, and 50 are for training your dog to bark.

If you sell books on dog training, this may not be the best phrase to go after, but it is a topic you may want to cover. This is where optimizing for the topic comes in.

The topic will encompass the “SEO keywords” and increase the potential pool of traffic based on the entity it ranks for, and the solution it provides.

Optimize For The Solution And Topic

Instead of optimizing for a keyword by stuffing it into the copy, headers, and title, optimize for the topic it relates to.

Ask yourself what the person searching for this keyword is looking for, and build a series of pages that meet these needs.

  • If it is a conversion phrase, then incorporate the questions and solutions the person has related to the product query into the product or collection page. This can be done in the copy itself or in the FAQs, if your template has them.
  • When the keyword has an informational and conversion intent, such as “micro needling,” it can be about the process and procedure, a before-and-after photo series, or someone looking to find a local med spa. This means your site should have multiple content types for the SEO keywords based on the stage of the customer’s journey, including:
    • Pages that show the before and after, and by skin type and age.
    • Blog posts and guides that cover the process and alternatives if it isn’t a match.
    • Comparisons between micro needling and similar procedures to help the person know which is better suited to their needs.
    • A direct conversion page where you can onboard the lead or take payment.

By creating guides that address the topic, your website becomes stronger for the specific phrases.

Machine learning and AI are getting better at understanding what the content solves, and they use the trustworthiness of the content and its phrasing to determine the keywords the page should rank for.

If the content is clearly showing knowledge and expertise, and the claims or solutions are backed up by proven facts, you can show up for keywords without optimizing for the phrase from Google Keyword Planner.

Once you have the content and user intent, like shopping or learning, completed text-wise, add schema.

Use article or blog post schema, depending on whether you’re a news site, for informative content. Use the shopping schema, such as product, collection, or service, along with the area served and additional types to help drive the intent of the page home.

Keywords With Higher Search Volumes

Keywords with high search volumes are tempting to optimize for. However, instead of worrying about the keyword, take other keywords that are similar and are part of the solution.

Put those together into a group, and then think about how they interact to educate the person so that the person will have the information they need to make an informed decision about their purchase, whether it is a product or a collection/category page.

Keywords and search volumes are part of topics, but you don’t focus on their volumes – focus on the solutions for the phrases.

Your goal is to create the ultimate resource for the topic, whether it’s a question, a guide, or compatibility for products and services.

When you do this, the keyword search volume may multiply exponentially, and you can optimize the same page for multiple high-volume phrases.

By doing this, you may also be able to avoid creating content that cannibalizes itself by having a content map of your website.

When you know a page is dedicated to a topic and specific intent, you have your reminder not to create another page just because there is a search volume you found.

Instead, try to incorporate the theme of the phrase based on the search intent into the correct page for that search volume.

Competition Scores Do Not Matter

Someone has to show up for the phrase, so why shouldn’t it be you?

Competition scores are scores made up by SEO tools, not used by search engines.

Search engines are concerned with providing the most accurate answer in the easiest-to-absorb format and in the fastest way possible. If you do this, you may be the site that gets the ranking and the traffic.

For highly competitive phrases where big money is being spent, you will need some authority and trust, but there’s no reason you shouldn’t create the content that can rank.

You may get lucky and take traffic from the more established sites – it happens a lot. When it does, it can attract backlinks naturally from highly authoritative sites, which helps build your site’s stability.

Another reason to create this content now is that having it in an easy-to-use and trustworthy format can help it rank once your website is strong enough. I’ve seen this happen, where multiple pages rise to the top during core updates.

If you don’t create the content because you think it’s too competitive, you won’t have the chance to rank it when core updates happen.

The last thing I’d consider when looking at keywords with 500+ monthly searches is the long tail.

Long-tail phrases can be part of the topic. When you filter a keyword research tool to only show volumes at 500+, you miss out on parts of the entity, which can include consumer questions.

Knowing what matters to the consumer or user helps to provide them with more complete solutions.

When the page answers all of their questions, they can now convert (if your funnel is good), or they may subscribe to your publication because you’re a solution provider.

We never focus on SEO keyword volume when doing research, but we love high volumes when we find them.

We look at what will benefit the person on the page and if it matches the topic of the site, products, and services.

From there, we use keywords and search volumes to set a potential goal in traffic, but we don’t stress if there is no search volume.

Google Discover data, for example, isn’t going to show up, but if the content aligns with interests and your site qualifies, you could get featured and attract a ton of new visitors.

I hope this helps answer your question.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

How CMOs Can Tell Stories To Manage Change [Case Study With Mondelēz International] via @sejournal, @gregjarboe

Chief marketing officers should evaluate and synthesize success stories to learn from past marketing efforts, identify repeatable strategies, and demonstrate the return on investment (ROI) of their work to stakeholders.

Ultimately, this can help to drive better future campaigns and business outcomes.

As Steve Jobs once observed, “The most powerful person in the world is the storyteller. The storyteller sets the vision, values, and agenda of an entire generation that is to come.”

Telling Stories To Manage Change And Uncertainty

Storytelling is important because it’s a fundamental way that humans connect, share experiences, and learn.

It fosters empathy, creativity, and emotional intelligence while also helping to build relationships, convey complex ideas, and inspire action.

Mondelēz International, a Fortune 500 company in over 150 countries, generated around $36 billion in net revenue in 2024.

Its well-known international and local brands include Oreo, Ritz, and Tate’s Bake Shop cookies and baked goods, along with chocolate favorites like Cadbury Dairy Milk and Toblerone.

(Disclosure: I was a member of a team of subject matter experts who taught a bespoke digital marketing training program for hundreds of marketers at Mondelēz International. I can share its story now without violating my non-disclosure agreement because it has since made this information public.)

Mondelēz International’s Journey To Customer-Centric Growth

The challenge for any Fortune 500 CMO is navigating the ever-evolving consumer behavior and technological advancements.

Mondelēz International, a global snack giant, offers a compelling blueprint for not only reacting to change, but also proactively shaping it.

Its journey, spanning several years, highlights the critical elements of foresight, collaboration, and a deep commitment to understanding the customer.

Embracing Empathy At Scale

Back in 2019, Mondelēz recognized a fundamental shift in consumer expectations. The desire for generic brand messaging was waning, replaced by a craving for familiarity and personalization.

This insight spurred a strategic pivot, moving the company from a margin-focused approach to one centered on growth, fueled by increased marketing investment and a concept it termed “empathy at scale.

This wasn’t just about collecting data; it was about establishing the right connection with the right customer at the right time.

The early days of the pandemic underscored the wisdom of this shift.

While consumer behavior was in flux, Mondelēz’s prior investment in digital maturity and flexibility provided the agility needed to adapt.

The bedrock of this strategy was a profound understanding of its consumers, allowing it to create genuine value, a principle that remains timeless in the face of uncertainty.

Mondelez India’s Automation-Driven Success

Mondelez India has achieved remarkable success through automation, particularly in the diverse Indian market.

Its innovative approach to ad personalization has demonstrated the transformative power of marketing automation and machine learning in creating deep customer connections and driving significant sales.

During the Diwali festive season, Mondelez India recognized the immense value of local relevance for its Cadbury Celebrations gift boxes.

It ingeniously leveraged voice AI and ML to create ads featuring megastar Shah Rukh Khan, in which he personally named local stores selling their products.

This technology enabled the efficient generation of a staggering 130,000 videos, each tailored to a specific store.

Using YouTube’s advanced contextual targeting, the campaign matched ad versions with the right audience based on their proximity to local stores.

This hyper-local approach resonated strongly, resulting in a 60% increase in YouTube engagement, 42% growth in sales at local stores, and 33 million gift boxes sold during the festive season. The campaign underscored the power of making consumers feel directly seen and acknowledged within their local context.

Mondelez India further pushed the boundaries of ad personalization with its campaign for Perk, a chocolate brand popular among youth.

Recognizing the cultural phenomenon of “cancel culture,” the brand aimed to inject humor and encourage levity.

Using AI to identify 2.5 million of the most searched videos, it created custom disclaimers that playfully warned viewers of potential “triggers” within the content, such as a carrot being aggressively chopped in a cooking video.

These short, pre-roll ads were seamlessly integrated into each of the millions of trending videos using Google’s custom-built API and Director Mix technology.

The campaign’s clever and highly contextualized approach resonated with viewers, bringing in an impressive 84 million views, 635 million impressions, and a 20% spike in sales.

It demonstrated how injecting timely cultural relevance, powered by automation, can capture attention and drive business results.

Bridging The Art And Science Of Marketing

The execution of “empathy at scale” demanded a fundamental transformation in how Mondelēz operated. It wasn’t enough to have insightful data; the brand needed to activate it effectively.

This required a powerful synergy between the “art” of marketing and the “science” of data.

A pivotal element was the strong partnership between the chief marketing and sales officer and the architect of their data infrastructure. This collaboration was the engine driving their digital transformation.

Recognizing the need for robust data management, Mondelēz partnered with Google Cloud to build regional data hubs for first-party data.

Critically, it also invested in training its teams to leverage these new capabilities. This wasn’t just about technology adoption; it was about empowering its people to harness the power of data.

This strategic overhaul yielded impressive results. By integrating previously siloed data, Mondelēz gained a holistic view of its consumers, enabling it to deliver personalized content that cut through the noise.

This human-driven strategic shift, augmented by technology, resulted in significant ROI increases globally and in the U.S., laying a solid foundation for sustained growth.

Leveraging AI To Scale Personalization And Reach New Audiences

The marketing landscape continues to evolve, with audience fragmentation across media platforms becoming a significant challenge.

For brands with deep heritage, like Cadbury, the added complexity lies in extending their reach beyond traditional channels to engage new generations.

The story of Cadbury’s Creme Egg offers a powerful illustration of how to navigate this challenge.

Faced with increased competition and cost-of-living pressures impacting consumer spending, Cadbury recognized the need to connect with Gen Z and Millennials, who were less engaged with traditional TV advertising.

Building on its existing digital presence, particularly on YouTube, the brand explored the potential of AI-powered video advertising. Initially, adapting its existing TV ad for digital seemed like the most cost-effective approach.

However, it discovered that YouTube’s AI ad formats, specifically Video Reach Campaigns, required a diverse range of creative assets built from the ground up. This realization highlighted the importance of platform-specific creative strategies.

Through a collaboration with Google’s Creative Works team and its creative agency VCCP, Cadbury embraced this challenge. It developed a series of assets for an AI-driven campaign centered around its iconic “How do you eat yours?” slogan.

Leveraging consumer research, it highlighted different eating styles, creating quirky and engaging video statements in various formats, from six-second bumpers to longer ads with compelling story arcs.

By providing a diverse content ecosystem, Cadbury empowered YouTube’s AI to effectively match the right Creme Egg message with the right viewer at the right time.

This approach, managed through a single campaign, allowed the AI to optimize ad delivery based on business goals and audience signals far more effectively than manual adjustments.

Despite economic pressures, the success of this AI-powered campaign, which focused on maximizing unique reach, led to increased investment in both production and media, demonstrating the power of AI to enhance campaigns while underscoring the enduring importance of human creativity.

Key Takeaways For CMOs

This series of Mondelēz International case studies offers valuable insights for CMOs seeking to navigate the complexities of modern marketing and foster customer-centric growth.

Several key takeaways emerge from these examples.

1. Customer Empathy Serves As The Foundational Element For Sustainable Growth

Mondelēz’s early recognition of the necessity to prioritize understanding its customers over solely focusing on margin proved pivotal.

This “empathy at scale” approach became the cornerstone of its subsequent achievements.

This goes beyond mere data collection; true empathy involves utilizing those insights to generate genuine value for the customer by deeply understanding their needs and desires.

The resilience of this customer-centric strategy was particularly evident during the pandemic, enabling Mondelēz to adapt swiftly due to its preexisting strong understanding of its consumers.

2. Hyper-Personalization Implemented At Scale Drives Significant Results

The success of Mondelez India with campaigns for Cadbury Celebrations and Perk illustrates the transformative potential of marketing automation and machine learning in delivering personalized experiences on a large scale.

The Cadbury Celebrations campaign brilliantly demonstrated the impact of hyper-local personalization, making consumers feel directly seen and acknowledged within their own communities.

Furthermore, the Perk campaign highlighted the effectiveness of incorporating timely cultural relevance, powered by AI, to cut through the noise and resonate effectively with audiences.

3. Bridging The Gap Between The Art And Science Of Marketing Is Essential For Success

Effective marketing in today’s landscape demands a strong synergy between the creative aspects of marketing and the analytical power of data.

Achieving this requires critical cross-functional collaboration, particularly a strong working relationship between the CMO/CSO and the data infrastructure architect to drive digital transformation.

Investing in robust data infrastructure is only part of the equation; CMOs must also prioritize training their teams to effectively utilize these new capabilities.

Ultimately, integrating siloed data to gain a holistic view of the customer enables more effective personalization and improves overall return on investment.

Summary

While AI is a powerful tool for scaling personalization and reaching new audiences, it necessitates a strategic approach.

AI can assist brands in overcoming the challenge of reaching increasingly fragmented audiences across numerous platforms.

However, it’s crucial to recognize that platform-specific creative is often necessary, as simply repurposing traditional creative for digital platforms may not be optimal.

AI-powered ad formats often require tailored creative strategies developed from the outset.

Despite the capabilities of AI, human creativity remains essential. Compelling and engaging creative, driven by human insights, is still fundamental to campaign success.

Even during periods of economic pressure, investing in AI-powered campaigns focused on maximizing unique reach can yield significant results and justify further investment in this technology.

The Mondelēz journey underscores the importance of a fundamental shift towards customer-centricity, enabled by strategic investments in technology, data, and talent.

By embracing these principles, CMOs can equip their Fortune 500 companies to not only weather the storms of change and uncertainty, but also to emerge stronger and more connected with their customers.

More Resources:


Featured Image: StockLite/Shutterstock

Reimagining EEAT To Drive Higher Sales And Search Visibility via @sejournal, @martinibuster

The SEO Charity podcast recently discussed a different way to think about EEAT that focuses on activities that leads to external signals that Google may associate with the underlying concepts of EEAT (expertise, experience, authoritativeness, and trustworthiness). Google’s John Mueller recently said that EEAT is not something that you can add to a site and most of what was discussed on the show lines up perfectly with that reality.

The podcast, hosted by Olesia Korobka and Anton Shulke, featured Amanda Walls (LinkedIn profile), founder of Cedarwood Digital in Manchester, UK.

Aristotle And SEO

Amanda introduced the concept of applying Aristotle’s principles of ethos, pathos, and logos to SEO strategy. These principles are three ways to persuade site visitors and potential customers:

  1. Credibility (ethos)
  2. Emotional appeal (pathos)
  3. Logical reasoning (logos), which is used to convince an audience.

Amanda explains these concepts in more depth but those three principles form the basis for her approach to creating the circumstances that lead to positive external signals that can be correlated to concepts like expertise, experience, authoritativeness, and trustworthiness.

Why It Matters for SEO

Amanda says that SEO is ultimately about driving leads and conversions, not just rankings and I agree with that 100%. The history of SEO is littered with gurus crowing about all the traffic they gained for clients but they never talk about the part that really matters which is sales and leads.

Link building historically falls into that trap where both the client and the link builder focus on how many links are acquired each month and look to traffic as evidence of success. But really, as Amanda points out, everything that a good SEO does should be focused on increasing sales. Nothing else matters.

Amanda explained:

“SEO is more than just rankings, it’s about conversion. It’s about business return. It’s about getting that success, those leads, those sales that we need… Bringing people to a website ….means nothing if they don’t convert. …we don’t just want to bring people to the website, we want them to engage and love your brand and have a really, really good reason to go through and fulfill the conversion journey.”

Reputation Management

Amanda recommends focusing on managing the business’s reputation, such as in reviews, interviews, and what’s written online about the brand.

She cites the following statistics:

  • 87% of consumers will back out of a purchase decision if they read something negative about the brand.
  • 81% of consumers do extensive research before a purchase, as much as 79 days.

Amanda prescribes findability, credibility, and persuasion as the ingredients for successful search optimization:

“We’re working on SEO to help people find us, and then most importantly, we are convincing them or we’re persuading them to actually go and purchase our product…”

Monitor Off-Site Signals

Amanda recommends regularly researching your brand to uncover potential issues, to monitor the online user sentiment, and to assess media coverage because poor off-site sentiment can remove users out of the conversion funnel.

Manage On-Site Signals

Amanda also recommends using the About Us page for sharing relatable stories that users can generate actual positive feelings for the brand, using the phrase emotional appeal to describe the experience users should get from an About Us page. She says that this can be as simple as telling potential customers about the business.

User-Generated Content And Authenticity

Many of the fastest growing business on the Internet cultivate high quality user generated content. Encouraging customers to post reviews and images helps to build confidence in products.

Amanda explains:

“And then also from a pathos perspective, you know, really getting that kind of user generated content, getting people to connect… because fundamentally humans, they buy from humans and the more human and the more emotional that we can be in our sales process, the more likely that we are to get that buy-in and that connection that we need to actually get across to our audience.”

Pitching To Journalists

This last part, pitching story ideas to journalists, is something that link building companies consistently get wrong. I know because I get approached by them all the time and they consistently have the wrong approach, which is focusing too much on links and not enough on understanding my audience.

I specialized in link building back in the early days of SEO (early 2000s). I was even the moderator of the link building forum at WebmasterWorld. Although I don’t do link building anymore, I have a vast, vast amount of experience persuading publishers to give my clients a link.

My opinion is that PR to journalists should be approached strictly for brand exposure. Don’t make links the goal.

Focus instead on building positive stories with journalists and let them write those articles with or without adding a link, let them decide. What will happen is that the consumers will go out and type your business’s name into Google and that’s a strong, strong signal. I prefer thousands of consumers typing my website’s name on Google over a handful of links, every time, all day long.

I strongly agree with what Amanda says about understanding a journalist’s audience:

“92% of journalists say that understanding their audience is crucial for them to consider a story pitch.”

Understanding the audience is super important. I’ll go even deeper and recommend understanding what motivates the audience. Focus on the reasons why a journalist’s readers will click an article title that’s displayed on Google News. Once you understand that part, I can practically guarantee that PR outreach approval rates will skyrocket.

Takeaway

The SEO Charity podcast episode featuring Amanda Walls introduces a novel way to build signals associated with Google’s EEAT (expertise, experience, authoritativeness, trustworthiness) by focusing on credibility, emotion, and logic in content strategy. Walls emphasizes using Aristotle’s persuasive principles to influence reputation, brand perception, and conversion, encouraging SEO strategies focused on meaningful business outcomes like leads and sales, with better search visibility that supports those ends.

Watch the SEO Charity episode on EEAT:

Reimagining E-E-A-T with Amanda Walls

Featured Image by Shutterstock/Ollyy

WordPress Jubilee Of Forgiveness Continues via @sejournal, @martinibuster

Last week, WordPress declared a “jubilee” and is unblocking all community members who were previously blocked. The official WordPress X (formerly Twitter) account posted a reminder that the unblocking is still ongoing.

According to the latest post:

“We’re clearing out all previous human blocks to create a more open and collaborative environment. While community and directory guidelines remain, consider any old blocks to be bugs that are on their way out.”

A similar post on the official WordPress site echoed the post on X:

“As I said, we’re dropping all the human blocks. Community guidelines, directory guidelines, and such will need to be followed going forward, but whatever blocks were in place before are now cleared. It may take a few days, but any pre-existing blocks are considered bugs to be fixed.”

WordPress appears to be using the word Jubilee in the sense of the Jewish and biblical tradition of a year of forgiveness.

The part about “Dropping all the human blocks” is similar to the Jewish jubilee in terms of forgiveness.

Moving forward, all pre-existing blocks will be considered “bugs” for fixing and everyone who is unblocked and those who were never blocked will still be subject to being banned should they fail to abide by WordPress community guidelines.

The post on X received a handful of responses.

Read the latest post on X:

Featured Image by Shutterstock/Ollyy

Why Your Loyalty Program Isn’t Working

Loyalty programs are more than the usual rewards of 10% off, free shipping, and birthday emails. Done well, loyalty incentives focus on psychological and behavioral science to deepen retention.

Smarter Segmentation

Seasoned marketers segment for campaigns, but what about loyalty impact? Try building segments based on motivational context, not just purchase history.

  • Redemption behavior. Who hoards rewards, and who redeems quickly? Target accordingly.
  • Dormancy within loyalty tiers. Users with no activity for 60 days may need a different prod than recent converts.
  • High browse, low buy customers. Use loyalty nudges to bridge the gap with non-monetary perks or risk-free trials.

Build loyalty throughout the shopping journey:

  • Acquisition. Display loyalty perks on product pages and modals, and on ad copy (Meta, Google) that speaks to exclusive benefits.
  • Onboarding. Pre-enroll customers or ask for their birth dates and unique interests early to tailor benefits faster.

Sephora’s Beauty Insider program offers tiered perks, birthday gifts, and exclusive experiences that encourage purchase frequency and aspiration.

Screenshot of the BeautyInsider page

Sephora’s Beauty Insider program encourages purchases and aspiration via tiered perks, birthday gifts, and exclusive experiences. Click image to enlarge.

More Than Discounts

Discounts offer short-term gratification, but they don’t build lasting loyalty. Instead, think about what motivates long-term engagement:

  • Progress effect. People are more likely to complete a task when they feel they’ve already started. Pre-load new customers with points or status and visually highlight their progress.
  • Variable rewards. Unpredictable perks (e.g., surprise freebies, mystery discounts) can spur action and boost engagement.
  • Goal-gradient hypothesis. The closer people are to a goal (e.g., a gift at 100 points), the more effort they exert to reach it. Use dynamic emails or texts to show progress bars and remaining required actions.

For your high-value customers, consider layered benefits based on lifecycle and psychology:

  • Exclusive access. Think status and belonging, such as early drops, members-only content, and personalized products.
  • Identity-based rewards. Customers want recognition. Use first-party data (e.g., style quiz responses, dietary preferences) to personalize loyalty perks that align with their values.
  • Mission-aligned incentives. Offer donation matching, carbon offset rewards, or “choose your perk” flexibility for cause-conscious customers.

Beyond Email

Experienced teams know this, but it’s worth reiterating: An email-only loyalty program is limited and often ineffective. A little integration goes a long way in making the program feel alive, not automated.

Connect loyalty data to:

  • SMS platforms for real-time nudges (“You’re 10 points from your next reward!”).
  • Ad platforms.
  • Customer service platforms so agents can surprise and delight based on tier or behavior.

In short, customers remember the shopping experience and interaction with your brand, not points alone. Design rewards to tap into progress, surprise, exclusivity, and identity. Move from boring and predictable to habit-forming and sticky.

Google’s Updated Raters Guidelines Refines Concept Of Low Quality via @sejournal, @martinibuster

Google’s Search Quality Rater Guidelines were updated a few months ago, and several of the changes closely track the talking points shared by Googlers at the 2025 Search Central Live events. Among the most consequential updates are those to the sections defining the lowest quality pages, which more clearly reflect the kinds of sites Google wants to exclude from the search results.

Section 4.0 Lowest Quality Pages

Google added a new definition of the Lowest Rating in the Lowest Quality Pages section. While Google has always been concerned about removing low quality sites from the search results, this change to their raters guideline likely reflects an emphasis on weeding out a specific kind of low quality website.

The new guideline focuses on identifying the publisher’s motives for publishing the content.

The previous definition said:

“The Lowest rating is required if the page has a harmful purpose, or if it is designed to deceive people about its true purpose or who is responsible for the content on the page.”

The new version keeps that sentence but adds a new sentence that encourages the quality rater to consider the underlying motives of the publisher responsible for the web page. The focus of this guidance is to encourage the quality raters to consider how the page benefits a site visitor and to judge whether the purpose of the page is entirely for benefiting the publisher.

The addition to this section reads:

“The Lowest rating is required if the page is created to benefit the owner of the website (e.g. to make money) with very little or no attempt to benefit website visitors or otherwise serve a beneficial purpose.”

There’s nothing wrong with being motivated to earn an income from a website. What Google is looking at is if the content only serves that purpose or if there is also a benefit for the user.

Focus On Effort

The next change is focused on identifying how much effort was put into creating the site. This doesn’t mean that publishers must now document how much time and effort was put into the creating the content. This section is simply about looking for evidence that the content is not distinguishable from content on other sites and offers no clear advantages over the content found elsewhere on the Internet.

This part about the main content (MC) was essentially rewritten:

“● The MC is copied, auto-generated, or otherwise created without adequate effort.”

The new version has more nuance about the main content (MC):

“● The MC is created with little to no effort, has little to no originality and the MC adds no value compared to similar pages on the web”

Three things to unpack there:

  1. Content created with little to no effort
  2. Contains little to no originality
  3. Main content adds no additional value

Publishers who focus on keeping up with competitors should be careful that they’re not simply creating the same thing as their competitors. Saying that it’s not the same thing because it’s the same topic only better doesn’t change the fact that it’s the same thing. Even if the content is “ten times better” the fact remains that it’s still basically the same thing as the competitor’s content, only ten times more of it.

A Word About Content Gap Analysis

Some people are going to lose their minds about what I’m going to say about this, but keep an open mind.

There is a popular SEO process called Content Gap Analysis. It’s about reviewing competitors to identify topics that the competitors are writing about that are missing on the client’s site then copying those topics to fill the content gap.

That is precisely the kind of thing that leads to unoriginality and content that is indistinguishable from everything else that’s on the Internet. It’s my number one reason I would never use a software program that scrapes top ranked sites and suggests topics based on what the competitor’s are publishing. It results in virtually indistinguishable content and pure unoriginality.

Who wants to jump from one site to another site and read the same exact  recipes, even if they have more images and graphs and videos. Copying a competitor’s content “but doing it better” is not original.

Scraping Google’s PAAs (People Also Asked) just like everyone else does not result in original content. It results in content that’s exactly the same as everyone else that’s scraping PAAs.

While the practice of content gap analysis is about writing about the same thing only better, it’s still unoriginal. Saying it’s better doesn’t change the fact that it’s the same thing.

Lack of originality is a huge issue with Internet content and it’s something that Google’s Danny Sullivan discussed extensively at the recent Google Search Central Live in New York City.

Instead of looking for information gaps, it’s better to review your competitor’s weaknesses. Then look at their strengths. Then compare that to your own weaknesses and strengths.

A competitor’s weakness can become your strength. This is especially valuable information when competing against a bigger and more powerful competitor.

Takeaways

1. Google’s Emphasis on Motive-Based Quality Judgments

  • Quality raters are now encouraged to judge not just content, but the intent behind it.
  • Pages created purely for monetization, with no benefit to users, should be rated lowest.
  • This may signal Google’s intent to refine their ability to week out low quality content based on the user experience.

2. Effort and Originality Are Now Central Quality Signals

  • Low-effort or unoriginal content is explicitly called out as justification for the lowest rating.
  • This may signal that Google’s algorithms may increasingly focus on surfacing content with higher levels of originality.
  • Content that doesn’t add distinctive value over competitors may struggle in the search results

3. Google’s Raters Guidelines Reflect Public Messaging

  • Changes to the Guidelines mirror talking points in recent Search Central Live events.
  • This suggests that Google’s algorithms may become more precise on things like originality, added value, and effort put into creating the content.
  • This means publishers should (in my opinion) consider ways to make their sites more original than other sites, to compete by differentiation.

Google updated its Quality Rater Guidelines to draw a sharper line between content that helps users and content that only helps publishers. Pages created with little effort, no originality, or no user benefit are now listed as examples of the lowest quality, even if they seem more complete than competing pages.

Google’s Danny Sullivan used the example of travel sites that all have the sidebar that introduces the smiling site author and other hallmarks of travel sites as an example of an area where sites become indistinguishable from each other.

The reason why publishers do that is that they see what Google is ranking and assume that’s what Google wants. In my experience, that’s not the case. In my opinion it may be useful to think about what you can do to make a site more original.

Download the latest version of Google’s Search Quality Raters Guidelines here (PDF).

Featured Image by Shutterstock/Kues

Google Answers Why Landing Page Ranks For An E-Commerce Query via @sejournal, @martinibuster

Google’s John Mueller answered a question on Bluesky about why an e-commerce page with minimal content is ranking, illustrating that sometimes optimized content isn’t enough.

E-Commerce Search Results

A person posted their concerns about an e-commerce site that was ranking in the search results with barely any content. In fact, the domain that was ranking redirects to another domain. On the face of it, it appears like something is not right. Why would Google rank a landing page about a domain name transfer, right?

Why would Google rank what is essentially a landing page with virtually zero content for a redirected domain?

Why A Landing Page Ranks

The company with the landing page had acquired another company and they subsequently joined the two domains. There was nothing wrong or spammy going on, one business bought another business, it happens every day.

The person asking the question dropped a URL and a screenshot of the landing page and asked:

“How does Google think this would be the best result and also, do you think this is a relevant result for users?”

Google’s John Mueller answered:

“It looks like a normal ecommerce site to me. They could have handled the site-migration a bit more gracefully (and are probably losing a lot of “SEO value” by doing this instead of a real migration), but it doesn’t seem terrible for users.”

Site Migration

Mueller’s comment about the site migration was expanded further.

He posted:

“Our guidance for site migrations is at https://developers.google.com/search/docs/crawling-indexing/site-move-with-url-changes . What they’re doing is a “soft or crypto redirect”, and they’re doing it “N:1″ (meaning all old pages go there). Both of these make transfering information about the old site hard / impossible.”

Sometimes Google ranks pages that seem like they don’t belong. But sometimes the site rankings make sense when looked at from a different perspective, particularly from the perspective of what’s good and makes sense for the user. Rankings change all the time and it could be that the rankings for that page could go away after a certain amount of time. But waiting for a competitor to drop away isn’t really a good SEO strategy. Google’s Danny Sullivan had some good advice about differentiating a site for better rankings.

Channel Reporting Is Coming To Performance Max Campaigns via @sejournal, @brookeosmundson

Google just launched substantial upgrades to its Performance Max campaigns today.

In their announcement, they introduced long-anticipated reporting features that will provide advertisers with much-needed visibility into how their campaigns perform across different Google surfaces.

These updates include new channel-level reporting, full search terms data, and expanded asset performance metrics.

The goal?

It’s aimed at helping marketers better understand, evaluate, and optimize their Performance Max campaigns.

The rollout is expected to begin with an open beta for channel performance reporting in the coming weeks.

For advertisers managing budget and strategy across a mix of formats and inventory, these reporting enhancements mark a meaningful step forward in understanding where results are coming from and how to take informed action.

Advertiser Feedback is Directly Shaping PMax’s Direction

According to Google, Performance Max is now used by over one million advertisers.

In 2024 alone, Google implemented more than 90 improvements to Performance Max, leading to measurable gains in both conversions and conversion value.

But alongside performance, advertisers have consistently asked for better transparency and reporting.

Google’s latest announcements make clear that advertiser feedback has played a central role in shaping these enhancements.

The goal is to deliver clearer insights, support decision-making, and increase control—without sacrificing the benefits of automation.

Channel Performance Reporting Is Coming To Performance Max

Channel-level reporting is the most significant update in this release.

For the first time, advertisers will be able to view results by channel: Search, YouTube, Display, Discover, Gmail, Maps, and Search partners.

The new “Channel performance” page will show:

  • Visual breakdowns of performance by surface
  • Campaign-level metrics for each channel, including clicks, conversions, and spend
  • A downloadable table with key performance data
  • Diagnostics to surface missed opportunities or setup issues.

You’ll be able to find the Channel Performance reporting in the “Insights & reports” tab on the left-hand side of Google. See the example below on how the report will function.

For example, if Maps isn’t generating traffic, diagnostics might suggest adding a location asset. Or if YouTube is outperforming, advertisers can shift their focus to high-impact video creatives.

The ability to view spend and conversion value by channel adds clarity that Performance Max has previously lacked.

Search Terms Reporting Reaches (Almost) Full Visibility

Another major enhancement is the addition of full search terms reporting.

Advertisers will now be able to see the actual queries driving performance – similar to what’s available in standard Search and Shopping campaigns.

With this rollout, marketers can:

  • Identify top-performing search terms
  • Create tailored assets round those queries
  • Apply negative keywords or brand exclusions when needed

For agencies managing multiple clients or accounts at scale, this change improves daily workflow efficiency.

Rather than relying solely on limited theme-level insights or making assumptions about what’s driving performance, teams can now analyze exact queries.

This supports better keyword refinement, more accurate exclusions, and tighter alignment between campaign objectives and user behavior, all within the familiar framework of Search best practices.

Privacy thresholds will still apply, but the reporting experience will be much more detailed than before.

At launch, this feature will be available in the Google Ads UI only, with API support expected later.

For marketers focused on search intent, this change makes Performance Max a more actionable channel.

More Granular Asset Metrics Across Campaign Types

Asset reporting is also expanding. In addition to conversion data, advertisers will now see:

  • Impressions
  • Clicks
  • Cost
  • Conversion Value
Example of expanded asset-level reporting in Performance Max.Image Credit: Google, April 2025

These new metrics will apply across Performance Max, Search, and Display. This allows advertisers to evaluate creative performance at a deeper level.

Want to know if your video is driving more conversions than your static image? Now you can. Want to see if your headline gets more clicks than your call-to-action? The data is there.

These insights support better creative testing and stronger Ad Strength scores, all based on performance—not assumptions.

Built-In Diagnostics Help Spot Gaps and Missed Opportunities

Google is also adding diagnostics that flag potential performance issues. These insights will live within the Channel performance page and highlight areas for improvement.

For example:

  • If you’re not showing on Maps, diagnostics might suggest adding a location feed or location asset
  • If Search delivery is limited, landing page relevance could be the cause
Image credit: Google, April 2025

This feature won’t give full control over where ads appear, but it does provide better visibility into what’s working and what’s not.

Channel exclusions are still not available in Performance Max, but Google confirmed it’s exploring future control options. For now, diagnostics serve as a step toward more informed decision-making.

Why These Updates Matter For Advertisers

This round of updates helps address a long-standing challenge with Performance Max: the lack of visibility.

Advertisers have embraced the campaign type for its scale and automation, but often struggled to understand the “how” behind performance.

With these new features, advertisers will gain:

  • Channel-level transparency
  • Deeper search intent insights
  • Clearer creative performance metrics
  • Actionable recommendations to fix delivery issues

These aren’t just incremental changes. They reshape how marketers can evaluate and optimize PMax.

The updates make it easier to align creative strategy, understand channel contribution, and refine search targeting.

It’s also clear that Google is listening. The inclusion of diagnostics, downloadable tables, and more detailed reporting shows a strong response to real-world feedback.

These updates also signal a broader industry shift toward hybrid automation models: where AI handles scale, but humans still guide strategy with the help of robust data.

As marketers continue to seek clarity on campaign performance, updates like these help reinforce trust in automated systems by making them easier to measure and manage.

More details are expected at Google Marketing Live. But this release signals a new phase for Performance Max: one that balances automation with greater accountability and insight.

AI Search & SEO: Key Trends and Insights [Webinar] via @sejournal, @lorenbaker

As AI continues to reshape search, marketers and SEOs are facing a new set of challenges and opportunities. 

From the rise of AI Overviews to shifting SERP priorities, it’s more important than ever to know what to focus on in 2025.

Why This Webinar Is a Must-Attend Event

In this session, you’ll get:

You’ll Learn How To:

  • Adapt your approach to optimize for both answer engines and traditional search engines.
  • Create top-of-SERP content that stands out to AI Overviews.
  • Update technical SEO strategies for the AI era.
  • Use success in conversions as the overall KPI.

Expert Insights From Conductor

Join Shannon Vize, Sr. Content Marketing Manager at Conductor, and Pat Reinhart, VP of Services & Thought Leadership, as they walk through the biggest search and content shifts shaping 2025. From Google’s AI Overviews to new content strategies that actually convert, you’ll get clear guidance to help you move forward with confidence.

Don’t Miss Out!

Join us live and walk away with a clear roadmap for leading your SEO and content strategy in 2025.

Can’t attend live?

Register anyway and we’ll send you the full recording to watch at your convenience.

11 Lessons Learned From Auditing Over 500 Websites via @sejournal, @olgazarr

After conducting more than 500 in-depth website audits in the past 12 years, I’ve noticed clear patterns about what works and doesn’t in SEO.

I’ve seen almost everything that can go right – and wrong – with websites of different types.

To help you avoid costly SEO mistakes, I’m sharing 11 practical lessons from critical SEO areas, such as technical SEO, on-page SEO, content strategy, SEO tools and processes, and off-page SEO.

It took me more than a decade to discover all these lessons. By reading this article, you can apply these insights to save yourself and your SEO clients time, money, and frustration – in less than an hour.

Lesson #1: Technical SEO Is Your Foundation For SEO Success

  • Lesson: You should always start any SEO work with technical fundamentals; crawlability and indexability determine whether search engines can even see your site.

Technical SEO ensures search engines can crawl, index, and fully understand your content. If search engines can’t properly access your site, no amount of quality content or backlinks will help.

After auditing over 500 websites, I believe technical SEO is the most critical aspect of SEO, which comes down to two fundamental concepts:

  • Crawlability: Can search engines easily find and navigate your website’s pages?
  • Indexability: Once crawled, can your pages appear in search results?

If your pages fail these two tests, they won’t even enter the SEO game — and your SEO efforts won’t matter.

I strongly recommend regularly monitoring your technical SEO health using at least two essential tools: Google Search Console and Bing Webmaster Tools.

Google Search Console Indexing ReportGoogle Search Console Indexing Report provides valuable insights into crawlability and indexability. Screenshot from Google Search Console, April 2025

When starting any SEO audit, always ask yourself these two critical questions:

  • Can Google, Bing, or other search engines crawl and index my important pages?
  • Am I letting search engine bots crawl only the right pages?

This step alone can save you huge headaches and ensure no major technical SEO blockages.

→ Read more: 13 Steps To Boost Your Site’s Crawlability And Indexability

Lesson #2: JavaScript SEO Can Easily Go Wrong

  • Lesson: You should be cautious when relying heavily on JavaScript. It can easily prevent Google from seeing and indexing critical content.

JavaScript adds great interactivity, but search engines (even as smart as Google) often struggle to process it reliably.

Google handles JavaScript in three steps (crawling, rendering, and indexing) using an evergreen Chromium browser. However, rendering delays (from minutes to weeks) and limited resources can prevent important content from getting indexed.

I’ve audited many sites whose SEO was failing because key JavaScript-loaded content wasn’t visible to Google.

Typically, important content was missing from the initial HTML, it didn’t load properly during rendering, or there were significant differences between the raw HTML and rendered HTML when it came to content or meta elements.

You should always test if Google can see your JavaScript-based content:

  • Use the Live URL Test in Google Search Console and verify rendered HTML.
Google Search Console LIVE TestGoogle Search Console LIVE Test allows you to see the rendered HTML. (Screenshot from Google Search Console, April 2025)
  • Or, search Google for a unique sentence from your JavaScript content (in quotes). If your content isn’t showing up, Google probably can’t index it.*
Site: search in Google The site: search in Google allows you to quickly check whether a given piece of text on a given page is indexed by Google. (Screenshot from Google Search, April 2025)

*This will only work for URLs that are already in Google’s index.

Here are a few best practices regarding JavaScript SEO:

  • Critical content in HTML: You should include titles, descriptions, and important content directly in the initial HTML so search engines can index it immediately. You should remember that Google doesn’t scroll or click.
  • Server-Side Rendering (SSR): You should consider implementing SSR to serve fully rendered HTML. It’s more reliable and less resource-intensive for search engines.
  • Proper robots.txt setup: Websites should block essential JavaScript files needed for rendering, as this prevents indexing.
  • Use crawlable URLs: You should ensure each page has a unique, crawlable URL. You should also avoid URL fragments (#section) for important content; they often don’t get indexed.

For a full list of JavaScript SEO common errors and best practices, you can navigate to the JavaScript SEO guide for SEO pros and developers.

Read more: 6 JavaScript Optimization Tips From Google

Lesson #3: Crawl Budget Matters, But Only If Your Website Is Huge

  • Lesson: You should only worry about the crawl budget if your website has hundreds of thousands or millions of pages.

Crawl budget refers to how many pages a search engine like Google crawls on your site within a certain timeframe. It’s determined by two main factors:

  • Crawl capacity limit: This prevents Googlebot from overwhelming your server with too many simultaneous requests.
  • Crawl demand: This is based on your site’s popularity and how often content changes.

No matter what you hear or read on the internet, most websites don’t need to stress about crawl budget at all. Google typically handles crawling efficiently for smaller websites.

But for huge websites – especially those with millions of URLs or daily-changing content – crawl budget becomes critical (as Google confirms in its crawl budget documentation).

Google documentation on crawl budgetGoogle, in its documentation, clearly defines what types of websites should be concerned about crawl budget. (Screenshot from Search Central, April 2025)

In this case, you need to ensure that Google prioritizes and crawls important pages frequently without wasting resources on pages that should never be crawled or indexed.

You can check your crawl budget health using Google Search Console’s Indexing report. Pay attention to:

  • Crawled – Currently Not Indexed: This usually indicates indexing problems, not crawl budget.
  • Discovered – Currently Not Indexed: This typically signals crawl budget issues.

You should also regularly review Google Search Console’s Crawl Stats report to see how many pages Google crawls per day. Comparing crawled pages with total pages on your site helps you spot inefficiencies.

While those quick checks in GSC naturally won’t replace log file analysis, they will give quick insights into possible crawl budget issues and may suggest that a detailed log file analysis may be necessary.

Read more: 9 Tips To Optimize Crawl Budget For SEO

This brings us to the next point.

Lesson #4: Log File Analysis Lets You See The Entire Picture

  • Lesson: Log file analysis is a must for many websites. It reveals details you can’t see otherwise and helps diagnose problems with crawlability and indexability that affect your site’s ability to rank.

Log files track every visit from search engine bots, like Googlebot or Bingbot. They show which pages are crawled, how often, and what the bots do. This data lets you spot issues and decide how to fix them.

For example, on an ecommerce site, you might find Googlebot crawling product pages, adding items to the cart, and removing them, wasting your crawl budget on useless actions.

With this insight, you can block those cart-related URLs with parameters to save resources so that Googlebot can crawl and index valuable, indexable canonical URLs.

Here is how you can make use of log file analysis:

  • Start by accessing your server access logs, which record bot activity.
  • Look at what pages bots hit most, how frequently they visit, and if they’re stuck on low-value URLs.
  • You don’t need to analyze logs manually. Tools like Screaming Frog Log File Analyzer make it easy to identify patterns quickly.
  • If you notice issues, like bots repeatedly crawling URLs with parameters, you can easily update your robots.txt file to block those unnecessary crawls

Getting log files isn’t always easy, especially for big enterprise sites where server access might be restricted.

If that’s the case, you can use the aforementioned Google Search Console’s Crawl Stats, which provides valuable insights into Googlebot’s crawling activity, including pages crawled, crawl frequency, and response times.

Google Search Console Crawl Stats reportThe Google Search Console Crawl Stats report provides a sample of data about Google’s crawling activity. (Screenshot from Google Search Console, April 2025)

While log files offer the most detailed view of search engine interactions, even a quick check in Crawl Stats helps you spot issues you might otherwise miss.

Read more: 14 Must-Know Tips For Crawling Millions Of Webpages

Lesson #5: Core Web Vitals Are Overrated. Stop Obsessing Over Them

  • Lesson: You should focus less on Core Web Vitals. They rarely make or break SEO results.

Core Web Vitals measure loading speed, interactivity, and visual stability, but they do not influence SEO as significantly as many assume.

After auditing over 500 websites, I’ve rarely seen Core Web Vitals alone significantly improve rankings.

Most sites only see measurable improvement if their loading times are extremely poor – taking more than 30 seconds – or have critical issues flagged in Google Search Console (where everything is marked in red).

Core Web Vitals in Google Search ConsoleThe Core Web Vitals report in Google Search Console provides real-world user data. (Screenshot from Google Search Console, April 2025)

I’ve watched clients spend thousands, even tens of thousands of dollars, chasing perfect Core Web Vitals scores while overlooking fundamental SEO basics, such as content quality or keyword strategy.

Redirecting those resources toward content and foundational SEO improvements usually yields way better results.

When evaluating Core Web Vitals, you should focus exclusively on real-world data from Google Search Console (as opposed to lab data in Google PageSpeed Insights) and consider users’ geographic locations and typical internet speeds.

If your users live in urban areas with reliable high-speed internet, Core Web Vitals won’t affect them much. But if they’re rural users on slower connections or older devices, site speed and visual stability become critical.

The bottom line here is that you should always base your decision to optimize Core Web Vitals on your specific audience’s needs and real user data – not just industry trends.

Read more: Are Core Web Vitals A Ranking Factor?

Lesson #6: Use Schema (Structured Data) To Help Google Understand & Trust You

  • Lesson: You should use structured data (Schema) to tell Google who you are, what you do, and why your website deserves trust and visibility.

Schema Markup (or structured data) explicitly defines your content’s meaning, which helps Google easily understand the main topic and context of your pages.

Certain schema types, like rich results markup, allow your listings to display extra details, such as star ratings, event information, or product prices. These “rich snippets” can grab attention in search results and increase click-through rates.

You can think of schema as informative labels for Google. You can label almost anything – products, articles, reviews, events – to clearly explain relationships and context. This clarity helps search engines understand why your content is relevant for a given query.

You should always choose the correct schema type (like “Article” for blog posts or “Product” for e-commerce pages), implement it properly with JSON-LD, and carefully test it using Google’s Rich Results Test or Structured Data Testing Tool.

Structured data markup typesIn its documentation, Google shows examples of structured data markup supported by Google Search. (Screenshot from Google Search Console, April 2025)

Schema lets you optimize SEO behind the scenes without affecting what your audience sees.

While SEO clients often hesitate about changing visible content, they usually feel comfortable adding structured data because it’s invisible to website visitors.

Read more: CMO Guide To Schema: How Your Organization Can Implement A Structured Data Strategy

Lesson #7: Keyword Research And Mapping Are Everything

  • Lesson: Technical SEO gets you into the game by controlling what search engines can crawl and index. But, the next step – keyword research and mapping – tells them what your site is about and how to rank it.

Too often, websites chase the latest SEO tricks or target broad, competitive keywords without any strategic planning. They skip proper keyword research and rarely invest in keyword mapping, both essential steps to long-term SEO success:

  • Keyword research identifies the exact words and phrases your audience actually uses to search.
  • Keyword mapping assigns these researched terms to specific pages and gives each page a clear, focused purpose.

Every website should have a spreadsheet listing all its indexable canonical URLs.

Next to each URL, there should be the main keyword that the page should target, plus a few related synonyms or variations.

Keyword research and keyword mappingHaving the keyword mapping document is a vital element of any SEO strategy. (Image from author, April 2025)

Without this structure, you’ll be guessing and hoping your pages rank for terms that may not even match your content.

A clear keyword map ensures every page has a defined role, which makes your entire SEO strategy more effective.

This isn’t busywork; it’s the foundation of a solid SEO strategy.

→ Read more: How To Use ChatGPT For Keyword Research

Lesson #8: On-Page SEO Accounts For 80% Of Success

  • Lesson: From my experience auditing hundreds of websites, on-page SEO drives about 80% of SEO results. Yet, only about 1 in 20 or 30 sites I review have done it well. Most get it wrong from the start.

Many websites rush straight into link building, generating hundreds or even thousands of low-quality backlinks with exact-match anchor texts, before laying any SEO groundwork.

They skip essential keyword research, overlook keyword mapping, and fail to optimize their key pages first.

I’ve seen this over and over: chasing advanced or shiny tactics while ignoring the basics that actually work.

When your technical SEO foundation is strong, focusing on on-page SEO can often deliver significant results.

There are thousands of articles about basic on-page SEO: optimizing titles, headers, and content around targeted keywords.

Yet, almost nobody implements all of these basics correctly. Instead of chasing trendy or complex tactics, you should focus first on the essentials:

  • Do proper keyword research to identify terms your audience actually searches.
  • Map these keywords clearly to specific pages.
  • Optimize each page’s title tags, meta descriptions, headers, images, internal links, and content accordingly.

These straightforward steps are often enough to achieve SEO success, yet many overlook them while searching for complicated shortcuts.

Read more: Google E-E-A-T: What Is It & How To Demonstrate It For SEO

Lesson #9: Internal Linking Is An Underused But Powerful SEO Opportunity

  • Lesson: Internal links hold more power than overhyped external backlinks and can significantly clarify your site’s structure for Google.

Internal links are way more powerful than most website owners realize.

Everyone talks about backlinks from external sites, but internal linking – when done correctly – can actually make a huge impact.

Unless your website is brand new, improving your internal linking can give your SEO a serious lift by helping Google clearly understand the topic and context of your site and its specific pages.

Still, many websites don’t use internal links effectively. They rely heavily on generic anchor texts like “Read more” or “Learn more,” which tell search engines absolutely nothing about the linked page’s content.

Low-value internal linksImage from author, April 2025

Website owners often approach me convinced they need a deep technical audit.

Yet, when I take a closer look, their real issue frequently turns out to be poor internal linking or unclear website structure, both making it harder for Google to understand the site’s content and value.

Internal linking can also give a boost to underperforming pages.

For example, if you have a page with strong external backlinks, linking internally from that high-authority page to weaker ones can pass authority and help those pages rank better.

Investing a little extra time in improving your internal links is always worth it. They’re one of the easiest yet most powerful SEO tools you have.

Read more: Internal Link Structure Best Practices to Boost Your SEO

Lesson #10: Backlinks Are Just One SEO Lever, Not The Only One

  • Lesson: You should never blindly chase backlinks to fix your SEO. Build them strategically only after mastering the basics.

SEO audits often show websites placing too much emphasis on backlinks while neglecting many other critical SEO opportunities.

Blindly building backlinks without first covering SEO fundamentals – like removing technical SEO blockages, doing thorough keyword research, and mapping clear keywords to every page – is a common and costly mistake.

Even after getting those basics right, link building should never be random or reactive.

Too often, I see sites start building backlinks simply because their SEO isn’t progressing, hoping more links will magically help. This rarely works.

Instead, you should always approach link building strategically, by first carefully analyzing your direct SERP competitors to determine if backlinks are genuinely your missing element:

  • Look closely at the pages outranking you.
  • Identify whether their advantage truly comes from backlinks or better on-page optimization, content quality, or internal linking.
Backlink analysisThe decision on whether or not to build backlinks should be based on whether direct competitors have more and better backlinks. (Image from author, April 2025)

Only after ensuring your on-page SEO and internal links are strong and confirming that backlinks are indeed the differentiating factor, should you invest in targeted link building.

Typically, you don’t need hundreds of low-quality backlinks. Often, just a few strategic editorial links or well-crafted SEO press releases can close the gap and improve your rankings.

Read more: How To Get Quality Backlinks: 11 Ways That Really Work

Lesson #11: SEO Tools Alone Can’t Replace Manual SEO Checks

  • Lesson: You should never trust SEO tools blindly. Always cross-check their findings manually using your own judgment and common sense.

SEO tools make our work faster, easier, and more efficient, but they still can’t fully replicate human analysis or insight.

Tools lack the ability to understand context and strategy in the way that SEO professionals do. They usually can’t “connect the dots” or assess the real significance of certain findings.

This is exactly why every recommendation provided by a tool needs manual verification. You should always evaluate the severity and real-world impact of the issue yourself.

Often, website owners come to me alarmed by “fatal” errors flagged by their SEO tools.

Yet, when I manually inspect these issues, most turn out to be minor or irrelevant.

Meanwhile, fundamental aspects of SEO, such as strategic keyword targeting or on-page optimization, are completely missing since no tool can fully capture these nuances.

Screaming Frog SEO Spider flagging SEO issuesScreaming Frog SEO Spider says there are rich result validation errors, but when I check that manually, there are no errors. (Screenshot from Screaming Frog, April 2025)

SEO tools are still incredibly useful because they handle large-scale checks that humans can’t easily perform, like analyzing millions of URLs at once.

However, you should always interpret their findings carefully and manually verify the importance and actual impact before taking any action.

Final Thoughts

After auditing hundreds of websites, the biggest pattern I notice isn’t complex technical SEO issues, though they do matter.

Instead, the most frequent and significant problem is simply a lack of a clear, prioritized SEO strategy.

Too often, SEO is done without a solid foundation or clear direction, which makes all other efforts less effective.

Another common issue is undiagnosed technical problems lingering from old site migrations or updates. These hidden problems can quietly hurt rankings for years if left unresolved.

The lessons above cover the majority of challenges I encounter daily, but remember: Each website is unique. There’s no one-size-fits-all checklist.

Every audit must be personalized and consider the site’s specific context, audience, goals, and limitations.

SEO tools and AI are increasingly helpful, but they’re still just tools. Ultimately, your own human judgment, experience, and common sense remain the most critical factors in effective SEO.

More Resources:


Featured Image: inspiring.team/Shutterstock