Google On Negative Authorship Signal And Mini-Site Reputation via @sejournal, @martinibuster

At the recent Search Central Live NYC event, Danny Sullivan discussed what happens when a site begins publishing vastly different content on a site and how that may affect rankings, introducing the concept of a mini-site as a metaphor for dividing the reputation of a site. He also discussed the concept of negative authorship authority, which some SEOs believe follows authors from penalized websites and can negatively affect the other sites they publish on.

Negative Authorship Reputation

Danny initially discussed a negative authorship signal that some in the SEO community believe can follow an author from site to site. The idea is that an author of content that is banned on one site will also have their content banned on another site. He denied that Google tracked author authority signals from site to site.

Sullivan explained:

“If you wrote for a site that got a manual action, it doesn’t somehow infect the other site that you might work for later on, so again, this is not something that freelancers should be worried about.

If you’re a publication and for whatever reason you feel like employing a freelancer, and it makes sense, that’s fine. You don’t need to worry about who they worked for before.

And if you are a freelancer you do not need to go back to the publications and say, can you take my byline down because now I can’t get hired from anybody else because they think I’m going to infect them. It is not like that. It’s not a disease. “

The above SEO myth likely began when publishers noticed that content created by a certain author across multiple sites was banned. In that case, it’s reasonable to assume that there was something wrong with the content but that’s not necessarily true. It could have been the case that the website itself was poorly promoted with unnatural links. Or, it could be that the site itself was engaged with selling links.

The takeaway from what Danny Sullivan shared is that a manual action on one site doesn’t follow an author to another site. Another takeaway is that there is no negative authorship signal that Google is tracking.

And if there’s no negative authorship signal could it be that there is no positive author signal as well? In my opinion it’s a reasonable assumption. A signal like that would be too easy to manipulate. Whatever signals Google uses to understand site reputation is likely enough for the purpose of citing an information source in the search results.

Although claims by some SEOs have been made about authorship signals, authorship signals have never been known to be a thing with Google’s algorithms. Google has a long history of denying the use of authorship signals and Danny’s statements offer further validation that Google continues to not use authorship as signals for ranking purposes.

Ranking Drops And Mini-Site Reputation

Danny next begins talking about how a new section of a site could suddenly lose rankings. He says this isn’t necessarily a bad thing. It’s just Google trying to figure out this new section and that if it’s sufficiently different Google could even start treating it as a standalone mini-site. This is a really fascinating thing he gets into here.

Danny used the example of the addition of a forum to a website.

Danny explained

“For example, you might have a site where you start running a forum. Forums can be different and we would want to understand that this looks like a forum so that we can then rank the forum content against other kinds of forum content on kind of a level playing field or understand that that forum content should be included in things where we try to show forum content.”

What can happen is… that it could be that part of your site was doing better because it was seen as part of the overall site. Now we kind of see it as more of independent and part of a full site on its own.

And potentially you could see a traffic drop that comes from that. That doesn’t mean that you suddenly got a site reputation abuse ban issue because first of all that might not have involved third party content abusing first party work, right? Those were the things. So if it doesn’t have any of that it doesn’t have anything to do with that.Secondly, we would have sent you an email. So, it’s not bad.

Because it just could be we’ve had a general re-ranking… It could also mean that in the long run that part of your site might actually do better, because we might recognize it in different ways, that we might be able to surface it in different ways. And it might start sort of earning its own like ‘mini-site’ reputation along the way.”

Three things to take away from that last part.

A ranking drop could be due to benign things, don’t always assume that a ranking drop is due to a spam or other negative algorithmic action.

Second, a rankings drop could be due to a “general re-ranking” which is a vague term that went unexplained but is probably a reference to minor ranking adjustments outside of a core algorithm update.

The third takeaway is the part about a section of a website earning it’s own “mini-site” reputation. I think SEOs should not create theories about mini-sites and mini-site reputations because that’s not what Danny Sullivan said. He used the word “like” which means that he likely used the phrase “mini-site” as a metaphor.

Featured Image by Shutterstock/Joseph Hendrickson

YouTube Unveils New AI-Powered Hook Generator via @sejournal, @MattGSouthern

YouTube has announced three new features in its ‘Inspiration’ suite for creators. These tools use AI to generate fresh ideas that can help keep viewers watching.

The new hook generator is the most eye-catching tool, which offers suggestions for engaging video openings.

Below, I’ll break down the new features and explain how they can improve video performance.

YouTube’s New AI Hook Generator

Video hooks are the first moments in a video that grab the viewer’s attention. With shorter attention spans and more online competition, strong hooks are essential.

YouTube’s new AI tool provides three hook suggestions that can help creators capture their audience quickly.

1. Statement Hook

This hook uses a strong statement to address a common problem. For example, a statement might say, “Stop letting creative block control you. I’ll show you how to beat it for good in this video. Let’s get started.”

2. Visual Hook

The visual hook guides creators to use striking imagery. One suggestion starts with an extreme close-up of a blinking cursor, then zooms out to reveal a messy desk and a frustrated face, ending with a bold brushstroke on a canvas.

3. Action Hook

This hook is all about movement.

YouTube’s example suggests using a hand crumpling paper and repeatedly tossing it in the trash, then quickly changing to a fresh start as the hand picks up a new sheet and begins to draw.

See an example of the interface below:

Screenshot from: YouTube.com/CreatorInsider, March 2025.

Additional Inspiration Features

While hooks are the main highlight, YouTube has introduced two other tools.

Brainstorm from Anywhere

YouTube notes that creators often get new ideas while checking performance data or reading audience comments. This tool uses data from past videos to suggest new content ideas, making brainstorming easier as you work.

Quick Saves

The quick saves feature lets creators save ideas directly from the brainstorming list. This helps you capture inspiration when it strikes without breaking your creative flow.

See each of these tools in action in the video below:

Availability

These new updates are part of YouTube’s growing Inspiration Tab in YouTube Studio. The tab now helps creators with hooks, video outlines, titles, and thumbnails.

The full Inspiration suite is available on desktop for most creators worldwide. However, due to local regulations, it’s not yet available in the European Union, United Kingdom, or Switzerland.

Implications for Content Strategy

YouTube does warn that “AI-generated content may be inaccurate or inappropriate, vary in quality, or provide information that doesn’t reflect YouTube’s views.”

Even so, these tools provide a helpful base for creators to build on and refine.

As online competition grows, these AI-powered features come at the right time. They offer a practical way to engage viewers better and optimize video performance.


Featured Image: Best Smile Studio/Shutterstock

Google’s Martin Splitt: JavaScript-Loaded Images Can Be Indexed via @sejournal, @MattGSouthern

Google’s Developer Advocate Martin Splitt recently debunked a common SEO myth. He confirmed that images loaded with JavaScript can be indexed by Google when set up correctly.

Splitt shared these insights during the SEO for Paws Conference, a live-streamed fundraiser by Anton Shulke.

Here’s how to avoid common image indexing issues when loading images with JavaScript.

JavaScript Image Loading Isn’t the Problem

When asked about images loaded by JavaScript, Splitt clarified that the method is not to blame for indexing issues.

Splitt explains:

“JavaScript to load images is fine. A purely JavaScript image loading solution can absolutely get your images indexed.”

This comment clears up worries among many SEO pros. Images may not appear in Google Images for reasons other than using JavaScript.

The Real Culprits Behind Unindexed Images

Splitt explained that something else is usually wrong if JavaScript-loaded images don’t appear in search results.

He pointed to a few common issues:

  • Sitemap Problems: Sometimes, key images are missing from XML sitemaps.
  • HTTP Headers: Some image files may have headers that stop them from being indexed.
  • Rendered HTML Issues: If images don’t appear in the rendered HTML (the version Google sees after JavaScript runs), they won’t get indexed.

Debugging JavaScript Image Indexing Issues

Splitt offers a simple process to spot problems. Start by checking if images appear in the rendered HTML using tools like Search Console’s URL Inspection tool.

Splitt explains:

“You would have to check: is the rendered HTML containing the images? If it is, fantastic. If it’s not, then something else is off.”

Since Google indexes the rendered HTML, any image missing from it won’t be found by Googlebot.

See Splitt’s full talk on JavaScript SEO in the video below:

Common JavaScript Image Loading Techniques & Their SEO Impact

There are several ways to load images with JavaScript. Some common methods include:

  • Lazy Loading: Loads images only when needed.
  • Progressive Loading: Shows a low-quality image first, then upgrades to a high-quality one.
  • Infinite Scroll Loading: Loads images as users continue to scroll.
  • Background Image Insertion: Adds images through CSS backgrounds.

If they are set up properly, all these methods can work with Google’s indexing. Each may need its own checks to ensure everything is working as expected.

Best Practices for SEO-Friendly JavaScript Image Loading

Even though JavaScript-loaded images can be indexed, following these best practices can help avoid issues:

  • Verify with the URL Inspection Tool: Ensure images appear in the rendered HTML.
  • Update Your XML Sitemaps: Include key images with proper tags.
  • Use Alt Text: Provide clear alt text for images loaded via JavaScript.
  • Use Native Lazy Loading: Add the loading="lazy" attribute where it makes sense.
  • Check Robots.txt: Ensure you are not blocking JavaScript resources that load images.

What This Means for SEO Professionals

Instead of avoiding JavaScript, verify that images are loaded correctly and appear in the rendered HTML.

As websites rely more on JavaScript, understanding these details is key. SEO professionals who learn to troubleshoot and optimize JavaScript-based image loading will be better prepared to support their clients’ visibility in search results.

Looking Ahead

This clarification is timely. Many modern sites built with frameworks like React, Vue, or Angular load images using JavaScript instead of traditional tags.

Splitt’s insights help dispel the myth that JavaScript harms image indexing. Developers can now focus on performance without worrying about SEO penalties.


Featured Image: Alicia97/Shutterstock

Stop Guessing. Start Converting: The Key To Smarter Lead Generation In 2025 via @sejournal, @Juxtacognition

Marketers have always relied on data to fine-tune their strategies.

But for years, that data has been based on assumptions made from broad industry benchmarks, competitor insights, and vague trends.

The result?

Marginal conversion improvements and a constant struggle to truly connect with the audience.

Better Leads → More Sales In 2025: How To Analyze Leads To Improve Marketing Performance from CallRail shows you how to transform customer conversations into actionable insights that improve your marketing performance at every stage of the funnel.

And you can download your copy right now.

What If You Could Stop Guessing & Start Marketing With Precision?

The secret lies in the conversations your customers are already having with you.

Every call, chat, and interaction holds valuable insights and the exact words, concerns, and motivations that drive your audience to take action.

Yet, most businesses overlook this goldmine of data.

Instead, they continue to make decisions based on surface-level analytics like click-through rates, page views, and lead form submissions.

And while that strategy works, it’s only part of the story. Why?

Image by Paulo Bobita/Search Engine Journal, March 2025

While numbers show what’s happening, your customers’ conversations can tell you why a customer did or didn’t convert and what he or she was thinking.

By leveraging these conversations and other first-party conversational data, you can unlock real insights, refine your messaging, and optimize your marketing, sales, customer service, and more.

Without the guesswork.

The Power Of Listening To Your Customers

Better Leads → More Sales In 2025: How To Analyze Leads To Improve Marketing Performance shows you how to make the most of those insights by providing practical, step-by-step strategies for improving lead quality, increasing conversions, and refining your marketing approach.

You’ll learn how to craft messaging that resonates with your audience, optimize funnels to reflect actual user behavior, and uncover friction points before they impact conversions.

From improving marketing copy to boosting customer retention and increasing Customer Lifetime Value (CLV), the strategies outlined are practical and results-driven.

You’ll be able to move away from guesswork with confidence and build marketing campaigns that feel relevant, personal, and persuasive, leading to higher-quality leads and more sales.

The Future Of Marketing Is Data-Driven And Customer-Focused

Brands that win in 2025 won’t be those with the biggest ad budgets.

They’ll be the ones that listen.

When you understand your customers’ frustrations, their needs, and the exact words they use to describe their problems, you can craft campaigns that feel personal, relevant, and persuasive.

This isn’t just about getting more leads.

It’s about getting the right leads and turning them into loyal customers.

What You’ll Learn In This Ebook:

  • Uncover Customer Insights: Learn how to extract powerful insights from customer conversations, sentiment analysis, and first-hand interactions.
  • Improve Marketing Messaging: Use the language your audience naturally speaks to create high-converting ads, landing pages, and content.
  • Optimize Your Lead Generation Funnels & Customer Journeys: Build a pipeline that reflects real customer behavior. Not assumptions.
  • Reduce Friction & Increase Conversions: Identify barriers before they impact your bottom line.
  • Increase CLV & Customer Lifespan: Find upsell opportunities and improve customer retention using call and chat transcript analysis.

Why This Matters

  • Marketing is evolving. Customers expect brands to understand them and not just sell to them.
  • Data beats guesswork. First-party conversational data gives you direct access to what your customers truly care about.
  • Better insights = higher conversions. When your message aligns with customer needs, engagement and sales increase.

Want to put these insights to work for your business?

Download your free copy today and start turning customer conversations into your most powerful marketing asset.


Featured Image: Paulo Bobita/Search Engine Journal

Data Shows Google AIO Is Citing Deeper Into Websites via @sejournal, @martinibuster

New data from BrightEdge that was shared with Search Engine Journal shows that Google’s AI Overviews (AIO) in March 2025 have rapidly expanded in size and are shifting how traffic is distributed across search results. The analysis suggests that deep, specialized content is likelier to be cited than homepages.

This shows that AIO is becoming more precise about the answers it is giving, aligning with the concepts of Predictive Summaries and Grounding Links that Google recently shared with marketers at Search Central NYC.

Google has traditionally favored showing precise answers to questions, something that the home pages of websites generally cannot do. It makes sense that BrightEdge’s data reflects the kind of precise linking that Predictive Summaries and Grounding Links display in AIO.

Google has expanded the pixel height of AIO by 18.26% in the first two weeks of March. Although some may rightly note that it’s reducing the outbound links of the organic search results it’s important to put that into the context that Google’s AIO also has outbound links and that those links are highly precise contextually relevant links.

The expansion of AIO size was not across the board. Industry-specific increases in AI Overview size:

  • Travel: +39.49%
  • B2B Tech: +37.13%
  • Education: +35.49%
  • Finance: +32.89%

Strategic Response for SEOs And Publishers

BrightEdge suggests that publishers and SEOs should monitor performance metrics to track changes in traffic, impressions, CTRs, and clicks to evaluate how AI Overviews may be influencing traffic trends. Of absolute importance is to try to identify sales or revenue trends because those are the most important metrics, not traffic.

Although it may be useful to create citable content, Google is generally summarizing content and then linking to where users can read more. Now more than ever it’s important to be aware of user trends relative to your industry and be able to anticipate them, including the context of a user’s search. Jono Alderson recently suggested targeting users at the very early side of the consumer journey in order to get ahead of AI-based citations.

Importance of In-Depth, Specialized Content

Google AIO is showing a citation preference for deep-linked content, two more clicks from the home page deep (2+ deep). 82.5% of clicks were to 2+ deep pages. Home pages accounted for less than 0.5% of all clicks.

86% of cited pages were ranked for only one keyword, often high-volume. This represents an opportunity for high traffic volume keyword traffic. The median keyword volume for citations was 15,300 monthly searches and 19% of citation-triggering keywords contain traffic volume in excess of 100k monthly searches.

Implications For Technical SEO And Content Optimization

BrightEdge suggests that full site indexing is critical for AI Overviews in order to ensure that every page is available to be cited as a potential source. Even older and otherwise overlooked content may gain value, especially if they’re reviewed and updated to make them suitable to be cited and reflect the most current information.

Google has been citing deeper content for many years now and the age of the primacy of the home page has been long over, except in local search. That said, home pages only accounted for half a percent of clicks from AIO so it’s super important now more than ever to optimize inner pages.

Takeaways:

The following are the top takeaways from the data:

  • Google’s AI Overviews are rapidly expanding in visual size on the SERP
  • Industries like Travel, B2B Tech, Education, and Finance are experiencing the fastest AI Overview growth
  • Deeper, more specific content is overwhelmingly favored for AI citations over homepages
  • Pages cited in AI Overviews often surface for just one keyword—frequently high-volume
  • Technical SEO and full-site indexing are now essential for brand visibility in AI-driven search

Google’s AI Overviews are not just expanding in size; they are improving the contextual relevance of outbound links to websites. Optimizing for AIO should turn an eye toward keeping older content fresh and up to date to keep it relevant for users who will appreciate that content.

BrightEdge data shared with Search Engine Journal has not been published but a monthly updated guide to AIO is available.

Featured Image by Shutterstock/B..Robinson

Google On Scaled Content: “It’s Going To Be An Issue” via @sejournal, @martinibuster

Google’s John Mueller and Danny Sullivan discussed why AI generated content is problematic, citing the newly updated quality rater guideline and sharing examples of how AI can be used in a positive way that has added value.

Danny Sullivan, known as Google Search Liaison, spoke about the topic in more detail, providing an example of what a high quality use of AI generated content is to serve as a contrast to what isn’t a good use of it.

Update To The Quality Rater Guidelines

The quality rater guidelines (QRG) is a book created by Google to provide guidance to third-party quality raters who rate tests of changes to Google’s search results. It was recently updated and it now includes guidance about AI generated content that’s folded into a section about content created with little effort or originality.

Mueller discussed AI generated content in the context of scaled content abuse, noting that the quality raters are taught to rate that kind of content as low quality.

The new section of the QRG advises the raters:

“The lowest rating applies if all or almost all of the MC on the page (including text, images, audio, videos, etc) is copied, paraphrased, embedded, auto or AI generated, or reposted from other sources with little to no effort, little to no originality, and little to no added value for visitors to the website. Such pages should be rated Lowest, even if the page assigns credit for the content to another source.”

Doesn’t Matter How It’s Scaled: It’s Going To Be An Issue

Danny Sullivan, known as Google Search Liaison, started his part of the discussion by saying that to Google, AI generated content is no different than scaled content tactics from the past, comparing it to the spam tactics of 2005 when Google used statistical analysis and other methods to catch scaled content. He also emphasized that it doesn’t matter how the content was scaled.

According to my notes, here’s a paraphrase of what he said:

“The key things are, large amounts of unoriginal content and also no matter how it’s created.

Because like, ‘What are you going to do about AI? How are you going to deal with all the AI explosion? AI can generate thousands of pages?’

Well 2005 just called, it’d like to explain to you how human beings can generate thousands of pages overnight that look like they’re human generated because they weren’t human generated and etcetera, etcetera, etcetera.

If you’ve been in the SEO space for a long time, you well understand that scaled content is not a new type of thing. So we wanted to really stress: we don’t really care how you’re doing this scaled content, whether it’s AI, automation, or human beings. It’s going to be an issue.

So those are things that you should consider if you’re wondering about the scaled content abuse policy and you want to avoid being caught by it.”

How To Use AI In A Way That Adds Value

A helpful thing about Danny’s session is that he offered an example of a positive use AI, citing how retailers offer a summary of actual user reviews that give an overall user sentiment of the product without having to read reviews. This is an example of how AI is providing an added value as opposed to being the entire main content.

This is from my notes of what he said:

“When I go to Amazon, I skip down to the reviews and the reviews have a little AI-generated thing at the top that tells me what the users generally think, and I’m like, this is really helpful.

And the thing that’s really helpful to me about it is, it’s AI applied to original content, the reviews, to give me a summary. That was added value for me and unique value for me. I liked it.”

As Long As It’s High Quality….

Danny next discussed how they tried to put out a detailed policy about AI generated content but he said it was misconstrued by some parts of the SEO community to mean that AI generated content was fine as long as it was quality AI generated content.

In my 25 years of SEO experience, let me tell you, whenever an SEO tells you that an SEO tactic is fine “as long as it’s quality” run. The “as long as it’s quality” excuse has been used to justify low-quality SEO practices like reciprocal links, directory links, paid links, and guest posts – If it’s not already an SEO joke it should be.

Danny continued:

“And then people’s differentiation of what’s quality is all messed up. And they say Google doesn’t care if it’s AI!’ And that is not really what we said.

We didn’t say that.”

Don’t Mislead Yourself About Quality Of Scaled Content

Danny advised that anyone using artificially generated content should think about two things to use as tests for whether it’s a good idea:

  1. The motivation for mass generated content.
  2. Unoriginality of the scaled content.

Traffic Motivated Content

The motivation shouldn’t be because it will bring more traffic. The motivation should because there’s a value-add for site visitors.

This is how Danny Sullivan explained it, according to my notes:

“Any method that you undertake to mass generate content, you should be carefully thinking about it. There’s all sorts of programmatic things, maybe they’re useful. Maybe they’re not. But you should think about it.

And the things to especially think about is if you’re primarily doing into it to game search traffic.

Like, if the primary intent of the content was, ‘I’m going to get that traffic’ and not, ‘some user actually expected it’ if they ever came to my website directly. That’s one of the many things you can use to try to determine it.”

Originality Of Scaled Content

SEOs who praise their AI-generated content lose their enthusiasm when the content is about a topic they’re actually expert in and will concede that it’s not as smart as they are… And what’s going on, that if you are not an expert then you lack the expertise to judge the credibility of the AI generated content.

AI is trained to crank out the next likeliest word in a series of words, a level of unoriginality so extreme that only a computer can accomplish it.

Sullivan next offered a critique of the originality of AI-generated content:.

“The other thing is, is it unoriginal?

If you are just using the tool saying, ‘Write me 100 pages on the 100 different topics that I got because I ran some tool that pulled all the People Also Asked questions off of Google and I don’t know anything about those things and they don’t have any original content or any value. I just kind of think it’d be nice to get that traffic.’

You probably don’t have anything original.

You’re not necessarily offering anything with really unique value with it there.

A lot of AI tools or other tools are very like human beings because they’ve read a lot of human being stuff like this as well. Write really nice generic things that read very well as if they are quality and that they answer what I’m kind of looking for, but they’re not necessarily providing value.

And sometimes people’s idea of quality differ, but that’s not the key point of it when it comes to the policy that we have with it from there, that especially because these days some people would tell you that it’s quality.”

Takeaways:

  • Google doesn’t “care how you’re doing this scaled content, whether it’s AI, automation, or human beings. It’s going to be an issue.”
  • The QRG explicitly includes AI-generated content in its criteria for ‘Lowest’ quality ratings, signaling that this is something Google is concerned about.
  • Ask if the motivation for using AI-generated content is primarily to drive search traffic or to help users
  • Originality and value-add are important qualities of content to consider
Google’s Martin Splitt Reveals 3 JavaScript SEO Mistakes & Fixes via @sejournal, @MattGSouthern

Google’s Martin Splitt recently shared insights on how JavaScript mistakes can hurt a website’s search performance.

His talk comes as Google Search Advocate John Mueller also urges SEO pros to learn more about modern client-side technologies.

Mistake 1: Rendered HTML vs. Source HTML

During the SEO for Paws Conference, a live-streamed fundraiser by Anton Shulke, Splitt drew attention to a trend he’s noticing.

Many SEO professionals still focus on the website’s original source code even though Google uses the rendered HTML for indexing. Rendered HTML is what you see after JavaScript has finished running.

Splitt explains:

“A lot of people are still looking at view source. That is not what we use for indexing. We use the rendered HTML.”

This is important because JavaScript can change pages by removing or adding content. Understanding this can help explain some SEO issues.

Mistake 2: Error Pages Being Indexed

Splitt pointed out a common error with single-page applications and JavaScript-heavy sites: they often return a 200 OK status for error pages.

This happens because the server sends a 200 response before the JavaScript checks if the page exists.

Splitt explains:

“Instead of responding with 404, it just responds with 200 … always showing a page based on the JavaScript execution.”

When error pages get a 200 code, Google indexes them like normal pages, hurting your SEO.

Splitt advises checking server settings to handle errors properly, even when using client-side rendering.

Mistake 3: Geolocation Request Issue

Another problem arises when sites ask users for location or other permissions.

Splitt says Googlebot will always refuse the request if a site relies on geolocation (or similar requests) without a backup plan.

Splitt explains:

“Googlebot does not say yes on that popup. It says no on all these requests … so if you request geolocation, Googlebot says no.”

The page can appear blank to Googlebot without alternative content, meaning nothing is indexed. This can turn into a grave SEO mistake.

How to Debug JavaScript for SEO

Splitt shared a few steps to help diagnose and fix JavaScript issues:

  1. Start with Search Console: Use the URL Inspection tool to view the rendered HTML.
  2. Check the Content: Verify if the expected content is there.
  3. Review HTTP Codes: Look at the status codes in the “More info” > “Resources” section.
  4. Use Developer Tools: Open your browser’s developer tools. Check the “initiator” column in the Network tab to see which JavaScript added specific content.

Splitt adds:

“The initiator is what loaded it. If it’s injected by JavaScript, you can see which part of the code did it.”

Following these steps can help you find the problem areas and work with your developers to fix them.

See Splitt’s full talk in the recording below:

A Shift in SEO Skills

Splitt’s advice fits with Mueller’s call for SEOs to broaden their skill set.

Mueller recently suggested that SEO professionals learn about client-side frameworks, responsive design, and AI tools.

Mueller stated:

“If you work in SEO, consider where your work currently fits in … if your focus was ‘SEO at server level,’ consider that the slice has shrunken.”

Modern JavaScript techniques create new challenges that old SEO methods cannot solve alone. Splitt’s real-world examples show why understanding these modern web practices is now critical.

What This Means For SEO Professionals

Both Google Advocates point to a clear trend: SEO now requires more technical skills. As companies look for professionals who can blend SEO and web development, the demand for these modern skills is growing.

To keep up, SEO pros should:

  • Learn How JavaScript Affects Indexing: Know the difference between source and rendered HTML.
  • Master Developer Tools: Use tools like Search Console and browser developer tools to spot issues.
  • Collaborate with Developers: Work together to build sites that serve users and search engines well.
  • Broaden Your Skillset: Add client-side techniques to your traditional SEO toolkit.

Looking Ahead

As the web evolves, so must the skills of SEO professionals. However, leveling up your knowledge doesn’t have to be intimidating.

This fresh look at JavaScript’s role in SEO shows that even simple changes can have a big impact.


Featured Image: BestForBest/Shutterstock

AI Researchers Warn: Hallucinations Persist In Leading AI Models via @sejournal, @MattGSouthern

A report from the Association for the Advancement of Artificial Intelligence (AAAI) reveals a disconnect between public perceptions of AI capabilities and the reality of current technology.

Factuality remains a major unsolved challenge for even the most advanced models.

The AAAI’s “Presidential Panel on the Future of AI Research” report draws on input from 24 experienced AI researchers and survey responses from 475 participants.

Here are the findings that directly impact search and digital marketing strategies.

Leading AI Models Fail Basic Factuality Tests

Despite billions in research investment, AI factuality remains largely unsolved.

According to the report, even the most advanced models from OpenAI and Anthropic “correctly answered less than half of the questions” on new benchmarks like SimpleQA, a collection of straightforward factual questions.

The report identifies three main techniques being deployed to improve factuality:

  • Retrieval-augmented generation (RAG): Gathering relevant documents using traditional information retrieval before generating answers.
  • Automated reasoning checks: Verifying outputs against predefined rules to cull inconsistent responses.
  • Chain-of-thought (CoT): Breaking questions into smaller units and prompting AI to reflect on tentative conclusions

However, these techniques show limited success, with 60% of AI researchers expressing pessimism that factuality issues will be “solved” in the near future.

This suggests you should prepare for continuous human oversight to ensure content and data accuracy. AI tools may speed up routine tasks, but full autonomy remains risky.

The Reality Gap: AI Capabilities vs. Public Perception

The report highlights a concerning perception gap, with 79% of AI researchers surveyed disagreeing or strongly disagreeing that “current perception of AI capabilities matches the reality.”

The report states:

“The current Generative AI Hype Cycle is the first introduction to AI for perhaps the majority of people in the world and they do not have the tools to gauge the validity of many claims.”

As of November, Gartner placed Generative AI just past its peak of inflated expectations and is now heading toward the “trough of disillusionment” in its Hype Cycle framework.

For those in SEO and digital marketing, this cycle can provoke boom-or-bust investment patterns. Decision-makers might overcommit resources based on AI’s short-term promise, only to experience setbacks when performance fails to meet objectives.

Perhaps most concerning, 74% of researchers believe research directions are driven by hype rather than scientific priorities, potentially diverting resources from foundational issues like factuality.

Dr. Henry Kautz, chair of the Factuality & Trustworthiness section of the report, notes that “many of the public statements of people quite new to the field are out of line with reality,” suggesting that even expert commentary should be evaluated cautiously.

Why This Matters for SEOs & Digital Marketing

Adopting New Tools

The pressure to adopt AI tools can overshadow their limitations. Since issues of factual accuracy remain unresolved, marketers should use AI responsibly.

Conducting regular audits and seeking expert reviews can help reduce the risks of misinformation, particularly in industries regulated by YMYL (Your Money, Your Life) standards, such as finance and healthcare.

The Impact On Content Quality

AI-based content generation can lead to inaccuracies that can directly harm user trust and brand reputation. Search engines may demote websites that publish unreliable or deceptive material produced by AI.

Taking a human-plus-AI approach, where editors meticulously fact-check AI outputs, is recommended.

Navigating the Hype

Beyond content creation challenges, leaders must adopt a clear-eyed view to navigate the hype cycle. The report warns that hype can misdirect resources and overshadow more sustainable gains.

Search professionals who understand AI’s capabilities and limitations will be best positioned to make strategic decisions that deliver real value.

For more details, read the full report (PDF link).


Featured Image: patpitchaya/Shutterstock

Google Cautions SEOs & Creators About Filler Content via @sejournal, @martinibuster

Google’s John Mueller cautioned publishers and SEOs about filler content, which is generally created with the apparent goal of reaching a word-count threshold without concern for the user experience. Although recipe sites are the cause of this warning, this is the kind of thing that all SEOs and publishers should be concerned about.

Probable Origin Of Filler Content

What Mueller warned about, filler content, probably has origins in the idea that “content is king” which resulted in the SEO practice that values creating content similar to what Google is already ranking except doing it ten times better so that it towers over the competition (10x and skyscraper content strategies).

John Mueller Warns About Filler Content

Mueller’s observations about filler content were in the context of an overview of recent changes in the Quality Rater Guidelines (QRG), a book that Google created to bring more objective standards to how third party raters rate search results that are being tested.

Mueller said that filler content is low quality content that’s designed to make a page longer. Speaking informally and loosely, he said that filler content is problematic because users can find it “annoying.”

This is, according to my notes, what he said:

“Recently, quality rater guidelines, there are few things that I think are interesting for some sites that we have mentioned in the quality rater guidelines now which weren’t in there before.

And so this is the kind of thing which I think is important for sites to realize. On the one hand we’ve written about filler content, which is the kind of fluff that some websites put on their pages to make the pages longer. And sometimes they have good reasons to make the pages longer.

But for us this is sometimes problematic and users sometimes find it annoying. So we have that mentioned in the quality rater guidelines.”

Filler Content Is A Poor User Experience

What Mueller is referring to is the new section 5.2.2 of the QRG which lays out how to objectively judge whether a page has filler content. Filler content previously was nestled within section 5.2 but it’s now broken out into its own section. The main takeaway is that filler content is a user experience issue.

Here’s what the new section of the QRG says:

“5.2.2 Filler as a Poor User Experience

The main content (MC) of a page should support its purpose. Web site owners and content creators should place the most helpful and essential MC near the top of the page so that visitors can immediately access it.

A high quality page has the most helpful MC placed most prominently. Content that supports the page purpose without directly contributing to the primary goal can be included, but it should be shown lower on the page in a less prominent position. For example, on recipe pages, the recipe itself and important supporting content directly related to the recipe should be prominently displayed near the top of the webpage.

Sometimes, MC includes “filler” – low-effort content that adds little value and doesn’t directly support the purpose of the page. Filler can artificially inflate content, creating a page that appears rich but lacks content website visitors find valuable. Filler can result in a poor experience for people who visit the page, especially if placed prominently ahead of helpful content for the purpose of the page.

Important: Content that supports the page purpose without directly contributing to its primary goal can still be valuable if placed appropriately. Filler refers to low-effort content that occupies valuable and prominent space without providing value or without being helpful or satisfying for the primary purpose of the page.

A Low rating is appropriate if the page

● Contains a large amount of low quality and unhelpful filler, causing a poor experience for people visiting the page

● Contains a large amount visually prominent filler that makes it difficult to find the helpful MC, causing frustration for people visiting the page”

Content Filler – Not Just For Recipe Sites

The new quality rater guideline section about filler content specifically mentions recipe sites, likely because of their notoriously long filler content that’s so bad they have to add a link to skip to the the part that’s useful.

Some SEOs generally dislike change and this will probably make some people angry, but in my opinion, any publisher that feels they need to add a link to “skip to recipe” should probably consider that they’re doing something wrong. As Google says, important supporting content should be up front. If users have to skip the content to get to it then whatever it is that’s being skipped is not useful.

Recipe sites may be the worst offenders for filler content, but that doesn’t mean publishers in other niches can ignore the policy. All SEOs and content creators should recognize that filler content is problematic.

Imitating the top-ranked content is a practice that content publishers need to reconsider. It works against what Google is actually trying to rank and can lead to artificial word count targets instead of focusing on the user’s needs.

Reddit Introduces Faster Ad Setup and Pixel Integration via @sejournal, @brookeosmundson

Starting today, Reddit rolled out a series of updates aimed at making it easier for small and medium-sized businesses (SMBs) to advertise on the platform.

The changes focus on simplifying the ad creation process, improving signal quality, and helping advertisers move campaigns from other platforms like Meta with fewer headaches.

These updates follow Reddit’s continued push to make its Ads Manager more accessible, especially for smaller businesses that may not have the luxury of dedicated ad ops teams or outside agencies.

Launching Campaigns Faster With New Tools

In the Reddit Ads update, they announced two new tools to streamline campaign creation:

  • Campaign Import.
  • Simplified Campaign Quality Assurance (QA).

The first of the additions is Campaign Import, a tool that lets advertisers bring campaigns over from Meta directly into Reddit Ads Manager.

The process is straightforward — after connecting their Meta account, advertisers can select an existing campaign, import it, and make any necessary adjustments to suit Reddit’s environment.

This isn’t just a time-saver; it gives brands a quick way to leverage proven creative and targeting strategies while adapting them to Reddit’s unique audiences.

Another welcomed update is Reddit’s new Campaign Quality Assurance (QA) system. Instead of clicking back and forth between settings pages, advertisers now get a consolidated review page summarizing all key campaign details.

If something looks off — budget, targeting, placements, or creative — users can jump directly to the relevant section and make fixes before going live.

It may seem small, but anyone who’s fumbled through nested ad platforms under tight deadlines knows how much this improves workflow.

Improved Quality Signals In Reddit Ads

In addition to the streamlined campaign creation tools, Reddit also announced two features to improve the quality of audience and user behavior signals:

  • 1-click Google Tag Manager integration for Reddit Pixel.
  • Event Manager Quality Assurance (QA).

The platform now offers a 1-click integration with Google Tag Manager (GTM) for the Reddit Pixel, dramatically reducing the friction of installing and configuring conversion tags.

Advertisers can now fire up GTM, install the Reddit Pixel in minutes, and start sending conversion data without needing to pull in a developer. This update alone will make performance-focused advertisers breathe a little easier.

Reddit also upgraded its Event Manager QA tools. The revamped Events Overview now gives a clearer breakdown of conversion events coming from both the Reddit Pixel and the Conversions API (CAPI).

Advertisers can spot data discrepancies faster and ensure their lower-funnel campaigns are set up for success.

Jim Squires, EVP of Business Marketing and Growth at Reddit, noted that SMBs have always been an essential part of the platform’s community and advertising base.

We continue to make improvements to the Reddit Ads Manager that make it easier to launch and manage campaigns, so they can focus on what matters most: growing and running their businesses.

Reddit Ads Continues To Push Forward

With these latest updates, Reddit continues refining its ad platform for a broader range of advertisers, with particular attention to reducing friction for growing businesses.

Advertisers who have been looking for more streamlined ways to import, optimize, and measure campaigns will likely find these tools helpful as they plan their next steps on Reddit.

Have you already tried out Reddit Ads? Will these updates make you lean towards testing a new platform next quarter?