YouTube Tests AI Overviews In Search Results via @sejournal, @MattGSouthern

YouTube is now testing AI-powered video summaries in its search results, a feature similar to Google Search’s AI overviews.

The new tool helps users find relevant videos faster by highlighting specific clips that best match their search criteria.

New AI-Powered Search Experience

YouTube’s test introduces a new video results carousel that appears when you search for specific topics. This feature uses AI to find the most helpful parts of videos related to your search.

As YouTube explains:

“This new feature will use AI to highlight clips from videos that will be most helpful for your search query.”

The AI summaries will mainly show up for two types of searches:

  • Product searches (like “best noise cancelling headphones”)
  • Location-based searches (such as “museums to visit in San Francisco”)

Limited Testing Phase

Right now, only “a small number of YouTube Premium members in the US” can see this feature, and only for searches in English.

If you’re part of the test group, YouTube wants your feedback. You can rate the feature using the three-dot menu, where you can give it a thumbs-up or thumbs-down.

Part of YouTube’s Experimental Process

This test follows YouTube’s standard approach to new features. The company regularly tests ideas with small groups before deciding whether to roll them out more widely.

YouTube explains:

“YouTube product teams are constantly testing out new tools and features.”

These tests help users “find, watch, share, and create content more easily.”

The company uses feedback from these experiments to decide “if, when, and how to release these features more broadly.”

What This Means

YouTube’s AI Overviews present opportunities and challenges for SEO pros and content creators.

On the positive side, the feature may help users discover content they might have missed. This could especially benefit creators who make detailed, information-rich videos.

However, there are also concerns similar to those with Google’s AI Overviews:

  • Will these summaries reduce click-through rates by answering questions directly in search results?
  • How will the AI choose which content to feature in these summaries?

These questions may change how creators structure their YouTube videos. Some might start creating clearly defined segments that AI can identify and highlight.

Looking Ahead

YouTube’s test is another step in transforming search across Alphabet’s platforms.

YouTube hasn’t announced when the feature might launch more widely. However, based on how quickly Google expanded AI Overviews, successful testing could lead to a broader rollout in the coming months.


Featured Image: aaddyy/Shutterstock

Google: How To Remove Site From Search Without Verifying Ownership via @sejournal, @martinibuster

Google’s John Mueller answered a question on Reddit that showed an easy way to completely remove an entire website from Google’s search index without a search console verified account.

The person who started the discussion on Reddit had an old website that they wanted to remove a Canva website from Google’s search results.

They wrote:

“As a disclaimer, I am not a tech savvy person, I just use Canva for design. I’ve been reading every piece of literature I can find on how to fully remove my old website from Google search results. I took the website down from Canva’s side, but I can’t get the search result on Google to disappear. Is there a way to do this? Thank you!”

One of the Redditors provided a link to a Google help page that offers a lot of information about removing sites, pages and images from Google Search by using the Refresh Outdated Content tool. The tool is for situations in which web pages and images no longer exist or pages with sensitive content that was deleted. The Google support page further explains:

“Use this tool if…
you do not own the web page pointed to by Google. (If you own the page, you can ask Google to recrawl the page or hide the page.) AND
the page or image no longer exists or is significantly different from the current version of the page or image.”

Google’s John Mueller responded with an option they could use if they don’t have a verified site on Google Search Console, and provided a URL to a page that enabled the person to submit a website URL, explaining that it’s slower than doing it through Search Console as a verified site owner.

He wrote:

“It requires that your old pages are removed from the internet — so you’d need to take them down from wherever you were hosting your old website.

If by “old” website you mean that you also have a “new” website, you can also check to see if your hoster allows you to redirect your old pages to your new ones. This is a bit cleaner than just removing your pages, since it forwards any “signals” that have been collected with the old web pages. https://developers.google.com/search/docs/crawling-indexing/site-move-with-url-changes has a bit more about site migrations (when you redirect from an old site to a new one). If you’re hosting the old site with Canva, I don’t know if they support redirects.”

Read the Reddit discussion here:

Removing website from Google

Featured Image by Shutterstock/Anatoliy Karlyuk

7 AI Terms Microsoft Wants You to Know In 2025 via @sejournal, @MattGSouthern

Microsoft released its 2025 Annual Work Trend Index this week.

The report claims this is the year companies will move beyond AI experiments and rebuild their core operations around AI.

Microsoft also introduced several new terms that it believes will shape the future of the workplace.

Let’s look at what Microsoft wants to add to your work vocabulary. Remember, Microsoft has invested heavily in AI, so they have good reasons to make these concepts seem normal.

The Microsoft AI Dictionary

1. The “Frontier Firm”

Microsoft says “Frontier Firms” are organizations built around on-demand AI, human-agent teams, and employees who act as “agent bosses.”

The report claims 71% of workers at these AI-forward companies say their organizations are thriving. That’s much higher than the global average of just 37%.

2. “Intelligence on Tap”

This refers to AI that’s easily accessible whenever needed. Microsoft calls it “abundant, affordable, and scalable on-demand.”

The company suggests AI is now a resource that isn’t limited by staff size or expertise but can be purchased and used as needed, conveniently through Microsoft’s products.

3. “The Capacity Gap”

This term refers to the growing disparity between what businesses require and what humans can provide.

Microsoft’s research indicates that 53% of leaders believe productivity must increase, while 80% of workers report a lack of time or energy to complete their work. They suggest that AI tools can fill this gap.

4. “Work Charts”

Forget traditional org charts. Microsoft envisions more flexible “Work Charts” that adapt to business needs by leveraging both human workers and AI.

These structures focus on results rather than rigid hierarchies. They allow companies to use the best mix of human and AI workers for each task.

5. “Human-Agent Ratio”

This term refers to the balance between AI agents and human workers required for optimal results.

Microsoft suggests that leaders need to determine the number of AI agents required for specific roles and the number of humans who should guide those agents. This essentially redefines how companies staff their teams.

6. “Agent Boss”

Perhaps the most interesting term is that of an “agent boss,” someone who builds, assigns tasks to, and manages AI agents to boost their impact and advance their career.

Microsoft predicts that within five years, teams will be training (41%) and managing (36%) AI agents as a regular part of their jobs.

7. “Digital Labor”

This is Microsoft’s preferred term for AI-powered work automation. Microsoft positions AI not as a replacement for humans, but as an addition to the workforce.

The report states that 82% of leaders plan to use digital labor to expand their workforce within the next year and a half.

However, this shift towards AI-powered work automation raises important questions about job displacement, the need for retraining, and the ethical use of AI.

These considerations are crucial as we navigate this new era of work.

Behind the Terminology

These terms reveal Microsoft’s vision for embedding AI deeper into workplace operations, with its products leading the way.

The company also announced updates to Microsoft 365 Copilot, including:

  • New Researcher and Analyst agents
  • An AI image generator
  • Copilot Notebooks
  • Enhanced search functions

Jared Spataro, Microsoft’s CMO of AI at Work, states in the report:

“2025 will be remembered as the year the Frontier Firm was born — the moment companies moved beyond experimenting with AI and began rebuilding around it.”

Looking Ahead

While Microsoft’s terms may or may not stick, the trends it describes are already changing digital marketing.

Whether you embrace the title “agent boss” or not, knowing how to use AI tools while maintaining human creativity will likely become essential in the changing marketing workforce.

Will Microsoft’s vision of “Frontier Firms” happen exactly as they describe? Time and the number of people who adopt these ideas will tell.


Featured Image: Drawlab19/Shutterstock

Google’s Martin Splitt Explains How To Find & Remove Noindex Tags via @sejournal, @MattGSouthern

Google’s Search Relations team has released a new SEO Office Hours video with Martin Splitt.

He tackles a common problem many website owners face: unwanted noindex tags that keep pages out of search results.

In the video, Splitt helps a user named Balant who couldn’t remove a noindex tag from their website. Balant wanted their page to be public, but the tag prevented this.

Where Unwanted Noindex Tags Come From

Splitt listed several places where unwanted noindex tags might be hiding:

“Make sure that it’s not in the source code, it’s not coming from JavaScript, it’s not coming from a third-party JavaScript.”

Splitt pointed out that A/B testing tools often cause this problem. These tools sometimes add noindex tags to test versions of your pages without you realizing it.

CDN & Cache Problems

If you use a Content Delivery Network (CDN), Splitt warned that old cached versions might still have noindex tags even after you remove them from your site.

Splitt explained:

“If you had a noindex in and you’re using a CDN, it might be that the cache hasn’t updated yet.”

Check Your CMS Settings & Plugins

Splitt explained that your Content Management System (CMS) settings might be adding noindex tags without you knowing.

He said:

“If you’re using a CMS, there might be settings or plugins for SEO, and there might be something like ‘allow search engines to index this content’ or ‘to access this content,’ and you want to make sure that’s set.”

Splitt added that settings labeled as “disallow search engines” should be unchecked if you want your content to appear in search results.

See the full video:

Debugging Process for Persistent Noindex Issues

If you’re dealing with stubborn noindex problems, Splitt suggests checking these places in order:

  1. Check your HTML source code directly
  2. Look at JavaScript files that might add meta tags
  3. Review third-party scripts, especially testing tools
  4. Check if your CDN cache needs updating
  5. Look at your CMS settings and SEO plugins

What This Means For SEO Professionals

Google’s advice shows why thorough technical SEO checks are essential. Modern websites are complex with dynamic content and third-party tools, so finding technical SEO problems takes deeper digging.

SEO professionals should regularly crawl their sites with tools that process JavaScript. This practice provides a deeper understanding of how search engines interpret your pages, going beyond the basic HTML and revealing the true visibility of your content.

Google keeps covering these basic technical issues in its videos, suggesting that even well-designed websites often struggle with indexing problems.

If your pages aren’t showing up in search results, use Google’s URL Inspection tool in the Search Console. This shows you how Google sees your page and whether any noindex tags exist.

Google Quietly Ends COVID-Era Structured Data Support via @sejournal, @martinibuster

Google announced that it is dropping support for the 2020 COVID-era Special Announcements structured data type and completely phasing it out by July 31, 2025. The announcement was posted on the SpecialAnnouncement structured data documentation.

SpecialAnnouncement Structured Data

This structured data type was adopted by Google in April 2020 as a way to announce a wide range of information related to the COVID pandemic. It was specifically for COVID related announcements and it never evolved beyond pandemic related purposes although Google did allow the use of this structured data for local businesses to announce new store hours as a way to communicate to Google that data while not necessarily showing a rich result.
Interestingly, this structured data was released as a “beta” feature, meaning that it was a live test subject to changes and was never integrated as an official structured data, remaining a beta feature to the end.

There were two ways to submit a special announcement notice, by structured data and Google Search console.

Users who continue to use the Special Announcement structured data will have no negative effect by keeping it on their site but it will have no effect on Google Search.

Read Google’s special announcement about the deprecation of the SpecialAnnouncement structured data here:

Special announcement (SpecialAnnouncement) structured data (BETA)

Featured Image by Shutterstock/Blinx

Wix Announces Adaptive Content For Driving Higher Sales & Engagement via @sejournal, @martinibuster

Wix announced a new feature that enables businesses to create personalized content for visitors, increasing relevance and opportunities for higher sales and lead generation. The feature integrates AI into the workflow, making it easier for publishers to deliver advanced personalized experiences to returning customers.

Relevance = Higher Sales

It’s commonly known that site visitors who visit a site that’s an exact match for the keywords used in a search tend to convert at a higher rate than visitors that to a site that has less relevant  content. A website experience that’s directly relevant to site visitors contributes to higher conversion rates. Being able to optimize factors that contribute to relevance to site visitors is an innovative and useful way to deploy technology.

The new feature is easily configurable and offers simulations of what the adaptive content may look like so that Wix users can preview what their site visitors will see.

Muly Gelman, Senior Product Manager at Wix Personalize shared:

“Website personalization is now essential for delivering the relevant, engaging experiences today’s consumers expect. This application highlights how we can move beyond using AI to generate website content but leverage AI to dynamically adapt and personalize the live website experience for each visitor in real-time, empowering businesses to connect more effectively with their customers.

As a result, businesses can deliver engaging, personalized experiences that resonate with their audience, ultimately driving higher engagement rates and creating greater monetization opportunities.”

The new adaptive content feature complements their new Automation Builder and the Wix Functions feature.

Featured Image by Shutterstock/chainarong06

Why Do Web Standards Matter? Google Explains SEO Benefits via @sejournal, @MattGSouthern

Google Search Relations team members recently shared insights about web standards on the Search Off the Record podcast.

Martin Splitt and Gary Illyes explained how these standards are created and why they matter for SEO. Their conversation reveals details about Google’s decisions that affect how we optimize websites.

Why Some Web Protocols Become Standards While Others Don’t

Google has formally standardized robots.txt through the Internet Engineering Task Force (IETF). However, they left the sitemap protocol as an informal standard.

This difference illustrates how Google determines which protocols require official standards.

Illyes explained during the podcast:

“With robots.txt, there was a benefit because we knew that different parsers tend to parse robots.txt files differently… With sitemap, it’s like ‘eh’… it’s a simple XML file, and there’s not that much that can go wrong with it.”

This statement from Illyes reveals Google’s priorities. Protocols that confuse platforms receive more attention than those that work well without formal standards.

The Benefits of Protocol Standardization for SEO

The standardization of robots.txt created several clear benefits for SEO:

  • Consistent implementation: Robots.txt files are now interpreted more consistently across search engines and crawlers.
  • Open-source resources: “It allowed us to open source our robots.txt parser and then people start building on it,” Illyes noted.
  • Easier to use: According to Illyes, standardization means “there’s less strain on site owners trying to figure out how to write the damned files.”

These benefits make technical SEO work more straightforward and more effective, especially for teams managing large websites.

Inside the Web Standards Process

The podcast also revealed how web standards are created.

Standards groups, such as the IETF, W3C, and WHATWG, work through open processes that often take years to complete. This slow pace ensures security, clear language, and broad compatibility.

Illyes explained:

“You have to show that the thing you are working on actually works. There’s tons of iteration going on and it makes the process very slow—but for a good reason.”

Both Google engineers emphasized that anyone can participate in these standards processes. This creates opportunities for SEO professionals to help shape the protocols they use on a daily basis.

Security Considerations in Web Standards

Standards also address important security concerns. When developing the robots.txt standard, Google included a 500-kilobyte limit specifically to prevent potential attacks.

Illyes explained:

“When I’m reading a draft, I would look at how I would exploit stuff that the standard is describing.”

This demonstrates how standards establish security boundaries that safeguard both websites and the tools that interact with them.

Why This Matters

For SEO professionals, these insights indicate several practical strategies to consider:

  • Be precise when creating robots.txt directives, since Google has invested heavily in this protocol.
  • Use Google’s open-source robots.txt parser to check your work.
  • Know that sitemaps offer more flexibility with fewer parsing concerns.
  • Consider joining web standards groups if you want to help shape future protocols.

As search engines continue to prioritize technical quality, understanding the underlying principles behind web protocols becomes increasingly valuable for achieving SEO success.

This conversation shows that even simple technical specifications involve complex considerations around security, consistency, and ease of use, all factors that directly impact SEO performance.

Hear the full discussion in the video below:

AI Use Jumps to 78% Among Businesses As Costs Drop via @sejournal, @MattGSouthern

Stanford University’s latest AI Index Report reveals a significant increase in AI adoption among businesses.

Now 78% of organizations use AI, up from 55% a year ago. At the same time, the cost of using AI has dropped, becoming 280 times cheaper in less than two years.

More Businesses Than Ever Are Using AI

The latest report, now in its eighth year, shows a turning point for AI in business.

The number of organizations using generative AI in at least one business area more than doubled, from 33% in 2023 to 71% in 2024.

“Business is all in on AI, fueling record investment and usage,” the report states.

In 2024, U.S. companies invested $109.1 billion in AI, nearly 12 times more than China’s $9.3 billion and 24 times more than the U.K.’s $4.5 billion.

AI Costs Are Dropping

One reason more companies are using AI is that it’s becoming increasingly affordable. The report indicates that the cost of running AI queries has decreased significantly.

The report highlights:

“The cost of querying an AI model that performs like GPT-3.5 dropped from $20.00 per million tokens in November 2022 to just $0.07 per million tokens by October 2024.”

That’s 280 times cheaper in about 18 months.

Prices have dropped between 9 and 900 times per year, depending on the use case for AI. This makes powerful AI tools much more affordable for companies of all sizes.

Regional Differences and Business Impact

Different regions are adopting AI at different rates.

North America remains the leader, but Greater China has shown the most significant jump, with a 27-point increase in company AI use. Europe was next with a 23-point increase.

For marketing teams, AI is starting to show financial benefits. About 71% of companies using AI in marketing and sales report increased revenue, although most say the increase is less than 5%.

This suggests that while AI is helping, most companies are still figuring out how to use it best.

What This Means for Marketers & SEO Pros

These findings matter for several reasons:

  1. The drop in AI costs means powerful tools are getting more affordable, even for smaller teams.
  2. Companies report that AI boosts productivity and helps bridge skill gaps. This can enable you to accomplish more with limited resources.
  3. The report notes that “smaller models drive stronger performance.” Today’s models are 142 times smaller than the 2022 versions, so more AI tools can run on regular computers.

The 2025 AI Index Report clarifies that AI is no longer an experimental technology, it’s a mainstream business tool. For marketers, the question isn’t whether to use AI, but how to utilize it effectively to stay ahead of competitors.

For more insights, see the full report.


Featured Image: kanlaya wanon/Shutterstock

OpenAI Expresses Interest In Buying Chrome Browser via @sejournal, @martinibuster

Nick Turley, Head of Product at ChatGPT, testified that OpenAI would be interested in acquiring the Chrome browser should a judge decide to break it off from Alphabet, Google’s parent company.

According to a report in Reuters:

“ChatGPT head of product Nick Turley made the statement while testifying at trial in Washington where U.S. Department of Justice seeks to require Google to undertake far-reaching measures restore competition in online search.”

Perplexity Comes Out Against Chrome Divestiture

On Monday Perplexity CEO Aravind Srinivas wrote a post on X (formerly Twitter) stating that he intends to testify in support of Google at the U.S. governments anti-trust trial.

Perplexity simultaneously published an article explaining how their position isn’t so much about supporting Google as it is about supporting the future of web browsers and a more open Android ecosystem, two things that he explains will preserve a high level of quality for browsers and create more opportunity and innovation on mobile devices, a win-win for consumers and businesses.

The United States department of Justice wants to split Chrome off from Google as a way to minimize Google’s monopoly position across multiple industries which it asserts is having a negative effect on competition. Srinivas argues that separating Chrome from Google would have the opposite effect.

Srinivas laid out his two key concerns:

1. Google should not be broken up. Chrome should remain within and continue to be run by Google. Google deserves a lot of credit for open-sourcing Chromium, which powers Microsoft’s Edge and will also power Perplexity’s Comet. Chrome has become the dominant browser due to incredible execution quality at the scale of billions of users.

2. Android should become more open to consumer choice. There shouldn’t be a tight coupling to the default apps set by Google, and the permission for OEMs to have the Play Store and Maps. Consumers should have the choice to pick who they want as a default search and default voice assistant, and OEMs should be able to offer consumers this choice without having to be blocked by Google on the ability to have the Play Store and other Google apps (Maps, YouTube).”

Takeaways

OpenAI Expresses Interest In Chrome Browser

  • Nick Turley, Head of Product at ChatGPT, stated OpenAI would be interested in purchasing Chrome if a court orders Google to divest it.
  • His statement was made during testimony in the U.S. Department of Justice’s antitrust trial against Google.

Perplexity AI’s Position Against Chrome Divestiture

  • Perplexity CEO Aravind Srinivas publicly opposed the idea of separating Chrome from Google.
  • He announced plans to testify in support of Google in the antitrust case.
  • Perplexity emphasized that their stance is focused on preserving innovation.

Call for a More Open Android Ecosystem

  • Srinivas advocated for a more open Android ecosystem.
  • He proposed that consumers should freely choose their default search engine and voice assistant.
  • He criticized Google’s practice of requiring OEMs to bundle Google services like the Play Store and Maps.
  • He urged regulators to focus on increasing consumer choice on Android rather than breaking up Chrome.

Featured Image by Shutterstock/Prathmesh T