Google Spam Update Sparks Relentless Discontent via @sejournal, @martinibuster

Google spam updates have in previously been welcomed by the search marketing community. Today’s announcement reflected the sour mood of the search marketing and publishing community that is still reeling from six months of disruptive updates and the rollout of AI Overviews which is widely regarded as a traffic-stealing feature.

It’s not an overstatement to say that the response to Google’s spam update is relentlessly negative.

Not The Update Publishers Are Waiting For

Google’s March 2024 Core Update, which took about 45 days to complete, devastated the rankings for many site owners. Although Google no longer has a Helpful Content system (aka HCU), many site owners and SEOs who were affected by the HCU from last year are still waiting for a new update that would hopefully “fix” what many feel was a broken update.

One person tweeted:

“@JohnMu @searchliaison can this update remove the sitewide classifier still applied to sites since last September HCU? Or do we need to wait for a larger core update?”

Another person appeared to be laughing through their tears when they tweeted a screenshot showing their web traffic was down to six organic visitors:

“Google is coming after my last 6 organic visitors🤣Bring it on! Let’s see if we get to 0.”

Another person shared that they are demoralized from having lost 95% of their traffic from past updates:

“Honestly, it doesn’t matter what update you have under your sleeve. I’m uninstalling Google Site Kit from my site. Seeing constant, declining charts and figures every time I log into WordPress is demoralizing. They remind me that I’ve lost 95% of my traffic for no reason at all.”

A tweet that’s representative of the widespread sentiment that Google’s updates are broken:

“Your harmful monopoly is ruining the internet. Every one of your updates kills more independent websites while boosting spam.”

Another person tweeted:

“Google has turned many helpful websites into lost places”

It could be that Google’s last update missed the mark. But some of those who are affected by the updates from last year could (rightly or wrongly) be suffering from a shift in how Google defines site quality or relevance. Many are hoping that Google reverses course.

Backlash Against Pinterest In SERPs

Some of the feedback was about dissatisfaction with how Google ranked websites. One person tweeted that they hoped the spam update fixed Google’s preference for ranking Pinterest:

“Does this update means that Google will start to show my website when users make a “brand search” instead of my pins on pinterest?”

Backlash About Reddit in SERPs

Another person offered feedback about the (common) perception that Google is ranking Reddit for too many queries.

They tweeted:

“Reddit is the only spam in the SERP right now”

That sentiment about Reddit in the SERPs was shared by many others:

“Interesting to see Google roll out a spam update! I wonder how it will affect Reddit’s ranking in search results. Personally, I haven’t found a lot of truly helpful content there, Reddit is just spamming in search result.”

What About The Site Reputation Update?

Site reputation abuse is a form of spam where a digital marketer publishes content on a third party website for the purpose leveraging the site reputation for quick rankings. It’s a shortcut for avoiding having to create and promote an entirely new website.

Google SearchLiaison responded to a question of whether this spam update included the algorithmic version of the site reputation abuse update that Google announced was forthcoming. SearchLiaison responded that no, this update didn’t contain algorithmic elements for targeting site reputation abuse.

He tweeted:

“For the third time now, I’ll say again, I have every confidence that when we’re acting on site reputation abuse algorithmically, we’ll say that. It’s not right now. I also won’t be responding to this particular question every week so maybe let it go a month between asking (I don’t mean that as harsh as it sounds just that it’s not useful or productive for me to do the “are we there yet” over and over again)”

SearchLiaison followed up with:

“I mean I’d figure most wondering about this would know it’s a standard spam update given there’s no blog post, no “FYI things to know” and it’s just a regular posting to our dashboard

That said, I know people are asking Barry even though I’ve said what I just said above at least twice before. So I figured if I’m going to say it at least a third time, I’ll try again to explain why it’s not really something to ask about each week.”

No Description Of Spam Update

Changes to Google’s rankings are rarely announced except when it’s anticipated that the effects to rankings may be noticeable, which by that measure makes this update notable and significant, particularly because the update will take an entire week to roll out.

Google sometimes publishes a blog post about their spam updates but there is no accompanying article that details what this spam update is targeting, which may be a factor contributing to the anxiety expressed in some of the responses to Google’s announcement.

Google Has A Sentiment Problem

A combination of AI Overviews, Helpful Content Update from late 2023 to the recent updates dating from March are all combining to create negative sentiment in the digital marketing community. The so-called leak added to fuel to that fire. Even though the data revealed nothing that wasn’t already known, some are using it justify their long held suspicions and accusing Google of lying.  And it’s not just the search marketing community, independent web publishers and big brand news organizations have soured on Google, too.

So much negative sentiment has accumulated over the past year that the spam update, which would normally be cheered, is now met with skepticism and complaints.

Read Google’s spam announcement:

Featured Image by Shutterstock/Cast Of Thousands

Study: Google Favors Ecommerce Sites & User-Generated Content via @sejournal, @MattGSouthern

A recent study by the digital marketing agency Amsive documented a notable change in Google’s search results rankings over the last year.

The study found that Google is surfacing more ecommerce websites and sites featuring user-generated content while reducing the visibility of product review and affiliate marketing sites.

Here’s a look at the findings and the implications for online businesses if the shifts continue.

Ecommerce Dominance In Search Results

The study found a marked increase in ecommerce sites appearing in top search positions for many commercial queries.

Keywords that previously returned results from product reviews and affiliate sites now predominantly feature online retailers.

For example:

  • Bird feeders“: Ecommerce stores now hold all 10 top positions, replacing several product review sites from the previous year.
  • Laptops“: The top 10 results now consist exclusively of ecommerce websites, with some appearing multiple times.
  • Towel warmer“: Ecommerce giants like Amazon and Walmart have multiple listings, completely replacing affiliate websites in the top results.

Rise Of User-Generated Content

Alongside ecommerce sites, user-generated content (UGC) platforms have seen a significant boost in search visibility.

Reddit, Quora, and YouTube now frequently appear in top positions for various queries where they were previously absent or ranked lower.

This trend is particularly noticeable for longer queries like “toys for 2-year-old boys,” where UGC sites are more visible.

Impact On Product Review & Affiliate Sites

The shift in search rankings introduces challenges for product review and affiliate websites, as they’re now less visible for many commercial queries.

While Google hasn’t explicitly stated that product review content is considered “unhelpful,” the data suggests that recent updates have disproportionately affected these pages.

Implications For Digital Marketing Strategies

Due to these changes, product review and affiliate sites may need to reconsider their strategies to maintain visibility and traffic.

Lily Ray and Silvia Gituto, the study’s authors, suggest diversifying traffic sources through:

  • Increased focus on digital media and PR.
  • Enhanced social media engagement.
  • Creation of video content for platforms like YouTube Shorts and TikTok.
  • Development of podcast content.
  • Active participation in relevant online forums.

What This Means For Websites

For ecommerce sites, this is an opportunity to gain more visibility and traffic.

They could take advantage of this shift by getting more customer reviews and user-generated content on their sites.

Product review and affiliate sites may need to change strategies.

Promoting themselves on social media, making videos, starting podcasts, and engaging in online forums could help compensate for lost Google search traffic.

Adapting to these changes, especially around user-generated content, will likely be needed for continued success.


Featured Image: hanss/Shutterstock

Google Dials Back AI Overviews In Search Results, Study Finds via @sejournal, @MattGSouthern

According to new research, Google’s AI-generated overviews have undergone significant adjustments since the initial rollout.

The study from SE Ranking analyzed 100,000 keywords and found Google has greatly reduced the frequency of AI overviews.

However, when they appear, they’re more detailed than they were previously.

The study digs into which topics and industries are more likely to get an AI overview. It also looks at how the AI snippets interact with other search features like featured snippets and ads.

Here’s an overview of the findings and what they mean for your SEO efforts.

Declining Frequency Of AI Overviews

In contrast to pre-rollout figures, 8% of the examined searches now trigger an AI Overview.

This represents a 52% drop compared to January levels.

Yevheniia Khromova, the study’s author, believes this means Google is taking a more measured approach, stating:

“The sharp decrease in AI Overview presence likely reflects Google’s efforts to boost the accuracy and trustworthiness of AI-generated answers.”

Longer AI Overviews

Although the frequency of AI overviews has decreased, the ones that do appear provide more detailed information.

The average length of the text has grown by nearly 25% to around 4,342 characters.

In another notable change, AI overviews now link to fewer sources on average – usually just four links after expanding the snippet.

However, 84% still include at least one domain from that query’s top 10 organic search results.

Niche Dynamics & Ranking Factors

The chances of getting an AI overview vary across different industries.

Searches related to relationships, food and beverages, and technology were most likely to trigger AI overviews.

Sensitive areas like healthcare, legal, and news had a low rate of showing AI summaries, less than 1%.

Longer search queries with ten words were more likely to generate an AI overview, with a 19% rate indicating that AI summaries are more useful for complex information needs.

Search terms with lower search volumes and lower cost-per-click were more likely to display AI summaries.

Other Characteristics Of AI Overviews

The research reveals that 45% of AI overviews appear alongside featured snippets, often sourced from the exact domains.

Around 87% of AI overviews now coexist with ads, compared to 73% previously, a statistic that could increase competition for advertising space.

What Does This Mean?

SE Ranking’s research on AI overviews has several implications:

  1. Reduced Risk Of Traffic Losses: Fewer searches trigger AI Overviews that directly answer queries, making organic listings less likely to be demoted or receive less traffic.
  2. Most Impacted Niches: AI overviews appear more in relationships, food, and technology niches. Publishers in these sectors should pay closer attention to Google’s AI overview strategy.
  3. Long-form & In-Depth Content Essential: As AI snippets become longer, companies may need to create more comprehensive content beyond what the overviews cover.

Looking Ahead

While the number of AI overviews has decreased recently, we can’t assume this trend will continue.

AI overviews will undoubtedly continue to transform over time.

It’s crucial to monitor developments closely, try different methods of dealing with them, and adjust game plans as needed.


Featured Image: DIA TV/Shutterstock

Google Launches June 2024 Spam Update via @sejournal, @MattGSouthern

Google has announced the rollout of the June 2024 spam update, which aims to further improve search results by targeting websites that violate Google’s spam policies.

According to a statement, the update, which began on June 20, is expected to take up to one week to roll out fully.

Background On Google’s Spam Updates & Policies

Google regularly updates its systems to reduce low-quality and spammy content from its search results.

Spam updates target websites that break Google’s rules, such as:

  • Automatically generating content solely to improve search rankings.
  • Buying or selling links to manipulate rankings.
  • Having thin, duplicated, or poor-quality content.
  • Tricking users with hidden redirects or other deceptive techniques.

Google’s last spam update was released in March.

Despite the March update impacting many spammy websites, some AI-generated content still managed to rank well in search results.

Analysis by Search Engine Journal’s Roger Montti notes that some AI spam sites ranked for over 217,000 queries, with more than 14,900 ranking in the top 10 search results.

The sites employed tactics such as rapid content churn, AI-generated images, and templated article structures, exploiting a loophole that allowed new content to receive an initial ranking boost.

Potential Impact On Search Results

The June spam update will likely refine Google’s spam detection capabilities further.

However, past experiences have shown that closing loopholes can inadvertently impact legitimate websites.

As with any significant update, the June spam update may result in fluctuations in search rankings for some websites.

Websites that engage in practices that violate Google’s spam policies or rely heavily on AI-generated content may see a decline in their search visibility.

On the other hand, some websites may benefit from the update, as they will face less competition from spammy websites in search results.

Looking Ahead

Google says the June 2024 spam update may take up to one week to roll out fully.

Once the rollout is complete, Google will post an update on its Search Status Dashboard, and you can assess the update’s impact on your search rankings.


Featured Image: Danishch/Shutterstock

New Bluehost Agency Partner Program For WordPress Agencies via @sejournal, @martinibuster

Bluehost announced a partner program that’s expressly designed to support WordPress agencies and freelancers that service small-to-medium size businesses (SMBs). The program offers revenue generating opportunities in the form of commissions, exclusive discounts, priority customer service, and other benefits that will help agencies grow their client base and earn more revenue.

Focus On WordPress Websites

Bluehost is an active member of the WordPress community, which includes helping to develop the WordPress core itself by directly sponsoring six WordPress core contributors. Bluehost is well-positioned to offer agencies the products, community, service and revenue generating opportunities that align with the goals of WordPress-based development agencies and freelancers that service SMBs.

A key element of the Agency Partner Program is Bluehost Cloud, a managed WordPress hosting platform that provides a 100% uptime SLA. Bluehost managed WordPress Cloud is designed as a secure high performance solution, which makes it ideal for freelancers and agencies that depend on performant hosting.

Exclusive Benefits for Partner Agencies

Acceptance into the program grants agencies early access to Bluehost’s referral program (commissions), product discounts, learning webinars, access to priority customer support, and membership in an exclusive LinkedIn network.

According to the Bluehost announcement:

“By partnering with Bluehost, agencies can now provide their clients with the highest quality customer service, WordPress expertise and some of the most comprehensive hosting products, including Bluehost Cloud, Yoast SEO and eCommerce plug-ins.”

The Bluehost Agency Partner Program offers the resources for WordPress agencies and freelancers to level up their service offerings, generate new revenue streams, and the resources to deliver superior results for their clients. It’s a win-win partnership that may be worth looking into.

Visit the Bluehost Partner Program page:

Early Applications: Introducing the Bluehost Agency Partner Program.

Read the official announcement here:

Bluehost Unlocks New Opportunities For WordPress Agencies

Featured Image by Shutterstock/Shift Drive

Reddit Traffic Up 39%: Is Google Prioritizing Opinions Over Expertise? via @sejournal, @MattGSouthern

Reddit’s website traffic has grown 39% compared to the previous year, according to data from Similarweb.

This growth seems fueled by Reddit’s increased visibility in Google search results.

Why is Reddit growing so fast, and what does this mean for businesses and SEO professionals?

Here’s our take on it.

Why Is Reddit Growing?

Several factors, including Google prioritizing “helpful content” from discussion forums in a recent algorithm update, have likely contributed to Reddit’s improved search rankings and visibility.

A report from Business Insider indicates that more people are now finding Reddit through Google searches than by directly visiting the reddit.com website.

Mordy Oberstein, Wix’s Head of SEO, shared recent data showing a consistent increase in the share of Reddit sources appearing in Google’s Discussion and Forums SERP feature.

Lily Ray, Senior Director of SEO and Head of Organic Research at Amsive Digital, tweeted about Reddit’s increased visibility in Google search results.

She noted that Reddit appeared in “Discussions and Forums” for various medical queries in recent weeks but not anymore today.

Ray also observed that the number of Discussion and Forum features with multiple Reddit URLs has decreased slightly over the past months.

Google’s $60 Million Deal with Reddit

Google recently signed a $60 million deal to license Reddit data for AI products.

The timing of the deal and Reddit’s search growth raise questions.

Google has denied a direct connection between the deal and Reddit’s search visibility, but the coincidence is notable.

Implications For Marketers & SEO Professionals

Reddit’s newfound dominance in Google search results presents business challenges and opportunities.

Challenges

Roger Montti, a staff writer for Search Engine Journal, raises concerns about the expertise and trustworthiness of Reddit content:

In the article, “Let’s Be Real: Reddit In Google Search Lacks Credibility,” Montti states:

“Opinions shared on Reddit by people who lack expertise and are sharing opinions in anonymity qualify as dubious. Yet Google is not only favoring Reddit in the search results, it is also paying millions of dollars for access to content that is lacking in expertise, experience, authoritativeness and trustworthiness.”

This is challenging because it means your expert-written content could get outranked by the opinions of anonymous Reddit users.

Opportunities

Search Engine Journal founder Brent Csutoras offers a more optimistic view, believing marketers should lean into Reddit’s newfound prominence.

In the article, “Why Every Marketer Should Be On Reddit,” Csutoras states:

“If your brand has something meaningful to say and is interested in truly connecting with your audience, then yes, you should be on Reddit.”

However, Reddit’s community-driven nature requires a delicate approach, Csutoras adds:

“Reddit communities can be highly negative toward self-serving promotion. But if you put in the effort and solve people’s needs and problems, Reddit has the potential to be a high-performance channel.”

Why SEJ Cares

SEO professionals and marketers should be mindful that expert-written resources could be outranked by Reddit threads that reflect personal opinions rather than authoritative information.

However, by providing genuine value and respecting Reddit’s community guidelines, businesses may be able to leverage the platform’s prominence for increased visibility and audience engagement.


Featured Image: rafapress/Shutterstock

Is Google Crawling Your Site A Lot? That Could Be A Bad Sign via @sejournal, @MattGSouthern

According to a recent LinkedIn post by Gary Illyes, Analyst at Google, you should be cautious if Google starts aggressively crawling your website.

While an uptick in crawling can be a good sign, Illyes says it may indicate underlying issues.

Illyes cautions:

“Don’t get happy prematurely when search engines unexpectedly start to crawl like crazy from your site.”

He says there are two common problems to watch out for: infinite spaces and website hacks.

Infinite Spaces Could Cause Crawling Spike

An issue Illyes highlighted is sites with “infinite spaces”—areas like calendar modules or endlessly filterable product listings that can generate unlimited potential URLs.

If a site is crawled a lot already, crawlers may get extra excited about infinite spaces.

Illyes explains:

“If your site generally has pages that search users find helpful, crawlers will get excited about these infinite spaces for a time.”

He recommends using the robots.txt file to block crawlers from accessing infinite spaces.

Hacked Sites Can Trigger Crawling Frenzy

Another troubling cause of a crawling spike is a security breach where hackers inject spam onto a reputable site.

Crawlers may initially interpret this as new content to index before realizing it’s malicious.

Illyes states:

“If a no-good-doer somehow managed to get access…they might flood your site with, well, crap… crawlers will get excited about these new pages for a time and happily crawl them.”

Remain Skeptical Of Crawling Spikes

Rather than assuming a crawling spike is positive, Illyes suggests treating it as a potential issue until the root cause is identified.

He states:

“Treat unexpected sharp increases in crawling as a symptom…until you can prove otherwise. Or, you know, maybe I’m just a hardline pessimist.”

Fixing Hacked Sites: Help From Google

For hacked sites, Illyes pointed to a page that includes a video with further assistance:

Here are the key points.

Tips From Google’s Video

Google’s video outlines the steps in the recovery process.

1. Identify The Vulnerability

The first crucial step is finding how the hacker gained access. Tools like Google’s Webmaster Tools can assist in detecting issues.

2. Fix The Vulnerability

Once the security hole is identified, it must be closed to prevent any future unauthorized access. This could involve updating software, changing passwords, etc.

3. Clean The Hacked Content

Check the entire site’s content and code to remove any spam, malware, defaced pages, or other injections by the hacker. Security plugins like Wordfence can assist in this process.

4. Harden Security

Beyond fixing the specific vulnerability, take additional measures to harden the site’s security. This could include enabling firewalls, limiting user permissions, and more frequent software updates.

5. Request A Review

Once the vulnerability is patched and any hacked content is removed, you can then request Google to review the site and remove any security warnings or blacklists once it’s verified as clean.

The video notes that the review process is faster for malware issues (days) than spam issues (weeks) since Google has to inspect spam cleanup efforts further.

Additional Tips From Google’s John Mueller

Google’s John Mueller has previously offered specific advice on recovering from the SEO impact of hacked pages:

  1. Use the URL removal tool to deindex the hacked pages quickly.
  2. Focus on improving the overall site quality beyond removing hacked content.
  3. Lingering impacts may persist for months until the site recovers Google’s trust.

Why SEJ Cares

Website security is crucial for all businesses, as hacked content can impact trust and search engine rankings.

Google’s Gary Illyes pointed out that sudden spikes in crawling activity could indicate security breaches or technical issues that need immediate attention.


Featured Image: Stacey Newman/Shutterstock

Razorfish R-Index Turns Consumer Data Silos Into Strategic Insights via @sejournal, @martinibuster

Razorfish launched a new technology called R-Index that measures disparate online and offline customer interactions (including for paid and owned interactions) and generates prescriptive insights on consumer sentiment, brand performance and business impact. R-Index turns otherwise disconnected data into strategic insights on consumer journeys and brand sentiment.

R-Index is based on a custom algorithm that leverages Google Cloud, Big Query, and a suite of machine learning and Vertex AI, working together to analyze what customers are doing at every step and providing actionable insights about customer insights and learning how to engage with customers better.

What Is R-Index About?

I interviewed Razorfish to get a better idea of what R-Index is and why it’s an important tool for brands.

I asked Razorfish about what’s being measured:

“R-Index helps measure brand performance, consumer sentiment, and business impact. It includes a brand’s experience touchpoints across the consumer journey, including paid and owned interactions.”

The press release notes how there’s an abundance of data about “moments that matter” but that its inherent disparate quality makes it challenging to get a holistic picture of what it all means and extract meaning from it. So I asked them to elaborate on that.

“The holistic journey looks different for different consumers and consumer journeys. R-Index aims to capture how consumers start their journeys through purchase and loyalty, and distill how resonant each of these touchpoints are along the journey into a single, easy-to-use metric.

A moment that matters is a specific engagement that a consumer has with any of our experience touchpoints, whether that’s marketing, going to a website, etc. These are the moments where we see
more engagement based on our observations. They can be different across consumers and segments.

As we analyze what the consumer is doing across the full journey, we’re identifying touchpoints that are resonating more and helping brands refine and optimize those experiences. This could be increasing the frequency, delivering a more personalized message, or focusing on a specific touchpoint. But with R-Index, we’re capitalizing on this behavioral data and using it to serve consumers better.”

What are the concrete real-world “touchpoints” you are referring to?

“Real-world touchpoints include call data, CRM information, web traffic, mobile app clicks, ad traffic or offline interactions like TV. As the number of avenues for consumers to interact with a brand continues to increase, data from those sources is continuing to fragment and shift further into silos.

Similarly, despite recent delays, third-party cookies will continue to deprecate and newer regulations will further the challenges in data collection, making it vital for brands to be able to access and process any and all data options into one source.

When you think about how traditional measurement tools have looked at performance (ex: acquisition and how that works across specific channels, paid media, or television) they aren’t really connected to measuring the actual sentiment or perception of consumers and how these translate into specific business value for brands.

And sentiment descriptions really differ from brand to brand, as some labels that are considered “negative” for one brand might not be the same for another.

R-index is meant to aggregate all the different touchpoints that a consumer could theoretically interact with and get to a perspective of what’s actually driving either positive sentiment and resonance for consumers or what areas need to be optimized for better experiences.”

Tell me more about the insights and how R-Index provides more a “nuanced view”?

“R-Index is a simpler tool to get the insights that are needed to help a brand optimize overall performance, dive into the specific drivers of that performance for a brand, and make those experiences more resonant and relevant for their core consumers. R-Index provides more insights into what’s truly driving positive and helpful consumer experiences, and driving resonance for brands across the entire marketing mix and marketing investment.

Even if you have specific segments of consumers, they can behave very differently based on how they’re interacting with the touchpoints. While the aim is not to drill down to any one
specific customer, it can provide improved segment understanding to make each touchpoint more appropriate and personalized.

There are many measurement solutions in the market that can look at channel performance or sentiment performance in a silo, but R-Index is putting everything together in one place. R-Index has the components of being more dynamic, being able to scale and being able to plug into a number of different tools and AI capabilities to provide predictive optimized recommendations at scale.

The combination and connectivity of the data being pulled, the AI capabilities, and rigorous testing of the tool is helping drive the more nuanced views of insights that provide prescriptive strategic recommendations and analyses of data with greater detail. The definition and understanding of a brand’s audience segments will be deeper than ever before.

R-Index is prescriptive, providing automated insights and recommendations, and allows for drill-down insights at granular levels for components that make up the index score.

R-Index’s capabilities go beyond simply understanding what ads work to understanding how nuances across media investment, macroeconomic data, etc., impact overall consumer perceptions and interactions with brands, and how to best refine experiences to be resonant to consumers with those insights in mind.”

A Powerful Tool For Actionable Insights

R-Index is a powerful marketing insight tool that measures brand performance, consumer sentiment, and business impact and provides prescriptive recommendations to help make marketers and marketing teams improve consumer experiences and business outcomes.

Read more about R-Index:

Razorfish Unveils R-Index, A Proprietary Data Solution for Creating Unified Experiences in Collaboration with Google Cloud

Featured Image by Shutterstock/PCH.Vector

YouTube Tests Crowdsourced Annotations For Videos via @sejournal, @MattGSouthern

YouTube is piloting a new experimental feature allowing users to add contextual notes to videos to provide supplemental information.

The “Video Context Notes” feature, currently being tested on mobile in the United States for English language videos, allows invited contributors to write short annotations.

Screenshot from: blog.youtube.com, June 2024.

In its announcement, YouTube describes how it intends for people to use context notes:

“These notes could clarify when footage contains parody material, point out if a product review is outdated due to a newer version release, or confirm whether viral clips actually depict current events.”

Notes build on other YouTube efforts to present context alongside videos, such as information panels and disclosure labels for altered or synthetic media.

However, YouTube recognizes there’s potential for inaccurate or unsuitable notes during the experimental phase, stating:

“We anticipate there will be mistakes – notes that aren’t a great match for the video or potentially incorrect information. That’s part of how we’ll learn from the experiment.”

Availability

A limited number of YouTube channels in good standing will be invited to write and attach context notes to videos.

Viewers in the U.S. will be able to see and rate the helpfulness of these notes.

Third-party evaluators, the same contracted personnel who provide feedback on YouTube’s search and recommendation systems, will also assess the quality and accuracy of posted notes.

Their ratings and viewer input will be processed through a “bridging-based algorithm” to determine which notes get published broadly.

YouTube explains in the announcement:

“If many people who have rated notes differently in the past now rate the same note as helpful, then our system is more likely to show that note under a video.”

As the pilot progresses, YouTube plans to explore having contributors rate each other’s notes to further train the note-publishing system.

Why SEJ Cares

Letting users add context could add another layer of credibility to videos, such as confirming or debunking the presenter’s claims.

While there are bound to be some growing pains, if YouTube can get this new notes system right, it could raise the bar for transparency when it comes to video content across the web.


Featured Image: Queenmoonlite Studio/Shutterstock