Duda Website Builder For Agencies Adds More AI Tools via @sejournal, @martinibuster

Duda has announced the addition of new AI assistant features that help digital agencies scale their website creation and optimization, making it easier for them to handle more clients without increasing costs.

Duda Website Builder For Digital Agencies

Duda is a proprietary website builder platform created specifically for digital marketing agencies that allows them to scale and take on more clients without having to hire an army of web developers to scale along with their growth.

The Duda platform facilitates creating high quality websites and easily maintaining them for clients. White labeling allows digital agencies the ability to provide a branded experience to their clients.

Duda AI Assistant Features

Duda added two new features to their robust set of AI tools that further improves the automation of webpage creation and optimization of alt text. The new Sections tool creates webpage sections and layouts using just prompts to describe what is required. It’s almost like something out of a science fiction movie where a designer tells the computer what it wants and the AI completes the project.

The alt text tool expands on the current alt text tool by now acquiring the ability to create alt text for images in bulk for the purposes of accessibility and search optimization. The improved AI alt text tool can also add alt text in multiple languages.

Video Of AI Tool

According to the Duda announcement:

“Duda’s AI sections tool automatically generates new site sections with suggested design, layout, copy, and images from a short user-provided prompt in just a few clicks. This feature drastically reduces the time needed to lay out the page, write copy, and select images. Now, content sections can be tailored and stylized in seconds.

Similarly, Duda’s AI alt text tool boosts productivity by enabling the creation of alt text for all site images at once. Users can generate alt text in multiple languages for images lacking tags or for all site images, streamlining the process of optimizing websites for SEO and accessibility.”

Duda’s continual improvements to their products demonstrates their commitment to helping digital agencies lower their costs while scaling their ability to handle more clients to keep growing into larger and more profitable businesses.

Read the announcement here:

Duda reveals industry-first website creation and optimization AI Assistant for digital marketing agencies

Read about the Duda AI Assistant:

Duda’s AI Assistant: The future of web building is here

Featured Image by Shutterstock/Shutter_M

Google CEO Addresses Concerns Over AI’s Impact On Search Traffic via @sejournal, @MattGSouthern

In a recent interview, Google CEO Sundar Pichai discussed the company’s implementation of AI in search results and addressed concerns from publishers and website owners about its potential impact on web traffic.

Background On AI In Google Search

Google has been gradually incorporating AI-generated overviews and summaries into its search results.

These AI overviews aim to provide users with quick answers and context upfront on the search page. However, publishers fear this could dramatically reduce website click-through rates.

Pichai Claims AI Drives Traffic

Despite concerns, Pichai maintained an optimistic outlook on how AI will affect the web ecosystem in the long run.

He tells The Verge:

“I remain optimistic. Empirically, what we are seeing throughout these years is that human curiosity is boundless.”

The Google CEO claimed that the company’s internal data shows increased user engagement with AI overviews, including higher click-through rates on links within these previews compared to regular search results.

Pichai stated:

“When you give the context, it also exposes people to various branching off, jumping off, points, and so they engage more. So, actually, this is what drives growth over time.”

Unfortunately, Picahi didn’t provide specific metrics to support this assertion.

Balancing User Experience & Publisher Interests

Pichai claims that Google is attempting to balance meeting user expectations and sending website traffic, stating:

“I look at our journey, even the last year through the Search Generative Experience, and I constantly found us prioritizing approaches that would send more traffic while meeting user expectations.

… what’s positively surprising us is that people engage more, and that will lead to more growth over time for high-quality content.”

When pressed on anecdotal evidence of some websites losing significant traffic, Pichai cautioned against drawing broad conclusions from individual cases.

He argued that Google has provided more traffic to the web ecosystem over the past decade.

Pichai believes the sites losing traffic are the “aggregators in the middle.”

He stated:

“From our standpoint, when I look historically, even over the past decade, we have provided more traffic to the ecosystem, and we’ve driven that growth.

Ironically, there are times when we have made changes to actually send more traffic to the smaller sites. Some of those sites that complain a lot are the aggregators in the middle.

So should the traffic go to the restaurant that has created a website with their menus and stuff or people writing about these restaurants? These are deep questions. I’m not saying there’s a right answer.”

Takeaways For Website Owners & SEO Professionals

For those in the SEO community, Pichai’s comments offer insight into Google’s strategy and perspective but should be viewed with a degree of skepticism.

While the CEO painted a rosy picture of AI’s impact, concrete data was lacking to support his claims. Website owners must monitor their analytics closely to assess the real-world effects of AI overviews on their traffic.

As Google continues to roll out AI features in search, the dust is far from settled on this issue.

Pichai’s optimism aside, the true impact of AI on the web ecosystem remains to be seen. For now, publishers and SEOs must stay vigilant, adaptable, and vocal about their concerns in this rapidly shifting landscape.


Featured Image: Muhammad Alimaki/Shutterstock

38% Of Webpages From 2013 Have Vanished, Pew Study Finds via @sejournal, @MattGSouthern

A new study by Pew Research Center reveals the fleeting nature of online information: 38% of webpages from 2013 are no longer accessible a decade later.

The analysis, conducted in October, examined broken links on government and news websites and in the “References” section of Wikipedia pages.

The findings reveal that:

  • 23% of news webpages and 21% of government webpages contain at least one broken link
  • Local-level government webpages, particularly those belonging to city governments, are especially prone to broken links
  • 54% of Wikipedia pages have at least one link in their “References” section pointing to a non-existent page

Social Media Not Immune To Content Disappearance

To investigate the impact of digital decay on social media, Pew Research collected a real-time sample of tweets on X and monitored them for three months.

The study discovered that “nearly one-in-five tweets are no longer publicly visible on the site just months after being posted.”

In 60% of these cases, the original posting account was made private, suspended, or deleted.

In the remaining 40%, the account holder deleted the tweet, but the account still existed.

Certain types of tweets are more likely to disappear than others, with more than 40% of tweets written in Turkish or Arabic no longer visible within three months of posting.

Additionally, tweets from accounts with default profile settings are particularly susceptible to vanishing from public view.

Defining “Inaccessible” Links & Webpages

For the purpose of this report, Pew Research Center focused on pages that no longer exist when defining inaccessibility.

Other definitions, such as changed content or accessibility issues for visually impaired users, were beyond the scope of the research.

The study used a conservative approach, counting pages as inaccessible if they returned one of nine error codes, indicating that the page and/or its host server no longer exist or have become nonfunctional.

Why SEJ Cares

Digital decay raises important questions about the preservation and accessibility of online content for future generations.

Pew Research Center’s study sheds light on the extent of this problem across various online spaces, from government and news websites to social media platforms.

The high rate of link rot and disappearing webpages has implications for anyone who relies on the internet as a reliable source of information.

It poses challenges for citing online sources, as the original content may no longer be accessible in the future.

What This Means For SEO Professionals

This study underscores the need to regularly audit and update old content, as well as consistently monitor broken links and resolve them promptly.

SEO professionals should also consider the impact of digital decay on backlink profiles.

As external links to a website become inaccessible, it can affect the site’s link equity and authority in the eyes of search engines.

Monitoring and diversifying backlink sources can help mitigate the risk of losing valuable links to digital decay.

Lastly, the study’s findings on social media content prove that SEO efforts should focus on driving users back to more stable, owned channels like websites and email lists.


Featured Image: apghedia/Shutterstock

Google’s AI Vision Driven By Panic, Not User Needs: Former Product Manager via @sejournal, @MattGSouthern

A 16-year Google veteran is raising concerns about the company’s current focus on AI, labeling it a “panic reaction” driven by fear of falling behind competitors.

Scott Jenson, who left Google last month, took to LinkedIn to critique the tech giant’s AI projects as “poorly motivated and driven by this mindless panic that as long as it had ‘AI’ in it, it would be great.”

Veteran’s Criticism Of Google’s AI Focus

Jenson stated that Google’s vision of creating an AI assistant for its ecosystem is “pure catnip” fueled by the fear of letting someone else get there first.

He parallels the ill-fated Google+ product, which he calls a “similar hysterical reaction” to Facebook’s rise.

Jenson wrote:

“This exact thing happened 13 years ago with Google+ (I was there for that fiasco as well). That was a similar hysterical reaction but to Facebook.”

Lack Of User-Driven Motivation

Jenson argues that Google’s strategy lacks motivation driven by genuine user needs, a sentiment echoed by a recent Gizmodo article that described this year’s Google I/O developer conference as “the most boring ever.”

The article, which Jenson linked to in his post, criticized Google for failing to clarify how Gemini’s new AI technology would integrate into its existing products and enhance the user experience.

See Jenson’s full post below:

Can You Turn Off Google’s AI Overviews?

One prime example of Google’s AI overreach is the AI overviews feature, which generates summaries to directly answer search queries by ingesting information from across the web.

This controversial move has sparked legal battles, with publishers accusing Google of violating intellectual property rights and unfairly profiting from their content without permission.

Turning Off AI Overviews

While Google doesn’t provide an official setting to turn off AI overviews, a viral article from Tom’s Hardware suggests using browser extensions.

Alternatively, you can configure Chrome to go directly to web search results, bypassing the AI-generated overviews.

Here are the steps:

  • Open Chrome settings by clicking the three dots in the top-right corner and selecting “Settings” from the menu.
  • In the Settings window, click on the “Search Engine” tab on the left side.
  • Under the “Search Engine” section, click “Manage search engines and site search.”
  • Scroll down to the “Site search” area and click “Add” to create a new entry.

In the new entry, enter the following details:

  • Name: Google (Web)
  • Shortcut: www.google.com
  • URL: {google:baseURL}/search?udm=14&q=%s
  • Click “Add
Screenshot from: chrome://settings/searchEngines, May 2024.

Lastly, click the three dots next to the new “Google (Web)” entry and select “Make default.”

Screenshot from: chrome://settings/searchEngines, May 2024.

After following these steps, Chrome will now default to showing regular web search results instead of the AI overview summaries when you perform searches from the address bar.

Tensions Over Data Usage

The controversy surrounding AI overviews creates tension between tech companies and content creators over using online data for AI training.

Publishers argue that Google’s AI summaries could siphon website traffic, threatening independent creators’ revenue streams, which rely on search referrals.

The debate reflects the need for updated frameworks to balance innovation and fair compensation for content creators, maintaining a sustainable open internet ecosystem.


FAQ

What concerns has Scott Jenson raised about Google’s AI focus?

Scott Jenson, a former Google product manager, has expressed concerns that Google’s current AI focus is more of a “panic reaction” to stay ahead of competitors rather than addressing user needs. He critiques Google’s AI initiatives as poorly motivated and driven by a fear of letting others get ahead.

How does Scott Jenson compare Google’s AI strategy to past projects?

Jenson parallels Google’s current AI focus and the company’s response to Facebook years ago with Google+. He describes both as “hysterical reactions” driven by competition, which, in the case of Google+, resulted in a product that failed to meet its objectives.

Why are content creators concerned about Google’s AI overviews?

Content creators worry that Google’s AI overviews, which generate summaries by ingesting web content, could reduce site traffic. They argue that this practice is unfair as it uses their content without permission and impacts their revenue streams that rely on search referrals.

How can users turn off Google’s AI overviews in Chrome?

Although no official setting exists to disable AI overviews, users can use a workaround by enabling a specific Chrome setting.

Here are the steps:

  • Open Chrome settings by clicking the three dots in the top-right corner and selecting “Settings” from the menu.
  • In the Settings window, click on the “Search Engine” tab on the left side.
  • Under the “Search Engine” section, click “Manage search engines and site search.”
  • Scroll down to the “Site search” area and click “Add” to create a new entry.

In the new entry, enter the following details:

    • Name: Google (Web)
    • Shortcut: www.google.com
    • URL: {google:baseURL}/search?udm=14&q=%s
    • Click “Add

This will force Chrome to skip AI-generated overviews and show the classic list of web links.


Featured Image: Sira Anamwong/Shutterstock

Google Helpfulness Signals Might Change – Why It’s Not Enough via @sejournal, @martinibuster

Google’s John Mueller indicated the possibility of changes to sitewide helpful content signals so that new pages may be allowed to rank. But there is reason to believe that even if that change goes through it may not be enough to help.

Helpful Content Signals

Google’s Helpful Content Signals (aka Helpful Content Update aka HCU) was originally a site-wide signal when launched in 2022. That meant that an entire site would be classified as unhelpful and become unable to rank, regardless if some pages were helpful.

Recently the signals associated with the Helpful Content System were absorbed into Google’s core ranking algorithm, generally changing them to page-level signals, with a caveat.

Google’s documentation advises:

“Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand the helpfulness of individual pages. We do have some site-wide signals that are also considered.”

There are two important takeaways:

  1. There is no longer a single system for helpfulness. It’s now a collection of signals within the core ranking algorithm.
  2. The signals are page-level but there are site-wide signals that can impact the overall rankings.

Some publishers have tweeted that the site-wide effect is impacting the ability of new helpful pages from ranking and John Mueller offered some hope.

If Google follows through with lightening the helpfulness signals so that individual pages are able to rank, there is reason to believe that it may not impact many websites that publishers and SEOs believe are suffering from sitewide helpfulness signals.

Publishers Express Frustration With Sitewide Algorithm Effects

Someone on X (formerly Twitter) shared:

“It’s frustrating when new content is also being penalized without having a chance to gather positive user signals. I publish something it goes straight to page 4 and stays there, regardless of if there are any articles out on the location.”

Someone else brought up the point that if helpfulness signals are page-level then in theory the better (helpful) pages should begin ranking but that’s not happening.

John Mueller Offers Hope

Google’s John Mueller responded to a query about sitewide helpfulness signals suppressing the rankings of new pages created to be helpful and later indicated there may be a change to the way helpfulness signals are applied sitewide.

Mueller tweeted:

“Yes, and I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”

Possible Change To Helpfulness Signals

Mueller followed up his tweet by saying that the search ranking team is working on a way to surface high quality pages from sites that may contain strong negative sitewide signals indicative of unhelpful content, providing relief to some sites that are burdened by sitewide signals.

He tweeted:

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

Why Changes To Sitewide Signal May Not Be Enough

Google’s search console tells publishers when they’ve received a manual action. But it doesn’t tell publishers when their sites lost rankings due to algorithmic issues like helpfulness signals.  Publishers and SEOs don’t and cannot “know” if their sites are affected by helpfulness signals. Just the core ranking algorithm contains hundreds of signals, so it’s important to keep an open mind about what may be affecting search visibility after an update.

Here are five examples of changes during a broad core update that can affect rankings:

  1. The way a query is understood could have changed which affects what kinds of sites are able to rank
  2. Quality signals changed
  3. Rankings may change to respond to search trends
  4. A site may lose rankings because a competitor improved their site
  5. Infrastructure may have changed to accommodate more AI on the back end

A lot of things can influence rankings before, during, and after a core algorithm update. If  rankings don’t improve then it may be time to consider that a knowledge gap is standing in the way of a solution.

Examples Of Getting It Wrong

For example, a publisher who recently lost rankings correlated the date the of their rankings collapse to the announcement of the site Reputation Abuse update. It’s a reasonable assumption that if the rankings drop on the same date of an update then it’s the update.

Here’s the tweet:

“@searchliaison feeling a bit lost here. Judging by the timing, we got hit by the Reputation Abuse algorithm. We don’t do coupons, or sell links, or anything else.

Very, very confused. We’ve been stable through all this and continue to re-work/remove older content that is poor.”

They posted a screenshot of the rankings collapse.

Screenshot Showing Search Visibility Collapse

SearchLiaison responded to that tweet by noting that Google is currently only doing manual actions. It’s reasonable to assume that an update that correlates to a ranking issue is related, one to the other.

But one cannot ever be 100% sure about the cause of a rankings drop, especially if there’s a knowledge gap about other possible reasons (like the five I listed above). This bears repeating: one cannot be certain that a specific signal is the reason for a rankings drop.

In another tweet SearchLiaison remarked about how some publishers mistakenly assumed they had an algorithmic spam action or were suffering from negative Helpful Content Signals.

SearchLiaison tweeted:

“I’ve looked at many sites where people have complained about losing rankings and decide they have a algorithmic spam action against them, but they don’t.

…we do have various systems that try to determine how helpful, useful and reliable individual content and sites are (and they’re not perfect, as I’ve said many times before, anticipating a chorus of “whatabouts…..” Some people who think they are impacted by this, I’ve looked at the same data they can see in Search Console and … not really. “

SearchLiaison, in the same tweet, addressed a person who remarked that getting a manual action is more fair than receiving an algorithmic action, pointing out the inherent knowledge gap that would lead someone to surmise such a thing.

He tweeted:

“…you don’t really want to think “Oh, I just wish I had a manual action, that would be so much easier.” You really don’t want your individual site coming the attention of our spam analysts. First, it’s not like manual actions are somehow instantly processed.”

The point I’m trying to make (and I have 25 years of hands-on SEO experience so I know what I’m talking about), is to keep an open mind that maybe there’s something else going on that is undetected. Yes, there are such things as false positives, but it’s not always the case that Google is making a mistake, it could be a knowledge gap. That’s why I suspect that many people will not experience a lift in rankings if Google makes it easier for new pages to rank and if that happens, keep an open mind about maybe there’s something else going on.

Featured Image by Shutterstock/Sundry Photography

Google Hints At Improving Site Rankings In Next Update via @sejournal, @MattGSouthern

Google’s John Mueller says the Search team is “explicitly evaluating” how to reward sites that produce helpful, high-quality content when the next core update rolls out.

The comments came in response to a discussion on X about the impact of March’s core update and September’s helpful content update.

In a series of tweets, Mueller acknowledged the concerns, stating:

“I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”

He added:

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

What Does This Mean For SEO Professionals & Site Owners?

Mueller’s comments confirm Google is aware of critiques about the March core update and is refining its ability to identify high-quality sites and reward them appropriately in the next core update.

For websites, clearly demonstrating an authentic commitment to producing helpful and high-quality content remains the best strategy for improving search performance under Google’s evolving systems.

The Aftermath Of Google’s Core Updates

Google’s algorithm updates, including the September “Helpful Content Update” and the March 2024 update, have far-reaching impacts on rankings across industries.

While some sites experienced surges in traffic, others faced substantial declines, with some reporting visibility losses of up to 90%.

As website owners implement changes to align with Google’s guidelines, many question whether their efforts will be rewarded.

There’s genuine concern about the potential for long-term or permanent demotions for affected sites.

Recovery Pathway Outlined, But Challenges Remain

In a previous statement, Mueller acknowledged the complexity of the recovery process, stating that:

“some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller clarified that not all changes would require a new update cycle but cautioned that “stronger effects will require another update.”

While affirming that permanent changes are “not very useful in a dynamic world,” Mueller adds that “recovery” implies a return to previous levels, which may be unrealistic given evolving user expectations.

“It’s never ‘just-as-before’,” Mueller stated.

Improved Rankings On The Horizon?

Despite the challenges, Mueller has offered glimmers of hope for impacted sites, stating:

“Yes, sites can grow again after being affected by the ‘HCU’ (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”

He says the process may require “deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants.”

Looking Ahead

Google’s search team is actively working on improving site rankings and addressing concerns with the next core update.

However, recovery requires patience, thorough analysis, and persistent effort.

The best way to spend your time until the next update is to remain consistent and produce the most exceptional content in your niche.


FAQ

How long does it generally take for a website to recover from the impact of a core update?

Recovery timelines can vary and depend on the extent and type of updates made to align with Google’s guidelines.

Google’s John Mueller noted that some changes might be reassessed quickly, while more substantial effects could take months and require additional update cycles.

Google acknowledges the complexity of the recovery process, indicating that significant improvements aligned with Google’s quality signals might be necessary for a more pronounced recovery.

What impact did the March and September updates have on websites, and what steps should site owners take?

The March and September updates had widespread effects on website rankings, with some sites experiencing traffic surges while others faced up to 90% visibility losses.

Publishing genuinely useful, high-quality content is key for website owners who want to bounce back from a ranking drop or maintain strong rankings. Stick to Google’s recommendations and adapt as they keep updating their systems.

To minimize future disruptions from algorithm changes, it’s a good idea to review your whole site thoroughly and build a content plan centered on what your users want and need.

Is it possible for sites affected by core updates to regain their previous ranking positions?

Sites can recover from the impact of core updates, but it requires significant effort and time.

Mueller suggested that recovery might happen over multiple update cycles and involves a deep analysis to align the site with current user expectations and modern search criteria.

While a return to previous levels isn’t guaranteed, sites can improve and grow by continually enhancing the quality and relevance of their content.


Featured Image: eamesBot/Shutterstock

Google Reveals Two New Web Crawlers via @sejournal, @martinibuster

Google revealed details of two new crawlers that are optimized for scraping image and video content for “research and development” purposes. Although the documentation doesn’t explicitly say so, it’s presumed that there is no impact in ranking should publishers decide to block the new crawlers.

It should be noted that the data scraped by these crawlers are not explicitly for AI training data, that’s what the Google-Extended crawler is for.

GoogleOther Crawlers

The two new crawlers are versions of Google’s GoogleOther crawler that was launched in April 2023. The original GoogleOther crawler was also designated for use by Google product teams for research and development in what is described as one-off crawls, the description of which offers clues about what the new GoogleOther variants will be used for.

The purpose of the original GoogleOther crawler is officially described as:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Two GoogleOther Variants

There are two new GoogleOther crawlers:

  • GoogleOther-Image
  • GoogleOther-Video

The new variants are for crawling binary data, which is data that’s not text. HTML data is generally referred to as text files, ASCII or Unicode files. If it can be viewed in a text file then it’s a text file/ASCII/Unicode file. Binary files are files that can’t be open in a text viewer app, files like image, audio, and video.

The new GoogleOther variants are for image and video content. Google lists user agent tokens for both of the new crawlers which can be used in a robots.txt for blocking the new crawlers.

1. GoogleOther-Image

User agent tokens:

  • GoogleOther-Image
  • GoogleOther

Full user agent string:

GoogleOther-Image/1.0

2. GoogleOther-Video

User agent tokens:

  • GoogleOther-Video
  • GoogleOther

Full user agent string:

GoogleOther-Video/1.0

Newly Updated GoogleOther User Agent Strings

Google also updated the GoogleOther user agent strings for the regular GoogleOther crawler. For blocking purposes you can continue using the same user agent token as before (GoogleOther). The new Users Agent Strings are just the data sent to servers to identify the full description of the crawlers, in particular the technology used. In this case the technology used is Chrome, with the model number periodically updated to reflect which version is used (W.X.Y.Z is a Chrome version number placeholder in the example listed below)

The full list of GoogleOther user agent strings:

  • Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)
  • Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36

GoogleOther Family Of Bots

These new bots may from time to time show up in your server logs and this information will help in identifying them as genuine Google crawlers and will help publishers who may want to opt out of having their images and videos scraped for research and development purposes.

Featured Image by Shutterstock/ColorMaker

YouTube Unveils New Content And Ad Offerings At Brandcast via @sejournal, @gregjarboe

YouTube unveiled four new content and ad offerings at its 13th annual Brandcast at David Geffen Hall, Lincoln Center.

Key announcements include:

  • WNBA Friday night games: Google and Scripps Sports announced an agreement for YouTube TV to show the locally televised WNBA Friday night games on ION in both the home and away markets of the teams playing. This season, YouTube TV will be the only digital multichannel video programming distributor (dMVPD) carrying local and national WNBA games. The games will be part of the YouTube TV Base Plan starting May 31 and continuing through the regular season.
  • Creator Takeovers via YouTube Select: YouTube announced the expansion of this takeover option to more creators, which was initially piloted at the end of 2023. With the formalization of this program, brands will be able to collaborate with top YouTube creators to own a 100% share of voice on their channel, leveraging creator-fan solid connections.
  • Non-Skips for Video Reach Campaigns: YouTube announced a new AI-powered format optimized for connected TV (CTV), using non-skippable assets across in-stream inventory.
  • Branded QR Codes: YouTube launched branded QR codes, enabling advertisers to drive more interactivity by putting their brand front and center in a more modern QR code.

In addition to these news announcements, YouTube’s executive bench also took the stage to talk about their vision, the importance of content, and innovation in advertising.

Neal Mohan, the CEO of YouTube, said, “Creators are drawing audiences on the big screen because they’re the new Hollywood. They have business strategies, writers’ rooms, and production teams. They’re reimagining classic TV genres, from morning shows to sports commentary. And they’re inventing entirely new ones!”

He added, “Along the way, creators are redefining what we think of as ‘TV.’ And they deserve the same acclaim as other creative professionals. I believe it’s time a creator won an Emmy.”

YouTube CEO Neal Mohanan speaks onstage during YouTube Brandcast 2024 at David Geffen Hall on May 15, 2024, in New York City. (Photo by Kevin Mazur/Getty Images for YouTube)

Mary Ellen Coe, the Chief Business Officer of YouTube, said:

“No one is more engaged than loyal YouTube fans. They excitedly count down to new videos, rewatch old ones, and create their own in response. And they rush to their favorite creator’s channel in the 24 hours after new videos are released. Which presents an ideal moment for brands to engage with these fans.”

Sean Downey, the President of Americas and Global Partners at Google, said:

“Google AI has been at the core of our ads solutions for years. As we advance, our ability to help brands drive ROI keeps improving.”

The night featured musical performances by Billie Eilish featuring FINNEAS, Benson Boone, and Stray Kids, as well as various YouTube creators, including Haley Kalil, Kinigra Deon, Ryan Trahan, Shannon Sharpe, and Zach King.

This highlighted why a survey conducted by Kantar found that viewers in the United States say that if they could only watch one service for an entire year, YouTube would be the #1 platform they chose.

Mary Ellen Coe, CBO, YouTube, speaks at YouTube Brandcast. (Photo by Noam Galai/Getty Images for YouTube)

The audience gleaned other critical data throughout the evening event, which was part of the Upfronts.

For example, according to Nielsen’s total TV and streaming report for the US, YouTube has remained the leader in streaming watch time every month since February 2023. And 9 out of 10 viewers say they use YouTube, according to a Pew Research Study.

According to YouTube’s internal data, the key CTV metrics included:

  • Views in the living room have increased by more than 130% from 2020 to 2023.
  • On average, viewers watch over 1 Billion hours of YouTube content on the big screen (television) daily.
  • YouTube TV has more than 8 million paid subscribers.
  • Over 40 of YouTube’s top 100 channels by watch time have TV as their most-watYouTube’sen.
  • Last year, views of Shorts on connected TVs more than doubled.

Advertisers in the audience also snacked on these news nuggets:

  • According to a custom MMM meta-analysis commissioned by Google with Nielsen, on average, YouTube drives higher long-term Return on Ad Spend (ROAS) than TV, other online video, and paid social.
  • Based on a meta-analysis across 13 NCS sales lift studies, AI-powered video reach campaign mixes earned an average ROAS 3.7x higher (271%) than manually optimized campaigns.
  • According to a Kantar survey, viewers in the United States agree that YouTube is the #1 video platform for gaming content, outperforming TV, social, and streaming platforms.
  • According to a Google/Ipsos YouTube Trends Survey, 54% of people would rather watch creators break down a significant event like the Oscars or Grammys than watch it themselves.

Now, that’s a lot of news to digest. Still, as I mentioned in “Google Unveils Updates At IAB NewFronts 2024,” YouTube is also “expected to make more announcements at VidCon A” aheim 2024, which will take place from June 26–29, 2024, at the Anaheim Convention Center.

So, as TV newscasters would say in the old days, “Don’t touch that dial.”


Featured Image: Muhammad Alimaki/Shutterstock

Google Ads Restricts Brand Names & Logos From AI Image Generation via @sejournal, @MattGSouthern

Google has provided details about the capabilities and limitations of its AI image generation tools for Google Ads.

The clarification came after search marketer Darcy Burk expressed excitement about the potential for AI to create product images.

This prompted Google’s Ads Liaison, Ginny Marvin, to outline some key restrictions.

Branded Content Off-Limits

Marvin confirmed that while Google’s AI tools can generate generic product images, they are designed to avoid creating visuals that depict branded items or logos.

Marvin stated:

“The tool will generate product images, but it won’t generate product images that include brand names or logos.”

She provided an illustrative example:

“So, for example, you could ask it to generate images of ‘a dog in a pet stroller in a park,’ but if you asked it to generate images of ‘a dog in a pet stroller in a park with a Doggo logo,’ you’ll get an error notification to remove mentions of brands and branded items from your description.”

Guidelines Outlined

Marvin points to Google’s support documentation for more details on using the AI image generation and editing capabilities.

When attempting to generate branded product images, users will likely receive an error message instructing them to remove any branded terms from their prompts.

Google’s support page notes:

“Generative AI tools in Google Ads are designed to automatically limit the creation of certain content.”

It lists “Faces, children, or specific individuals” and “Branded items and logos” as examples of restricted subject matter.

Restricted Verticals

Google’s documentation also addresses concerns around safety and responsible AI development.

Generated images include digital watermarking to identify their AI-generated nature and deter misuse.

Sensitive advertising verticals like politics and pharmaceuticals are also restricted from automatically receiving AI-generated image suggestions.

“As this technology evolves, we’re continuously evaluating and improving our approach to safety,” Google states.

Why SEJ Cares

As generative AI capabilities expand across the advertising ecosystem, clear guidelines from Google help provide guardrails to mitigate potential risks while allowing advertisers to experiment.

Understanding current limitations, such as restrictions around branded visuals, is critical for marketers looking to incorporate AI image generation into their workflows.

How This Can Help You

For advertisers, Google’s AI image generation tools can produce large volumes of high-quality generic product and lifestyle images at scale.

By following the outlined guidelines around avoiding branded references, you can generate a variety of visual assets suited for ecommerce product listings, display ads, social media marketing and more.

This can streamline traditionally time-consuming processes like product photoshoots while maintaining brand safety.


FAQ

How does Google Ads’ AI image generation tool handle branded content?

Google’s AI image generation tool can create generic product images but is designed to exclude any branded items or logos.

If a user tries to generate an image with specific brands or logos, the system will trigger an error notification directing them to remove those references before proceeding.

  • The tool generates generic product images
  • It excludes brand names and logos
  • Users receive error notifications guiding them to correct prompts

What kind of content is restricted when using Google Ads’ AI image generation tools?

Several types of content are restricted when using the AI image generation tools in Google Ads.

Restrictions include creating images featuring faces, children, specific individuals, branded items, and logos.

Sensitive verticals like politics and pharmaceuticals are also barred from receiving AI-generated image suggestions.

How does the restriction on branded content benefit marketers using Google’s AI tools?

By focusing on generating only generic product images, advertisers can utilize the tool for a variety of applications, such as ecommerce product listings, display ads, and social media marketing, without risking any legal issues related to brand misuse.


Featured Image: DANIEL CONSTANTE/Shutterstock