Google SGE: Study Reveals Potential Disruption For Brands & SEO via @sejournal, @MattGSouthern

A new study by Authoritas suggests that Google’s AI-powered Search Generative Experience (SGE), currently being tested with a limited group of users, could adversely impact brand visibility and organic search traffic.

These findings include:

  • When an SGE box is expanded, the top organic result drops by over 1,200 pixels on average, significantly reducing visibility.
  • 62% of SGE links come from domains outside the top 10 organic results.
  • Ecommerce, electronics, and fashion-related searches saw the greatest disruption, though all verticals were somewhat impacted.

Adapting to generative search may require a shift in SEO strategies, focusing more on long-form content, expert insights, and multimedia formats.

As Google continues to invest in AI-powered search, the Authoritas study provides an early look at the potential challenges and opportunities ahead.

High Penetration Rate & Industry-Wide Effects

The study analyzed 2,900 brand and product-related keywords across 15 industry verticals and found that Google displays SGE results for 91.4% of all search queries.

The prevalence of SGE results indicates they impact a majority of websites across various industries.

The research analyzed the typical composition of SGE results. On average, each SGE element contained between 10-11 links sourced from an average of four different domains.

This indicates brands may need to earn multiple links and listings within these AI-curated results to maintain visibility and traffic.

The research also suggests that larger, well-established websites like Quora and Reddit will likely perform better in SGE results than smaller websites and lesser-known brands.

Shifting Dynamics In Organic Search Results

With SGE results occupying the entire first page, websites that currently hold the top positions may experience a significant decrease in traffic and click-through rates.

When a user clicks to expand the SGE element, the study found that, on average, the #1 ranked organic result drops a sizeable 1,255 pixels down the page.

Even if a website ranks number one in organic search, it may effectively be pushed down to the second page due to the prominence of SGE results.

New Competition From Unexpected Sources

The study revealed that SGE frequently surfaces links and content from websites that didn’t appear in the top organic rankings.

On average, only 20.1% of SGE links exactly matched a URL from the first page of Google search results.

An additional 17.9% of SGE links were from the same domains as page one results but linked to different pages. The remaining 62% of SGE links came from sources outside the top organic results.

Challenges For Brand Term Optimization & Local Search

The study reveals that SGE results for branded terms may include competitors’ websites alongside the brand’s own site, potentially leading to increased competition for brand visibility.

Laurence O’Toole, CEO and founder of Authoritas, states:

“Brands are not immune. These new types of generative results introduce more opportunities for third-party sites and even competitors to rank for your brand terms and related brand and product terms that you care about.”

Additionally, local businesses may face similar challenges, as SGE results could feature competing local brands even when users search for a specific brand in a regional context.

Methodology & Limitations

To arrive at these insights, Authoritas analyzed a robust dataset of 2,900 search keywords across a spectrum of query types, including specific brand names, brand + generic terms, brand + product names, generic terms, and specific product names. The keywords were distributed across 15 industry verticals.

The study utilized a consistent desktop browser viewport to quantify pixel-based changes in the search results. Authoritas also developed proprietary “alignment scores” to measure the degree of overlap between traditional organic search results and the new SGE links.

While acknowledging some limitations, such as the keyword set needing to be fully representative of each vertical and the still-evolving nature of SGE, Authoritas maintains that the insights hold value in preparing brands for the new realities of an AI-powered search ecosystem.

Why We Care

The findings of the Authoritas study have implications for businesses, marketers, and SEO professionals. As Google’s SGE becomes more prevalent, it could disrupt traditional organic search rankings and traffic patterns.

Brands that have invested heavily in SEO and have achieved top rankings for key terms may find their visibility and click-through rates diminished by the prominence of SGE results.

SGE introduces new competition from unexpected sources, as most SGE links come from domains outside the top 10 organic results. This means businesses may need to compete not only with their traditional rivals but also with a broader range of websites that gain visibility through SGE.

As Google is a primary source of traffic and leads for many businesses, any changes to its search results can impact visibility, brand awareness, and revenue.

How This Could Help You

While the rise of SGE presents challenges, it also offers opportunities.

Taking into account what we’ve learned from the Authoritas study, here are some actionable takeaways:

  • As SGE favors in-depth, informative content, businesses may benefit from investing in comprehensive, well-researched articles and guides that provide value to users.
  • Incorporating expert quotes, interviews, and authoritative sources within your content could increase the likelihood of being featured in SGE results.
  • Enriching your content with images, videos, and other multimedia elements may help capture the attention of both users and the SGE algorithm.
  • Building a strong brand presence across multiple channels, including social media, industry forums, and relevant websites, can increase your chances of appearing in SGE.
  • Creating a trustworthy brand and managing your online reputation will be crucial, as SGE may feature competitors alongside your website.

Looking Ahead

While the long-term impact of SGE will depend on user adoption and the perceived usefulness of results, this study’s findings serve as a valuable starting point for businesses and SEO professionals.

By proactively addressing the challenges and opportunities SGE presents, you can increase your chances of success in the new search environment.


Featured Image: BestForBest/Shutterstock

Google Blocks 5.5 Billion Bad Ads, Leverages AI For Improved Ad Safety via @sejournal, @brookeosmundson

Google announced the release of the 2023 Ads Safety Report to demonstrate its commitment to safe and transparent advertising for users and businesses.

The ad safety report shares progress on how Google enforces advertiser and publisher policies and the key trend in 2023: generative AI.

Read on for insights shared within the report and what they mean for advertisers.

Advertiser Policy Enforcement Insights

The main driver of increased ad enforcement has been, without a doubt, generative AI.

While traditional machine learning models are still trained to detect and alert of ad policy violations, Google is leveraging its generative AI tool, Gemini, for ad safety tools.

According to the ad safety report, more than 90% of publisher page enforcement used machine learning models and LLMs (large language models) in 2023.

Another staggering metric reported was the number of bad ads stopped in 2023: 5.5 billion.

The top five policies Google enforced the most in 2023 include:

  1. Abusing the ad network
  2. Trademark
  3. Personalized ads
  4. Financial services
  5. Legal requirements.

Another key trend in 2023 was the rise of scams and fraud across online platforms.

Digital marketing isn’t sheltered from these types of scams, which is why Google took the following measures to combat these threats:

  • The Limited Ads Serving policy launched in November 2023. The policy intends to protect users by limiting the reach of lesser-known advertisers who don’t have an established track record of good ad behavior.
  • Updated the misrepresentation policy in late 2023. The policy enables a more swift response from Google to suspend accounts of bad actors.

Google blocked over 1 billion ads for violating the “abusing the ad network” policy.

Almost 7 Billion Ads Restricted in 2023

Another component of ad safety comes in the way of restricting or limiting ads.

According to Google’s report, they restricted 6.9 billion ads in the following content areas:

  • Legal requirements
  • Financial services
  • Gambling and games
  • Adult content
  • Copyright
  • Healthcare and medicine
  • Alcohol

Now, restricting ads doesn’t mean they were entirely blocked. The categories listed above can be shown to users but on a much more limited basis.

Google Invests in Election Integrity

Each election year, users are inundated with countless political advertisements.

2024 is, and will be, no different.

Per the ad safety report, Google takes the following measures to ensure voter trust in the ad content they’re viewing:

  • Identity verification process
  • Transparency requirements for election ads
  • Restrictions on how election advertisers can target their ads
  • Must include a “paid for” disclosure in their ads

Publisher Enforcement Insights

The other aspect of Google’s ad safety report includes enforcing restrictions on publishers who monetize their content.

In 2023, over 2 billion pages were taken action against by Google.

The top five areas Google enforced restrictions the most included:

  1. Sexual content
  2. Dangerous or derogatory content
  3. Weapons promotion and sales
  4. Shocking content
  5. Online gambling

Conclusion

As the use of generative AI continues to increase, the industry will likely see an increase in ad fraud and more restrictions.

While AI and machine learning increasingly detect ad disapproval, advertisers must remain proactive in monitoring their accounts for potential false red flags.

If you’re a new advertiser on Google this year, it’s important to note that, due to Google’s ad safety policies, your ads may be limited while it learns who you are as an advertiser.

You can read the full 2023 ad safety report here.

Google Explains Why Images Are Gone From Some SERPs via @sejournal, @martinibuster

Google’s John Mueller offered a suggestion about a long-standing problem with missing images in search results for recipe sites but recipe bloggers don’t seem convinced or satisfied.

Missing Images In The Search Results

Several recipe bloggers have for the past few months brought up the issue of image thumbnails that stopped appearing in the search engine results pages (SERPs) for their websites. They previously enjoyed thumbnail images displayed but at some point it appears Google stopped showing them.

The recipe bloggers feel this is a problem that needs fixing and brought it to the attention of SearchLiaison and John Mueller but the problem keeps happening.

One of the first posts in the discussion that Mueller answered dates back to January 29th when a recipe blogger noticed that a thumbnail image stopped showing for their sites when ranked in the search results.

They tweeted:

“@searchliaison

Hi. I saw today that Google is not showing any of my featured images, not one. From the most popular recipes to any other random one I checked.

Even if the picture appears in the Carousel (below the first one), it doesn’t appear in the list (second link). Help!”

Google SearchLiaison responded that they had passed it on.

Casey Markee, founder of Media Wyse, offered some background information about the problem, explaining that the problem may be widespread.

He tweeted:

“FYI Danny, this is not a unique issue. We are seeing it on dozens of recipe sites.

Also, second issue, sites are reporting they can’t get any new recipes into the carousel in over a month!

I am going to have the sites tag you with specific examples where possible.”

Google SearchLiaison responded that he’s passed the report on to the search team and reassured that he would also discuss Casey’s feedback internally as well.

Missing Images Not An Isolated Issue

The tweets about missing images continued into February and into March when someone from the home improvement space tweeted that the missing images in the SERPs issue was affecting them as well.

@Sleblanc921 tweeted:

“It’s not just food bloggers, I’m in home improvement/ DIY and mine are gone too.”

The reports of missing images in the search results even affected travel bloggers.

The Answer That Probably Nobody Wanted

Mueller responded to the latest tweet by assuring everyone that he too had passed the feedback onto the search team.

He tweeted:

“We did pass these on, and folks have been looking into the report…”

But his tweet also contained a caveat about the image thumbnails, saying that the images don’t show for all sites.

His tweet continued:

“…but in general, we don’t guarantee that images are shown with search results, so sometimes things are just shown without them.”

The travel blogger, @flyingwithbaby, tweeted:

“Hi @JohnMu it seems to be affecting my site ( family travel) for every post- everyone else’s has an image next to theirs but mine have gone. Content is super helpful according to readers , own photos etc. and it’s harming what little has been left :-(“

In response to the travel blogger, John Mueller reiterated his previous response with a possible explanation of why the images are no longer appearing.

He tweeted:

“Like I mentioned, it’s not expected that every post / search result has an image next to it. I imagine things would look pretty cluttered if that were the case.”

@flyingwithbaby responded:

“I get that- but from my experience, it was all showing before and I’m unsure why my entire site is being singled out. Seems that others are having the exact same issue. It always showed before. …I’m not sure how more helpful this can be with personal experience, unique photos and content ( which others have copycatted) and then now even the image wiped off next to the search results.

It’s crippling John & so frustrating “

Indirect Statements Perpetuate Confusion

John Mueller’s answer implies that Google’s search results are working the way they’re supposed to work.

But the problem with implying is that it doesn’t explicitly say that there is no bug, it doesn’t explicitly say that there is no outage.

It might not be Mueller’s fault that his language is not more explicit, there may be a prohibition on mentioning specifics. Or it could be that he doesn’t even realize that a more explicit statement that there is no outage is what users might need to hear.

Regardless, Mueller’s statement does imply that there is no outage and that the decrease in image thumbnails is the new normal.

Recipe bloggers raised concerns to Google SearchLiaison and John Mueller about the image thumbnails that used to appear alongside their search results, advising that their images completely stopped showing and that it was affecting their visibility. Images in rich results draw attention to a search result and helps them attract clicks.

Food and recipe sites invest time and money into equipment for photographing food to accompany their articles as well as in researching and testing recipes. That explains how disheartening it must feel to see those image thumbnails disappear.

The issue with the missing image thumbnails in Google search results extends beyond recipe blogs and also impacts home improvement and travel sites as well, which may indicate that the problem is widespread.

Featured Image by Shutterstock/Roman Samborskyi

WordPress 6.5 Release Derailed By Bugs In New Feature via @sejournal, @martinibuster

WordPress 6.5 has been postponed because of disagreements over a new feature that would have required some websites to download a plugin in order to keep them functioning.

Font Directory Problem

What derailed the 6.5 release is a new Font Library feature for managing fonts that also makes using Google Fonts GDPR compliant. The GDPR compliance part is a nice feature that allows a publisher to connect with Google Fonts and download fonts locally instead of serving them from Google’s servers. That’s what makes them GDPR compliant because Google’s servers won’t have access to your site visitor’s IP address.

The fonts themselves were going to be stored in a new location, /wp-content/fonts/. However it was acknowledged that doing this would result in bugs due to some file systems inability to store fonts in that directory. The compromise that the core contributors initially agreed to was to require buggy sites to use a plugin (Fonts to Uploads) in order to be able to function with WordPress 6.5.

That plan didn’t go down to well because it didn’t conform to the design philosophy of making WordPress work out of the box.

There was strong opposition from the WordPress community.

Among the comments:

“I think this is a significant mistake that deviates from WordPress’ core philosophies, the existing technical infrastructure, and the stated requirements and plan for shipping in 6.5. …By removing a key aspect of the compromise plan, it undermines the entire plan, and means we are fundamentally shipping something we know is broken with 6.5.”

Multiple comments agreed that the original plan was flawed, including this one:

“So basically the decision of moving uploadable fonts outside of the uploadable folder is already proven problematic even before releasing with partial testing.

And TBH `wp-content/uploads/fonts` path was semantically better in the first place.”

So the fonts are now back to being stored in the more compatible location of /wp-content/uploads/fonts/.

And that led to the announcement that 6.5 would be delayed by a week:

“Based on community feedback on the Unblocking WP6.5 – Font Library and Synced Pattern Overrides and Font Library follow up posts, there has been a change to the WordPress 6.5 release schedule and a final change to the Font Library.

…This approach ensures that the greatest number of sites possible can benefit from the new Font Library feature without the need to install or configure anything. …Shipping a feature that requires additional configuration or technical knowledge isn’t in line with the guiding philosophies that have helped the project mature into the successful project that exists today.”

Not Without Some Minor Complaints

The change to where the files will be stored did meet with the observation that storing fonts in the /uploads/ directory is not semantically correct because the fonts are downloaded in the case of Google fonts, among other observations.

WordPress 6.5 is now schedule for release on Tuesday, April 2nd, 2024.

Read the full announcement:

WordPress 6.5 release delayed 1 week

Featured Image by Shutterstock/PBXStudio

Let’s Be Real: Reddit In The SERPs Lacks Credibility via @sejournal, @martinibuster

Google dramatically increased the amount of Reddit answers in the search results in order to surface actual user opinions but it’s becoming increasingly apparent that it’s having a negative effect on the quality of search results. Everyone seems to know it but Google.

Lack of Expertise But Makes Sense

User generated content that is typically found in product reviews, travel reviews, Reddit and in web forums are opinions formed by anonymous people of unknown experience and expertise. It is based on personal opinions and biases that are limited to each person’s experience and unfortunately, a lack of experience has never stopped anyone from expressing their opinion.

The frightening quality of user generated content typically found on Reddit is when an opinion “makes sense” (regardless of factual accuracy) it is practically guaranteed that it will get upvoted by all the other similarly experienced users who lack the expertise to tell the difference between inaccurate opinions that “makes sense” versus factually accurate opinions.

All you have to do is look at the SEO community running from one trend to another because the ideas make sense and are inevitably discarded.

Ideas that make sense are similar to AI hallucinations. AI hallucinations “sound right” but sounding right is not the same as factually accurate. The legacy of things that make sense is the idea that the sun revolves around the earth, which “made sense” for millenniums.

Common sense is a big problem at Reddit because it allows those who lack experience and expertise to share plausible yet factually inaccurate opinions.

I asked Chef Allen (Facebook Profile) about the expertise of the food and recipe discussions on Reddit.

She shared  her expert opinion:

“Reddit is not a reliable place to get recipes. From untested recipes to improper measurements and lack of instructions, finding a trustworthy recipe on Reddit is nearly impossible. People mean well, but most responses to recipe requests are met with what is sure to be disappointing results if one were actually to attempt to make the suggested recipe.

Reddit just isn’t designed for sharing tried-and-tested recipes from experienced recipe developers.”

Example Of Facts Versus Reddit Opinions

There are many examples of how a lack of expertise by Redditors leads to opinions and biases that override facts and negatively impacts the quality of the content.

This is one example that is related to SEO.

Two facts about links:

  • Googler Gary Illyes explained at Pubcon Austin that links aren’t in the top 3 of ranking factors anymore.
  •  Then consider that in March 2024 Google’s spam policy page deemphasized links by stating they’re a ranking factor but no longer saying they’re an important ranking factor.

On Reddit today someone observes that links aren’t playing as strong a role as in the past.

Screenshot Of A Reddit Post About Links

Screenshot of a Reddit post asserting that links play a less important role in rankings than it has in the past.

And several Redditors argue against it, including sine who joined Reddit within the past few years and may have entirely missed the heyday of when links were the undisputable #1 ranking factor.

Here’s one person’s response:

Typical response that is arguing that there is no change in how Google uses links for ranking purposes.

Here’s another example. A top ranking Reddit thread for how to make birria de res, which is a Mexican braised meat recipe for cooking goat meat that originated in the Mexican state of Jalisco.

The first person to answer is someone who is guessing what type of meat is good for a birria.

Redditor Guessing An Answer

Screenshot of a Redditor answering a post with a guess

That person posting a guess is typical of the responses given in not just that Reddit thread but in any number of discussions. That’s not to say there aren’t experts on Reddit but for those of you with years of experience in SEO, when was the last time you turned to Reddit for SEO advice?

Lack Of EEAT In User Generated Content

The idea of ranking the views of real people makes sense, perhaps more in some contexts than others. For example, when it comes to saltwater fishing I would 100% take the opinion of someone posting on a dedicated east coast fishing forum like Stripers Online or from one of the writers at Surfcaster’s Journal who all have decades of hands-on fishing experience. I value the expertise of of Surfcaster’s Journal so much that I pay for access to their content. It’s that good.

But what about Reddit? I have expertise in saltwater and freshwater fishing and truthfully, Reddit is the last place I would ever turn to for fishing advice.

For example, it’s well-known that the best time for catching striped bass on the east coast is at night when striped bass sneak around in the dark to ambush prey. They can be caught during the day but the night time is without question the best time for catching the larger keeper size fish.

Yet many top ranked discussions on Reddit focus on daytime fishing.

Example Of Personal Bias In A Reddit Answer

Screenshot of a poor answer about saltwater fishing on Reddit

Contrast that with the logo of Surfcaster’s Journal where it’s clear that fishing at night is the heart of the sport of striped bass fishing.

Surfcaster's Journal emphasizes striped bass fishing at night

Recipe Sites Versus Reddit

Casey Markee (@MediaWyse), a well-regarded search marketing consultant specializing in optimizing recipe and food blogs agrees that Reddit threads in the search results are a poor experience for users.

He explained:

“Although Google has argued that the proliferation of Reddit results is to provide more first-hand experience to guide users, rarely does this result in superior results in the recipe niche.  Especially for more complicated recipes, the detailed step by step instructions and expert tips that come with those recipes, fare outshine what you’ll find in a ranking Reddit thread.

For example, a complicated recipe like “Beef Wellington” has a lot that can go wrong, especially with regard to the phyllo dough and getting clean cuts when it’s done. You need only look at the Reddit result returned for this query, and compare it to the many better written and more detailed recipes, to see how large the gap is between presented expertise.

You absolutely wouldn’t want to rely on a Reddit thread alone to make even the average complicated recipe. The results will not be satisfying. Most Reddit threads are “too general” and really don’t provide the needed level of expertise to ensure a recipe is done ‘perfectly’ the first time.”

Top Reasons Why Reddit Should Not Be Highly Ranked

  1. Lack of Expertise
    This is self-evident by now.
  2. Anonymity
    The anonymous nature of Reddit among the millions of users makes it easy for anyone to post anything without experiencing the self-moderating effects of a forum community where expert members are always near to pick apart poor advice.
  3. Bias
    Reddit answers tend to reflect the biases of the users, some of which arise from a lack of experience
  4. Subjectivity
    Reddit posts tend to be based on the preconceptions or tastes that may have more to do with their geographic and cultural background than facts, experience and knowledge.
  5. Echo Chambers
    This is a well-known effect where likeminded people will coalesce and reinforce each other’s preconceptions and biases.
  6. Cognitive Biases
    A common trait in social media and user generated content is that cognitive biases like the Dunning-Kruger effect are amplified. The Dunning-Kruger effect is when someone who lacks expertise overestimate their subject matter knowledge and contributes to the overall decline of understanding.
  7. Oversimplification
    This relates to what Casey mentioned about Reddit threads being “too general” and one of the reasons for this is that the people who are answering the questions lack the contextual understanding and all of the nuance that goes with that which leads to oversimplification of any given topic.

Google Is Using Content Of Dubious Expertise

The definition of dubious is something that cannot be trusted or and is doubtful. Opinions shared on Reddit by people who lack expertise and are sharing opinions in anonymity qualify as dubious. Yet Google is not only favoring Reddit in the search results, it is also paying millions of dollars for access to content that is lacking in expertise, experience, authoritativeness and trustworthiness.

Google has stated that it is prioritizing content from actual people with experience, which makes sense in certain contexts such as their experiences with products. But does prioritizing Reddit content stray too far from surfacing content with actual expertise?

Featured Image by Shutterstock/Roman Samborskyi

Google Updates Product Structured Data With 3DModel Type via @sejournal, @martinibuster

Google updated it’s documentation to add a new 3DModel markup Type to the Product structured data type, connecting the two by the use of the subjectOf property.

Understanding The New Structured Data Property

A structured data Type is an entity or a concept (like EatAction or DrugCost). Some common structured data Types most people are familiar with are CreativeWork, Product, Event, and Organization.

A property is simply an attribute of the structured data Type.

The new markup that Google is introducing for use with the Property Type is the 3DModel Type which is connected to the Property Type through the use of the subjectOf property.

The subjectOf property is a way to link two Types. In this case the 3DModel type is linked with the subjectOf property to the Product structured data type. The 3DModel structured data type adds more information about the Product.

Why Is Google Introducing The 3DModel Type?

Google’s developer pages changelog explains that the reason Google is making the 3DModel Type official is because 3DModels are increasingly used on product webpages so this gives merchants a way to add information about the 3DModel that is associated with a product on a product pages.

There is no indication of how the 3DModel structured data might be used as a rich result but it’s not unreasonable to imagine that merchant listings or the regular SERPs may one day have a rich result associated with 3D representations of products.

Even if Google doesn’t create a rich result for the search results pages (SERPs) it’s still worthwhile to use the new structured data type because it helps Google know that there is a 3D representation of the product on the webpage and use that information for ranking purposes.

Google’s explanation for the introducing the new Schema.org structured data type is described in the changelog:

“Sometimes 3D models appear on pages with multiple products and are not clearly connected with any of them. This markup lets site owners link a 3D model to a specific product.”

Example Of 3DModel Type

Google published an example of how the subjectOf property is used to connect the 3DModel type to the Product structured data type.

Example Of 3DModel Type In Use

{
"@context": "https://schema.org/",
"@type": "Product",
"sku": "1234-5678",
"image": "https://www.example.com/sofa.jpg",
"name": "Water heater",
"description": "White 3-Seat Sofa",
"gtin14": "12345678901231",
"mpn": "S1234W3",
"brand": {
"@type": "Brand",
"name": "ExampleSofaBrand"
},
"subjectOf": {
"@type": "3DModel",
"encoding": {
"@type": "MediaObject",
"contentUrl": "https://example.com/sofa.gltf"
}
},

As can be seen above, the subjectOf property links the 3DModel Type to the overall Product structured data Type.

Read the new documentation in Google’s Search Central page for the Property structured data:

Product (Product, Review, Offer) structured data – 3D Model

Featured Image by Shutterstock/Castleski

Google: About Us & Contact Pages Not Important? via @sejournal, @martinibuster

Google’s John Mueller answered a question about whether it’s true that adding a “contact page” and/or and “about page” was a good idea because it’s important to Google. Mueller checked around and said why the contact and about us pages were useful.

Needs To Convince Company To Add Contact & About Us Pages

@jaclynbrandt tweeted a question to John Mueller, explaining that they were trying to convince the company they worked for that adding contact and about us pages and needed a statement or information to show them why they should add those pages.

They tweeted:

“@JohnMu I am trying to convince my company to add a contact us and/or about us page to our website. We are an online directory/blog for a niche sport (but run by a ecommerce company in that sport). Do you have any tips I can present to them as to why it’s important?”

John Mueller asked if it was because they wanted feedback from users and suggested that they check with their users.

The person asking the question responded that they wanted to do it because they heard it’s important to Google that a company have those pages.

They tweeted:

“Nope! No need for feedback (there are other ways to do this). I have just heard it’s important to Google – but I can’t find any documentation on this.”

And added:

“And yes I know I should not do anything just because Google wants it – I generally stay away from that and just try to be helpful. But I have heard this is a make or break rule.”

Are Contact & About Us Pages Important To Google?

It’s not an unreasonable question to ask if an about us or contact page is important. Google’s Quality Raters Guidelines specifically tell raters that it should be clear who is responsible for the website.

The guidelines (PDF) explain on page 25:

“In determining page quality, Raters must consider EEAT:

The first-hand experience of the creator.

The expertise of the creator.

The authoritativeness of the creator, the main content itself and the website.”

All of the above considerations cannot be confirmed by the quality raters if there is no documentation on the webpage about who is responsible for the website, which is information that could be found on an about page.

The page quality section continues:

Raters determine a page quality rating by:

…Reviewing the information available about the website and its creator: It should be clear who is responsible for the website and who created the content on the page, even if through an alias or username.”

Identifying who is responsible for a webpage is easier when there’s an about page that explains who are the people responsible for the website and why site visitors should trust them. This is information that is, for the purposes of rating websites, directly linked to E-E-A-T in the quality raters guidelines.

John Mueller Answers The Question

Mueller responded to the question as to whether having a contact and about us page is a make or break rule at Google and that it’s important to have those pages on a website.

He tweeted his response:

“I can think of good reasons for some sites to have these kinds of pages, but, after double-checking, there’s nothing in our search developer documentation that suggests this is needed.”

About Us & Contact Pages

There’s a lot of things that people feel are “important to Google” that really aren’t important. For example, recipe bloggers have long understood that having lots of content is important for ranking in Google. Even recipe blogger SEOs insisted on this to me – even though there is no documentation or Googler statement that confirms that.

And the reality is that the length of content is not a ranking factor or an influence on ranking, it simply doesn’t matter to Google. The only thing that matters is if it’s useful or helpful and offers a good and satisfying experience for users.

Similarly the thing about contact and about us pages is that Google doesn’t care about those either, they’re not “important” to Google or required.

But they are important if you want to communicate to site visitors that the people responsible for the site aren’t affiliates with zero expertise. Even for the sake of conversions, getting people to return to the site to click on ads or buy more stuff, it’s important to earn their trust and confidence.

So if a company needs convincing then maybe the argument that it’ll help you make more money might be a good argument as any because if people are buying more stuff, clicking on affiliate ads or increasing ad impressions then that’s a sign that people trust the site as well.

The user experience is a money-earning-thing and it could have a downstream effect but that’s not a direct thing, even if it’s important to the quality raters.

Featured Image by Shutterstock/Dean Drobot

2M+ WordPress Sites Hit By Essential Addons For Elementor Vulnerability via @sejournal, @martinibuster

Security researchers published an advisory on the popular Essential Addons For Elementor WordPress plugin which was discovered to contain a Stored Cross-Site Scripting vulnerability affecting over 2 million websites.

Flaws in two different widgets that are a part of the plugin are responsible for the vulnerabilities.

Two Widgets That Lead To Vulnerabilities

  1. Countdown Widget
  2. Woo Product Carousel Widget

Essential Addons For Elementor

Essential Addons is a plugin that extends the popular Elementor WordPress page builder. Elementor makes it easy for anyone to create websites and the Essential Addons makes it possible to add even more website features and widgets.

The Vulnerability

The advisory by Wordfence announced that the plugin contained a Stored Cross-Site Scripting (XSS) vulnerability that allows an attacker to upload a malicious script and attack website visitor browsers, which can itself lead to stealing session cookies in order to take control of the website.

XSS vulnerabilities are among the most common and arise from a failure to properly sanitize (screen or filter) fields that accept inputs like text or images.

Plugins typically “sanitize” inputs which means that they filter out unwanted inputs like scripts.

Another flaw that creates an XSS vulnerability is the failure to “escape output” which means to remove any output that contains unwanted data in order to prevent it from reaching a browser.

Wordfence cites both of those flaws as factors that led to the vulnerabilities.

They warned about the countdown widget:

“The Essential Addons for Elementor – Best Elementor Templates, Widgets, Kits & WooCommerce Builders plugin for WordPress is vulnerable to Stored Cross-Site Scripting via the countdown widget’s message parameter in all versions up to, and including, 5.9.11 due to insufficient input sanitization and output escaping.

This makes it possible for authenticated attackers, with contributor access or higher, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

The warning about the  Woo Product Carousel Widget:

“The Essential Addons for Elementor …plugin for WordPress is vulnerable to Stored Cross-Site Scripting via the alignment parameter in the Woo Product Carousel widget in all versions up to, and including, 5.9.10 due to insufficient input sanitization and output escaping. “

See also:

Authenticated Attackers

What’s meant by the phrase “authenticated attackers” is that a hacker needs to first acquire website credentials first in order to launch the attack. The Essential Addons for Elementor vulnerability requires an attacker to have a contributor level access or higher.

Medium Level Threat – Updating Recommended

The vulnerability is rated as a medium threat and has been assigned a score of 6.4 on a scale of 1 – 10, with 10 being the most critical level of vulnerability.

Plugin users that have version 5.9.11 or lower are recommended to upgrade to the latest version of the plugin, currently version 5.9.13.

Read the Wordfence security bulletins:

Essential Addons for Elementor – Best Elementor Templates, Widgets, Kits & WooCommerce Builders <= 5.9.11 – Authenticated (Contributor+) Stored Cross-Site Scripting

Essential Addons for Elementor – Best Elementor Templates, Widgets, Kits & WooCommerce Builders <= 5.9.11 – Authenticated (Contributor+) Stored Cross-Site Scripting

Featured Image by Shutterstock/Aleksandrs Sokolovs

Generative AI Tool Guide: Transforming Your Strategy For SEO Success via @sejournal, @sejournal

The rise of AI has been on a fast track lately, with no sign of slowing down anytime soon.

Though artificial intelligence isn’t a new concept, generative AI, in particular, had a major breakthrough last year.

In fact, 80% of Fortune 500 companies adopted ChatGPT in 2023.

From search engines and social media networks to advertising platforms and productivity software, generative AI is already integrated into many aspects of our work lives.

So, if you’re looking to step up your search strategy in 2024, we’re sharing ways to leverage generative AI tools to your advantage.

In our latest ebook, you’ll learn how generative AI works, how to incorporate it into your team’s workflow, and how to avoid common pitfalls.

Download Leveraging Generative AI Tools For SEO and discover how to use them to achieve your marketing objectives more efficiently.

Inside, we cover the following topics:

Part I. How Generative AI Works & How Best To Use It

  • Defining Narrow AI & Generative AI
  • Limitations & Dangers Of Generative AI
  • Revolutionizing SEO With Google’s Search Generative Experience

Part II. Scalable AI Tools For SEO

  • The Top AI Chatbots
  • Generative AI & AI Chatbots On Social Media
  • Built-In AI Chatbots & Features In Tools
  • Prompting To Understand AI Discovery
  • Training AI Models & Building Generative AI Applications

Part III. Implementing & Automating Generative AI

  • Integrating ChatGPT With Google Sheets For Enhanced Data Analysis
  • GPT 4 With Vision SEO Applications
  • More Cool Generative AI Implementations & Guides

Generative AI, which uses deep learning to understand large amounts of data and generate new outputs based on user inputs (prompts).

This unique ability is the defining feature of generative AI, allowing it to perform an impressive array of new tasks. However, it’s also the source of significant risk and ethical debate.

If you’re looking to steer clear of these risks as you adopt this technology into your SEO strategy, this comprehensive guide covers everything you need to know.

Here are some key takeaways:

Impact On SEO

From the platforms you attempt to rank within to the tools you use in your workflow, generative AI is part of the future of SEO.

So, even if you’re not particularly interested in AI-powered tools, it’s important to understand how the technology will impact search.

Customizability Of AI Tools

Generative AI, and AI tools in general, are becoming exceptionally customizable.

Whether you’re picking a chatbot with specific functionalities, stringing together functions to create custom workflows, or creating custom tools from scratch, the possibilities with AI are endless.

This level of flexibility empowers you to fine-tune your SEO strategy and optimize your workflow for maximum efficiency.

Limitations Of Generative AI

While generative AI can certainly be useful, it’s important to acknowledge its limitations, particularly when it comes to content creation.

When creating content, AI should be used as a complementary tool to enhance your creative process rather than replace it entirely. Learning to leverage the strengths of generative AI while avoiding its limitations can help you maximize its potential.

Whether you’re a manager looking to harness the power of AI or an experienced SEO professional looking to refine your skills, this guide is sure to transform your search strategy.

Don’t miss out on the potential of generative AI, but learn how to use it responsibly.

Grab your copy today and start leveraging cutting-edge tools to improve your SEO strategy.

Generative AI Tool Guide: Transforming Your Strategy For SEO Success

Rank Math WordPress SEO Plugin Vulnerability Affects +2 Million Sites via @sejournal, @martinibuster

Rank Math SEO plugin with over 2+ million users recently patched a Stored Cross-Site Scripting vulnerability that makes it possible for attackers to upload malicious scripts and launch attacks.

Rank Math SEO Plugin

Rank Math is a popular SEO plugin that’s installed in over 2 million websites. It has an incredible array of functions that ranges from keyword tracking, Schema.org structured data integration, Google Search Console and Analytics integration, a redirect manager and other features that make it unnecessary to use other plugins for technical or on-page SEO.

A popular feature that users appreciate is that it’s a modular plugin which means users can choose which features they require and turn off those that they don’t which can help make a website perform even faster.

Many turn to Rank Math as an alternative to Yoast. A comparison between the two shows that Rank Math is smaller (61.1k lines of code versus Yoast’s 97.1k lines) and uses less server resources (+0.35 MB of memory versus Yoast’s +1.62 MB).

Authenticated Stored Cross-Site Scripting

Wordfence WordPress security researchers published an advisory of a vulnerability in Rank Math SEO plugin that can lead to a stored Cross Site Scripting (XSS) vulnerability.

A stored XSS vulnerability allows an attacker to upload malicious scripts and attack browsers which can result in stealing a session cookies which enables unauthorized website access and compromising sensitive data.

Insufficient Input Sanitization And Output Escaping

The source of the vulnerability is due to insufficient input sanitization and output escaping. These are common reasons for an XSS vulnerabilities that occur in areas of plugins that allow users to upload or input data.

Sanitizing input data is like filtering out unwanted type of input like scripts or HTML where only text inputs are expected. Output escaping is a process that validates what’s output by the website to block unwanted output like malicious scripts from reaching a website browser.

Wordfence warned:

“The Rank Math SEO with AI SEO Tools plugin for WordPress is vulnerable to Stored Cross-Site Scripting via the HowTo block attributes in all versions up to, and including, 1.0.214 due to insufficient input sanitization and output escaping on user supplied attributes.

This makes it possible for authenticated attackers, with contributor-level access and above, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

Rank Math’s update changelog responsibly acknowledges what was changed in their plugin and the reason for the update. This transparency makes it possible for plugin users to understand the importance of a given update and to make an informed decision as to the urgency of the updated.

The changelog identifies the patched vulnerability:

“Improved: Strengthened the security of the plugin’s HowTo Block to prevent potential exploitation by users with post edit access. Thanks to [WordFence]
(https://www.wordfence.com/) for revealing it responsibly”

Read the official Wordfence advisory:

Rank Math SEO with AI SEO Tools <= 1.0.214 – Authenticated(Contributor+) Stored Cross-Site Scripting via HowTo block attributes

Featured Image by Shutterstock/Roman Samborskyi