Top 15 Ways To Secure A WordPress Site via @sejournal, @inmotionhosting

Thankfully, there are plenty of steps you can take to protect your WordPress website.

Easy WordPress Security Basics

When setting up your WordPress site security, there are some basic things you can do to beef up your protection.

Below, we will take a look at some of the first things you should do to help protect your website.

1. Implement SSL Certificates

Secure Sockets Layer (SSL) certificates are a standard technology that establishes an encrypted connection between a web server (host) and a web browser (client). This connection ensures all data passed between the two remains private and intrinsic.

SSL certificates are an industry-standard used by millions of websites to protect their online transactions with their customers, and obtaining one should be one of the first steps you take to secure your website.

2. Require & Use Strong Passwords

Along with obtaining an SSL certificate, one of the very first things you can do to protect your site is use strong passwords for all your logins.

It might be tempting to create or reuse a familiar or easy-to-remember password, but doing so puts both you and your website at risk. Improving your password strength and security decreases your chances of being hacked. The stronger your password, the less likely you are to be a victim of a cyberattack.

When creating a password, there are some general password best practices you should follow.

If you aren’t sure if you are using a strong enough password, you check the strength of one by using a free tool like this helpful Password Strength Checker.

3. Install A Security Plugin

WordPress plugins are a great way to quickly add useful features to your website, and there are several great security plugins available.

Installing a security plugin can add some extra layers of protection to your website without requiring much effort.

To get you started, check out this list of recommended WordPress security plugins.

4. Keep WordPress Core Files Updated

As of 2024, there are an estimated 1.09 billion total websites on the web with more than 810 million of those sites using WordPress.

Because of its popularity, WordPress websites are oftentimes a target for hackers, malware attacks, and data thieves.

Keeping your WordPress installation up to date at all times is critical to maintain the security and stability of your site.

Every time a WordPress security vulnerability is reported, the core team starts working to release an update that fixes the issue.

If you aren’t updating your WordPress website, then you are likely using a version of WordPress that has known vulnerabilities.

There is especially no excuse for using an outdated version of WordPress since the introduction of automatic updates.

Don’t leave yourself open to attack by using an old version of WordPress. Turn on auto updates and forget about it.

If you would like an even easier way to handle updates, consider a Managed WordPress solution that has auto updates built in.

5. Pay Attention To Themes & Plugins

Keeping WordPress updated ensures your core files are in check, but there are other areas where WordPress is vulnerable that core updates might not protect such as your themes and plugins.

For starters, only ever install plugins and themes from trusted developers. If a plugin or theme wasn’t developed by a credible source, you are probably safer not using it.

On top of that, make sure to update WordPress plugins and themes. Just like an outdated version of WordPress, using outdated plugins and themes makes your website more vulnerable to attack.

6. Run Frequent Website Backups

One way to protect your WordPress website is to always have a current backup of your site and important files.

The last thing you want is for something to happen to your site and you do not have a backup.

Backup your site, and do so often. That way if something does happen to your website, you can quickly restore a previous version of it and quickly get back up and running.

Intermediate WordPress Security Measures That Add More Protection

If you’ve completed all the basics but you still want to do more to protect your website, there are some more advanced steps you can take to bolster your security.

Let’s take a look at what you should do next.

7. Never Use The “Admin” Username

Never use the “admin” username. Doing so makes you susceptible to brute force attacks and social engineering scams.

Because “admin” is such a common username, it is easily-guessed and makes things much easier for scammers to trick people into giving away their login credentials.

Much like having a strong password, using a unique username for your logins is a good idea because it makes it much harder for hackers to crack your login info.

If you are currently using the “admin” username, change your WordPress admin username.

8. Hide Your WP Admin Login Page

On top of using a unique username another thing you can do to protect your login credentials is hide your WordPress admin login page with a plugin like WPS Hide Login.

By default, a majority of WordPress login pages can be accessed by adding “/wp-admin” or “/wp-login.php” to the end of a URL. Once a hacker or scammer has identified your login page, they can then attempt to guess your username and password in order to access your Admin Dashboard.

Hiding your WordPress login page is a good way to make you a less easy target.

9. Disable XML-RPC

WordPress uses an implementation of the XML-RPC protocol to extend functionality to software clients.

Most users don’t need WordPress XML-RPC functionality, and it’s one of the most common vulnerabilities that opens users up for exploits.

That’s why it’s a good idea to disable it. Thanks to the Wordfence Security plugin, it is really easy to do just that.

10. Harden wp-config.php File

The process of adding extra security features to your WordPress site is sometimes known as “hardening” because you are essentially giving your site some extra armor against hackers.

You can “harden” your website by protecting your wp-config.php file via your .htaccess file. Your WordPress wp-config.php file contains very sensitive information about your WordPress installation including your WordPress security keys and the WordPress database connection details, which is exactly why you don’t want it to be easy to access.

11. Run A Security Scanning Tool

Sometimes your WordPress website might have a vulnerability that you had no idea existed. That’s why it’s wise to use some tools that can find vulnerabilities and even fix them for you.

The WPScan plugin scans for known vulnerabilities in WordPress core files, plugins and themes. The plugin also notifies you by email when new security vulnerabilities are found.

Strengthen Your Server-Side Security

So you have taken all the above measures to protect your website but you still want to know if there is more you can do to make it as secure as possible.

The remaining actions you can take to beef up your security will need to be done on the server side of your website.

12. Look For A Hosting Company That Does This

One of the best things you can do to protect your site from the very get-go is to choose the right hosting company to host your WordPress website.

When looking for a hosting company, you want to find one that is fast, reliable, and secure, and will support you with great customer service.

That means they should have good, powerful resources, maintain an uptime of at least 99.5%, and use server-level security tactics.

If a host can’t check those basic boxes, they are not worth your time or money.

13. Use The Latest PHP Version

Like old versions of WordPress, outdated versions of PHP are no longer safe to use.

If you aren’t on the latest version of PHP, upgrade your PHP version to protect yourself from attack.

14. Host On A Fully-Isolated Server

Fully-isolated virtual private servers have a lot of advantages and one of those advantages is increased security.

The physical isolation offered from a cloud-based VPS is inherently secure, protecting your website against cross-infection from other customers. Combined with robust firewalls and DDoS protection, your data remains secure against potential threats and vulnerabilities.

Looking for the perfect cloud environment for your WordPress website? Look no further.

With InMotion Hosting’s Platform i, you receive unparalleled security features including managed server updates, real-time security patching, web application firewalls, and DDoS prevention, along with purpose-built high-availability servers optimized for fast and reliable WordPress sites.

15. Use A Web Application Firewall

One of the final things you can do to add extra security measures to your WordPress website is use a web application firewall (WAF).

A WAF is usually a cloud-based security system that offers another layer of protection around your site. Think of it as a gateway for your site. It blocks all hacking attempts and filters out other malicious types of traffic like distributed denial-of-service (DDoS) attacks or spammers.

WAFs usually require monthly subscription fees, but adding one is worth the cost if you place a premium on your WordPress website security.

Make Sure Your Website & Business Is Safe & Secure

If your website is not secure, you could be leaving yourself open to a cyber attack.

Thankfully, securing a WordPress site doesn’t require too much technical knowledge as long as you have the right tools and hosting plan to fit your needs.

Instead of waiting to respond to threats once they happen, you should proactively secure your website to prevent security issues.

That way if someone does target your website, you are prepared to mitigate the risk and go about your business as usual instead of scrambling to locate a recent backup.

Get Managed WordPress Hosting featuring robust security measures on high-performance servers, complete with free SSL, dedicated IP address, automatic server updates, DDoS protection, and included WAF.

Learn more about how Managed WordPress Hosting can help protect your website and valuable data from exposure to hackers and scammers.

Big Brands Receive Site Abuse Manual Actions via @sejournal, @martinibuster

Google indicated that manual actions were coming to webpages that host third party webpages and according to some, the effects of those manual actions may be showing up in the search results.

Site Reputation Abuse Manual Actions

Google’s SearchLiaison tweeted late on May 6th that Google was enforcing the new site reputation abuse policy with manual actions. Manual actions are when someone at Google inspects a webpage to determine if the page is in violation of a spam policy.

Th Reputation Abuse policy affects sites that host third party content that is published with little to no oversight from the hosting website. The purpose of the arrangement is for the third party to take advantage of the host site’s reputation so that both receive a share of affiliate sales. An example could be a news website that’s hosting coupon code content that’s entirely created by a third party.

What Are Manual Actions?

A manual action is when a human at Google visually inspects a website to determine if they engaged in violations of Google’s spam policies. The result of a manual action is typically but not always a removal from Google’s search index. Sometimes the offending webpages are completely removed and sometimes they are only prevented from ranking.

Sites With Manual Actions

Google communicates to the site publisher if a site has been issued a manual action. Only the site publisher and those with access to a website’s search console account is able to know. Google generally doesn’t announce which sites have received a manual action. So unless a site has completely disappeared from Google Search it’s not possible to say with any degree of certainty if a site has received a manual action.

The fact that a webpage has disappeared from Google’s search results is not confirmation that it has received a manual action, especially if other pages from the site can still be found.

It’s important then to understand that unless a website or Google publicly acknowledges a manual action anyone on the outside can only speculate if a site has received one. The only exception is in the case when a site is completely removed from the search index, in which case there’s a high probability that the site has indeed penalized.

Big Brands Dropped From Search Results

It can’t be said with certainty that a site received a manual action if the page is still in the search index. But Aleyda Solis noticed that some big brand websites have recently stopped ranking for coupon related search queries.

Aleyda shared screenshots of coupon related search results before and after the Site Abuse policies were enforced. Her tweets showed screenshots of sites that were no longer ranking. Some of the sites appear to have removed their coupon webpages (highlighted in red) and sites that still hosted coupon pages but were no longer ranking in the search results were highlighted in orange in Aleyda’s screenshots.

It should be noted that Aleyda does not accuse any site of having received a manual action. She only shows that some sites are no longer ranking for coupon code search queries.

Aleyda tweeted:

“Google has already started taking action for the new site reputation abuse policy 👀👇 See the before/after for many of the most popular “promo code(s)” queries:

* carhartt promo code
* postmates promo code
* samsung promo code
* godaddy promo code

Sites that were ranking before and not anymore:

* In Orange (with still existing coupon sections): Cnet, Glamour, Reuters, USA Today, CNN, Business Insider
* In Red (with removed coupon sections): LA Times, Time Magazine, Wired, Washington Post”

Did Reuters Receive A Manual Action?

The global news agency Reuters formerly took the number one ranking spot for the keyword phrase “GoDaddy promo code” (as seen in the “before” screenshot posted by Aleyda to Twitter).

But Reuters is completely removed from the search results today.

Did the Reuters GoDaddy page receive a manual action? Manual actions typically result in a webpage’s complete removal from Google’s search index, But that’s not the case with the Reuters GoDaddy coupon page. A site search for the GoDaddy coupon page still shows webpages from Reuters are currently still in Google’s index. It’s just not ranking anymore.

Reuters Coupon Page Remains In Search Index

Screenshot of search results

It’s hard to say with certainty if the Reuters page received a manual action but what is clear is that the page is no longer ranking, as Aleyda correctly points out.

Did Reuters GoDaddy Page Violate Google’s Spam Policy?

Google’s Site Reputation Abuse policy says that a characteristic of site reputation abuse is the lack of oversight of the third party content.

“Site reputation abuse is when third-party pages are published with little or no first-party oversight or involvement…”

Reuter’s current GoDaddy page contains a disclaimer that asserts oversight over the third party content.

This is the current disclaimer:

“The Reuters newsroom staff have no role in the production of this content. It was checked and verified by the coupon team of Reuters Plus, the brand marketing studio of Reuters, in collaboration with Upfeat.”

Reuters’ disclaimer shows that there is first-party oversight which indicates that Reuters is in full compliance with Google’s spam policy.

But there’s a problem. There was a completely different disclaimer prior to Google’s Site Reputation Abuse policy announcement.  This raises the question as to whether Reuters changed their disclaimer in order to give the appearance that there was oversight.

Fact: Reuters Changed The Disclaimer

The current disclaimer on the Reuters coupon page asserts that there was some oversight of the third party content.  If that’s true then Reuters complies with Google’s spam policy.

But from March 11, 2024 and prior, the Reuters published a disclaimer that clearly disavowed involvement with the third party content.

This is what Google’s site reputation abuse policy says:

“Site reputation abuse is when third-party pages are published with little or no first-party oversight or involvement…”

And this is the March 11, 2024 disclaimer on the Reuters coupon page:

“Reuters was not involved in the creation of this content.”

Reuters Previously Denied Oversight Of 3rd Party Content

Screenshot of Reuters' previous disclaimer that disavows involvement in third party coupon content

Reuters changed their disclaimer about a week after Google’s core update was announced.  That disclaimer had always distanced Reuters from involvement prior to Google’s spam policy announcement.

This is their 2023 disclaimer on the same GoDaddy Coupon page:

“This service is operated under license by Upfeat Media Inc. Retailers listed on this page are curated by Upfeat. Reuters editorial staff is not involved.”

Why did that disclaimer change after Google’s Site Reputation Abuse announcement? If Reuters is in violation did they receive a manual action but were spared from having those pages removed from Google’s search index?

Manual Actions

Manual actions can result in a complete removal of the offending webpage from Google’s search index. That’s not what happened to Reuters and other big brand coupon pages highlighted by Aleyda so it could be possible that the big brand coupon pages only received a ranking demotion and not a full blown de-indexing as is common for regular sites. Or it could be that the demotion of those pages in the rankings are complete coincidence.

Featured Image by Shutterstock/Mix and Match Studio

Gravity: How To Cultivate Focus And Execute Better via @sejournal, @Kevin_Indig

Focus is mission-critical, but most companies don’t lower the gravitational forces pulling on attention.

As a result, large companies with too much mass have a hard time navigating and adapting to the quickly changing organic growth landscape:

  • Google’s algorithm updates have struck hard and left casualties.
  • Even small rank changes in top positions have an outsized impact.
  • Consumer behavior on the internet is messy and hard to track
  • Teams are shrinking: 80,000 tech workers have been laid off in 2024 so far.
  • Marketing budgets were slashed on average by 10 to 20% over the last two years and only slowly started to recover.
  • Tech workers spend two out of five days per week on meetings and email. Only 43% of the time is spent on actual tasks.

No platform has as many changes of requirements. Over the last 3 years, Google launched 8 Core, 19 major and 75-150 minor updates. The company mentions thousands of improvements every year.

As individuals, we live in a distracted world, where one of the most important skills is managing attention. How did we think teams are any different?

How to cultivate focus and execute betterImage Credit: Lyna ™

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

No Win Without Focus

Sir Isaac Newton realized that the sun’s gravity causes planets to orbit it in an elliptical path. Gravity in the workplace is a distraction from the focus of individuals and teams:

  • Meetings.
  • Fire drills.
  • Red tape.
  • Strategy pivots.
  • Too many goals.
  • Alignment overhead.
  • Too many cooks in the kitchen.
  • Procurement and legal rabbit holes.
  • Non-critical emails and Slack messages.

The larger a company gets, the stronger its gravity.

Taken to an extreme, it takes companies forever to launch even a single feature, and they fall behind the competition.

For individuals, gravity is even more consequential: Scattered attention means getting nothing done, having no impact, and likely being fired. Worse, people get exhausted and burned out in the process.

“Tranquility comes from doing less,” (Ryan Holiday), but a lot of teams execute scatter-brained like a teenager multitasking between Netflix, TikTok, and texting.

Individual and team focus are connected at the hip. When a team is distracted, it transfers to individuals. Two-thirds of people struggle to find the energy to do their job.

Whenever I get overwhelmed, my brain tells me to open my email inbox and look for a quick dopamine hit. But finding quick tasks and busy work is no achievement.

Real impact comes from working through tedious, complex problems.

We cannot erase gravity, but we can do five things better:

  1. Communication.
  2. Prioritization.
  3. Strategy.
  4. Red Tape.
  5. Meetings.

Better Communication

Unclear communication is one of the biggest attention drainers. We waste a lot of time deciphering what other people mean.

At Shopify, we had a very high bar for what internal communications went out to the Growth org and how they would be framed.

It’s easy to @ your whole team on Slack, but what people really need is key information:

  • What’s going on?
  • How is it relevant to me?
  • What do I need to know/do?

Lazy communication has massive speed cost. In the book “Smart Brevity,” the authors provide a simple framework for writing clear statements:

  1. Start with a muscular tease that grabs attention with six or fewer strong words.
  2. Explain what recipients need to know in the first sentence.
  3. Explain why it matters.
  4. Offer a choice to go deeper by providing more optional context.

Most important: Think about one thing you want people to remember – not more. Nobody has time to read a Slack novel.

Better Prioritization

At PayPal, Peter Thiel established a culture of hardcore focus. He would only discuss their No. 1 priority with managers and hold them accountable for just their one main contribution to the company.

Focus is a forcing function to separate the wheat from the chaff when it comes to what to spend time on.

No one could be a better example of hardcore prioritization than engineers. If you want to get something on the ENG roadmap, something else has to give.

An effective roadmap operating system has only a few lines of code:

  • What’s the goal?
  • What are the top 3 things that get us there?
  • For those three things, what do we need in terms of people, assets, time, support from other teams, and tools?
  • For those three things, who does what, by when?
  • Defend yourself and your team as much as possible from anything else.

You never look back at your time at a company and say, “Man, my fourth, fifth, and sixth priority back then really hit home,” but you might remember the impact of priorities one, two, and three.

Cal Newport’s new book, “Slow Productivity,” mentions doing less as one of the top ways to do better work.

But the advice I like the most is doubling the time you think a project takes.

Doubling automatically trims your roadmap by probably 50% but makes it more likely that you deliver on time and deliver well.

A big part of moving ourselves into an overcommitment corner is underestimating how long projects take (I think I wrote the last sentence at least as much for myself as for you).

Better Strategies

Poor strategies are hard to follow and confuse the team.

In my experience, managers want to get fancy, but they miss the most important point: A good strategy means doing something different from the competition.

Instead of outworking contenders, you want to do something that’s unique and leans to your competitive advantage.

Pairing differentiation with prioritization, your three most important projects should underline how you achieve the same goal in a different way as your competitors.

For example, instead of writing 100 blog articles, can you build a programmatic play or a list of tools? Or can you leverage contributors who write the content instead of a large in-house team?

I also found that most strategies simply aren’t clear. A simple test for clarity is to up or downsize the surface you have to explain it: Can you express your strategy in one paragraph (TL;DR), one page (high-level), and one doc (in-depth)?

Less Red Tape

Red tape in the form of excessive bureaucracy kills execution. I’ve seen many companies that take many weeks and endless alignment meetings before being able to sign up for a simple SaaS tool.

Procurement and legal teams can slow companies down and frustrate teams beyond means.

The key to having speed and a good evaluation process is clear guidelines when legal or procurement steps in.

With one of my former clients, the fastest-growing fintech startup in history, we sat down with the legal team and got a full download on guardrails. What can we say and what not? When do we have to get legal approval, and when can we move forward without it?

This is a task for the team manager or org leader. While tedious, the good news is that once the borders have been established, teams can move forward faster and focus on execution.

Fewer Meetings

Tobi Lütke, founder and CEO of Shopify, called meetings a “bug.” The leadership team regularly deployed the “Chaos Monkey,” a script that deletes all recurring meetings with more than two participants. Other companies set guardrails around time. Calendly restricts meetings to noon until 5 p.m.

Most meetings are poorly run, unnecessary, or simply a way for people to socialize.

Besides an agenda, every meeting should have a clear purpose. There really are only three types of meetings: socializing, information sharing, and decision-making.

Building relationships in the workplace is important, and there is nothing wrong with socializing. It’s important to be explicit and avoid meeting to “talk about Project X” while really wanting to socialize.

Information-sharing meetings are best done async. Instead of getting a large group of people together, record your message in a video or write a memo.

Decision-making meetings should be led by the decision maker and come with a pre-read.

The problem with many large organizations is that decisions are poorly framed; it’s unclear who makes the decision, and the decision-maker doesn’t have explicit criteria for how to make the decision.

Outlook: Can AI Help Us Regain Focus?

Show me how focused your team is, and I’ll show you a team that will win.

High gravity in large organizations, on the other hand, is an ask to be disrupted by a smaller, more agile player. The good news is that technology is working against gravity – at least in the workplace.

AI has the potential to help us find fragmented information, force clarity, and take over bland admin tasks that drain time so we can focus on things that matter.

Microsoft’s Future of Work report concludes:

“Organizational knowledge is fragmented across documents, conversations, apps and devices, but LLMs hold the potential to gather and synthesize this information in ways that were previously impossible”.

In the future, we’ll be able to ask LLMs questions about any internal process, like “What are our top goals?” or “Does this need legal review?” The freed-up time allows us to refine our strategies and get work done.

That future still seems a few years away. Until then, we can do a lot to improve our attention.


Layoffs.fyi

Marketing spending shows signs of growth, but AI adoption is slow: report; Beyond belt-tightening: How marketing can drive resiliency during uncertain times

Will AI Fix Work?

Workers Now Spend Two Full Days a Week on Email and in Meetings

Microsoft New Future of Work Report 2023


Featured Image: Paulo Bobita/Search Engine Journal

Google SGE Organic Traffic Impact Divided By Verticals [Data Study] via @sejournal, @bart_goralewicz

We analyzed 500k queries to find patterns and triggers enabling SGE results.

We started to research SGE in June 2023 and have conducted bi-weekly scrapes of 100-500k keywords along with data analysis from our R&D team.

From this data, we have in-depth findings about SGE development and ranking factors.

While verticals are not the only aspect that affects whether we’ll see generative AI in search result pages or not, the vertical’s topic is the strongest of all the factors we’ve measured.

Although Google’s Search Generative Experience (SGE) is not showing up for all of the user queries, the rollout of SGE will have a different impact on your organic traffic acquisition, depending on the vertical in which your business operates.

Let’s start with the statistics and what we found.

Vertical vs. SGE answer dataImage from author, April 2024

As you can see above, the verticals differentiate significantly. This is a key data point when starting your SGE optimization journey.

While most verticals seem to be in the 80-90% range, the financial vertical is a strong outlier, with 47% SGE coverage compared to the beauty industry, which has 94% query coverage.

Before I dive into more exciting findings from our research, I wanted to stop for a second and address the elephant in the room: Google SGE and YMYL & E-E-A-T.

Google SGE, YMYL & E-E-A-T

Google began to hold websites to a higher standard around 10 years ago, and the guidelines for so-called “Your Money, Your Life” (YMYL) content became stricter year after year.

Two of the verticals (health and finance) with the lowest Google SGE coverage fall into that category. However, what came as a shock for me is that despite it being one of the lowest-covered categories, health-related queries were covered almost 80%.

Google SGE presents AI-generated answers to queries like [cure for cancer].

The image shows a Google search results page for the query Screenshot from search for [cure for cancer], Google April 2024

Not only is Google SGE using AI to cover YMYL queries, but our research suggests that Google’s AI search results will be present more often in the future.

Google SGE Will Cover More Queries Over Time

Our early Google SGE research on July 1, 2023 revealed that the finance/investing vertical was only 22% covered. Our check in April 2024 showed 47% coverage of the same group of queries.

We have been tracking Google SGE coverage since its inception and observing Google’s efforts to increase SGE coverage month after month.

Based on these trends, my prediction is that we’ll see more Google SGE results as soon as Google gathers more data, allowing for a higher certainty of high-quality results with lower chances of AI hallucinations.

Understanding The “Generate” Button And How You Can Use It To Prioritize Your SGE Optimization

If you’ve been using Google SGE for a while, you’ve seen this button multiple times. In some cases, Google will show a “Generate” button rather than showing Google SGE results by default.

Screenshot of a Google SGE search results page for Screenshot by Gyazo.com from search for [trek road bike], Google April 2024

After looking at the ratio of the default Google SGE layouts compared to the ones showing the generate button, we realized that this ratio is different for each of the verticals analyzed.

How Often Does Google SGE Is A Default SERP View?

The ratios of auto-generated Google SGE compared to the ones with the generate button depend on the vertical you are in. The graph below will show you the ratio for your set of queries.

Why is this important? Auto-generated Google SGE results don’t require any action on the user’s end to cover the whole viewport on both mobile & desktop. Based on this SGE behavior, I expect auto-generated SGE views to disrupt the search experience, effectively pushing the “traditional” Google SERPs down. To be exact 905 px down (average height of Google SGE response in our research).

Vertical percentage of auto-generated Google SGE resultsImage from author, April 2024

The health vertical is shockingly an outlier, with 68% of queries returning Google SGE automatically and only 9% showing the generate button.

On the other side of the spectrum, hotels, and entertainment have only 4% to 6% auto-generated Google SGE pages, with 77% of Google SGE queries showing the generate button for each of these verticals.

The impact of Google SGE will depend on multiple metrics, with SGE coverage being one of them.

However, we can assume that the queries with auto-generated AI responses will be affected sooner than the ones returning the generate button.

Let’s see how you can use this data to prepare your business for this shift.

Data Over Time

We’ve tracked the manually and auto-generated rations for a diverse set of queries since June last year, and the rations changed in the following way:

  • June 2023: 58% (auto) – 13% (manually).
  • November 2023: 48% – 31%.
  • March 2024: 46% – 27%.

How To Use This Data?

Estimating Google SGE’s Impact On Your Brand

Prioritization is key to estimating Google SGE’s impact on your brand.

Looking at the data above, the health industry will be impacted quicker than any other in the list as users will be automatically presented with Google SGE response.

Hotels, entertainment, real estate, and food verticals will have more time to prepare, with 4-12% of queries returning Google SGE automatically.

Action Point #1

  • Gather Google SGE data for queries that cover 80-90% of your search volume (you can use a free demo of ZipTie.dev to run mass checks of Google SGE coverage).
  • What percentage of your traffic and queries comes from queries returning auto-generated Google SGE vs. manually generated SGE?
  • Is there a pattern between these two groups?

Query-Based Google SGE Impact And Prioritization

SEO teams all around the world are shifting their efforts to encompass Google SGE optimization.

Dividing your queries into auto-generated and generated after the click allows you to focus on auto-generated Google SGE queries first and tracking this ratio, as our research shows that Google SGE is covering more queries automatically.

Action Point #2

  • After dividing your queries into two buckets – auto-generated and manually generated SGE – focus on optimizing the queries in the auto-generated bucket.
  • Track Google SGE’s coverage in the manually generated bucket (over time, Google will show more auto-generated SGE SERPs).

Once you have a clear set of optics for your queries and know SGE’s impact on your brand, it’s time to look into the sources.

Spoiler alert: The number of Google SGE sources also depends on the vertical.

SGE Sources And Why Hotels May Get Up To 40% Fewer SGE Clicks Compared To Other Verticals

You can only get organic traffic from Google SGE when users click on the source of the AI-generated content. This is where it gets interesting. Depending on the vertical you are in, the number of sources varies from three to five.

This 40% to 50% difference will make Google SGE optimization more difficult, especially in the finance or hotel vertical, where Google SGE shows an average of around three sources per query.

The data below shows an average number of Google SGE sources quoted on the search engine result page.

Interestingly, some of the queries do not return any sources (Google is generating the answer based on its internal database).

Average number of SGE sources per industryImage from author, April 2024

SGE results that mention fewer content sources have a higher chance of generating traffic due to higher average CTR (click-through rate). At the same time, these queries will be more difficult to optimize for due to limited real estate.

This makes queries with a limited number of sources a bit more complex to track and strategize.

Google SGE Will Not Show Any Sources For 5% of The Queries

Our research shows that 5% of the queries will return Google SGE results with no external sources quoted. This topic is one of the most interesting and also quite complex in our research due to the risk of Google taking over a ton of traffic, but also because Google seems to be using different logic for zero-sources SGE for different verticals and types of queries. We are still running checks & experiments; however, to understand this feature, we need to track it over time and spot more queries that went to the zero-sources bucket. I’ll keep you posted.

Carousel Rankings – To The Top Left

We are all used to aiming for the top of SERPs. Google SGE results with more than three sources will come with a challenge that’s new to us SEO professionals: horizontal scrolling.

Being included within the first three sources (counting from the left to the right side) will mean that your URL is visible without scrolling horizontally, leading to higher CTRs and more impressions.

Action Points

  • Track the number of SGE sources for all of your queries.
    • Prioritize your SGE optimization efforts for queries with the least amount of sources (higher returns from higher CTRs).
    • For queries with more than four or more sources, track carousel positions and aim to be included in the top 3 sources to gain better visibility and boost CTRs.

Google SGE Sources Vs. Traditional Rankings

Google SGE operates on a brand-new algorithm, and your current rankings will not translate to Google SGE one-to-one.

Shockingly, these data points are also completely different when analyzed per vertical.

% of SGE sources not ranking in Top 10Image from author, April 2024

Industry winners

While analyzing our study, we found that Google SGE seems to favor a few specific domains per vertical. In many cases, the key factors driving these outliers were lean technology combined with content structure focused on answering the query in a concise and clear manner.

Final Thoughts

With Bing rapidly becoming a co-pilot rather than a search engine and Jeff Bezos threatening Google’s Perplexity, the SEO industry needs to adjust to this never-before-seen pace.

I believe that navigating this change without overwhelm and chaos is only possible with a proper set of optics and wisely chosen battles.

I hope the data I shared in this article can help you prioritize and make better strategic decisions. We are going to share more insights into each vertical over the next few months here on SEJ.

If there’s anything else you’d like to know or understand about how Google SGE is reshaping organic search, reach out to me, and I’ll do my best to get the data needed to help our industry during this time of transition and change.

Happy optimizing!

More resources:


Featured Image: BestForBest/Shutterstock

Hit By The Core Algorithm? 5 Factors To Be Aware Of via @sejournal, @martinibuster

Many factors can affect rankings after a core algorithm update. It’s not always about the helpfulness of content, there are other factors that can play a role in why the algorithm changed and negatively affected your website rankings.

If you find yourself saying, “It used to rank before, why doesn’t it rank now?” then some of these factors may be something to consider.

1. Algorithmic Losses Are Not Necessarily Persistent

Sites hit by the core algorithm update (which includes the Helpful Content part) do not have a permanent strike against them. Over the past ten years Google has rolled out complicated algorithms and systems that can take months between update cycles, leaving affected sites unable to find a quick path back to the search results. While that’s not a permanent mark it does feel like a site has acquired a curse that permanently marks them as no good and permanently excluded.

Google’s John Mueller answered a question where he confirmed that getting caught in a Core Algorithm Update is not persistent and with work a site can recover from being hit by an update.

Someone asked on X (formerly Twitter):

“Can a site hit by HCU grow again in terms of traffic if it improves in quality? Many fear that no matter the amount of improvements we make a HCU hit site will forever have a classifier assigned to it that keeps it from growing again.”

John Mueller responded:

“Yes, sites can grow again after being affected by the “HCU” (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”

2. Recovering Is Not The Right Word

A lot of people think of recovering from an update as resetting the rankings so that websites regain positions to a previous state. John Mueller’s answer on X suggests that publishers can understand algorithmic effects as something that requires adjusting a website to fit into an evolving web, including user expectations.

Mueller tweeted:

“Permanent changes are not very useful in a dynamic world, so yes. However, “recover” implies going back to just-as-before, and IMO that is always unrealistic, since the world, user-expectations, and the rest of the web continues to change. It’s never “just-as-before”.”

This statement seems to imply that to a certain degree, algorithmic updates reflect user expectations in what they expect to see in the search results. A way to understand this is with the example of Google’s Medic Update from a few years back. That update reflected a realignment of the search results with what users expect to see when making certain queries. After the Medic update, search queries for medical topics required search results with a scientific approach. Sites that reflected folk remedies and unscientific did not fit that updated definition of relevance.

There are subtle variations to this realignment of search results that goes directly to answering the question, what do users mean when they ask a search query? Sometimes relevance means informational sites while for other queries it may mean review sites are what users expect to see.

So if your site is hit by a core algorithm update, revisit the SERPs and try to determine what the new SERPs mean in terms of relevance and self-assess whether your site meets this new definition of relevance.

Circling back to Mueller’s response, there is no “going back to just-as-before” and that may be because there has been a subtle shift in relevance. Sometimes the fix is subtle. Sometimes getting back into the search engine results (SERPs) requires a major change in the website so that it meets with user expectations.

3. Thresholds And Ranking Formulas

Another interesting point that Mueller discussed is the difference between an ongoing algorithmic evaluation and the more persistent effects from a ranking system that requires an update cycle before a site can recover.

Someone asked:

“The simple question is whether you need to wait for a new core update to recover from the HCU. A simple “yes” or “no you can recover anytime” would suffice.”

John Mueller answered:

“It’s because not all changes require another update cycle. In practice, I’d assume that stronger effects will require another update. Core updates can include many things.”

Then continued with these interesting comments:

“For example, a ranking formula + some thresholds could be updated. The effects from the updated formula are mostly ongoing, the changes to thresholds often require another update to adjust.

…(“thresholds” is a simplification for any numbers that need a lot of work and data to be recalculated, reevaluated, reviewed)”

The above means there are two kinds of effects that can hit a site. One that is a part of a continually updated ranking formula that can quickly reflect changes made to a site. These used to be called rolling updates where the core algorithm can make relatively instant evaluations about a site and boost or demote the rankings.

The other kind of algorithmic issue is one that requires a massive recalculation. This is what the HCU and even the Penguin algorithms used to be like until they got folded into the core algorithm. They were like massive calculations that seemed to assign scores that were only updated on the following cycle.

4. The Web & Users Change

In another recent exchange on X, John Mueller affirmed that a key to success is keeping track of what users expect.

He tweeted:

“…there is no one-shot secret to long-lasting online success. Even if you find something that works now, the web, user desires, and how they engage with websites changes. It’s really hard to make good, popular, persistent things.”

That statement offers these concepts to keep in mind for online success:

  • The Internet
  • User desires
  • How users engage with websites
  • popularity is not persistent

Those are not algorithm factors. But they could be things that Google picks up on in terms of understanding what users expect to see when they make a search query.

What users expect to see is my preferred definition of relevance. That has practically zero to do with “semantic relevance” and more about what users themselves expect. This is something that some SEOs and publishers trip over. They focus hard on what words and phrases mean and forget that what really matters is what they mean to users.

Mueller posted something similar in an answer about why a website ranks #1 in one country and doesn’t perform as well in another. He said that what users expect to see in response to a query can be different from country to country. The point is that it’s not about semantics and entities and other technical aspects but often search ranking relevance has a lot to do with the users.

He tweeted:

“It’s normal for the search results in countries to vary. Users are different, expectations may vary, and the web is also very different.”

That insight may be helpful for some publishers who have lost rankings in a core algorithm update. It could be that user expectations have changed and the algorithm is reflecting those expectations.

5. Page-Level Signal

Google’s SearchLiaison affirmed that the Helpful Content component of the core algorithm is generally a page-level signal but that there are sitewide ones as well. His tweet quoted the Helpful Content Update FAQ which says:

“Do Google’s core ranking systems assess the helpfulness of content on a page-level or site-wide basis?

Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand the helpfulness of individual pages. We do have some site-wide signals that are also considered.”

Keep An Open Mind

It’s frustrating to lose rankings in a core algorithm update. I’ve been working in SEO for about 25 years and auditing websites since 2004. Helping site owners identify why their sites no longer rank has taught me that it’s useful is to keep an open mind about what is affecting the rankings.

The core algorithm has a lot of signals, some of which pertain to the helpfulness while others are relevance to users, relevance to site queries and also just plain site quality. So it may be helpful to not get stuck thinking that a site lost rankings because of one thing because it could be something else or even multiple factors.

Featured Image by Shutterstock/Benny Marty

Google’s Mueller Outlines Path To Recovery For Sites Hit By Core Update via @sejournal, @MattGSouthern

Google’s Search Advocate John Mueller recently addressed the SEO community’s concerns about site recovery after being impacted by algorithm updates.

The conversation arose as people questioned whether sites hit by the September helpful content update could regain lost traffic and rankings after future core updates.

The exchange began on X when an SEO professional, Thomas Jepsen, asked Mueller if Google’s previous stance still held true – that the search engine “doesn’t hold a grudge” and sites will recover once issues are resolved.

Mueller confirmed, “That’s still the case,” but cautioned that “some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.

Addressing Lingering Confusion

Following Mueller’s statements, confusion persisted around whether sites hit by the helpful content update require a new core update to recover lost rankings.

Mueller clarified:

“… not all changes require another update cycle. In practice, I’d assume that stronger effects will require another update. Core updates can include many things.”

He likened core updates to adjustments in ranking formulas and thresholds, with the latter often necessitating another update cycle.

Dismissing Permanence Concerns

There’s concern that sites affected by the September helpful content update will be permanently classified, obstructing future growth.

Mueller addressed those concerns and affirmed that affected sites could regain traffic by improving quality.

However, Mueller says full recovery to pre-update levels is unrealistic.

He states:

“Permanent changes are not very useful in a dynamic world… However, ‘recover’ implies going back to just-as-before, and IMO that is always unrealistic, since the world, user-expectations, and the rest of the web continues to change. It’s never ‘just-as-before’.”

When asked directly if a site affected by the helpful content update can grow in traffic if it improves in quality, Mueller stated:

“Yes, sites can grow again after being affected by the ‘HCU’ (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”

The Long Road Ahead

Continuing the conversation on LinkedIn, Mueller stressed that the recovery process isn’t specific to helpful content updates or core updates but applies to all kinds of systems and updates within Google.

Mueller states:

“… to be clear, it’s not that “helpful content update” “recoveries” take longer than other updates. It’s just that some kinds of changes take a long time to build up, and that applies to all kinds of systems & updates in Google & in any other larger computer system. Saying that this is specific to the helpful content system, or to core updates would be wrong & misleading.”

Mueller acknowledged that the recovery process doesn’t have a single, straightforward solution and may require deep analysis and significant work to understand how to make a website relevant again.

“There is, however, the additional aspect of the “core update” being about how our systems assess content overall, how we consider it to be helpful, reliable, relevant to users’ queries. This does not map back to a single change that you can make on a website, so – in my experience – it’s not something that a website can just tweak overnight and be done with it. It can require deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants.”

Lastly, he adds that a recovery will take more than fixing technical issues. It may require a realignment of business priorities.

“These are not “recoveries” in the sense that someone fixes a technical issue and they’re back on track – they are essentially changes in a business’s priorities (and, a business might choose not to do that).”

Why SEJ Cares

Google’s core algorithm updates can dramatically impact a website’s search visibility and traffic.

For sites negatively affected, clear guidance on recovery is critical – both for setting realistic expectations and charting a practical path forward.

Mueller’s insights reassure that improvement remains possible through strategic realignment with Google’s current quality standards.

How This Can Help You

Mueller’s insights allow impacted sites to set realistic expectations for recovery.

Regaining visibility remains possible with patience, thorough analysis, and persistent effort.

Mueller’s statements offer the following takeaways for sites impacted by Google’s updates:

  • Recovery isn’t out of the question but will require significant effort over multiple update cycles.
  • Simply restoring previous tactics is insufficient; sites must evolve to meet changing user needs and internet best practices.
  • Deep analysis is necessary to identify areas for improvement and realign content strategy with modern relevance signals.
  • Returning to previous ranking positions is unrealistic due to evolving user needs.

Featured Image: rudall30/Shutterstock

Google Confirms Site Reputation Abuse Update via @sejournal, @martinibuster

Google’s SearchLiaison confirmed that Google’s site reputation abuse update started on Monday May 6th. Many sites from across the web took down webpages that could be perceived as hosting third-party content for the purpose of ranking in search engines.

Site Reputation Abuse

An old strategy that’s made a recent comeback is one in which a marketer will piggyback their content on another website in order to rank it in search engines. The best way to describe the practice is that a publisher is piggybacking on another publishers website.

Some newbie marketers slapped the awkward name parasite SEO to the practice. Parasite SEO is an inept name for this strategy because a parasite subsists on an unwilling host organism but this approach to ranking is by agreement and not one site attacking another one without permission.

This isn’t a low-level affiliate marketer strategy though. It’s also one that’s practiced by many major brands, particularly for credit cards and product reviews.

Google Targets Third Party Content

This specific spam policy targets sites that host third party content in which the host publisher has little to do with the content published on their site. It takes more than just hosting third party content however to be targeted as spam.

Google’s formal definition is:

“Site reputation abuse is when third-party pages are published with little or no first-party oversight or involvement, where the purpose is to manipulate Search rankings by taking advantage of the first-party site’s ranking signals. Such third-party pages include sponsored, advertising, partner, or other third-party pages that are typically independent of a host site’s main purpose or produced without close oversight or involvement of the host site, and provide little to no value to users.”

Google’s SearchLiaison confirmed in a tweet that the policy went into effect today.

He tweeted:

“It’ll be starting later today. While the policy began yesterday, the enforcement is really kicking off today.”

Some big brand sites have recently removed sections of their site that featured product reviews that lack evidence that the reviewer actually handled the reviewed products. The reviews lacked original product photos, no product measurements and no testing results.

Read Google’s guidelines on Site Reputation Abuse.

Featured Image by Shutterstock/Lets Design Studio

Google Core Update: How To Write High-Ranking, High-Quality Content in 2024 via @sejournal, @fiverr

This post was sponsored by Fiverr Pro. The opinions expressed in this article are the sponsor’s own.

Were you impacted by Google’s recent core update?

Have your search visibility and rankings begun to drop?

Even in the era of automation, quality still wins over quantity when it comes to search rankings.

With the recent surge of generative AI, the demand for human-created, high-quality content is greater than ever.

Google’s latest update aimed to improve search result quality by reducing unhelpful, unoriginal content by 40%.

So, what kind of content does Google deem high quality these days?

How can you ensure your content performs well on search engines in 2024?

The simple answer is to focus on user experience rather than the search engine itself – in other words: make human connection the priority.

While many businesses are taking advantage of AI tools to populate their sites, those who still see the value in human-created, user-centric content will ultimately win out.

In this guide, we’ll break down the key steps to creating high-quality content that ranks on Google in 2024.

Step 1: Align Your Content Strategy With Campaign & User Journey Goals

Delivering the right content to the right audience at the right time is crucial for SEO success.

And with Google’s latest core update, it’s more important than ever to provide a seamless and personalized experience for your audience throughout their user journey.

Here’s how you can use audience research and user insights to create content that drives results:

  • Identify Your Target Personas: Start by defining your ideal audience personas. Understand their demographics, preferences, pain points, and behaviors.
  • Map Out Your User Journey: Visualize the various stages your customers go through, and identify the touchpoints where they interact with your brand.
  • Tailor Your Content To Each Stage: Meet your audience at each stage of the user journey, and address their specific needs with content that’s relevant, targeted, and effective.

Pro Tip: Don’t have the time or bandwidth to tackle these tasks? You can still move your business forward by hiring a professional SEO strategist to help plan your campaigns and keep you on track.

Image created by Fiverr Pro, April 2024

Step 2: Analyze Real User Data & Identify Content Gaps

As you’re building your content strategy, you’ll need insights into your audience’s behavior, preferences, and engagement patterns.

Utilize analytics tools to gather valuable user data and track key metrics such as website traffic, click-through rates, and conversion rates.

With this information, you can conduct a thorough audit of your existing content and identify gaps and areas for improvement.

Here’s how you can adjust your strategy to a more user-centric approach:

  • Build credibility by honing in on E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness).
  • Try leveraging real experts as content creators, elevating their first-hand experience to create truly helpful content from a trustworthy source. (This is particularly helpful for subjects such as finance or medicine.)
  • Be sure to display the writer as an experienced, authoritative figure in your post, with a link to an author archive page.
  • Maintain relevance by keeping your content up to date and checking frequently for possible revamps.

Get started by outsourcing your audience research and content auditing needs to a Fiverr Pro freelancer.

Step 3: Plan Content That Matches User Intent

Once you gain a deeper understanding of your customer touchpoints, it’s time to plan your content accordingly.

Start by developing detailed outlines for each piece of content that match user intent and meet their needs and expectations.

“Understanding and aligning content with user intent is crucial for SEO success. With Google’s algorithm updates, it’s essential to create content that is not only helpful but also answers specific questions your target audience has. Focus on crafting content that resonates with your audience, addresses their needs, and communicates your message clearly, while ensuring it’s relevant, engaging, and directly addresses the concerns and interests of your readers.”

Alan Redondo, SEO and PPC Expert

It’s also important to establish your own unique voice, as authentic and original content tends to rank higher on SERPS.

And make sure to avoid fluff and keep your content to the point – searchers want to see their questions answered and needs addressed with as little fuss as possible.

If content is not exactly your specialty but you’re ready to start laying out the process, try Fiverr Pro’s professional services to get the help you need.

Image created by Fiverr Pro, April 2024

Step 4: Write Content With Proactive Engagement In Mind

When you create content, the goal is to encourage interaction, discussion, and sharing among users, which boosts your brand awareness organically.

One effective strategy is to craft compelling headlines, introductions, and calls-to-action that prompt users to engage with your content.

You can also use interactive elements such as polls, quizzes, and surveys to encourage participation and feedback.

If you’re looking to maximize user engagement, a multichannel approach is key.

Choose the most appropriate content formats and channels for each stage of the user journey, and consider using blog posts, social media, email campaigns, etc. based on where your audience is most active.

Pro Tip: User Generated Content (UGC) can be beneficial for your overall search strategy. With Google favoring human insights, comment sections and other forms of human interactions are helpful for discoverability.

Ready to start getting the most out of your content and utilizing the multitude of channels available?

Whether you need a social media manager or an email marketer, Fiverr Pro has you covered.

Step 5: Analyze Performance & Pivot

Once you’ve implemented your content strategy, it’s essential to regularly monitor its performance and be prepared to make adjustments as needed.

Keep an eye on key performance indicators (KPIs) to gauge the effectiveness of your content, and pay attention to user feedback and comments to understand how your audience is responding.

Are they finding it helpful and engaging, or are there areas where you could improve?

Take note of any trends or patterns that emerge and use this information to inform future content decisions.

Image created by Fiverr Pro, April 2024

Embracing Human Talent In The AI Era

The human touch is essential to SEO success in today’s landscape, and Google’s latest update supports that.

As it stands, content that’s created for and by humans will outperform generic, AI-generated content every time.

So if you’re looking to set your website apart, it’s best to pivot away from sheer search volume and more towards what will resonate deeply with your target audience.

After all, the more value you provide to users, the more likely you are to beat the algorithm and boost your rankings.

But let’s say you’re a small business owner with a small team and limited hiring capacity.

How do you compete with organizations that have the in-house expertise necessary to implement this kind of content strategy?

How can you access the human talent needed to form genuine connections with your audience?

Whether you’re seeking an SEO professional to help you target your users, or a writer/artist to help you create content, it might help to start bringing on freelancers to fill in any talent gaps.

Fiverr Pro, for example, is a great solution for matching and collaborating with top-tier, pre-vetted freelance talent.

You can use it to find people with the right set of skills to help you execute your projects and add that much-needed human touch to your content.

In the meantime, use these tips to build out your content strategy, so that as you begin to expand your team, you’ll know which tasks to prioritize, moving forward.

Elevate Your Content Strategy With Fiverr Pro

With Google’s updates constantly shaking things up, the key to staying ahead is staying true to your audience and their needs.

While AI may be able to churn out content at lightning speed, it’s the authentic, user-centric content that truly stands out these days.

By creating content that connects with real people on a human level, you can ensure sustained SEO success in 2024 and beyond.

So if you’re looking for the right talent to bring that human touch to your content, Fiverr Pro is the ultimate solution.

Whether you’re a small business owner or a seasoned marketer, this subscription-free service offers a wealth of talented, pre-vetted professionals ready to elevate your content strategy.

Get started with Fiverr Pro today and let their experts find talent, plan, and manage projects for you.

Plus, you can hire with confidence, knowing that you’ll get a full refund if you’re not satisfied with the results.


Image Credits

Featured Image: Image by Fiverr Pro Used with permission.

Apple’s “Intelligent Search” Will Summarize Webpages via @sejournal, @martinibuster

A report based on independently verified accounts notes that Apple’s Safari 18 will come with an Intelligent Search feature that summarize webpages in response to search queries and there may be a new feature called Web Eraser that allows users to permanently remove text, images, advertisements from webpages.

The Potential For Disruption By Apple AI Search

Apple has been gathering website data for years through its Internet crawler that uses the user agent, Applebot. The harvested data has in the past been used in the context of Siri and Apple’s Spotlight Suggestions feature.

Many in the search community have been aware of Applebot and have welcomed the prospect of a new search engine from Apple but despite constant crawling Apple has not released a search engine. A reason Apple has not released a standalone search engine may be that it’s become apparent that the best way to challenge Google Search is with a technology that replaces search engines altogether, much like how the Apple iPhone made digital cameras obsolete.

The latest news coming out about Safari 18 appears to confirm that supplanting Google is the strategy that Apple is pursuing.

Duane Forrester, formerly of Bing and now at Yext, commented on the potentially disruptive quality of Apple’s new technology:

“Intelligent Search could change how iOS consumers get, see and interact with content and answers. We are likely to see Apple taking a big step forward, into the consumer-accessible AI conversation which has been dominated by OpenAI, Microsoft, and Google et al to this point. Our phones are about to become AI-enabled and that could be ground zero for AI Agents. It’s going to impact and potentially change consumer behavior and be the gateway to new experiences.

The approach Apple is taking has the potential to disrupt not just search engines but also the search optimization and publishing industry, who have have been waiting years for an Apple search engine. But the extent of that disruption depends on how Apple implements their AI web search summarizer.

Webpage Summarization

Although news reports didn’t provide details as to how the new search result summaries will appear, it seems reasonable to speculate that Apple will provide attribution in the form of a link to websites so that users can click through to the website.

Duane Forrester speculated:

“Apple was just in court for Google’s anti-trust trial and likely isn’t keen to dive into “monopoly” waters themselves. My guess is that any “result” will have a source.”

This is what was reported:

“With the release of Safari 18, Apple is expected to introduce article summarization through a new Intelligent Search feature — meaning users will have the option to generate a brief summary of any webpage currently on screen.

Apple’s built-in AI software can analyze the keywords and phrases within a webpage or document and generate a short summary containing only the most important information.”

SEOs have been giddy about the prospect of an Apple search engine for years. It now appears that the Google Killer they’ve been waiting for could possibly result in less traffic from search queries but to what extent it’s impossible to tell at this point.

One search marketing expert mused in a private chat that if Intelligent Search summarizes more than it links out then that may signal it’s time to start selling off the domain names they’ve invested in.

On-Device Processing

An interesting feature of the text summarization is that the technology that creates the summary (called Ajax) resides on the mobile device itself. What Ajax does is extract keywords, entities, and use the data to identify the topic and a loose summary of a webpage which is then turned into a text summary for the user.

This is how the functionality is described:

“In analyzing texts, the software takes into account all relevant information available. It can recognize and classify entities such as companies, people, and locations. For instance, if a name appears at the top of a text, the software will likely recognize the name as belonging to the text’s author.”

Apple Also Plans A Web Eraser

As if an Apple search summarizer isn’t bad enough Apple reportedly has a “Web Eraser” functionality planned for Safari. Web Eraser is a feature that removes content from webpages so that site visitors don’t have to look at it anymore. Things like advertising, videos, comments, suggested reading and maybe even popups could be permanently blocked by the Web Eraser.  Once a user “erases” a block of content from a webpage that block stays erased for the site visitor on subsequent visits.

According to the a report about the Apple Web Eraser:

“The feature is expected to build upon existing privacy features within Safari and will allow users to erase unwanted content from any webpage of their choosing. Users will have the option to erase banner ads, images, text or even entire page sections, all with relative ease.”

Technological Disruptions

It’s a natural response to experience anxiety in the face of changes. For many, the dawning of AI Search is their first experience of a major change. But for those of us who have been in search for 25+ years we have experienced and grown accustomed to sudden and transformative changes that alter publishing and SEO. Like Duane, I tend to feel that Apple’s implementation of an AI search engine that summarizes websites will be disruptive but not to the point that it harms websites. It’s in Apple’s self-interest to not disrupt the Internet to the point of extinction.

Featured Image by Shutterstock/frantic00

Big Change To Google’s Product Structured Data Page via @sejournal, @martinibuster

Google revamped their product structured data documentation by splitting up one comprehensive page into three pages that are more closely focused on their topics, providing an example of how to rewrite a page that’s too big and turn it into multiple topically relevant webpages.

Google Product Structured Data

Product structured data is essential for ecommerce and product review websites because they help make them eligible for rich result listings in Google’s search engine results pages (SERPs). When correctly deployed, the product structured data can make a significant contribution to the amount of traffic received by a website and earnings.

Google’s restructuring of the official documentation gives the ecommerce community a lot to take in but also simplifies the increasingly large products structured data webpage.

What Changed?

The most notable change to the documentation is that the entire document has been split into three pages. The original document, Product Structured Data, was renamed to Introduction To Product Structured Data. The word count went from 4,808 words to only 667 words, with approximately 50% of the new document containing the same content. Aside from trivial changes there is a brand new section of the revamped page called Deciding Which Markup To Use, which serves as a jumping off point to the two new pages.

Merchants, product review site publishers and SEOs now have three product structured data documents to read:

  1. Introduction To Product structured data
  2. Product Snippet Structured Data
  3. Merchant listing Structured Data

In addition to the above changes there’s a new blue tinted callout box that draws attention to the Product Variant Structured Data page that replaces similar text that was buried in the document and easily overlooked.

Screenshot of Callout Box

Screenshot

Document Extensively Rewritten

There are instances where the headings of the new documentation were rewritten to make clear what the topic of each section is about.

The new Introduction To Structured Data page now contains a brand new section. The old section was called “Result types” and the revamped section is called Deciding Which Markup To Use, a more descriptive heading.

This is the new content:

“Deciding which markup to use
There are two main classes of product structured data. Follow the requirements for the type that best suits your use case:

Product snippets: For product pages where people can’t directly purchase the product. This markup has more options for specifying review information, like pros and cons on an editorial product review page.

Merchant listings: For pages where customers can purchase products from you. This markup has more options for specifying detailed product information, like apparel sizing, shipping details, and return policy information.

Note that there is some overlap between these two features. In general, adding the required product information properties for merchant listings means that your product pages can also be eligible for product snippets. Both features have their own enhancements, so be sure to review both when deciding which markup makes sense in the context of your site (the more properties you can add, the more enhancements your page can be eligible for).”

First New Page: Product Snippets

I addition to the revamped introduction to product structured data there is now a new standalone page focused on adding product structured data to become eligible for rich results about ratings, reviews, price and product availability.

The title element for the new page is How To Add Product Snippet Structured Data. Excluding some of the structured data examples, the new product snippet page has about 2,500 words.

Much of the content on this new page isn’t new. The similarities between a section of the old 6,000 word mega-page and this new standalone page indicates that they split this section off the old documentation and turned it into its own page..

Second New Page: Merchant Listing Structured Data

The second new page is dedicated to the product structured data that’s specific to merchant listings, critical for surfacing products in the shopping-related rich results which Google calls Merchant Listing Experiences.

Apart from a single illustration, the dedicated page for Merchant Listing Structured Data has no information at all about what Merchant Listing Experiences are and how they look in the search results. The second paragraph of the new page encourages a reader to visit the Introduction To Product Structured Data webpage to learn more about what the Merchant Listing Experiences rich results look like.

An argument could be made that Merchant Structured Data page has context for the Merchant Listing Experiences information. But someone at Google decided that this one page needs to be 100% focused on a single topic (structured data).

The new webpage comes in at about (more or less) 1,700 words.

Editorial Decisions Based On User Needs

Rather than have one giant and comprehensive page about a topic the decision was made to split into its constituent subtopics, resulting in three pages that are now more tightly focused and presumably will not only be better for users but will also rank better.

Read The Documentation

Review the original version of the documentation:

Internet Archive: Product (Product, Review, Offer) structured data

Read the new documentation:

Introduction to Product structured data

Product snippet (Product, Review, Offer) structured data

Merchant listing (Product, Offer) structured data

Featured Image by Shutterstock/achinthamb