Critical SERP Features Of Google’s Shopping Marketplace via @sejournal, @Kevin_Indig

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

Google’s launch and pullback of AI Overviews (AIOs) caught the most attention in the SEO scene over the last two months.

However, a change with at least the same significance flew under the radar: Google’s transformation from search engine to marketplace for shopping queries.

Yes, AIOs are impactful: In my initial analysis, I found a negative impact of -8.9% when a page cited in an AIO compared to ranking at the top of the classic web search results.

I then found that Google pulled 50-66% of AIOs back. However, Google shows a whole slew of SERP features and AI features for ecommerce queries that are at least as impactful as AIOs.

To better understand the key trends for shopping queries, I analyzed 35,305 keywords across categories like fashion, beds, plants, and automotive in the US over the last five months using SEOClarity.

The results:

  • Product listings appear more often in position 1 in June compared to February 2024.
  • SERP features like Discussions & Forums gained visibility and opened a new playground for marketers.
  • SERP features fluctuate in visibility and introduce a lot of noise in SEO metrics.

Google Shopping Marketplace

To summarize ecommerce shifts, where I explain Google’s shift from search engine to ecommerce marketplace, Google has merged the web results and shopping tab for shopping searches as a response to Amazon’s long-standing dominance:

  • Google has fully transitioned into a shopping marketplace by adding product filters to search result pages and implementing a direct checkout option.
  • These new features create an ecommerce search experience within Google Search and may significantly impact the organic traffic merchants and retailers rely on.
  • Google has quietly introduced a direct checkout feature that allows merchants to link free listings directly to their checkout pages.
  • Google’s move to a shopping marketplace was likely driven by the need to compete with Amazon’s successful advertising business.
  • Google faces the challenge of balancing its role as a search engine with the need to generate revenue through its shopping marketplace, especially considering its dependence on partners for logistics.

To illustrate with an example:

  1. Say you are looking for kayaks (summertime!).
  2. On desktop (logged-in), Google will now show you product filters on the left sidebar and product carousels in the middle on top of classic organic results – and ads, of course.
Google search for kayaksImage Credit: Kevin Indig
  1. On mobile, you get product filters at the top, ads above organic results, and product carousels in the form of popular products.
Google search for kayaks on mobileImage Credit: Kevin Indig
  1. This experience doesn’t look very different from Amazon, which is the whole point.
Amazon resultsImage Credit: Kevin Indig

Google’s new shopping experience lets users explore products on Amazon, Walmart, Ebay, Esty, & Co.

From an SEO perspective, the prominent position of product grid (listings) and filters likely significantly impacts CTR, organic traffic, and, ultimately, revenue.

Product Listings Appear More Often In Position 1

30,172 out of 35,305 keywords (85.6%) show product listings, which are the free product carousels, in my analysis. It’s the most visible SERP feature in shopping search.

In February, product listings showed up for 39% of queries in position 1 and 15% of queries in position 3.

In July, that number shifted to 43% for position 1 and 13.6% for position 3. Google moved product listings higher up the SERPs.

Google product listings by positionImage Credit: Kevin Indig

The shift from web links to product images makes product listings a cornerstone feature in Google’s transformation. The increased visibility means Google doubles down on the new model.

Discussions & Forums Gain Visibility

After product listings (85.6% of queries), image carousels (61.8% of queries) are the most common SERP features.

SERP Features by occurenceImage Credit: Kevin Indig

Image carousels are highly impactful because shopping is a visual act. Seeing the right product can very quickly trigger a purchase, as opposed to customers being stuck in the Messy Middle for longer.

Retailers and ecommerce brands put a lot of effort into high-quality product pictures and need to spend equal time optimizing images for Google Search, even though organic traffic is usually much lower than web ranks.

Google now tests “generate image with AI,” a feature that lets users generate product images with prompts and then see similar (real) products.

It’s a powerful application of AI that, again, flies under the AIO radar but could also be impactful by making it easier for users to find things they want.

Image Credit: Kevin Indig

Visibility for most SERP features remained relatively unchanged between February and July, with one exception: Discussions & Forums grew from 28.7% to 34% of all queries (+5.3 percentage points).

SERP Features February vs June 2024Image Credit: Kevin Indig

The change in Discussions & Forums SERP features is in line with Reddit’s unprecedented SEO visibility gain over the last 12 months. The domain now operates at the traffic level of Facebook and Amazon.

Google’s Discussions & Forums feature highlights threads in forums like Reddit, Quora, and others. People visit forums when they are looking for authentic and unincentivized opinions from other consumers. Many review articles are biased, and it seems consumers know.

As a result, Google compensates for lower review quality with more user-generated content from forums. In Free Content, I referenced a study from Germany titled “Is Google getting worse?” that found:

  • “An overall downward trend in text quality in all three search engines.”
  • “Higher-ranked pages are on average more optimized, more monetized with affiliate marketing, and they show signs of lower text quality.”

Discussions & Forums show that high visibility doesn’t equal high impact for SERP features.

SERP Features And Their Impact Fluctuate

SERP features are commonly assumed to show up at a stable rate in Search, but Google constantly tests them.

As a result, SERP features that impact click-through rates can introduce a lot of noise into common SEO data (CTR, clicks, even revenue).

At the same time, Google switching some features on and off can help SEO pros understand the impact of SERP features on SEO metrics.

A good example is the Things To Know feature (TTK), which answers two common questions about a product with links to websites.

Things To Know featureImage Credit: Kevin Indig

After months of stable visibility, Google suddenly reduced the number of TTKs by -37.5% for a month, bringing it back to previous levels.

Sites that were linked in TTK might have seen less organic traffic during that month. Since TTK isn’t reported in Search Console, those sites might wonder why their organic traffic dropped even though ranks might be stable.

Things to Know SERP FeatureImage Credit: Kevin Indig

Coming back to the Kayak example from earlier, Google tests variations like deals and carousel segments (“Kayaks For Beginners”).

Kayaks for beginnersImage Credit: Kevin Indig

You can imagine how hard this makes getting stable data and why it’s so critical to monitor SERP features.


Featured Image: Paulo Bobita/Search Engine Journal

Google Struggles To Boost Search Traffic On Its iPhone Apps via @sejournal, @MattGSouthern

According to a report by The Information, Google is working to reduce its reliance on Apple’s Safari browser, but progress has been slower than anticipated.

As Google awaits a ruling on the U.S. Department of Justice’s antitrust lawsuit, its arrangement with Apple is threatened.

The current agreement, which makes Google the default search engine on Safari for iPhones, could be in jeopardy if the judge rules against Google.

To mitigate this risk, Google encourages iPhone users to switch to its Google Search or Chrome apps for browsing. However, these efforts have yielded limited success.

Modest Gains In App Adoption

Over the past five years, Google has increased the percentage of iPhone searches conducted through its apps from 25% to the low 30s.

While this represents progress, it falls short of Google’s internal target of 50% by 2030.

The company has employed various marketing strategies, including campaigns showcasing features like Lens image search and improvements to the Discover feed.

Despite these efforts, Safari’s preinstalled status on iPhones remains an obstacle.

Financial Stakes & Market Dynamics

The financial implications of this struggle are considerable for both Google and Apple.

In 2023, Google reportedly paid over $20 billion to Apple to maintain its status as the default search engine on Safari.

By shifting more users to its apps, Google aims to reduce these payments and gain leverage in future negotiations.

Antitrust Lawsuit & Potential Consequences

The ongoing antitrust lawsuit threatens Google’s business model.

If Google loses the case, it could potentially lose access to approximately 70% of searches conducted on iPhones, which account for about half of the smartphones in the U.S.

This outcome could impact Google’s mobile search advertising revenue, which exceeded $207 billion in 2023.

New Initiatives & Leadership

To address these challenges, Google has brought in new talent, including former Instagram and Yahoo product executive Robby Stein.

Stein is now tasked with leading efforts to shift iPhone users to Google’s mobile apps, exploring ways to make the apps more compelling, including the potential use of generative AI.

Looking Ahead

With the antitrust ruling on the horizon, Google’s ability to attract users to its apps will determine whether it maintains its search market share.

We’ll be watching closely to see how Google navigates these challenges and if it can reduce its reliance on Safari.


Featured Image: photosince/shutterstock

Technical SEO Strategy: Expert Tips To Maximize Website Performance via @sejournal, @lorenbaker

Wondering why your carefully crafted content isn’t climbing the search rankings? 

You might be overlooking a crucial piece of the puzzle: technical SEO. 

It’s easy to get lost in content optimization and on-page SEO, but the real game-changer lies behind the scenes. 

Technical SEO is basically the backbone of your website’s performance, ensuring that search engines can find, crawl, and index your pages effectively.

So if your site’s technical foundation hasn’t been a top priority, you could be missing out on major ranking opportunities. 

But it’s never too late to pivot – if you’re ready to start maximizing your web performance and outranking your competition, our upcoming webinar is one you won’t want to miss.  

Join us live on July 17, as we lay out an actionable framework for auditing and improving your technical SEO across four key pillars:

  1. Discoverability is all about how easily search engines can find your website and its pages.
  2. Crawlability ensures that search engine bots can navigate and access your site without any issues.
  3. Indexability means your pages can be stored in the search engine’s database and shown in search results.
  4. User Experience (UX) focuses on making sure your site is easy for visitors to navigate and enjoyable to use.

Our presenters Steven van Vessum, Director of Organic Marketing at Conductor and Alexandra Dristas, Principal Solutions Consultant at Conductor, will explore ways you can implement core technical SEO best practices.

You’ll also learn which to prioritize based on impact, as well as how to maintain these improvements moving forward.

In this webinar, we’ll cover the following topics: 

  • Optimizing for Discoverability: Learn how creating a clear sitemap and well-organized site architecture helps search engines find and index your pages efficiently.
  • Improving Crawl Budget: Ensure search engine bots focus on valuable pages rather than getting stuck in loops or wasting resources on low-priority content. 
  • Leveraging Schema and Headings: How using Schema markup and optimizing your heading structure can help improve indexability in search results.
  • Core Web Vitals and Accessibility: Discover best practices to provide a seamless and satisfying experience for all visitors.
  • Monitoring Technical SEO: Learn the top tools and processes to continuously identify and fix technical issues, maintaining optimal site performance.

Don’t miss this opportunity to elevate your technical SEO strategy and boost your search visibility. 

Plus, if you stick around after the presentation, Steven and Alexandra will be answering questions live in our Q&A session. 

Sign up now and get the expert insights you need to rank higher on SERPs.

The Three Pillars Of SEO: Authority, Relevance, And Experience via @sejournal, @marktraphagen

If there’s one thing we SEO pros are good at, it’s making things complicated.

That’s not necessarily a criticism.

Search engine algorithms, website coding and navigation, choosing and evaluating KPIs, setting content strategy, and more are highly complex tasks involving lots of specialized knowledge.

But as important as those things all are, at the end of the day, there is really just a small set of things that will make most of the difference in your SEO success.

In SEO, there are really just three things – three pillars – that are foundational to achieving your SEO goals.

  • Authority.
  • Relevance.
  • Experience (of the users and bots visiting the site).

Nutritionists tell us our bodies need protein, carbohydrates, and fats in the right proportions to stay healthy. Neglect any of the three, and your body will soon fall into disrepair.

Similarly, a healthy SEO program involves a balanced application of authority, relevance, and experience.

Authority: Do You Matter?

In SEO, authority refers to the importance or weight given to a page relative to other pages that are potential results for a given search query.

Modern search engines such as Google use many factors (or signals) when evaluating the authority of a webpage.

Why does Google care about assessing the authority of a page?

For most queries, there are thousands or even millions of pages available that could be ranked.

Google wants to prioritize the ones that are most likely to satisfy the user with accurate, reliable information that fully answers the intent of the query.

Google cares about serving users the most authoritative pages for their queries because users that are satisfied by the pages they click through to from Google are more likely to use Google again, and thus get more exposure to Google’s ads, the primary source of its revenue.

Authority Came First

Assessing the authority of webpages was the first fundamental problem search engines had to solve.

Some of the earliest search engines relied on human evaluators, but as the World Wide Web exploded, that quickly became impossible to scale.

Google overtook all its rivals because its creators, Larry Page and Sergey Brin, developed the idea of PageRank, using links from other pages on the web as weighed citations to assess the authoritativeness of a page.

Page and Brin realized that links were an already-existing system of constantly evolving polling, in which other authoritative sites “voted” for pages they saw as reliable and relevant to their users.

Search engines use links much like we might treat scholarly citations; the more scholarly papers relevant to a source document that cite it, the better.

The relative authority and trustworthiness of each of the citing sources come into play as well.

So, of our three fundamental categories, authority came first because it was the easiest to crack, given the ubiquity of hyperlinks on the web.

The other two, relevance and user experience, would be tackled later, as machine learning/AI-driven algorithms developed.

Links Still Primary For Authority

The big innovation that made Google the dominant search engine in a short period was that it used an analysis of links on the web as a ranking factor.

This started with a paper by Larry Page and Sergey Brin called The Anatomy of a Large-Scale Hypertextual Web Search Engine.

The essential insight behind this paper was that the web is built on the notion of documents inter-connected with each other via links.

Since putting a link on your site to a third-party site might cause a user to leave your site, there was little incentive for a publisher to link to another site unless it was really good and of great value to their site’s users.

In other words, linking to a third-party site acts a bit like a “vote” for it, and each vote could be considered an endorsement, endorsing the page the link points to as one of the best resources on the web for a given topic.

Then, in principle, the more votes you get, the better and the more authoritative a search engine would consider you to be, and you should, therefore, rank higher.

Passing PageRank

A significant piece of the initial Google algorithm was based on the concept of PageRank, a system for evaluating which pages are the most important based on scoring the links they receive.

So, a page that has large quantities of valuable links pointing to it will have a higher PageRank and will, in principle, be likely to rank higher in the search results than other pages without as high a PageRank score.

When a page links to another page, it passes a portion of its PageRank to the page it links to.

Thus, pages accumulate more PageRank based on the number and quality of links they receive.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

Not All Links Are Created Equal

So, more votes are better, right?

Well, that’s true in theory, but it’s a lot more complicated than that.

PageRank scores range from a base value of one to values that likely exceed trillions.

Higher PageRank pages can have a lot more PageRank to pass than lower PageRank pages. In fact, a link from one page can easily be worth more than one million times a link from another page.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

But the PageRank of the source page of a link is not the only factor in play.

Google also looks at the topic of the linking page and the anchor text of the link, but those have to do with relevance and will be referenced in the next section.

It’s important to note that Google’s algorithms have evolved a long way from the original PageRank thesis.

The way that links are evaluated has changed in significant ways – some of which we know, and some of which we don’t.

What About Trust?

You may hear many people talk about the role of trust in search rankings and in evaluating link quality.

For the record, Google says it doesn’t have a concept of trust it applies to links (or ranking), so you should take those discussions with many grains of salt.

These discussions began because of a Yahoo patent on the concept of TrustRank.

The idea was that if you started with a seed set of hand-picked, highly trusted sites and then counted the number of clicks it took you to go from those sites to yours, the fewer clicks, the more trusted your site was.

Google has long said it doesn’t use this type of metric.

However, in 2013 Google was granted a patent related to evaluating the trustworthiness of links. We should not though that the existence of a granted patent does not mean it’s used in practice.

For your own purposes, however, if you want to assess a site’s trustworthiness as a link source, using the concept of trusted links is not a bad idea.

If they do any of the following, then it probably isn’t a good source for a link:

  • Sell links to others.
  • Have less than great content.
  • Otherwise, don’t appear reputable.

Google may not be calculating trust the way you do in your analysis, but chances are good that some other aspect of its system will devalue that link anyway.

Fundamentals Of Earning & Attracting Links

Now that you know that obtaining links to your site is critical to SEO success, it’s time to start putting together a plan to get some.

The key to success is understanding that Google wants this entire process to be holistic.

Google actively discourages, and in some cases punishes, schemes to get links in an artificial way. This means certain practices are seen as bad, such as:

  • Buying links for SEO purposes.
  • Going to forums and blogs and adding comments with links back to your site.
  • Hacking people’s sites and injecting links into their content.
  • Distributing poor-quality infographics or widgets that include links back to your pages.
  • Offering discount codes or affiliate programs as a way to get links.
  • And many other schemes where the resulting links are artificial in nature.

What Google really wants is for you to make a fantastic website and promote it effectively, with the result that you earn or attract links.

So, how do you do that?

Who Links?

The first key insight is understanding who it is that might link to the content you create.

Here is a chart that profiles the major groups of people in any given market space (based on research by the University of Oklahoma):

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

Who do you think are the people that might implement links?

It’s certainly not the laggards, and it’s also not the early or late majority.

It’s the innovators and early adopters. These are the people who write on media sites or have blogs and might add links to your site.

There are also other sources of links, such as locally-oriented sites, such as the local chamber of commerce or local newspapers.

You might also find some opportunities with colleges and universities if they have pages that relate to some of the things you’re doing in your market space.

Relevance: Will Users Swipe Right On Your Page?

You have to be relevant to a given topic.

Think of every visit to a page as an encounter on a dating app. Will users “swipe right” (thinking, “this looks like a good match!)?

If you have a page about Tupperware, it doesn’t matter how many links you get – you’ll never rank for queries related to used cars.

This defines a limitation on the power of links as a ranking factor, and it shows how relevance also impacts the value of a link.

Consider a page on a site that is selling a used Ford Mustang. Imagine that it gets a link from Car and Driver magazine. That link is highly relevant.

Also, think of this intuitively. Is it likely that Car and Driver magazine has some expertise related to Ford Mustangs? Of course it does.

In contrast, imagine a link to that Ford Mustang from a site that usually writes about sports. Is the link still helpful?

Probably, but not as helpful because there is less evidence to Google that the sports site has a lot of knowledge about used Ford Mustangs.

In short, the relevance of the linking page and the linking site impacts how valuable a link might be considered.

What are some ways that Google evaluates relevance?

The Role Of Anchor Text

Anchor text is another aspect of links that matters to Google.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

The anchor text helps Google confirm what the content on the page receiving the link is about.

For example, if the anchor text is the phrase “iron bathtubs” and the page has content on that topic, the anchor text, plus the link, acts as further confirmation that the page is about that topic.

Thus, the links evaluate both the page’s relevance and authority.

Be careful, though, as you don’t want to go aggressively obtaining links to your page that all use your main keyphrase as the anchor text.

Google also looks for signs that you are manually manipulating links for SEO purposes.

One of the simplest indicators is if your anchor text looks manually manipulated.

Internal Linking

There is growing evidence that Google uses internal linking to evaluate how relevant a site is to a topic.

Properly structured internal links connecting related content are a way of showing Google that you have the topic well-covered, with pages about many different aspects.

By the way, anchor text is as important when creating external links as it is for external, inbound links.

Your overall site structure is related to internal linking.

Think strategically about where your pages fall in your site hierarchy. If it makes sense for users it will probably be useful to search engines.

The Content Itself

Of course, the most important indicator of the relevance of a page has to be the content on that page.

Most SEO professionals know that assessing content’s relevance to a query has become way more sophisticated than merely having the keywords a user is searching for.

Due to advances in natural language processing and machine learning, search engines like Google have vastly increased their competence in being able to assess the content on a page.

What are some things Google likely looks for in determining what queries a page should be relevant for?

  • Keywords: While the days of keyword stuffing as an effective SEO tactic are (thankfully) way behind us, having certain words on a page still matters. My company has numerous case studies showing that merely adding key terms that are common among top-ranking pages for a topic is often enough to increase organic traffic to a page.
  • Depth: The top-ranking pages for a topic usually cover the topic at the right depth. That is, they have enough content to satisfy searchers’ queries and/or are linked to/from pages that help flesh out the topic.
  • Structure: Structural elements like H1, H2, and H3, bolded topic headings, and schema-structured data may help Google better understand a page’s relevance and coverage.

What About E-E-A-T?

E-E-A-T is a Google initialism standing for Experienced-Expertise-Authoritativeness-Trustworthiness.

It is the framework of the Search Quality Rater’s Guidelines, a document used to train Google Search Quality Raters.

Search Quality Raters evaluate pages that rank in search for a given topic using defined E-E-A-T criteria to judge how well each page serves the needs of a search user who visits it as an answer to their query.

Those ratings are accumulated in aggregate and used to help tweak the search algorithms. (They are not used to affect the rankings of any individual site or page.)

Of course, Google encourages all site owners to create content that makes a visitor feel that it is authoritative, trustworthy, and written by someone with expertise or experience appropriate to the topic.

The main thing to keep in mind is that the more YMYL (Your Money or Your Life) your site is, the more attention you should pay to E-E-A-T.

YMYL sites are those whose main content addresses things that might have an effect on people’s well-being or finances.

If your site is YMYL, you should go the extra mile in ensuring the accuracy of your content, and displaying that you have qualified experts writing it.

Building A Content Marketing Plan

Last but certainly not least, create a real plan for your content marketing.

Don’t just suddenly start doing a lot of random stuff.

Take the time to study what your competitors are doing so you can invest your content marketing efforts in a way that’s likely to provide a solid ROI.

One approach to doing that is to pull their backlink profiles using tools that can do that.

With this information, you can see what types of links they’ve been getting and, based on that, figure out what links you need to get to beat them.

Take the time to do this exercise and also to map which links are going to which pages on the competitors’ sites, as well as what each of those pages rank for.

Building out this kind of detailed view will help you scope out your plan of attack and give you some understanding of what keywords you might be able to rank for.

It’s well worth the effort!

In addition, study the competitor’s content plans.

Learn what they are doing and carefully consider what you can do that’s different.

Focus on developing a clear differentiation in your content for topics that are in high demand with your potential customers.

This is another investment of time that will be very well spent.

Experience

As we traced above, Google started by focusing on ranking pages by authority, then found ways to assess relevance.

The third evolution of search was evaluating the site and page experience.

This actually has two separate but related aspects: the technical health of the site and the actual user experience.

We say the two are related because a site that is technically sound is going to create a good experience for both human users and the crawling bots that Google uses to explore, understand a site, and add pages to its index, the first step to qualifying for being ranked in search.

In fact, many SEO pros (and I’m among them) prefer to speak of SEO not as Search Engine Optimization but as Search Experience Optimization.

Let’s talk about the human (user) experience first.

User Experience

Google realized that authoritativeness and relevancy, as important as they are, were not the only things users were looking for when searching.

Users also want a good experience on the pages and sites Google sends them to.

What is a “good user experience”? It includes at least the following:

  • The page the searcher lands on is what they would expect to see, given their query. No bait and switch.
  • The content on the landing page is highly relevant to the user’s query.
  • The content is sufficient to answer the intent of the user’s query but also links to other relevant sources and related topics.
  • The page loads quickly, the relevant content is immediately apparent, and page elements settle into place quickly (all aspects of Google’s Core Web Vitals).

In addition, many of the suggestions above about creating better content also apply to user experience.

Technical Health

In SEO, the technical health of a site is how smoothly and efficiently it can be crawled by Google’s search bots.

Broken connections or even things that slow down a bot’s progress can drastically affect the number of pages Google will index and, therefore, the potential traffic your site can qualify for from organic search.

The practice of maintaining a technically healthy site is known as technical SEO.

The many aspects of technical SEO are beyond the scope of this article, but you can find many excellent guides on the topic, including Search Engine Journal’s Advanced Technical SEO.

In summary, Google wants to rank pages that it can easily find, that satisfy the query, and that make it as easy as possible for the searcher to identify and understand what they were searching for.

What About the Google Leak?

You’ve probably heard by now about the leak of Google documents containing thousands of labeled API calls and many thousands of attributes for those data buckets.

Many assume that these documents reveal the secrets of the Google algorithms for search. But is that a warranted assumption?

No doubt, perusing the documents is interesting and reveals many types of data that Google may store or may have stored in the past. But some significant unknowns about the leak should give us pause.

  • As  Google has pointed out, we lack context around these documents and how they were used internally by Google, and we don’t know how out of date they may be.
  • It is a huge leap from “Google may collect and store data point x” to “therefore data point x is a ranking factor.”
  • Even if we assume the document does reveal some things that are used in search, we have no indication of how they are used or how much weight they are given.

Given those caveats, it is my opinion that while the leaked documents are interesting from an academic point of view, they should not be relied upon for actually forming an SEO strategy.

Putting It All Together

Search engines want happy users who will come back to them again and again when they have a question or need.

They create and sustain happiness by providing the best possible results that satisfy that question or need.

To keep their users happy, search engines must be able to understand and measure the relative authority of webpages for the topics they cover.

When you create content that is highly useful (or engaging or entertaining) to visitors – and when those visitors find your content reliable enough that they would willingly return to your site or even seek you out above others – you’ve gained authority.

Search engines work hard to continually improve their ability to match the human quest for trustworthy authority.

As we explained above, that same kind of quality content is key to earning the kinds of links that assure the search engines you should rank highly for relevant searches.

That can be either content on your site that others want to link to or content that other quality, relevant sites want to publish, with appropriate links back to your site.

Focusing on these three pillars of SEO – authority, relevance, and experience – will increase the opportunities for your content and make link-earning easier.

You now have everything you need to know for SEO success, so get to work!

More resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Google Gives Exact Reason Why Negative SEO Doesn’t Work via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about negative SEO provides useful insights into the technical details of how Google prevents low quality spam links from affecting normal websites.

The answer about negative SEO was given in an interview in May and has gone unnoticed until now.

Negative SEO

Negative SEO is the practice of sabotaging a competitor with an avalanche of low quality links. The idea is that Google will assume that the competitor is spamming and knock them out of the search engine results pages (SERPs).

The practice of negative SEO originated in the online gambling space where the rewards for top ranking are high and the competition is fierce. I first heard of it around the mid-2000s (probably before 2010) when someone involved in the gambling space told me about it.

Virtually all websites that rank for meaningful search queries attract low quality links and there is nothing unusual about, it’s always been this way. The concept of negative SEO became more prominent after the Penguin link spam update caused site owners to become more aware of the state of their inbound links.

Does Negative SEO Cause Harm?

The person interviewing Gary Illyes was taking questions from the audience.

She asked:

“Does negative SEO via spammy link building, a competitor throwing tens of thousands of links at another competitor, does that kind of thing still harm people or has Google kind of pushed that off to the side?

Google’s Gary Illyes answered the question by first asking the interviewer if she remembered the Penguin update to which she answered yes.

He then explained his experience reviewing examples of negative SEO that site owners and SEOs had sent him. He said that out of hundreds of cases he reviewed there was only one case that might have actually been negative SEO but that the web spam team wasn’t 100% sure.

Gary explained:

“Around the time we released Penguin, there was tons and tons of tons of complaints about negative SEO, specifically link based negative SEO and then very un-smartly, I requested examples like show me examples, like show me how it works and show me that it worked.

And then I got hundreds, literally hundreds of examples of alleged negative SEO and all of them were not negative SEO. It was always something that was so far away from negative SEO that I didn’t even bother looking further, except one that I sent to the web spam team for double checking and that we haven’t made up our mind about it, but it could have been negative SEO.

With this, I want to say that the fear about negative SEO is much bigger than or much larger than it needs to be, we disable insane numbers of links…”

The above is Gary’s experience of negative SEO. Next he explains the exact reason why “negative SEO links” have no effect.

Links From Irrelevant Topics Are Not Counted

At about the 30 minute mark of the interview, Gary confirmed something interesting about how links evaluated that is important to understand. Google has, for a very long time, examined the context of the site that’s linking out to match it to the site that’s being linked to, and if they don’t match up then Google wouldn’t pass the PageRank signal.

Gary continued his answer:

“If you see links from completely irrelevant sites, be that p–n sites or or pure spam sites or whatever, you can safely assume that we disabled the links from those sites because, one of the things is that we try to match the the topic of the target page plus whoever is linking out, and if they don’t match then why on Earth would we use those links?

Like for example if someone is linking to your flower page from a Canadian casino that sells Viagra without prescription, then why would we trust that link?

I would say that I would not worry about it. Like, find something else to worry about.”

Google Matches Topics From Page To Page

There was a time, in the early days of SEO, when thousands of links from non-matching topics could boost a site to the top of Google’s search results.  Some link builders used to offer “free” traffic counter widgets to universities that when placed in the footer would contain a link back to their client sites and they used to work. But Google tightened up on those kinds of links.

What Gary said about links having to be relevant matches up with what link builders have known for at least twenty years. The concept of off topic links not being counted by Google was understood way in the days when people did reciprocal links.

Although I can’t remember everything every Googler has ever said about negative SEO, this seems to be one of the rare occasions that a Googler offered a detailed reason why negative SEO doesn’t work.

Watch Gary Illyes answer the question at the 26 minute mark:

Featured Image by Shutterstock/MDV Edwards

42 Facebook Statistics & Facts For 2024 via @sejournal, @annabellenyst

Don’t believe what you may have heard; Facebook is still a dominant social media force in 2024.

With over 3 billion active users, it remains a key player for businesses, marketers, and social media enthusiasts.

And despite the rise of newer, shinier platforms, Facebook’s expansive reach and diverse user base are still unrivaled, making it a powerful channel for both personal and business engagement.

In this article, we’ll highlight the latest Facebook statistics and facts, providing a comprehensive overview of its reach, user behavior, and influence.

Facebook Overview

1. Facebook is the world’s most-used social platform in 2024, with over 3 billion global active users.

2. It is the third most-used app globally among mobile users, trailing only WhatsApp and YouTube.

3. Facebook ranks third in terms of time spent (behind TikTok and YouTube), with users spending an average of 19 hours and 47 minutes on Android app per month.

4. 64.1% of Facebook Android users open the app every day.

5. Facebook is the third most visited website in the US, with an estimated 2.90 billion monthly visits in April 2024.

6. Of its monthly US visitors, roughly 50.07% are mobile users, and 49.93% are using a desktop.

7. Globally, users spend an average of 3 minutes and 42 seconds on Facebook per app session.

8. Facebook is the second most searched query globally, with a search volume of 584.9 million.

9. Facebook is the fourth most downloaded social networking app in the US, behind Threads, WhatsApp, and Telegram.

(Source) (Source) (Source) (Source) (Source) (Source) (Source)

Facebook Company Background

10. Facebook was founded in 2004 by Mark Zuckerberg, Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes.

11. The platform was originally launched as ‘TheFacebook’ on February 4, 2004. In August of 2005, it rebranded to Facebook.

12. Mark Zuckerberg is the current CEO of Facebook.

13. Facebook is headquartered in Menlo Park, California.

14. Facebook has 69,329 employees in 2024, a decrease of 10% year-over-year.

(Source) (Source) (Source)

Facebook Financial Performance

15. As of May 2024, Meta, Facebook’s parent company, has a market cap of $562.19 billion.

16. Meta generated $36.46 billion in revenue in Q1 2024, reflecting a 27% increase year-over-year.

17.  The company reported a net income of $12.37 billion in Q1 2024 – a significant 117% uptick from Q1 of 2023.

(Source) (Source)

Facebook User Statistics

18. Facebook had an average of 2.11 billion daily active users (DAUs) in 2023.

19. Facebook has approximately 3.07 billion monthly active users (MAUs).

20. That figure represents 37.7% of the total population and 57% of total internet users.

21. Facebook saw a 3.4% increase in MAUs between April 2023 and April 2024.

22. More than two-thirds of the world’s total internet users visit Facebook monthly.

23. English is the most represented language among Facebook users (53.8%), followed by Spanish (14.9%) and Hindi (8.5%).

24. Approximately seven in 10 US adults report ever using Facebook, second only to YouTube (83%).

25. A third of US teens aged 13-17 use Facebook, a decrease from 71% in 2014-2015.

26. More than 56.8% of Facebook users are male in 2024.

(Source) (Source) (Source) (Source) (Source)

Facebook Statistics By Location

27. 1.37 billion of Facebook’s MAUs are based in the Asia Pacific, making it the largest segment of the app’s users.

28. Europe and the US & Canada make up the next largest user groups.

29. Facebook’s global audience size, April 2023:

Country Active Facebook Users
India 369.9 million
US 186.4 million
Indonesia 135.1 million
Brazil 114.2 million
Mexico 93.3 million
The Philippines 91.9 million
Vietnam 75.6 million
Bangladesh 54.2 million
Thailand 51.6 million
Egypt 47.0 million

(Source) (Source)

Facebook Advertising

30. Advertisers can reach 2.24 billion users on Facebook in 2024, representing 41.3% of all internet users and 27.7% of the total population.

31. Among active Facebook users, 53.8% say they use the platform to follow or research brands and products. This ranks the platform second behind Instagram (62.7%) and ahead of TikTok (47.4%).

32. Male users aged 25-34 years old make up the largest portion of Facebook’s advertising audience (18.4%), followed by those aged 18-24 years old (13.5%).

33. Ad impressions on Meta’s Family of Apps (FoA), which includes Facebook, Instagram, WhatsApp, and Messenger, increased by 28% YoY in 2023.

(Source) (Source) (Source)

Facebook User Activities And Engagement

34. Active users use the app to message friends and family, with 72.6% doing so regularly.

35. Posting or sharing photos or videos is a common activity for 63.2% of Facebook users.

36. Almost 60% of users leverage Facebook to keep up to date with news and current events.

37. Facebook is the go-to platform for news for three in 10 Americans, making it the most popular social platform for this purpose.

(Source) (Source)

Facebook Content And Engagement

38. Link posts account for 44.5% of Facebook posts.

39. Photo posts follow at 33.4%.

40. Video posts make up 18.9% of content.

41. Photo posts receive an average engagement rate of 0.35%, followed by video posts at 0.23%, and album posts at 0.22%.

(Source)

Most Followed Facebook Pages

42. The top 10 most followed Facebook pages are:

Brand Followers*
1 Facebook App 188 million
2 Cristiano Ronaldo 168 million
3 Samsung 161 million
4 Mr. Bean 140 million
5 5-Minute Crafts 126 million
6 Shakira 124 million
7 Real Madrid C.F. 121 million
7 CGTN 121 million
9 Will Smith 116 million
9 Lionel Messi 116 million

*Facebook followers as of January 2024

(Source)

In Summary

Say what you will about Facebook, but its enduring relevance is undeniable.

With extensive reach, a broad user base, and significant advertising potential, Facebook will remain a cornerstone of any social media strategy in 2024.

By understanding these trends and user behaviors – and leveraging many of the insights covered above – you can maximize the potential of Facebook to drive engagement, awareness, and impact.

More resources: 


Featured Image: Kaspars Grinvalds/Shutterstock

Organic SEO Strategy Guide: How To Boost Search Visibility & Drive Growth via @sejournal, @idigitalinear

This post was sponsored by Digitalinear. The opinions expressed in this article are the sponsor’s own.

In a world where consumers search online for nearly everything – from product recommendations to local service providers – your brand’s digital presence is everything.

The right organic SEO strategy can boost your search visibility and drive sustainable business growth.

SEO is becoming increasingly complex with sweeping algorithm changes, intense competition, and the recent flood of AI-generated content.

So how can you navigate these challenges to enhance your site’s visibility and raise brand awareness?

Here, we’ll break down a strategic approach to organic SEO, focusing on building a solid foundation and continuously optimizing and analyzing performance.

Step One: Web Inspection

The first step to address is the core user experience of your website.

SEO strategies should always begin with the end user, how you solve their problems, and how you communicate your value to them.

If your website is a new obstacle to them, they’ll find solutions elsewhere. So, your first job is to clarify your intended user experience and make it seamless. This will take both audience research and technical optimization.

The audience research portion of inspecting your website should include:

  • Identify Success Metrics and Conversion Points: Your website must do its job well. You need a clear understanding of what you want users to do, which target audiences are likely to take those actions, what questions and pain points those users have, and how you can address them. Much of that strategy work comes later in the process, but for now, you need a clear view of your goals and intended user journeys. This will help you prioritize the most impactful technical fixes.
  • Assess Content Coverage and Quality: High-quality, relevant content is crucial for engaging users and ranking well in search results. Moreover, you need content that addresses the real needs of your target audiences at multiple stages of their journey. Understanding users’ needs and questions is critical to their experience and to lead them toward desired actions. Conduct a content audit to identify gaps and opportunities for improvement: identify low-engagement pages and analyze competitors to see what content you’re missing.

A comprehensive technical website audit should include:

  • Technical SEO: Ensure your site is technically sound by checking for issues like broken links, duplicate content, and proper use of meta tags. Many tools can assist with this, but individual tools may not provide a complete view. You may need to combine reports from multiple different tools.
  • Site Speed and Core Web Vitals: Slow-loading websites can deter visitors and negatively affect search rankings. Use tools like Google PageSpeed Insights to identify speed issues.
  • Mobile-Friendliness: Google uses mobile-first indexing. This means that mobile-friendliness isn’t only important for mobile users. Your website’s indexing and ranking depend on its mobile performance, no matter what device is being used to view it. The content of your pages should be the same and provide the same experience between desktop and mobile. Google’s Mobile-Friendly Test can help you assess your site’s performance on mobile devices. If you want to lean into mobile trends, then mobile app development allows you to provide mobile users with unique, seamless experiences.
  • User Experience (UX): A positive user experience encourages visitors to stay on your site longer, reducing bounce rates and improving SEO. Evaluate your site’s navigation, layout, and overall usability, keeping your success metrics and conversion points in mind. Users should be able to find the next steps quickly.

A full website audit is both technical and strategic. Sometimes, you need an external perspective to accurately identify the UX and communication issues you might be encountering. This is where an SEO agency can provide the perspective, research, and dedicated resources a website needs for long-term success.

Digitalinear specializes in organic SEO, and packages begin with deep research into your business and niche alongside technical audits.

With a team of dedicated SEO professionals, Digitalinear performs a thorough website inspection using the best tools available, ensuring no stone is left unturned in your audit.

Step Two: Deep Optimization

Once you’ve identified the areas of your site that need improvement, the next step is to optimize.

Optimizing your website ensures that it meets search engines’ technical requirements while providing a seamless and engaging user experience.

This is where you should get into the fine details of keyword research and query intent matching, ensuring that your SEO goals align with the business goals of your website that you identified in step one.

Top rankings for keywords won’t have a business impact if you haven’t matched them to your core audience. Traffic won’t result in signups or sales if you’re not effectively engaging those users.

Research is one area where an SEO consultancy can be particularly helpful in providing objective competitor and industry analysis.

Here are some key research elements for optimization:

  • Keyword Research: Identify the most relevant and high-performing keywords for your industry. Use tools like SEMrush or Ahrefs to find keywords that your target audience is searching for.
  • Content Optimization: Intent has been a big deal in SEO lately, and you must optimize for it as well as keywords. Matching your content to users’ needs and intents is where all your research will pay off in engagement, retention, and conversion. Use keywords naturally and ensure your content answers your audience’s questions and needs. Ensure you have wide and deep coverage of relevant topics demonstrating your unique expertise. Build strong networks of internal links to help users and search engines navigate and parse your content.

Key optimization techniques that lead to higher search rankings include:

  • On-Page SEO: This broad category includes content-focused and technical implementations to make individual pages shine. Your research and analysis thus far should culminate in a page with exceptional user experience. The elements load quickly and provide a consistent experience. The content demonstrates your experience, expertise, authoritativeness, and trustworthiness (E-E-A-T). You implement keywords, metadata, and linking effectively.
  • Link Building: Build high-quality backlinks to your site from reputable sources. This can help improve your site’s authority and search rankings. Creating content people share and want to link to is the first step. Then, you can actively seek links through many outreach strategies, such as digital PR, email, and social channels.

A full optimization process is a ton of work. Digitalinear is an SEO agency with services designed to simplify your process.

Their expert team uses advanced tools and techniques to ensure your site is fully optimized for search engines and users alike.

Additionally, Digitalinear provides web design and development solutions to enhance user interface and overall experience.

Step Three: Analyze Growth

After optimizing your website, it’s important to continuously analyze its performance to refine and improve your strategies over time.

This involves monitoring key metrics to evaluate the success of your SEO efforts and make data-driven decisions for continuous improvement.

By regularly analyzing your growth, you can better understand what’s working and what’s not.

Here are some key SEO metrics to monitor:

  • Organic Traffic: Track the number of visitors coming to your site through organic search.
  • Engagement Rate: Monitor the percentage of visitors who do not engage with content.
  • Keyword Rankings: Keep an eye on how your targeted keywords are ranking over time.
  • Conversion Rates: Measure the percentage of visitors who take a desired action, such as making a purchase, filling out a form, etc.
  • Backlink Profile: Analyze the quantity and quality of backlinks pointing to your site.

Digitalinear offers ongoing SEO consultancy. They can help you select the best metrics for your business goals and track and analyze them.

Their expertise ensures you make informed decisions that lead to tangible results, helping you drive continuous growth and stay ahead of the competition.

Start Meeting & Exceeding Your Growth Goals With Digitalinear

To navigate the complexities of SEO and achieve your growth goals, it’s essential to have a strategic partner that takes the time to understand your business, where you are now and where you want to be.

This involves a holistic approach to organic SEO through comprehensive site audits, deep optimization techniques, and continuous performance analysis. The audience research and testing involved are ongoing processes of learning.

Digitalinear offers expert guidance and tailored SEO solutions to help you enhance your online presence and drive sustainable growth.

With a team of professionals dedicated to exceeding your business objectives, they ensure that every SEO strategy is optimized for success.

Learn more about how Digitalinear’s tailored SEO services can make a difference for your business.


Image Credits

Featured Image: Image by Digitalinear. Used with permission.

Google Shows How To Beat Reddit & Big Brands via @sejournal, @martinibuster

In an interview published on YouTube, Google’s Gary Illyes offered advice on what small sites should consider doing if they want to compete against Reddit, Amazon and other big brand websites.

About Big Brand Dominance

Google’s Gary Illyes answered questions about SEO back in May that went underreported so I’m correcting that oversight this month. Gary answered a question about how to compete against Reddit and big brands.

While it may appear that Gary is skeptical that Reddit is dominating, he’s not disputing that perception and that’s not the context of his answer. The context is larger than Reddit because his answer is about the core issue of competing against big brands in the search engine results pages (SERPs).

This is the question that an audience member asked:

“Since Reddit and big publishers dominate nowadays in the SERPS for many keywords, what can the smaller brands do besides targeting the long tail keywords?”

The History Of Big Brands In The SERPs

Gary’s answer encompasses the entire history of big brands in the SERPs and the SEO response to that. About.com was a website about virtually any topic of interest and it used to rank for just about everything. It was like the Wikipedia of its day and many SEOs resented how About.com used to rank so well.

He first puts that context into his answer, that this complaint about Reddit is part of a long history of various brands ranking at the top of the SERPs then washing out of the SERPs as trends change.

Gary answered:

“So before I joined Google I was doing some SEO stuff for big publishers. …SEO type. Like I was also server manager like a cluster manager.

So, I would have had the same questions and in fact back in the day we saw these kind of questions all the time.

Now it’s Reddit. Back then it was Amazon. A few years before that, it was I think …About.com.

Pretty much every two years the name that you would put there …changes.”

Small Sites Can Outcompete Big Brands

Gary next shares that the history of SEO is also about small sites figuring out how to outcompete the bigger sites. This is also true. Some big sites started as small sites that figured out a way to outcompete larger big brand sites. For example, Reviewed.com, before it was purchased by USA Today, was literally started by a child whose passion for the topic contributed to it becoming massively successful.

Gary says that there are two things to do:

  1. Wait until someone else figures out how to outcompete and then copy them
  2. Or figure it out yourself and lead the way

But of course, if you wait for someone else to show the way it’s probably too late.

He continued:

“It seems that people always figure out ways to compete with whoever would be the second word in that question.

So it’s not like, oh my God, like everything sucks now and we can retire. It’s like, one thing you could do is to wait it out and let someone else come up with something for you that you can use to compete with Reddit and the big publishers that allegedly dominate nowadays the SERPs.

Or you sit down and you start thinking about how can you employ some marketing strategies that will boost you to around the same positions as the big publishers and Reddit and whatnot.

One of the most inspiring presentations I’ve seen was the empathetic marketing… do that. Find a way to compete with these positions in the SERPs because it is possible, you just have to find the the angle to compete with them.”

Gary is right. Big brands are slowed down by bureaucracy and scared to take chances. As I mentioned about Reviewed.com, a good strategy can outrun the big brands all day long, I know this from my own experience and from knowing others who have done the same thing, including the founder of Reviewed.com.

Long Tail Keywords & Other Strategies

Gary next talked about long tail keywords. A lot of newbie SEO gurus define long tail keyword phrases with a lot of words in it. That’s 100% wrong. Long tail keyword phrases are keyword phrases that searches rarely use. It’s the rareness of keyword use that makes them long tail, not how many words are in the keyword phrase.

The context this part of Gary’s answer is that the person asking the question essentially dismissed long tail search queries as the crumbs that the big brands leave behind for small sites.

Gary explains:

“And also the other thing is that, like saying that you are left with the long tail keywords. It’s like we see like 15 to even more percent of new long tail keywords every single day.

There’s lots of traffic in long tail keywords. You you can jump on that bandwagon and capture a ton of traffic.”

Something left unmentioned is that conquering long tail keyword phrases is one way to create awareness that a site is about a topic. People come for the long tail and return for the head phrases (the queries with more traffic).

The problem with some small sites is that they’re trying to hit the big traffic keywords without first showing relevance in the long tail. Starting small and building up toward big is one of the secrets of successful sites.

Small Sites Can Be Powerful

Gary is right, there is a lot of traffic in the long tail and emerging trends. The thing that small sites need to remember is that big sites move slow and have to get through layers of bureaucracy in order to make a strategic decision. The stakes for them are also higher so they’re not prone to take big swings either. Speed and the ability to make bold moves is the small site’s super power. Exercise it.

I know from my own experience and from working with clients that it’s absolutely possible to outrank to big sites that have been around for years. The history of SEO is littered with small sites that outpaced the slower moving bigger sites.

Watch Gary answer this question at the 20 minute mark:

Featured Image by Shutterstock/Volodymyr TVERDOKHLIB

Google Explains Reasons For Crawled Not Indexed via @sejournal, @martinibuster

Back in May Google’s Gary Illyes sat for an interview at the SERP Conf 2024 conference in Bulgaria and answered a question about the causes of crawled but not indexed, offering multiple reasons that are helpful for debugging and fixing this error.

Although the interview happened in May, the video of the interview went underreported and not many people have actually watched it. I only heard of it because the always awesome Olesia Korobka (@Giridja) recently drew attention to the interview in a Facebook post.

So even though the interview happened in May, the information is still timely and useful.

Reason For Crawled – Currently Not Indexed

Crawled Currently Not Indexed is a reference to an error report in the Google Search Console Page Indexing report which alerts that a page was crawled by Google but was not indexed.

During a live interview someone submitted a question, asking:

“Can crawled but not indexed be a result of a page being too similar to other stuff already indexed?

So is Google suggesting there is enough other stuff already and your stuff is not unique enough?”

Google’s search console documentation doesn’t provide an answer as to why Google may crawl a page and not index it, so it’s a legitimate question.

Gary Illyes answered that yes, one of the reasons could be that there is already other content that is similar. But he also goes on to say that there are other reasons, too.

He answered:

“Yeah, that that could be one thing that it can mean. Crawled but not indexed is, ideally we would break up that category into more granular chunks, but it’s super hard because of how the data internally exists.

It can be a bunch of things, dupe elimination is one of those things, where we crawl the page and then we decide to not index it because there’s already a version of that or an extremely similar version of that content available in our index and it has better signals.

But yeah, but it it can be multiple things.”

General Quality Of Site Can Impact Indexing

Gary then called attention to another reason why Google might crawl but choose not to index a site, saying that it could be a site quality issue.

Illyes then continued his answer:

“And the general quality of the of the site, that can matter a lot of how many of these crawled but not indexed you see in search console. If the number of these URLs is very high that could hint at general quality issues.

And I’ve seen that a lot since February, where suddenly we just decided that we are indexing a vast amount of URLs on a site just because …our perception of the site has changed.”

Other Reasons For Crawled Not Indexed

Gary next offered other reasons for why URLs might be crawled but not indexed, saying that it could be that Google’s perception of the site could have changed but that it could be a technical issue.

Gary explained:

“…And one possibility is that when you see that number rising, that the perception of… Google’s perception of the site has changed, that could be one thing.

But then there could also be that there was an error, for example on the site and then it served the same exact page to every single URL on the site. That could also be one of the reasons that you see that number climbing.

So yeah, there could be many things.”

Takeaways

Gary provided answers that should help debug why a web page might be crawled but not indexed by Google.

  • Content is similar to content already ranked in the search engine results pages (SERPs)
  • Exact same content exists on another site that has better signals
  • General site quality issues
  • Technical issues

Although Illyes didn’t elaborate on what he meant about another site with better signals, I’m fairly certain that he’s describing the scenario when a site syndicates its content to another site and Google chooses to rank the other site for the content and not the original publisher.

Watch Gary answer this question at the 9 minute mark of the recorded interview:

Featured Image by Shutterstock/Roman Samborskyi