Don’t believe what you may have heard; Facebook is still a dominant social media force in 2024.
With over 3 billion active users, it remains a key player for businesses, marketers, and social media enthusiasts.
And despite the rise of newer, shinier platforms, Facebook’s expansive reach and diverse user base are still unrivaled, making it a powerful channel for both personal and business engagement.
In this article, we’ll highlight the latest Facebook statistics and facts, providing a comprehensive overview of its reach, user behavior, and influence.
Facebook Overview
1. Facebook is the world’s most-used social platform in 2024, with over 3 billion global active users.
2. It is the third most-used app globally among mobile users, trailing only WhatsApp and YouTube.
3. Facebook ranks third in terms of time spent (behind TikTok and YouTube), with users spending an average of 19 hours and 47 minutes on Android app per month.
4. 64.1% of Facebook Android users open the app every day.
5. Facebook is the third most visited website in the US, with an estimated 2.90 billion monthly visits in April 2024.
6. Of its monthly US visitors, roughly 50.07% are mobile users, and 49.93% are using a desktop.
7. Globally, users spend an average of 3 minutes and 42 seconds on Facebook per app session.
8. Facebook is the second most searched query globally, with a search volume of 584.9 million.
9. Facebook is the fourth most downloaded social networking app in the US, behind Threads, WhatsApp, and Telegram.
30. Advertisers can reach 2.24 billion users on Facebook in 2024, representing 41.3% of all internet users and 27.7% of the total population.
31. Among active Facebook users, 53.8% say they use the platform to follow or research brands and products. This ranks the platform second behind Instagram (62.7%) and ahead of TikTok (47.4%).
32. Male users aged 25-34 years old make up the largest portion of Facebook’s advertising audience (18.4%), followed by those aged 18-24 years old (13.5%).
33. Ad impressions on Meta’s Family of Apps (FoA), which includes Facebook, Instagram, WhatsApp, and Messenger, increased by 28% YoY in 2023.
Say what you will about Facebook, but its enduring relevance is undeniable.
With extensive reach, a broad user base, and significant advertising potential, Facebook will remain a cornerstone of any social media strategy in 2024.
By understanding these trends and user behaviors – and leveraging many of the insights covered above – you can maximize the potential of Facebook to drive engagement, awareness, and impact.
This post was sponsored by Digitalinear. The opinions expressed in this article are the sponsor’s own.
In a world where consumers search online for nearly everything – from product recommendations to local service providers – your brand’s digital presence is everything.
The right organic SEO strategy can boost your search visibility and drive sustainable business growth.
SEO is becoming increasingly complex with sweeping algorithm changes, intense competition, and the recent flood of AI-generated content.
So how can you navigate these challenges to enhance your site’s visibility and raise brand awareness?
Here, we’ll break down a strategic approach to organic SEO, focusing on building a solid foundation and continuously optimizing and analyzing performance.
Step One: Web Inspection
The first step to address is the core user experience of your website.
SEO strategies should always begin with the end user, how you solve their problems, and how you communicate your value to them.
If your website is a new obstacle to them, they’ll find solutions elsewhere. So, your first job is to clarify your intended user experience and make it seamless. This will take both audience research and technical optimization.
The audience research portion of inspecting your website should include:
Identify Success Metrics and Conversion Points: Your website must do its job well. You need a clear understanding of what you want users to do, which target audiences are likely to take those actions, what questions and pain points those users have, and how you can address them. Much of that strategy work comes later in the process, but for now, you need a clear view of your goals and intended user journeys. This will help you prioritize the most impactful technical fixes.
Assess Content Coverage and Quality: High-quality, relevant content is crucial for engaging users and ranking well in search results. Moreover, you need content that addresses the real needs of your target audiences at multiple stages of their journey. Understanding users’ needs and questions is critical to their experience and to lead them toward desired actions. Conduct a content audit to identify gaps and opportunities for improvement: identify low-engagement pages and analyze competitors to see what content you’re missing.
A comprehensive technical website audit should include:
Technical SEO: Ensure your site is technically sound by checking for issues like broken links, duplicate content, and proper use of meta tags. Many tools can assist with this, but individual tools may not provide a complete view. You may need to combine reports from multiple different tools.
Site Speed and Core Web Vitals: Slow-loading websites can deter visitors and negatively affect search rankings. Use tools like Google PageSpeed Insights to identify speed issues.
Mobile-Friendliness: Google uses mobile-first indexing. This means that mobile-friendliness isn’t only important for mobile users. Your website’s indexing and ranking depend on its mobile performance, no matter what device is being used to view it. The content of your pages should be the same and provide the same experience between desktop and mobile. Google’s Mobile-Friendly Test can help you assess your site’s performance on mobile devices. If you want to lean into mobile trends, then mobile app development allows you to provide mobile users with unique, seamless experiences.
User Experience (UX): A positive user experience encourages visitors to stay on your site longer, reducing bounce rates and improving SEO. Evaluate your site’s navigation, layout, and overall usability, keeping your success metrics and conversion points in mind. Users should be able to find the next steps quickly.
A full website audit is both technical and strategic. Sometimes, you need an external perspective to accurately identify the UX and communication issues you might be encountering. This is where an SEO agency can provide the perspective, research, and dedicated resources a website needs for long-term success.
Digitalinear specializes in organic SEO, and packages begin with deep research into your business and niche alongside technical audits.
With a team of dedicated SEO professionals, Digitalinear performs a thorough website inspection using the best tools available, ensuring no stone is left unturned in your audit.
Step Two: Deep Optimization
Once you’ve identified the areas of your site that need improvement, the next step is to optimize.
Optimizing your website ensures that it meets search engines’ technical requirements while providing a seamless and engaging user experience.
This is where you should get into the fine details of keyword research and query intent matching, ensuring that your SEO goals align with the business goals of your website that you identified in step one.
Top rankings for keywords won’t have a business impact if you haven’t matched them to your core audience. Traffic won’t result in signups or sales if you’re not effectively engaging those users.
Research is one area where an SEO consultancy can be particularly helpful in providing objective competitor and industry analysis.
Here are some key research elements for optimization:
Keyword Research: Identify the most relevant and high-performing keywords for your industry. Use tools like SEMrush or Ahrefs to find keywords that your target audience is searching for.
Content Optimization: Intent has been a big deal in SEO lately, and you must optimize for it as well as keywords. Matching your content to users’ needs and intents is where all your research will pay off in engagement, retention, and conversion. Use keywords naturally and ensure your content answers your audience’s questions and needs. Ensure you have wide and deep coverage of relevant topics demonstrating your unique expertise. Build strong networks of internal links to help users and search engines navigate and parse your content.
Key optimization techniques that lead to higher search rankings include:
On-Page SEO: This broad category includes content-focused and technical implementations to make individual pages shine. Your research and analysis thus far should culminate in a page with exceptional user experience. The elements load quickly and provide a consistent experience. The content demonstrates your experience, expertise, authoritativeness, and trustworthiness (E-E-A-T). You implement keywords, metadata, and linking effectively.
Link Building: Build high-quality backlinks to your site from reputable sources. This can help improve your site’s authority and search rankings. Creating content people share and want to link to is the first step. Then, you can actively seek links through many outreach strategies, such as digital PR, email, and social channels.
A full optimization process is a ton of work. Digitalinear is an SEO agency with services designed to simplify your process.
Their expert team uses advanced tools and techniques to ensure your site is fully optimized for search engines and users alike.
Additionally, Digitalinear provides web design and development solutions to enhance user interface and overall experience.
This involves monitoring key metrics to evaluate the success of your SEO efforts and make data-driven decisions for continuous improvement.
By regularly analyzing your growth, you can better understand what’s working and what’s not.
Here are some key SEO metrics to monitor:
Organic Traffic: Track the number of visitors coming to your site through organic search.
Engagement Rate: Monitor the percentage of visitors who do not engage with content.
Keyword Rankings: Keep an eye on how your targeted keywords are ranking over time.
Conversion Rates: Measure the percentage of visitors who take a desired action, such as making a purchase, filling out a form, etc.
Backlink Profile: Analyze the quantity and quality of backlinks pointing to your site.
Digitalinear offers ongoing SEO consultancy. They can help you select the best metrics for your business goals and track and analyze them.
Their expertise ensures you make informed decisions that lead to tangible results, helping you drive continuous growth and stay ahead of the competition.
Start Meeting & Exceeding Your Growth Goals With Digitalinear
To navigate the complexities of SEO and achieve your growth goals, it’s essential to have a strategic partner that takes the time to understand your business, where you are now and where you want to be.
This involves a holistic approach to organic SEO through comprehensive site audits, deep optimization techniques, and continuous performance analysis. The audience research and testing involved are ongoing processes of learning.
Digitalinear offers expert guidance and tailored SEO solutions to help you enhance your online presence and drive sustainable growth.
With a team of professionals dedicated to exceeding your business objectives, they ensure that every SEO strategy is optimized for success.
Learn more about how Digitalinear’s tailored SEO services can make a difference for your business.
In an interview published on YouTube, Google’s Gary Illyes offered advice on what small sites should consider doing if they want to compete against Reddit, Amazon and other big brand websites.
About Big Brand Dominance
Google’s Gary Illyes answered questions about SEO back in May that went underreported so I’m correcting that oversight this month. Gary answered a question about how to compete against Reddit and big brands.
While it may appear that Gary is skeptical that Reddit is dominating, he’s not disputing that perception and that’s not the context of his answer. The context is larger than Reddit because his answer is about the core issue of competing against big brands in the search engine results pages (SERPs).
This is the question that an audience member asked:
“Since Reddit and big publishers dominate nowadays in the SERPS for many keywords, what can the smaller brands do besides targeting the long tail keywords?”
The History Of Big Brands In The SERPs
Gary’s answer encompasses the entire history of big brands in the SERPs and the SEO response to that. About.com was a website about virtually any topic of interest and it used to rank for just about everything. It was like the Wikipedia of its day and many SEOs resented how About.com used to rank so well.
He first puts that context into his answer, that this complaint about Reddit is part of a long history of various brands ranking at the top of the SERPs then washing out of the SERPs as trends change.
Gary answered:
“So before I joined Google I was doing some SEO stuff for big publishers. …SEO type. Like I was also server manager like a cluster manager.
So, I would have had the same questions and in fact back in the day we saw these kind of questions all the time.
Now it’s Reddit. Back then it was Amazon. A few years before that, it was I think …About.com.
Pretty much every two years the name that you would put there …changes.”
Small Sites Can Outcompete Big Brands
Gary next shares that the history of SEO is also about small sites figuring out how to outcompete the bigger sites. This is also true. Some big sites started as small sites that figured out a way to outcompete larger big brand sites. For example, Reviewed.com, before it was purchased by USA Today, was literally started by a child whose passion for the topic contributed to it becoming massively successful.
Gary says that there are two things to do:
Wait until someone else figures out how to outcompete and then copy them
Or figure it out yourself and lead the way
But of course, if you wait for someone else to show the way it’s probably too late.
He continued:
“It seems that people always figure out ways to compete with whoever would be the second word in that question.
So it’s not like, oh my God, like everything sucks now and we can retire. It’s like, one thing you could do is to wait it out and let someone else come up with something for you that you can use to compete with Reddit and the big publishers that allegedly dominate nowadays the SERPs.
Or you sit down and you start thinking about how can you employ some marketing strategies that will boost you to around the same positions as the big publishers and Reddit and whatnot.
One of the most inspiring presentations I’ve seen was the empathetic marketing… do that. Find a way to compete with these positions in the SERPs because it is possible, you just have to find the the angle to compete with them.”
Gary is right. Big brands are slowed down by bureaucracy and scared to take chances. As I mentioned about Reviewed.com, a good strategy can outrun the big brands all day long, I know this from my own experience and from knowing others who have done the same thing, including the founder of Reviewed.com.
Long Tail Keywords & Other Strategies
Gary next talked about long tail keywords. A lot of newbie SEO gurus define long tail keyword phrases with a lot of words in it. That’s 100% wrong. Long tail keyword phrases are keyword phrases that searches rarely use. It’s the rareness of keyword use that makes them long tail, not how many words are in the keyword phrase.
The context this part of Gary’s answer is that the person asking the question essentially dismissed long tail search queries as the crumbs that the big brands leave behind for small sites.
Gary explains:
“And also the other thing is that, like saying that you are left with the long tail keywords. It’s like we see like 15 to even more percent of new long tail keywords every single day.
There’s lots of traffic in long tail keywords. You you can jump on that bandwagon and capture a ton of traffic.”
Something left unmentioned is that conquering long tail keyword phrases is one way to create awareness that a site is about a topic. People come for the long tail and return for the head phrases (the queries with more traffic).
The problem with some small sites is that they’re trying to hit the big traffic keywords without first showing relevance in the long tail. Starting small and building up toward big is one of the secrets of successful sites.
Small Sites Can Be Powerful
Gary is right, there is a lot of traffic in the long tail and emerging trends. The thing that small sites need to remember is that big sites move slow and have to get through layers of bureaucracy in order to make a strategic decision. The stakes for them are also higher so they’re not prone to take big swings either. Speed and the ability to make bold moves is the small site’s super power. Exercise it.
I know from my own experience and from working with clients that it’s absolutely possible to outrank to big sites that have been around for years. The history of SEO is littered with small sites that outpaced the slower moving bigger sites.
Watch Gary answer this question at the 20 minute mark:
Featured Image by Shutterstock/Volodymyr TVERDOKHLIB
Back in May Google’s Gary Illyes sat for an interview at the SERP Conf 2024 conference in Bulgaria and answered a question about the causes of crawled but not indexed, offering multiple reasons that are helpful for debugging and fixing this error.
Although the interview happened in May, the video of the interview went underreported and not many people have actually watched it. I only heard of it because the always awesome Olesia Korobka (@Giridja) recently drew attention to the interview in a Facebook post.
So even though the interview happened in May, the information is still timely and useful.
Reason For Crawled – Currently Not Indexed
Crawled Currently Not Indexed is a reference to an error report in the Google Search Console Page Indexing report which alerts that a page was crawled by Google but was not indexed.
During a live interview someone submitted a question, asking:
“Can crawled but not indexed be a result of a page being too similar to other stuff already indexed?
So is Google suggesting there is enough other stuff already and your stuff is not unique enough?”
Google’s search console documentation doesn’t provide an answer as to why Google may crawl a page and not index it, so it’s a legitimate question.
Gary Illyes answered that yes, one of the reasons could be that there is already other content that is similar. But he also goes on to say that there are other reasons, too.
He answered:
“Yeah, that that could be one thing that it can mean. Crawled but not indexed is, ideally we would break up that category into more granular chunks, but it’s super hard because of how the data internally exists.
It can be a bunch of things, dupe elimination is one of those things, where we crawl the page and then we decide to not index it because there’s already a version of that or an extremely similar version of that content available in our index and it has better signals.
But yeah, but it it can be multiple things.”
General Quality Of Site Can Impact Indexing
Gary then called attention to another reason why Google might crawl but choose not to index a site, saying that it could be a site quality issue.
Illyes then continued his answer:
“And the general quality of the of the site, that can matter a lot of how many of these crawled but not indexed you see in search console. If the number of these URLs is very high that could hint at general quality issues.
And I’ve seen that a lot since February, where suddenly we just decided that we are indexing a vast amount of URLs on a site just because …our perception of the site has changed.”
Other Reasons For Crawled Not Indexed
Gary next offered other reasons for why URLs might be crawled but not indexed, saying that it could be that Google’s perception of the site could have changed but that it could be a technical issue.
Gary explained:
“…And one possibility is that when you see that number rising, that the perception of… Google’s perception of the site has changed, that could be one thing.
But then there could also be that there was an error, for example on the site and then it served the same exact page to every single URL on the site. That could also be one of the reasons that you see that number climbing.
So yeah, there could be many things.”
Takeaways
Gary provided answers that should help debug why a web page might be crawled but not indexed by Google.
Content is similar to content already ranked in the search engine results pages (SERPs)
Exact same content exists on another site that has better signals
General site quality issues
Technical issues
Although Illyes didn’t elaborate on what he meant about another site with better signals, I’m fairly certain that he’s describing the scenario when a site syndicates its content to another site and Google chooses to rank the other site for the content and not the original publisher.
Watch Gary answer this question at the 9 minute mark of the recorded interview:
Generative AI, SGE, and now AI Overviews have been hot topics since the launch of ChatGPT in November 2022, which gave Gen AI an accessible interface to the wide market.
Since then, the SEO industry has been trying to figure out just how much search behavior will change and how much this will impact organic search traffic.
Will we see the catastrophic drops in clicks that are being estimated?
Google’s aim is to integrate Gen AI into search to provide better answers and, in its words:
Sometimes you want a quick answer, but you don’t have time to piece together all the information you need. Search will do the work for you with AI Overviews.
However, there has been much contention and discussion about this as, in practice, the results are somewhat unpredictable – with advice, such as the health benefits of running with scissors, taking a bath with a toaster, and adding glue to pizza to make the cheese stick.
Google is still experimenting with AIO. Recently (June 19), a study from SE Ranking showed the frequency of AIO in SERPs has reduced from 64% to 8%. Meanwhile, BrightEdge reports that Google went from showing AI on 84% of queries to 15%.
Google also keeps experimenting with how AIO results appear in SERPs, and the latest iteration features citations in the top carousel.
Gen AI is disrupting the industry faster than anything else in the 25-year history of SEO. Some of the main discussion points for SEO include: How much is AI plagiarizing content, and how much do we need to pivot our approach to SEO?
In Your Opinion, What Do You Think About AI Overviews, How Will They Impact The Industry, And Where Is This Going?
This is what Pedro had to say:
“As I have mentioned before, Google wants to be your personal assistant and not your friendly librarian.
This is an important distinction, to see Google from this perspective moving forward. Instead of pointing us to the books, they will do the work for us.
If we continue to put out content that only requires a quick answer, this is where we will be disrupted. We need to focus on what people want beyond quick answers.
Google wants to be the personal assistant and caters for this by providing quick answers.
AI Overviews is no more than an evolution of instant answers.
If a site owner wants to target the quick answers, they should also be putting effort into more in-depth content that you can funnel your readers to and ideally, is closed to Google.
By doing this, you can protect the content assets you build.
We need to focus more than ever on building our own communities with users aligned to our brands. And doing more than simply providing a ‘this will do’ snippet, or an instant answer.
Right now, it’s impossible to predict how AIO will develop and what the format will be. Google keeps changing how it is presenting the SERP results and playing with the format much like live beta testing.
But, AIO will trigger different search behaviors.
Before, in SEO, we had ten blue links and no instant answers. From this, users would have to visit your website to get the answer, so a site could get considerable traffic for a basic question.
However, this type of traffic has little value, and these are not your customers – they are Google’s customers.
We need to understand how we can distinguish between instant answer traffic and users who want to consume our content. And this is the area where we should put our efforts.
Focus on building content for the people who don’t want the summary or the quick answer. Those who want to ‘read the book’ and consume the details to augment their knowledge.
In the same way that the web disrupted the music industry and the publisher industry, we are about to go through another change and we have to adapt to it. It’s a matter of time – when and not if.”
How Can You Leverage AIO And Google To Build A Content Community?
I asked Pedro:
“If we want to embrace this new approach, it will require thinking about how to gain users from a ‘take all the traffic you can get’ mentality to a selective one – leveraging Google to provide targeted traffic that you can absorb into your own community.
This will be a big change for some, so how can you leverage Google to achieve this?”
Pedro responded:
“Trying to figure out how much ‘discovery’ traffic Google will take away will be different for all verticals. For example, in the legal industry, or accountancy, the industry is based on consultants who understand and are gatekeepers to complex rules.
You can now ask AI to explain complex legislation on wider topics. But, if you have a specific scenario, you still need to visit a specialist who can deal with this for you.
AI can give you the wider information, but the expert is still needed for the detail.
As professionals in SEO, we can create content that covers broad concepts that AI can tap into. And then, for the specific scenarios and questions, this is where we can build out much more in-depth content.
This in-depth content is kept away from Google and AI and gated for your community or clients.
Every business will need to consider where to draw this line of what they give away for free and what they keep back as a premium.
AI came along to create more distance between those who know something and those who are specialists and will be sources of information.
The middle ground is about to disappear.
The professionals will remain because industries rely on the knowledge and the research these people do. And the rest will just be the rest.
Users will be divided into those who want a little information from AI and then the others who want specialist in-depth knowledge.
Being able to discern where you fit into this scenario and being able to create a strategy around this is how you can adapt.”
Fundamental Rules Never Change
I think we can expect more experimentation from Google before we begin to embrace AI in SERPs and SEO.
During a time of great flux, the best thing we can focus on is the fundamental rules that never change. And those fundamentals are all centered around how a brand builds a direct relationship with their user.
For SEO pros, it could be a challenging shift to adapt to this mindset away from chasing volume keyword traffic. Instead, looking at building user journeys and considering content touches where relevant.
The old days of gaining huge amounts of traffic for ranking from one high-volume keyword are becoming outdated. Moving forward, more effort will be needed to achieve far fewer clicks. However, those clicks should be far more relevant and beneficial.
Thank you to Pedro Dias for offering his opinion and being my guest on IMHO.
SEO metrics such as rankings are great, but what matters most is SEO’s impact on business growth.
This means you can only understand SEO’s value if you track the right metrics and how it can increase revenue.
Your metrics should focus on:
Audience quality: Are you attracting visitors who are likely to become customers?
Engagement and behavior: Are users finding the information they need, spending time on your site, and taking desired actions?
Conversions: Is your organic traffic translating into desired outcomes?
Brand impact: Is SEO influencing your brand’s reputation and visibility?
In this article, we’ve categorized important metrics to focus on at a high level.
User Engagement Metrics
Here are some user engagement metrics to track:
Bounce Rate
Bounce rate is the percentage of users who return to the SERP, or exit the webpage (and your site) without interacting with another page on your website. A high bounce rate can indicate that visitors are not finding what they want on your site, which causes them to exit quickly.
Why Is Bounce Rate Important In SEO?
Bounce rate helps you fix issues such as:
User Experience: A high bounce rate may indicate issues with your website’s content, design, or alignment with user intent. When you notice a high bounce rate, address these issues to improve user experience.
SEO Rankings: Search engines aim to provide users with the most relevant results. However, a high bounce rate may signal to Google that your site is not meeting user expectations and will lose visibility. This will also affect conversions as users didn’t even engage with your page.
How To Analyze Bounce Rate
Check your Google Analytics 4 for the percentage of single-page visits and divide that by the total number of visits. The result is a percentage of your bounce rate.
For example, if your website received 500 visitors and 100 interacted with more than one page, then 400 visitors bounced. Therefore, your bounce rate would be 80% (400 single-page visits / 500 total visits times 100).
Engaged session duration measures the amount of time a user actively spends on your website during an engaged session. This metric indicates how long users interact with your content before leaving the site or becoming inactive.
For example, if a user searches for “best running shoes,” clicks on your link, spends three minutes reading your content, and continues to interact with other parts of your site, the engaged session duration is three minutes.
Why Is Engaged Session Duration Important In SEO?
It indicates content engagement and relevance: Longer engaged session durations show that users find your content valuable and are willing to spend time interacting with it.
It impacts rankings: High-engaged session durations signal to search engines that your page provides content that satisfies user intent, which can improve your rankings.
It helps gauge content effectiveness: If users spend more time on your site, it suggests your content is meeting their expectations and providing the information they need.
To Analyze Dwell Time
Open GA4 and click on Reports in the left-hand menu.
Choose the Traffic acquisition: Session default channel group.
Click on the pencil icon at the top right corner and select Metrics.
In the bottom search box that says Add metric, type “average engagement time” and hit Apply.
Screenshot from GA4, June 2024
Engaged Sessions Per User
Engaged sessions per user is a metric that measures how frequently users interact meaningfully with your website.
In Google Analytics, an engaged session is defined by user activity that includes spending a certain amount of time on the site, viewing multiple pages, or completing specific actions like form submissions or purchases.
For example, if a user lands on your homepage, spends more than a minute exploring your content, clicks on a product page, and completes a form, this counts as an engaged session.
Why Is Engaged Sessions Per User Important To SEO?
It reflects user engagement and satisfaction: High engaged sessions per user indicate that visitors find your content valuable and are willing to interact with it in a meaningful way.
It impacts SEO positively: Search engines use engagement metrics as signals of content quality and relevance. High engagement suggests that your site meets user needs, which can boost your rankings.
How To Calculate Engaged Sessions Per User
Google Analytics provides this metric directly, but to calculate it manually, divide the total number of engaged sessions by the number of unique users.
For example, if your website had 50,000 engaged sessions and 20,000 unique users in a month, engaged sessions per user equals 50,000 divided by 20,000 (2.5).
This means, on average, each user had 2.5 engaged sessions during that month.
The organic conversion rate is the percentage of visitors who find your website through organic search results and complete a desired action. This could be
Making a purchase (usually on ecommerce sites).
Submitting a lead form (for businesses focused on lead gen).
Newsletter subscription (to build an email list)
Or any other goal that moves them further along the customer journey.
This metric shows how SEO drives valuable clicks that contribute to your business objectives.
How To Calculate the Organic Conversion Rate
Determine what constitutes a conversion for your business (e.g., form completion, sales, subscription).
Track the number of users who complete the desired action and the total number of organic visitors over a specific period.
Divide the number of conversions by the total number of organic visitors, then multiply by 100 to get a percentage.
For context, the organic conversion rate equals the number of conversions divided by the number of organic visitors multiplied by 100.
This means if 500 out of 10,000 organic visitors complete the desired action, the conversion rate would be 5%.
A goal completion is recorded whenever a user completes a specific action you’ve defined as valuable. The actions could be the same metrics bulleted out in the previous point.
Goal completions matter because they tell if your SEO is driving the right traffic and if visitors are taking the actions you want them to take.
How To Track Goal Completions
Choose an analytics platform such as GA4, Adobe Analytics, Matomo, etc.
Define your goals and be specific (e.g., “purchase confirmation page viewed”).
Set up goal tracking.
For this article, we’ll use GA4, and tracking looks like this:
Go to the Admin section.
In the Property column, click on Events.
Click the “Create Event” button to set up a new event.
Name your event (e.g., “form_submission” or “purchase_completed”).
Define the conditions for your event. For example, if tracking a form submission, set parameters like event name equals “form_submit” or similar.
Click Create to save your new event.
Mark that event as a Key Event (conversion).
Then, monitor and analyze the reports to track goal completions.
Screenshot from GA4, June 2024
Ecommerce Transactions
In ecommerce, a conversion is completing a desired action that generates revenue.
The most apparent conversion is a purchase, but other valuable actions include adding items to a cart, creating an account, or subscribing to emails.
What Does Tracking Ecommerce Transactions Look Like?
A user searches for [best running shoes] on Google.
They click on your blog post, “Top 10 Running Shoes for 2024,” which ranks high in organic search results.
They read your review and click on the buy button link to a product page on your website.
They add the shoes to their cart and complete the purchase.
If you enable enhanced ecommerce in GA4, it’ll track the entire customer journey (from product view to purchase).
UTM parameters will identify the blog post as the conversion source, your attribution model will assign credit to the post, and your CRM can link the purchase to the user’s profile for further analysis.
Organic traffic volume is the number of visitors arriving at your website through unpaid search results – organic clicks from search engine result pages (SERPs).
High organic traffic indicates that search engines consider your website relevant and authoritative for your target keywords.
This way, as long as you write quality content, your website will convert users without relying on paid advertising.
How To Measure Organic Traffic
Log into GA4 and go to Acquisition Reports. Navigate to Reports > Acquisition > Traffic Acquisition.
This report provides a detailed breakdown of your traffic sources, including organic search.
Organic Traffic Value
Organic traffic value goes beyond numbers to assess the actual worth of visitors your SEO efforts attract. It quantifies the potential revenue or business impact of your organic traffic.
Organic traffic value is ROI-focused; it answers the question, “What is the monetary value of the organic traffic we’re getting?”
The answer then informs decisions on how to allocate marketing resources.
How To Calculate Organic Traffic Value
You can either use the cost-per-click (CPC), conversion-based value, or the customer lifetime value (LTV) metrics:
TheCPC method estimates the value of organic traffic by calculating how much you would have spent on paid advertising (PPC) to get the same number of clicks. It uses the average CPC for your target keywords.
If your website receives 1,000 organic clicks per month for a keyword with an average CPC of $2, the estimated organic traffic value would be $2,000.
The conversion-based value metriccalculates the revenue generated from organic traffic by tracking conversions and assigning a value to each conversion. For example, if your website receives 1,000 organic visitors and 50 convert into customers with an average order value of $100, the organic traffic value would be $5,000.
Another method is the customer lifetime value (LTV). This method takes a long-term view by considering the total value a customer brings over their entire relationship with your business. It factors in repeat purchases, customer retention, and average order value.
For example, if your average customer from organic search makes three purchases per year with an average order value of $100 and remains a customer for two years, their LTV would be $600.
Technical SEO Metrics
Technical SEO metrics provide insights into your website’s infrastructure to ensure search engines can access, crawl, and index your content. Here are some metrics to focus on:
Crawl Errors
Crawl errors occur when search engine bots (like Googlebot) encounter issues while crawling pages on your website.
These errors can prevent search engines from understanding your content, potentially leading to lower rankings and visibility in SERPs.
Types of Crawl Errors
404 (Not Found): The requested page doesn’t exist. This could be due to a broken link, a deleted page, or a typo in the URL.
5xx (Server Errors): The server encountered an error while processing the request. This could be due to a temporary outage, a misconfiguration, or a server overload.
Robots.txt Errors: The robots.txt file blocks search engine bots from accessing certain pages or sections of your website.
How To Identify Crawl Errors
Head to Google Search Console (GSC). Go to Index > Coverage to see a list of crawl errors and warnings. Click on each error for more details, including the affected URLs and the error type. Then, prioritize the most critical errors, such as 404 errors on essential pages.
Create a 301 redirect to the new URL if the page has been moved permanently.
Create a helpful custom 404 page that guides users back to relevant content.
Afterward, validate your fixes using the URL Inspection tool in GSC to test if the fixed page can be crawled and indexed correctly.
Indexation Status
Indexation status refers to whether or not a specific webpage has been added to a search engine’s index.
When a page is indexed, it appears in search results when users search for relevant queries. In contrast, if a page is not indexed, it’s invisible to search engines and won’t be found by users.
How To Ensure Proper Indexing of Pages
Create high-quality, unique content and use relevant keywords to signal to search engines what your page is about.
Submit a sitemap to help search engines discover and crawl your pages.
Optimize internal linking to help search engine bots navigate your site and discover all your pages.
Check Robots.txt to ensure your txt file is not blocking search engines from crawling and indexing critical pages.
Monitor indexation status by checking the Index > Coverage report in GSC to see which pages have been indexed and if there are any indexing errors.
Screenshot from GA4, June 2024
Site Speed
Site speed is the time a website’s content takes to load and become fully interactive for users. Think of it as the digital stopwatch that measures the responsiveness and efficiency of your website.
Why Is Site Speed Important for SEO?
User experience (UX): Studies have shown that users expect websites to load within a few seconds. Fast website speed keeps users engaged, encourages them to explore more pages, consume more content, and ultimately convert into customers or leads. It also enhances the mobile experience.
Search engine rankings: Search engines prioritize faster websites because they provide a better user experience, which can help your faster website outrank slower competitors.
Read More:
Content Performance Metrics
This explores how effective your content is via:
Content Engagement
Content engagement measures users’ level of interaction and involvement with your web pages.
It goes beyond passive consumption and delves into how visitors actively engage with your content to indicate genuine interest and value.
How To Measure Content Engagement
In GA4, track metrics like average engagement time, sessions, and engagement rate to gauge how long users actively interact with your content. You can also implement event tracking to measure specific interactions (video views, downloads, form submissions, or clicks on internal links).
Use heatmaps and session recording tools like Hotjar or Crazy Egg to visualize how users interact with your pages. This will reveal where they click, scroll, and spend the most time.
Content Shares And Backlinks
Content shares, or social signals, are the number of times your content is shared across social media platforms.
Social shares indicate that your content is valuable and worthy of being shared and can amplify reach, build brand awareness, and attract backlinks.
Backlinks, on the other hand, are links from external websites that point to your web pages. Quality backlinks from other authoritative sites act as “votes of confidence” and signal to search engines that your content is trustworthy and authoritative.
High-quality backlinks can boost rankings, drive referral traffic from other websites, and increase your domain authority.
To track social shares, use the built-in analytics tools provided by social media platforms to track the number of shares, likes, comments, and overall engagement for your content. You can also use third-party tools like Hootsuite or Buffer.
To track backlinks, use tools like Ahrefs, Semrush, or Moz to see your total backlinks, referring domains, and link quality.
Read more:
Local SEO Metrics
Local SEO ensures your business appears when users search for products or services in your geographic area. Let’s start with getting insights from Google Business.
Google Business Profile Insights
Google Business Profile (GBP) is a free tool for businesses to manage their online presence across Google, including Search and Maps.
GBP Insights provides valuable data on how customers find and interact with your business listing.
How To Track GBP Performance
Log in to your GBP account and click the Insights tab. Look for the section titled How customers search for your business.
You’ll see a breakdown of:
Direct searches (branded searches),
Discovery searches (non-branded searches— when customers search for a general category, product, or service that you offer) and
Maps searches: When customers find your business through Google Maps.
Image from Google Support, June 2024
In the same Insights tab, look for the section called Where customers view your business on Google. It will show whether customers find your listing more often in Search results or Maps.
Image from Google Support, June 2024
Also, check for customer actions in the Insights tab. Here, you can track website visits, calls directly from your listing, and direct requests to your location. This data reveals how customers engage with your business after finding your listing.
Other data to track include photo views and search queries.
Local Search Rankings
Local search rankings refer to your business’s position in the SERPs for queries with local intent.
These searches include location-specific keywords like “coffee shops near me” or “best dentist in Albany.”
Local search results often include a map pack (a group of three to four businesses displayed on a map) and organic listings.
How to Track Local SEO Success
Tracking local keyword rankings through tools like Semrush, Ahrefs, or Moz Local. Monitor your rankings for critical local keywords, as well as your map pack rankings and organic rankings.
Monitor GMB Insights to know how customers find your business, their actions, and which search queries they use.
Analyze local traffic and conversions on GA4 to segment your traffic by location and track conversions (phone calls, direction requests, website visits, purchases) that originated from local searches.
Customer reviews and ratings provide valuable customer feedback about their experiences with your business, products, or services.
These reviews are often publicly accessible on Google, Yelp, Facebook, and industry-specific review sites.
Why Are Reviews Important For Local SEO?
It’s a ranking factor, as businesses with positive reviews are more likely to appear higher in local search results, including the map pack and organic listings. For instance, Google ranks your business higher if you have many reviews, a high frequency of new reviews, multiple review sources, and an overall star rating.
Star ratings (or positive reviews) displayed alongside your business listing in search results can increase CTR.
Positive reviews enhance customer trust and conversion, as customers now rely on online reviews when making purchasing decisions.
Competitor Analysis
Competitive Benchmarking
Competitive benchmarking in SEO involves identifying, analyzing, and comparing your website’s performance to that of your top competitors in the search engine results pages (SERPs).
This helps you uncover your strengths and weaknesses, discover opportunities, and make data-driven decisions.
Some competitor performance metrics to analyze include:
Their keywords, search volume, and keyword gaps.
Their high-performing content format.
Backlink analysis.
Technical SEO audit (site speed, mobile friendliness, crawlability, and indexability.
Rankings are great, but conversions pay the bills.
Conversions are important they determine the efficacy of all your marketing efforts.
Tracking these metrics (and how they contribute to sales) will help you intensify marketing efforts on the strategies that work and allocate budgets effectively.
A new study by search industry expert Rand Fishkin has revealed that Google’s rollout of AI overviews in May led to a noticeable decrease in search volume, particularly on mobile devices.
The study, which analyzed millions of Google searches in the United States and European Union, sheds light on the unexpected consequences of AI integration.
AI Overviews Rollout & Reversal
In May 2024, Google rolled out AI overviews in the United States, which generate summaries for many search queries.
However, the feature was met with mixed reactions and was quickly dialed back by the end of the month.
Google says it implemented over a dozen technical improvements to its systems in response.
A subsequent study by SE Ranking found the frequency of these summaries decreased, with only 8% of searches now triggering an AI Overview. However, when shown, these overviews are now longer and more detailed, averaging 25% more content.
SE Ranking also noted that after expansion, AI overviews typically link to fewer sources, usually around four.
Decline In Mobile Searches
Fishkin’s analysis reveals that the introduction of AI Overviews coincided with a marked decline in mobile searches in May.
While desktop searches saw a slight increase, the drop in mobile searches was significant, considering that mobile accounts for nearly two-thirds of all Google queries.
This finding suggests that users may have been less inclined to search on their mobile devices when confronted with AI-generated summaries.
Fishkin commented:
“The most visible changes in May were shared by both the EU and US, notably… Mobile searches fell a considerable amount (if anything spooked Google into rolling back this feature, I’d put my money on this being it).”
He adds:
“If I were running Google, that dip in mobile searches (remember, mobile accounts for almost 2/3rds of all Google queries) would scare the stock-price-worshiping-crap outta me.”
Impact On Overall Search Behavior
Despite the dip in mobile searches, the study found that search behavior remained relatively stable during the AI overviews rollout.
The number of clicks per search on mobile devices increased slightly, while desktop clicks per search remained flat.
This indicates that while some users may have been deterred from initiating searches, those who did engage with the AI Overviews still clicked on results at a similar or slightly higher rate than the previous months.
Implications For Google & the Search Industry
The study highlights the challenges Google faces in integrating AI-generated content into its search results.
Additionally, the research found other concerning trends in Google search behavior:
Low Click-through Rates: Only 360 out of every 1,000 Google searches in the US result in clicks to non-Google websites. The EU fares slightly better with 374 clicks per 1,000 searches.
Zero-click Searches Dominate: Nearly 60% of searches in both regions end without any clicks, classified as “zero-click searches.”
Google’s Self-referral Traffic: About 30% of clicks from US searches go to Google-owned properties, with a somewhat lower percentage in the EU.
Why SEJ Cares
This study underscores the need for adaptable SEO strategies.
As an industry, we may need to shift focus towards optimizing for zero-click searches and diversifying traffic sources beyond Google.
The findings also raise questions about the future of AI in search.
While major tech companies continue to invest in AI technologies, this study suggests that implementation may not always yield the expected results.
This means the file will continue operating even if you accidentally include unrelated content or misspell directives.
He elaborated that parsers typically recognize and process key directives such as user-agent, allow, and disallow while overlooking unrecognized content.
Unexpected Feature: Line Commands
Illyes pointed out the presence of line comments in robots.txt files, a feature he found puzzling given the file’s error-tolerant nature.
He invited the SEO community to speculate on the reasons behind this inclusion.
Responses To Illyes’ Post
The SEO community’s response to Illyes’ post provides additional context on the practical implications of robots.txt’s error tolerance and the use of line comments.
Andrew C., Founder of Optimisey, highlighted the utility of line comments for internal communication, stating:
“When working on websites you can see a line comment as a note from the Dev about what they want that ‘disallow’ line in the file to do.”
Screenshot from LinkedIn, July 2024.
Nima Jafari, an SEO Consultant, emphasized the value of comments in large-scale implementations.
He noted that for extensive robots.txt files, comments can “help developers and the SEO team by providing clues about other lines.”
Screenshot from LinkedIn, July 2024.
Providing historical context, Lyndon NA, a digital marketer, compared robots.txt to HTML specifications and browsers.
He suggested that the file’s error tolerance was likely an intentional design choice, stating:
“Robots.txt parsers were made lax so that content might still be accessed (imagine if G had to ditch a site, because someone borked 1 bit of robots.txt?).”
It has become quiet around AI Overviews. One month after my initial traffic impact analysis, I updated my data for AIOs. The results are important for anyone who aims for organic traffic from Google as we’re seeing a shift in AIO structures.
Shortly after Google just launched AI Overviews on May 14, I looked at 1,675 queries and found:
-8.9% fewer organic clicks when a domain is cited in AIOs than regular results.
A strong relationship between a domain’s organic ranks and AIO citations.
Variations of referral traffic depending on user intent.
Since then:
Featured snippets and AIOs confuse users with slightly different answers.
Google has significantly pulled back AIOs across all industries.
AIOs cite more sources.
Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!
AIOs Dropped By Two-Thirds
A few days after Google launched AIOs in the US, users found misleading and borderline harmful answers.
In a post titled “About last week,” VP of Search Liz Reid addressed the issue, but also called out that many queries were phrased in a way that would likely return questionable answers.
The debate about LLM answers and questionable queries is not new. Yes, you might get a funny answer when you ask an LLM a funny question. Leading queries were used in the NY Times vs. OpenAI lawsuit and backlash against Perplexity and are no different than leading questions that suggest the answer.
After the PR backlash, Google dropped AIOs across almost every industry by an average of two-thirds.
May 30: 0.6% on desktop, 0.9% on mobile.
June 28: 0.2% on desktop, 0.3% on mobile.
Industries with the largest drops (data from Semrush Sensor):
Health: -3.7% desktop, 1.3% mobile.
Science: -1% desktop, -2.6% mobile.
People & Society: -2% desktop, -3.9% mobile.
Image Credit: Kevin Indig
It seems that YMYL industries, such as health, science, animals, and law, were most affected. Some industries gained a small amount of AIOs, but not more than a negligible 0.2%.
Example: SEOmonitor clearly shows the pullback in visibility metrics for the jobs site monster.com.
Image Credit: Kevin Indig
For the 1,675 queries I analyzed, the number of AIOs dropped from 42% to 23% of queries (almost half). Interestingly, the domain was cited more often (31% vs. 25%, more shortly) and ranked more often in the top 10 spots (45% vs. 41%).
Image Credit: Kevin Indig
Queries that stopped showing AIOs had, on average, less search volume. However, I couldn’t detect a clear pattern across word count, user intent, or SERP features for queries that gained vs. lost AIOs. The effect applies broadly, meaning Google reduced AIOs for all types of queries.
Image Credit: Kevin Indig
AIOs Lean Heavily On No. 1 Web Result For Text Snippets
The before and after comparison allows us to learn more about the structure and behavior of AIOs.
For example, [hair growth products] and [best hair growth products] deliver almost identical AIOs (see screenshots below). The text is the same, but the product list and cited sources are slightly different. Google treats product searches as equal to “best” searches (makes sense).
SERPs for hair growth products (Image Credit: Kevin Indig)
SERPs for best hair growth products (AIO text is identical to screenshot above) Image Credit: Kevin Indig
The biggest difference is that the query for [hair growth products] shows no citation carousel on the side when you click the “show more” button (another example below).
On mobile, the carousel lives at the bottom of the AIO, which is not great for click-throughs. These subtle design differences likely make a big difference when it comes to clicks from AIOs since more prominently featured citations increase the likelihood of clicks.
Citations only expand when users click “show more” (Image Credit: Kevin Indig)
For transactional queries like [hair growth products], Google ranks products in the AIO in no apparent order.
I cross-referenced reviews, average ratings, price, organic product carousel and references in top-ranking articles – none indicate a relationship with the ranking in the AIO. It seems Google leans on its Shopping Graph to sort product lists.
To structure the AIO text, Google seems to pick more elements from the organic No. 1 result than others. For example, time.com ranks No. 1 for [best hair growth products]. Even though the citation in the AIO highlights a section about ingredients (purple in the screenshot below), the whole text closely mirrors the structure of the TIME article before it lists products.
The AIO mirrors the text on the No. 1 web result (time.com) (Image Credit: Kevin Indig)
AIOs use fragments of top web results because LLMs commonly use Retrieval Augmented Generation (RAG) to generate answers.
Sridhar says that Neeva uses a technique called Retrieval Augmented Generation (RAG), a hybrid of classic information retrieval and machine learning. With RAG, you can train LLMs (Large Language Models) through documents and “remove” inaccurate results by setting constraints. In plain terms, you can show AI what you want with the ranking score for web pages. That seems to be the same or similar technique Bing uses to make sure Prometheus results are as accurate and relevant as possible.
The best example of Google mirroring the AIO after the No. 1 web result (in some cases) is the answer for [rosemary oil for hair growth]. The AIO pulls its text from MedicalNewsToday (No. 1) and restructures the answer.
Text in the AI Overview vs. a snippet from MedicalNewsToday (Image Credit: Kevin Indig)
AIOs And Featured Snippets Still Co-Exist
For more informational queries with a featured snippet, like [dht], [panic attack vs. anxiety attack], or [does creatine cause hair loss], Google closely mirrors the answer in the featured snippets and elaborates further.
High overlap between AIOs and featured snippets (Image Credit: Kevin Indig)
In some cases, the elaboration might confuse users. When searching for [which vitamin deficiency causes hair loss], users see a long list in the AIO and a single answer in the featured snippet. While not contradicting each other, the AIO answer makes the featured snippet seem less trustworthy.
Image Credit: Kevin Indig
In my opinion, Google would be best off not showing a featured snippet when an AIO is present. However, that would be bad news for sites ranking in featured snippets.
AIOs Contain More Citations
One way Google seems to have increased the accuracy of AIOs after the PR backlash is by adding more citations. The average number of citations increased from 15 to 32 in the sample of 1,675 keywords I analyzed. I haven’t yet been able to confirm that more citations are used to compile the answer, but more outgoing links to webpages are a good signal for the open web because they increase the chance of getting click-throughs from AIOs.
Both Reddit and Wikipedia were cited more often after the PR Backlash. I counted citations from those two domains because marketers pay a lot of attention to influencing the public discourse on Reddit, while Wikipedia has a reputation for having more gatekeepers.
Image Credit: Kevin Indig
Keep in mind that, with 0.8% and 1%, the number of citations is relatively low. It seems AIO heavily diversifies the number of citations. Only 23 keywords in the 1,675 keyword sample returned more than 10% of citations from Reddit after the PR backlash (28 for Wikipedia).
Accountability
We can conclude that:
Google shows 50-66% fewer AIOs, which reduces the risk of losing organic traffic – for now.
There seem to be more opportunities to be cited in AIOs, but strong performance in classic web search still largely determines citations and referral clicks from AIOs.
Featured snippets get fewer clicks when AIOs are present since they elaborate much more on the answer.
Google becomes more accountable as it touches the border to publishing with AI Overviews. Verticals like health, science, and law continuously morph as new evidence comes out. It will be curious to understand whether AIOs are able to factor new evidence and opinions in and at what speed.
It’s not clear how, exactly, AI Overviews evaluate the strength of evidence, or whether it takes into account contradictory research findings, like those on whether coffee is good for you. “Science isn’t a bunch of static facts,” Dr. Yasmin said. She and other experts also questioned whether the tool would draw on older scientific findings that have since been disproved or don’t capture the latest understanding of an issue.
If AIOs adapt to new information, websites need to monitor AIOs and adapt content at an equal speed. The adaptation challenge alone will provide room for competitive advantages.