Google Search Console is among the most helpful free tools for search engine optimization. GSC‘s popular “Performance” section shows a site’s rankings, sources of organic clicks, and other info, such as:
Yet the Performance section provides much more. Here are three little-used reports to improve SEO.
Visitors’ Devices
Smartphones are increasingly the top device for visitors and conversions. But sites vary. Some still have more desktop users than mobile.
Search Console reports the number of visitors searching and landing on your site on mobile and desktop devices. In “Performance” > “Search results”:
Click “New” to create a new filter,
Select “Device,”
Click the “Compare” tab,
Apply “Desktop” vs. “Mobile.”
The example below shows a site where visitors mostly search and click from desktop devices. Thus SEO for this site should focus on the desktop version.
Search Console’s Performance section shows visitors’ devices. In this report, most visitors use desktops. Click image to enlarge.
Traffic from Images
Google’s image results and image packs can generate many clicks. Yet there’s no easy way to identify the best performers. Search Console gives some insight at “Performance” > “Search results.”
Select “Search type: Image,”
Click to the “Queries” column
Sort results by “Clicks.”
Then search Google Images for those queries to find your image that ranks and produces clicks. There’s no better way in my experience to see a site’s performance in Image search.
Keep an eye on your image optimization techniques — e.g., alt text, file type, size —and adjust the strategy accordingly.
Filter by “Search type: Image” to view traffic (clicks) from that source. Click screenshot to enlarge.
Visitors’ Countries
Searchers landing on English-language sites are likely from the United States because it is the most populous English-speaking country. But other countries offer much ecommerce potential.
To see visitors’ countries, use “Performance” > “Search results”:
Click the “Countries” column,
Sort by “Clicks.”
Get deeper insight into this data by creating a filter to reveal the visitors who translate search results into their native language before accessing your site. Countries with substantial clicks from translated results suggest a market better served in that language.
Click “New” to create a filter,
Select “Search appearance,”
Choose “Translated results.”
In the example below, visitors from Indonesia translated search results the most.
Search Console can report the number of visitors that first translated in search. Click image to enlarge.
Search engine optimization tools can be pricey. But a few, such as Bing Webmaster Tools, are free. It provides several helpful resources, including site scans.
To use, verify your site with Bing. The process is easy if you’ve already verified on Google Search Console. Choose “Import sites from Google Search Console” and allow access through your Google account. Otherwise, verify through DNS, meta tags, or a file upload.
Once verified, allow Bing Webmaster Tools a day or two to collect performance and optimization info. Then go to the “Site Scan” section to start the crawl. (It won’t launch automatically.)
The scan can take about a day. The ensuing SEO report contains “Errors,” “Warnings,” and “Notices.”
Bing Webmaster Tools site scan report contains “Errors,” “Warnings,” and “Notices.” Click image to enlarge.
Errors
This section lists all pages with missing or incorrect components affecting organic search. Examples include:
Missing meta descriptions,
Pages returning 400-499 errors,
Broken redirects,
Broken canonical URLs,
Missing title tags,
Pages blocked by robots.txt,
Pages returning server timeouts,
Of the reported errors, missing meta descriptions are the least worrisome since they don’t affect search results. Bing will generate a description itself, which it would likely do anyway based on the query.
In my experience, all other errors require fixing. Clicking any error will produce an explanation, instructions for resolving, and the affected pages.
For example, Bing provides the following explanation for the “missing tag” error:
What is the issue about?
The title is missing in the head section of the page. Search engines use the title tag as an important signal for determining the page’s relevancy for a given keyword search. It is important to ensure that your title tag is unique and descriptive, and contains accurate information about the content of the page. The title should be unique to each page on your website.
How to fix?
Add a title to your page using the title tag, which should be placed inside the section of the page’s source. Write a concise, descriptive, keyword-rich page title that best describes the page content.
Warnings
The Warnings section contains issues that could improve your site’s organic rankings but not prevent them. On my Smarty Marketing site, Bing reported these warnings:
Not all of these are critical. I don’t pay much attention to warnings on title tags, meta descriptions, meta language tags, and too-long HTML. Search crawlers are increasingly advanced and can handle longer HTML and missing language tags. Plus, Google has confirmed using longer title tags for ranking signals even if truncated in search listings.
Nonetheless, Bing’s recommendations won’t hurt. Certainly image alt texts and H1 tags could improve rankings.
Overall, Bing’s SEO report covers the basics. It’s an overview of potentially critical errors — at no cost.
HTML headings such as H2 and H3 are powerful ranking signals to search engines. Their importance is increasing because Google’s featured snippet algorithm looks for sections of a page to answer a query. Headings help Google better understand the content.
Thus analyzing competitors’ HTML headings and optimizing your own can improve rankings.
Here are three tools to help.
Devaka Tools
Devaka Tools SEO bookmarklet.
Devaka Tools offers a quick (and free) way to identify and extract on-page headings via a bookmarklet, which the site explains:
Bookmarklets are small JavaScript programs that can be stored as bookmarks in a web browser. Unlike regular bookmarks, which simply navigate to a specific webpage, bookmarklets perform actions on the current page you are viewing. They are executed by clicking on the bookmark, triggering the JavaScript code embedded within.
Install on any desktop browser. Then click the bookmarklet in your browser’s toolbar for any page, select “Headings” in the bookmarklet’s controls, and it will highlight all HTML headings.
Thruuu
Thruuu
Thruuu is an AI-powered analyzer of search engine result pages. Type in a keyword, and the tool will perform a SERP analysis that includes images, keywords, and HTML headings.
Use the “Extended view” to access heading information for every ranking page. The “Explore Headings” feature uses AI to pull headings from all ranking pages. Clicking the “i” icon to the right of any heading shows which URLs use it or a similar version.
Thruuu is a helpful tool for understanding competitors’ HTML headings relative to search queries.
Thruuu’s free trial includes 10 SERP analyses. Paid plans start at $13 per month for 75 SERPs.
Surfer
Surfer
Surfer is a content writing tool using AI to create search-engine-optimized copy. Like Thruuu, Surfer analyzes SERPs for keywords but doesn’t show the details, only the content recommendations.
To use, paste your article text into Surfer with the target keyword. The tool will then provide on-page suggestions — including the number of headings and their keywords — in a guided and clear process.
Pricing for Surfer starts at $89 per month to optimize 20 pages.
Thruuu is more affordable than Surfer and offers a free trial. Plus it provides more details on high-ranking competitors, which I appreciate. Yet busy entrepreneurs and managers may prefer Surfer’s streamlined approach.
Google launched its disavow tool in 2012 to help websites exclude incoming links from affecting organic rankings. At the time, Google was fighting “link spam,” the practice of obtaining low-grade links and link-building tactics.
Google made it clear that websites should use the disavow tool only after receiving a manual penalty notice via Webmaster Tools (now Search Console) and only when that penalty was related to backlinks, stating:
You should disavow backlinks only if:
1. You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
AND
2. The links have caused a manual action, or likely will cause a manual action, on your site.
Nonetheless, many website owners used the disavow tool to preemptively avoid penalties after investing in low-quality link-building services or suspecting “negative SEO attacks” from competitors.
To this day, there’s little agreement in the search engine community on whether (or how) to use the disavow tool. Multiple providers offer disavow services that continue to attract clients. Google’s documentation says the disavow tool “is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google Search results.”
In my experience, most businesses don’t know if their backlinks are spammy. In-house SEO teams don’t typically report on the quality or type of acquired links. Third-party services wrongfully label links as “toxic,” adding to the confusion.
Here are updates from industry practitioners to help you decide.
Disavows are ineffective
The disavow tool has always been a request to Google, which may or may not honor it.
The recent experiment by search optimizer Cyrus Shepard proves the point. He disavowed all links he could find pointing to his site, waited seven weeks, and saw little impact on organic traffic.
Cyrus Shepard, a search optimizer, disavowed all known incoming links to his website. It experienced no material traffic change. Click image to enlarge.
Shepard believes Google processed the document, as his site encountered a slight traffic increase shortly after submitting the disavow list. But the search engine then ignored it, he says.
The only change was the number of backlinks reported in Search Console, although it could have been a widespread reporting bug as many sites experienced it.
Hence Google likely ignores disavows unless your site receives a manual penalty. There’s no reason to invest in preemptive disavow services.
Manual link-related penalties are rare
I know of very few manual link penalties in recent years, and I’ve witnessed just one this year. This likely means Google doesn’t require help or suggestions to determine which links are unnatural or spammy and will likely drop the disavow tool eventually.
Nonetheless, if your site received a link-related manual penalty (as reported in Search Console) and you cannot remove the unnatural links, the disavow tool remains an option. There’s no better alternative at present.
AI is upending keyword research and analysis for search engine optimization.
I’ve addressed AI’s ability to generate new terms and identify search intent for an existing term. Yet the industry continues to innovate and evolve. Here are three AI tools for analyzing keywords, target audiences, content, intent, and more.
TermSniper
TermSniper identifies search intent for any keyword and suggests content tips for optimizing. Type a keyword, and the tool will provide an analysis.
I typed “best web hosting.” Here’s TermSniper’s analysis:
The primary search intent of this search term is to find and compare the best web hosting services available in 2024. The secondary search intent is to get recommendations and reviews of specific web hosting providers from trusted sources.
Insights
Tone of voice: Informational
User’s next action: Continue research
Webpage format: Listicle
Expert author needed: Not necessarily
Brand authority needed: High to medium
Incentive: Yes, various offers like discounts and promotions are common
CTR boost: Yes, the use of numbers like “2024,” specific terms like “Top Picks,” and mentions of well-known brands
The tool pulls keywords that can match the search intent. For “best web hosting,” those keywords were “best,” “bluehost,” “wordpress,” and “millions.”
TermSniper scores the related keywords from 1 to 100 depending on how many high-ranking pages include each in their page copy and meta tags. The tool recommends adding words scoring 20 to 100 into your meta title, H1, and first paragraph. Add words scoring 1 to 19 throughout the rest of your page in a natural fashion.
TermSniper offers five free credits to test ‚ with one keyword analysis per credit. Paid plans start at $2 per credit with a $10 minimum.
SEO.ai
SEO.ai offers AI-driven research that can limit keyword lists by the target audience. Provide a base term, click on the “Audience keywords” tab, and describe your target audience. The tool will find related keywords matching that audience.
For “best web hosting,” I described the target audience as “Beginner users who are planning to start a site.”
SEO.ai’s keyword suggestions that match my audience were:
“reliable providers,”
“starter plans,”
“money-back,”
“low-cost.”
SEO.ai provides keyword suggestions to match a target audience. Click image to enlarge.
I can sort those keywords by search volume or ranking difficulty. Based on those metrics, I can launch a dedicated page, add the words to existing pages, or both.
SEO.ai offers a 7-day free trial with a credit card. Paid plans start at $49 per month.
Free SEO Keyword Research & SERP Analyzer
Free SEO Keyword Research & SERP Analyzer is a custom GPT for all ChatGPT users. It can scrape any Google search-result page (or any page) and analyze keyword usage, intent, and the target audience.
For “best web hosting,” the tool’s SERP analysis was as follows:
User Intent: Users searching for “best web hosting” are likely looking for comprehensive reviews, comparisons, and recommendations to make an informed decision on web hosting services.
Content Freshness: Most articles are from 2024, ensuring up-to-date information on the latest web hosting trends and services.
Authority & Trust: The top results come from reputable tech review sites like TechRadar, PCMag, and CNET, as well as a highly engaged community platform (Reddit).
As with any ChatGPT or custom GPT prompts, users can refine the prompt as needed. For example, follow-up prompts could be:
“What are the most common keywords found in search snippets for this query?”
“What types of pages rank for this search query?”
Users can also identify any URL ranking for a target query and prompt the custom GPT to analyze its keywords and usefulness and compare it to users’ own pages.
Optimizing for organic search now requires knowing which queries trigger AI Overviews and the URLs those summaries reference. Neither metric — queries or citations — is easily identified because Overviews appear only for logged-in users, and Search Console does not include separate Overviews reporting.
We have only third-party sources. Here are three tools for AI Overviews tracking and analysis.
ZipTie
ZipTie is a standalone AI Overviews tracker that monitors users’ important queries weekly and sends alerts when opportunities are missed.
To start, import your queries — I used Search Console‘s list. ZipTie will then generate a report containing:
Your page rankings for each query.
Whether your page is featured.
Whether your query triggered an AI Overview and if your page was referenced.
ZipTie reports users’ queries that produce Overviews and whether users’ domains were cited. Click image to enlarge.
Click any query in the report to see which URLs are included in the AI Overview and where they rank organically. This is helpful insight I could not find elsewhere.
Click any query to see which URLs are included in the AI Overview and where they rank organically. Click image to enlarge.
Early tests confirm that Search Console’s Performance tab includes clicks from AI Overviews. And ZipTie’s “Standard” tier at $89 per month integrates with Search Console. Thus combining ZipTie with Search Console will show clicks to your links in Overviews.
Otherwise, ZipTie.dev offers a free 14-day trial with a credit card. Paid plans start at $29 per month for “Basic.”
Semrush
Semrush includes AI Overviews in its Position Tracking section. To see whether your site was included in Overviews, create a project in Position Tracking with your website URL and list of keywords.
Once the project is running, go to the Overview tab and use the “SERP features” filter to limit your report to queries that trigger AI Overviews. If your site is referenced in any AI Overview, an icon will appear next to your organic position.
If any AI Overview references your site, an icon will appear next to the organic position. Click image to enlarge.
In my testing, ZipTie.dev provided more accurate results as I couldn’t replicate some of Semrush’s reported AI Overviews.
Semrush offers no free trial. Paid plans start at $129 per month to track 500 keywords.
Install the extension and then type your comma-separated queries into its sidebar. The extension will run each search, identify AI Overviews, and report which Overview includes your URL.
The “AI Overview Citations” report includes all the URLs referenced in each Overview and how often each domain appeared for all queries. This feature was handy for identifying domains referenced from multiple Overviews for my important queries. The report is downloadable as an Excel file.
The “AI Overview Citations” report includes all the URLs referenced in each Overview and how often each domain appeared for all queries. Click image to enlarge.
Google’s search results are highly personalized. Varying levels of personalization exist for each query based on:
Search history,
Interaction (clicks) history,
Search Console properties of searcher,
Location,
Browser settings (e.g., language).
Thus what a merchant sees in search engine result pages, for example, could differ from shoppers. Here are three ways to depersonalize the results to understand what others may encounter on important keywords.
Chrome Incognito Mode
The Incognito mode prevents Chrome from saving browsing history, cookies, site data, and form-completion info.
Searching Google using Incognito will remove the three top personalization types — search history, clicks, and Search Console properties — but not necessarily location and browser settings.
To access Incognito, go to “File” > “New Incognito Window.”
From there, use third-party SERP extractor tools to copy to a local file. For example, Serpsonar exports SERPs in an Excel file to include ranking URLs, titles, and descriptions. SEO Minion, a Chrome app, offers a one-click copy of the entire SERP in view.
To access Incognito in Chrome, go to “File”> “New Incognito Window.”
Keyword Insights
Keyword Insights freemium SERP Explorer can extract up to 100 results per query. The tool will extract all components of the SERP, not just organic results.
It also provides country- or city-specific SERPs without a needing a VPN service or proxies.
To start a new SERP Explorer project:
Type a keyword,
Choose the target location and language,
Select the type of search results: web, image, news, shopping, or video,
Select device: desktop, mobile, or tablet.
Once extracted, the search results are easily copied and retained locally.
SERP Explorer offers five free daily searches with exports limited to the top 10 desktop results. To access all features, upgrade to the professional tier at $145 per month.
To start a new SERP Explorer project, type the keyword (“buy laptop”), location, language, and more. Click image to enlarge.
Thruuu
Thruuu is an AI-powered optimization tool for keyword discovery, SERP analysis, and content creation. The tool extracts organic results and additional SERP components such as local packs, videos, and “People also ask” sections.
Thruuu extracts up to 100 results on mobile and desktop devices and provides additional data for each listing, such as the number of words, number of images, publication date, on-page schema types, page type, and page rank (via the Open PageRank initiative).
Users can download the SERP and related data as an Excel file.
Thruuu offers four free credits to analyze and compare SERPs. Paid plans start at $19 per month.
Thruuu is a tool for keyword discovery, SERP analysis, and content creation. This example is for the keyword “buy a laptop.” Click image to enlarge.
Google releases updates to its search algorithm seemingly nonstop. Each release typically lowers organic rankings for many sites. Hence keeping an eye on those updates and their impact is essential for evolving optimization strategies.
Here are four tools to track the releases and their effect.
This is the only reliable way to confirm when an update has started and ended. It is also helpful for determining if search fluctuations are due to non-algorithm incidents such as crawling and indexing glitches.
Sistrix
Sistrix’s Google Updates page.
Sistix is premium SEO software with a free tool to evaluate the impact of a Google update on your site. Choose your country and type your domain; the tool will check your ranking fluctuations for each update, allowing 25 daily queries for free.
Sistix also provides a summary of each update and its purpose.
Advanced Web Ranking publishes daily SERP volatility and fluctuations based on approximately 400,000 desktop keywords (and 200,000 mobile) across multiple countries.
Users can filter reports by industry, device, and country — with confirmed Google updates.
The tool offers free alerts on volatility and Google-announced releases.
GSC Guardian
Search Console overlay on GSC Guardian.
GSC Guardian is a free Chrome extension to track Google’s updates against overlays in Search Console reports. Users can create annotations of tasks or observations in their Search Console dashboards.
The extension simplifies evaluating the impact of a Google update on your organic search visibility.
Monitoring Google’s updates is essential, but remember, organic traffic fluctuations are normal, and core algorithm releases often address queries and their intent, not weaknesses in websites.
Thus traffic declines are not typically penalties. Assess search result pages and note the changes. Have your organic positions changed, or are they pushed down the page owing to AI Overviews or new SERP features? For example, recent updates replaced review sites with ecommerce results for many queries.
Not all ranking fluctuations are reversible. But knowing the status of target queries and SERPs is key for identifying new opportunities.
Google has many interesting free tools, but two important ones for helping you improve your site are Search Console and the Rich Results Testing Tool. Search Console helps you get a general feel for how your site is doing in the SERPs, plus keep an eye on any errors to fix and improvements to make. The other one, the Rich Results Testing Tool, helps you see which of your pages are eligible for rich results. Rich results are those highlighted search results like product and event listings.
Rich results are incredibly important in today’s world. Once you add structured data to your site, you might get a highlighted listing in the SERPs. This gives you an edge over your competitor, as highlighted listings tend to get more clicks. For many sites and types of content, it can make sense to target rich results.
Adding structured data to your courses might lead to highlights like this one
Here, we look at how to verify your eligibility and what you can do to improve on that. Google’s Rich Results Testing Tool helps you check your pages to see if they have valid structured data applied to them and if they might be eligible for rich results. You’ll also find which rich results the page is eligible for and get a preview of how these would look for your content.
Using the Rich Results Testing Tool is very easy. There are two ways to get your insights: enter the URL of the page you want to test or the piece of code you want to test. The second option can be a piece of structured data or the full source code of a page, whichever you prefer.
While testing, you can also choose between a smartphone and a desktop crawler. Google defaults to the smartphone crawler since we live in a mobile-first world, people! Of course, you can switch to a desktop if needed.
Enter a URL or a piece of code to get going. You can also choose between a smartphone or desktop crawler.
There is a difference, of course. It is a good idea to use the URL option if your page is already online. You’ll see if the page is eligible for rich results, view a preview of these rich results, and check out the rendered HTML of the page. But there’s nothing you can ‘do’ in the code. The code option does let you do that.
This particular page has a valid Course list item and Course info and is, therefore, eligible for rich results — which you can see in the first screenshot.
Working with structured data code
If you paste a piece of JSON structured data into the code field and run the test, you get the same results as the URL option. However, you can now also use the code input field to edit your code to fix errors or improve the structured data by fixing warnings.
If it’s minified, unminify it for better readability
Paste the code in the code field of the Rich Results Testing Tool
Run the test
You’ll get a view similar to the one below.
Code input is on the left; rich results test is on the right. You can now edit the code and quickly run the test after making those edits to see the changes.
Editing an event page
The page above is an event page; you’ll notice warnings in orange. Remember: red is an error, and orange is a warning. An error you have to fix to be valid, but a warning is a possible improvement to make. Because this concerns a paid event, the page misses an offers property. It also misses the optional fields performer, organizer, description and image. We could add these to remove the warnings and round out this structured data listing — because more is better.
Look at Google’s documentation about events and find out how they’d like the offers to appear in the code. To keep it simple, you could copy the example code and adapt it to your needs. Find a good place for it in your structured data on the left-hand side of your Rich Results Testing Tool screen and paste the code.
You could expand the code until it looks something like this:
Rerun the test, and more sections should turn green. If not, you might have to check if you’ve correctly applied and closed your code.
Once you’ve validated your code and know it’s working, you can apply it to your pages. Remember that we’ve described a very simple way of validating your code, and there are other ways to scale this into production. But that’s not the goal of this article. Here, we’d like to give you a quick insight into structured data and what you can do with the Rich Results Testing Tool.
See a preview of your rich results
The preview option is one of the coolest things in the Rich Results Testing Tool. This gives you an idea of how that page or article will appear on Google. There are several rich results that you can test, like breadcrumbs, courses, job postings, recipes, and many more.
These previews aren’t just for showoff — you can use them to improve the look of the rich results. Maybe the images look weird, or the title is not very attractive. Use these insights to your advantage and get people to click your listings!
Get a glimpse of how your rich result might appear in the SERPs
This is a short overview of what you can see and do with the Rich Results Testing Tool. Remember that your content is eligible for rich results if everything is green in the Rich Results Testing Tool and no errors are found. This does not — and we mean not — guarantee that Google will show rich results for this page. You’ll just have to wait and see.
Artificial intelligence is upending Google Search. Merchants who rely on organic search traffic must track a nonstop flow of changes and updates to maintain performance.
Search Console is the most reliable tool because it uses Google’s own data. Here are three Chrome extensions to better understand Search Console’s data.
GSC Guardian
Sample overlay on Search Console from GSC Guardian. Click image to enlarge.
GSC Guardian overlays Search Console reports with information from Google’s Search Status Dashboard. Users can create annotations on Search Console for tasks or observations and then export them to a CSV file or Google Sheet.
The extension helps correlate Google’s updates with your site’s traffic, to react accordingly.
Search Console Enhanced Analytics
Google Search Console Enhanced Analytics compares clicks, impressions, and positions across periods. Click image to enlarge.
Google Search Console Enhanced Analytics compares clicks, impressions, and positions across two periods to analyze (via color coding) traffic fluctuations, trends, and more.
Use a premium feature to generate search volume for each query your site ranks for, per Search Console, at $0.05 per request.
GSC Crawl Stats Downloader
GSC Crawl Stats Downloader builds a handy visualization of crawl activity. Click image to enlarge.
GSC Crawl Stats Downloader provides a better way to download crawl stats from Search Console. Instead of downloading multiple CSV files (by response, file type, purpose, Googlebot type, or summary), this extension can download everything with one click.
Google assigns a crawl budget for every site. Knowing how often Google visits your site is essential. The crawl frequency or page volume usually implies the priority (to Google) — sites with higher rankings typically see more crawls.
Conversely, it’s a possible structural problem if Google crawls many pages but fails to index them.
Install the extension and click it while viewing Search Console. The extension will identify and merge the crawl stats and build a handy visualization to show the crawl activity at a glance.