12 reasons your page won’t rank – even though it’s optimized

What could be the matter if your perfectly optimized post isn’t ranking? Is the problem that your site is not on Google, or is something else going wrong? What is keeping your content from reaching that coveted #1 position? In this post, we’ll discuss many possible reasons why your page is not ranking, even though it’s optimized.

We’ve divided the possible issues you might be having into four sections:

Pro tip

Quick question: how’s your internal linking? If your content is optimized but not ranking, or Google is ranking the wrong pages from your site, it could be because you need to improve your site structure or fix your orphaned content. We’ve made some really neat SEO workouts to help you check and remedy these kinds of issues — check them out and fix those issues now!

Indexing and crawl issues

The first few points on the list all deal with indexing and crawl issues. Put simply, you can’t rank if your page or site is not on Google in the first place. If you find these topics confusing, you might want to read up on how Google works and how to start with SEO.

1. Your site/page is not on Google

If you need help determining whether your site is on Google, you can use the site: search operator in Google. Type site:yoast.com, and you’ll see a list of pages for that domain. If you type in the full URL of a specific article, you should see only one search result return. If you see your pages, this means that Google knows about your site and has put — at least some of it — in its index. Once you discover that your page is in the index, but you think it is not performing well, you might want to dig deeper.

an example of a site index search on google with yoast.com showing thousands of pages indexed
The site: search operator helps you find your site in Google’s index

How to fix it

Check your WordPress Reading Settings. For the Search Visibility option, if you’ve ticked the box ‘Discourage search engines from indexing this site’, that’s the most likely reason your site is not on Google. If that’s the case, uncheck that box and click to save your changes. If the problem is that only some specific pages aren’t showing up on Google, then you might want to review your Search Appearance settings in Yoast SEO. Go to the ‘Content Types’ tab and ensure your settings are correct.

2. Your site/page is still too new

If your site or page is new, it might simply be a matter of chilling out and checking back in a little while. There are many moving parts in getting your content crawled, indexed and ranked. Sometimes, it takes days or maybe even weeks for Google to finish its discovery process.

How to fix it

If you check and find your site is not on Google yet, you can install Yoast SEO and submit the generated XML sitemap to Google Search Console to help Google discover your website. You can also use the URL Inspection tool in Search Console to determine how specific pages are doing. It tells you exactly how Google crawls and views your site.

3. Your content is noindexed

One of the most common reasons Google does not index your site or a specific page is that it has been noindexed inadvertently. Adding noindex meta robot tags to a page tells Googlebot that it can crawl the page but that the results can’t be added to the index.

How can you check if your page is noindexed? That’s easy; simply open the page and view the source code. You’ll find the code below somewhere at the top of the page. This tells search engine crawlers that the page’s content shouldn’t be added to the index, thus keeping it from ranking.

How to fix it

It happens! Even we occasionally make a mistake and inadvertently noindex a post. Luckily, it’s an easy fix. We wrote about how to set a piece of content back on the right track with Yoast SEO.

4. Your site/page is blocking Google with robots.txt

You might have told Google not to index your content, but it’s also possible you’ve told Google not to crawl your site at all! Blocking crawlers in a so-called robots.txt file is a surefire way never to get any traffic. Blocking robots is easier than you might think. For instance, WordPress has a Search Engine Visibility setting that does its best to keep crawlers out once set to Discourage search engines from indexing this site. Uncheck this to make your site available again.

this is the search engine visibility setting in wordpress without a checkmark
See that this option isn’t inadvertently checked

WordPress uses the noindex approach described above to handle the indexing of sites via the Search Engine Visibility setting. It does have a warning that it’s up to search engines to honor the request.

Besides telling WordPress to block search engines, it might be that other technical issues generate crawl errors, preventing Google from crawling your site properly. Your site’s web server could be acting up and presenting server errors, or buggy bits of JavaScript in your code trip up the crawler. Make sure Google can crawl your site easily.

How to fix it

If your robots.txt file is blocking Google from crawling your website (or parts of it) and you want to change that, then you’ll need to edit the file. You can follow this guide to edit your robots.txt file.

5. You must enhance your index coverage

Ensuring that Google indexes your web pages is essential to succeed. Index coverage refers to the number of your site’s URLs included in Google’s search index. Even the most optimized content may not appear in search results without comprehensive index coverage.

To identify the issue, you must examine the Index Coverage report in Google Search Console. This tool categorizes your pages into various categories and explains why pages are not indexed. If you notice many pages falling under “Error” or “Excluded,” it’s time to investigate further. One of the most common errors is ‘Crawled – currently not indexed’ in Search Console.

How to fix it

Ensure your XML sitemap is current and accurately represents your site structure. Please submit it to Google Search Console to help Google find your pages. Review and resolve any crawl errors such as 404s, server errors, or redirect issues. These errors can prevent pages from being indexed. Pages with low-quality or duplicate content might be excluded from the index. Focus on creating unique, valuable content that provides genuine user engagement. Use the URL Inspection tool to request indexing for crucial pages not yet indexed. This tool also provides insights into how Google perceives your page.

Google Search Console helps you understand why pages are not indexed

Technical issues affecting ranking

Is your page/website indexed but not ranking? Then, technical problems need to be checked.

6. You’re not ranking because your site has technical issues

Your website needs to meet certain technical benchmarks if you’re going to rank on Google! Loading speed, or how quickly your pages load, is important. Security and hosting quality are important too, and that’s not all. You can read about all the essentials in our article: things everyone should know about technical SEO.

If your post doesn’t appear in the search engines, technical issues could prevent it from appearing in the search results completely. You could have conflicting plugins causing problems, and we’ve also seen some themes that prevent Google from indexing your site. And, while Yoast SEO takes care of many technical issues under the hood, it should be set correctly to do that properly.

How to fix it

The fix you need will depend on the technical issues your website is having, and we can’t cover everything here. You might want to check the following points:

  • Ensure all your Yoast plugin settings are correct
  • Check that you’re doing things the right way to keep loading times down
  • Make sure your site is set to https:// and your security certificates are up to date
  • Upgrade your hosting plan
  • Check your plugins and/or theme aren’t causing problems.

If your technical SEO looks good and your site is indexed, you must dig deeper to discover the problem. Keep reading!

7. You’re being penalized for breaking SEO rules

If Google catches you using shady SEO techniques that it doesn’t allow — e.g., sneaky tactics like buying links or stuffing keywords into hidden text — your page or site can be penalized. When you’re already putting in the effort to make a good website and quality content, it’s counterproductive to try. Even when everything else on your page is perfect, if you’re doing something that Google doesn’t allow, you will have problems ranking (or appearing in the Google search results).

Most of these things are common sense, so you probably don’t need to worry if you’re not trying to trick Google or spam people. However, a few things used to be common SEO practices that can now lead to issues — check out our article about SEO myths for more examples of bad SEO practices to avoid.

How to fix it

You can check whether Google has flagged your page for these problems in the Manual Actions tab in Google Search Console (GSC). If you’re still new to using GSC, you might want to check out our introductory article. If you find an issue under the Manual Actions tab, read this help article to learn more about what it means and how to fix it.

Linking issues that affect ranking

A good internal linking structure and quality backlinks are important if you want to rank high. Google crawls the web, following each link it finds, so if your links are lacking, it can cause problems with ranking.

8. Your site doesn’t have a proper internal linking structure

Another reason your content doesn’t appear in the search results is that a crucial part of your SEO strategy is not in order. Don’t underestimate the importance of site structure – the internal linking structure – for your SEO strategy. Having a clear site structure leads to a better understanding of your site by Google. If your internal linking structure is poor, chances to rank high are lower – even when your content is well-optimized and awesome. 

How to fix it

Start adding those links! Make sure that your important posts and pages have the most internal links to them. But don’t randomly add links: make sure you add relevant, related links that add value for your users.

You can use the Yoast SEO orphaned content filter to find posts without incoming internal links. Yoast SEO Premium will help you even more by offering helpful linking suggestions as you write. In addition, if you use Yoast SEO Premium, you get various other AI features, like Yoast AI Optimize, that help you do the hard work. And if you really want to improve your site structure, check out our site structure training — which is also included in Premium!

Pro tip: Take care of your orphaned content and internal linking the easy way with our SEO workouts, available in Yoast SEO Premium.

Read on: Site structure: the ultimate guide »

If you just started with your website, your content won’t instantly rank. Not even if you have optimized everything perfectly and every bullet in Yoast SEO is green. To rank, you’ll need some links from other websites. After all, Google has to know your website exists. 

How to fix it

Creating incredible content is a good way to get links to your pages. High-quality content attracts clicks from readers who might share the content far and wide via social media. All this helps to get those links. Of course, you can do more to get links in a natural, non-spammy way: here are fifteen ways of getting high-quality backlinks.

To get (more) backlinks, you can reach out to other websites. You’ll need to do some PR or link building. Ask them to mention your site or talk about your product and link to your site. You can also use social media to get the word out! Learn all about link-building strategies in our All-Around SEO training!

Content and keyword issues affecting ranking

If everything else is as it should be SEO-wise, then your page or site is not ranking might be related to your content or keywords.

10. Your page is great, but there’s too much competition

Usually, a page doesn’t rank because there’s simply too much competition. If you optimize your content for competitive keywords and keyphrases, such as [cat behavior], [robot vacuum cleaner], or [real estate agent], chances are high that you won’t rank for that term. 

Check the results pages for your keyword to determine if this is the problem. Do high authority sites like Wikipedia or Amazon dominate the first page? Do you see many sites already firmly established themselves in this niche? Probably, your site doesn’t have the authority that these other sites have (yet). So you can optimize all you want, but unfortunately, that’s not enough to rank high in the search results if your niche is too competitive. 

How to fix it

If you want to rank for highly competitive terms, try a long-tail keyword strategy. Write content that targets related long-tail keywords and phrases before tackling the competitive keywords. If these long-tail articles start ranking, you can also rank for more competitive terms. Such a strategy requires long-term efforts, but in the end, it will pay off.

Read more: Why you should focus on long tail keywords »

11. Low-quality content or wrong type of intent

Another reason your content isn’t ranking is that it doesn’t match the intent of people searching for your keyword. Search intent is important for search engines: do people want to buy something, go to a specific website, or seek information? Even if you’re targeting a more long-tail keyphrase, if your content doesn’t match the dominant intent of searchers, search engines won’t show it in the results because it won’t be what people are looking for.

Let’s look at a few examples. Say you’re a dog trainer who wants to rank for puppy training services, so you optimize for [training your puppy], with transactional intent in mind. But if you look at the search results, you’ll see that there are informational videos, and all the results explain how to train a puppy yourself. So, searchers have informational intent. This can work the other way around, too. If you’ve written a step-by-step guide for your blog on making garden decorations, aiming to rank for [flower garland garden decoration], you may have trouble ranking for that term if people just want to buy that, not make it themselves.

Remember that not every search term has one dominant type of intent. Also, it isn’t impossible to rank with content for differing intent. Still, it can be worthwhile to look into this if your optimized content doesn’t rank in the search engines.

How to fix it

Unfortunately, you don’t have the power to change the intent of search engine users. But you can adapt your content strategy. If your optimized content isn’t ranking, look at the search results (use private mode) and analyze what you see. Is one specific type of result dominant? Are there images or videos? Which related queries are shown? This is where your opportunities are. If you find primarily informational intent for a query, you can write content to get people to your site, establish your brand as a reliable source of information, and stay top of mind when people want to buy something. If you find a lot of images in the search results, you may need to focus more on image SEO. Consider what you see on the results pages when determining your SEO strategy.

12. Your content lacks uniqueness

Even well-written and optimized content might struggle to rank if it doesn’t stand out. Search engines prioritize content that offers a unique perspective or provides additional value compared to existing articles on the same topic.

Check the search results for your target keywords and examine the top-ranking pages. Does your content offer something different or more insightful? If your page presents similar information in a comparable format, you may find it difficult to climb the rankings. With the advent of generative AI, we’ll see a wave of mediocre sameness appear in the search results. If you publish the same stuff, search engines won’t bother with it.

Generative AI can help create content but needs help maintaining quality and relevance. While AI can quickly produce large volumes of content, we should prioritize quality over quantity. You should make sure that the material is original and valuable to your audience. AI-generated content might be repetitive or lack diverse perspectives. It’s essential to refine it with your unique insights or expert opinions.

Additionally, the content should always align with your audience’s needs and search intent, as AI may not fully capture human nuances. Always comply with search engine guidelines regarding AI-generated content to avoid potential penalties or indexing issues. You can enhance your content strategy while preserving its integrity by using AI as a supportive tool rather than a standalone solution.

How to fix it

Quit simply; add unique insights and views. Add your own voice and incorporate original research, case studies, or expert opinions to set your content apart. Keep your content fresh with the latest information, trends, or data to maintain relevance and uniqueness. Encourage comments and discussions to build a community around your content, making it more dynamic and engaging.

Is your optimized content still not ranking?

Multiple reasons could prevent a post from ranking. Have you optimized your post correctly with Yoast SEO? Then, the most common cause is likely to be that the competition in a niche is too fierce. Unfortunately, SEO is a long-term strategy. You need to work hard and be patient. In the meantime, you can tackle many other aspects of your SEO (site structure, link building). Try to focus on all website optimization aspects and be the best result. It will pay off eventually!

Read more: Rank tracking: why you should monitor your keywords »

Coming up next!

Google Shows How To Block Bots And Boost Site Performance via @sejournal, @martinibuster

Google’s Martin Splitt answered a question about malicious bots that impact site performance, offering suggestions every SEO and site owner should know and put into action.

Malicious Bots Are An SEO Problem

Many SEOs who do site audits commonly overlook security and bot traffic as part of their audits because it’s not widely understood by digital marketers that security events impact site performance and can account for why a site is inadequately crawled. Improving core web vitals will do nothing to improve site performance when a poor security posture is contributing to poor site performance.

Every website is under attack and the effects of excessive crawling can trigger a “500 server error” response code, signaling an inability to serve web pages and hindering Google’s ability to crawl web pages.

How To Defend Against Bot Attacks

The person asking the question wanted Google’s advice on how to fight back against the waves of scraper bots impacting their server performance.

This is the question asked:

“Our website is experiencing significant disruptions due to targeted scraping by automated software, leading to performance issues, increased server load, and potential data security concerns. Despite IP blocking and other preventive measures, the problem persists. What can we do?”

Google’s Martin Splitt suggested identifying the service that is serving as the source of the attacks and notifying them of an abusive use of their services. He also recommended the firewall capabilities of a CDN (Content Delivery Network).

Martin answered:

“This sounds like somewhat of a distributed denial-of-service issue if the crawling is so aggressive that it causes performance degradation.

You can try identifying the owner of the network where the traffic is coming from, thank “their hoster” and send an abuse notification. You can use WHOIS information for that, usually.

Alternatively, CDNs often have features to detect bot traffic and block it and by definition they take the traffic away from your server and distribute it nicely, so that’s a win. Most CDNs recognize legitimate search engine bots and won’t block them but if that’s a major concern for you, consider asking them before starting to use them.”

Will Google’s Advice Work?

Identifying the cloud provider or server data center that’s hosting the malicious bots is good advice. But there are many scenarios where that won’t work.

Three Reasons Why Contacting Resource Providers Won’t Work

1. Many Bots Are Hidden

Bots often use VPNs and open source “Tor” networks that hide the source of the bots, defeating all attempts of identifying the cloud services or web host providing the infrastructure for the bots. Hackers also hide behind compromised home and business computers, called botnets to launch their attacks. There’s no way to identify them.

2. Bots Switch IP Addresses

Some bots respond to IP blocking by instantly switching to a different network to immediately resume their attack. An attack can originate from a German server and when blocked will switch to a network provider in Asia.

3. Inefficient Use Of Time

Contacting network providers about abusive users is futile when the source of the traffic is obfuscated or from hundreds of sources. Many site owners and SEOs might be surprised to discover how intensive the attacks on their websites are. Even taking action against a small group of offenders is an inefficient use of time because there are literally millions of other bots that will replace the ones blocked by a cloud provider.

And what about botnets made up of thousands of compromised computers around the world? Think you have time to notify all of those ISPs?

Those are three reasons why notifying infrastructure providers is not a viable approach to stopping bots that impact site performance. Realistically, it’s a futile and inefficient use of time.

Use A WAF To Block Bots

Using a Web Application Firewall (WAF) is a good idea and that’s the function that Martin Splitt suggests when he mentioned using a CDN (content delivery network). A CDN, like Cloudflare, sends browsers and crawlers the requested web page from a server that’s located closest to them, speeding up site performance and reducing server resources for the site owner.

A CDN also has a WAF (Web Application Firewall) which automatically blocks malicious bots. Martin’s suggestion for using a CDN is definitely a good option, especially because it has the additional benefit of improving site performance.

An option that Martin didn’t mention is to use a WordPress plugin WAF like Wordfence. Wordfence has a WAF that automatically shuts down bots based on their behavior. For example, if a bot is requesting ridiculous amounts of pages it will automatically create a temporary IP block. If the bot rotates to another IP address it will identify the crawling behavior and block it again.

Another solution to consider is a SaaS platform like Sucuri that offers a WAF and a CDN to speed up performance. Both Wordfence and Sucuri are trustworthy providers of WordPress security and they come with limited but effective free versions.

Listen to the question and answer at the 6:36 minute mark of the Google SEO Office Hours podcast:

Featured Image by Shutterstock/Krakenimages.com

What To Know About Medium-Level WordPress Vulnerabilities via @sejournal, @martinibuster

The majority of WordPress vulnerabilities, about 67% of them discovered in 2023, are rated as medium level. Because of they’re the most common, it makes sense to understand what they are and when they represent an actual security threat. These are the facts about those kinds of vulnerabilities what you should know about them.

What Is A Medium Level Vulnerability?

A spokesperson from WPScan, a WordPress Security Scanning company owned by Automattic, explained that they use the Common Vulnerability Scoring System (CVSS Scores) to rate the severity of a threat. The scores are based on a numbering system from 1 – 10 and ratings from low, medium, high, and critical.

The WPScan spokesperson explained:

“We don’t flag levels as the chance of happening, but the severity of the vulnerability based on FIRST’s CVSS framework. Speaking broadly, a medium-level severity score means either the vulnerability is hard to exploit (e.g., SQL Injection that requires a highly privileged account) or the attacker doesn’t gain much from a successful attack (e.g., an unauthenticated user can get the content of private blog posts).

We generally don’t see them being used as much in large-scale attacks because they are less useful than higher severity vulnerabilities and harder to automate. However, they could be useful in more targeted attacks, for example, when a privileged user account has already been compromised, or an attacker knows that some private content contains sensitive information that is useful to them.

We would always recommend upgrading vulnerable extensions as soon as possible. Still, if the severity is medium, then there is less urgency to do so, as the site is less likely to be the victim of a large-scale automated attack.

An untrained user may find the report a bit hard to digest. We did our best to make it as suitable as possible for all audiences, but I understand it’d be impossible to cover everyone without making it too boring or long. And the same can happen to the reported vulnerability. The user consuming the feed would need some basic knowledge of their website setup to consider which vulnerability needs immediate attention and which one can be handled by the WAF, for example.

If the user knows, for example, that their site doesn’t allow users to subscribe to it. All reports of subscriber+ vulnerabilities, independent of the severity level, can be reconsidered. Assuming that the user maintains a constant review of the site’s user base.

The same goes for contributor+ reports or even administrator levels. If the person maintains a small network of WordPress sites, the admin+ vulnerabilities are interesting for them since a compromised administrator of one of the sites can be used to attack the super admin.”

Contributor-Level Vulnerabilities

Many medium severity vulnerabilities require a contributor-level access. A contributor is an access role that gives that registered user the ability to write and submit content, although in general they don’t have the ability to publish them.

Most websites don’t have to worry about security threats that require contributor level authentication because most sites don’t offer that level of access.

Chloe Chamberland – Threat Intelligence Lead at Wordfence explained that most site owners shouldn’t worry about medium level severity vulnerabilities that require a contributor-level access in order to exploit them because most WordPress sites don’t offer that permission level. She also noted that these kinds of vulnerabilities are hard to scale because exploiting them is difficult to automate.

Chloe explained:

“For most site owners, vulnerabilities that require contributor-level access and above to exploit are something they do not need to worry about. This is because most sites do not allow contributor-level registration and most sites do not have contributors on their site.

In addition, most WordPress attacks are automated and are looking for easy to exploit high value returns so vulnerabilities like this are unlikely to be targeted by most WordPress threat actors.”

Website Publishers That Should Worry

Chloe also said that publishers who do offer contributor-level permissions may have several reasons to be concerned about these kinds of exploits:

“The concern with exploits that require contributor-level access to exploit arises when site owners allow contributor-level registration, have contributors with weak passwords, or the site has another plugin/theme installed with a vulnerability that allows contributor-level access in some way and the attacker really wants in on your website.

If an attacker can get their hands on one of these accounts, and a contributor-level vulnerability exists, then they may be provided with the opportunity to escalate their privileges and do real damage to the victim. Let’s take a contributor-level Cross-Site Scripting vulnerability for example.

Due to the nature of contributor-level access, an administrator would be highly likely to preview the post for review at which point any injected JavaScript would execute – this means the attacker would have a relatively high chance of success due to the admin previewing the post for publication.

As with any Cross-Site Scripting vulnerability, this can be leveraged to add a new administrative user account, inject backdoors, and essentially do anything a site administrator could do. If a serious attacker has access to a contributor-level account and no other trivial way to elevate their privileges, then they’d likely leverage that contributor-level Cross-Site Scripting to gain further access. As previously mentioned, you likely won’t see that level of sophistication targeting the vast majority of WordPress sites, so it’s really high value sites that need to be concerned with these issues.

In conclusion, while I don’t think a vast majority of site owners need to worry about contributor-level vulnerabilities, it’s still important to take them seriously if you allow user registration at that level on your site, you don’t enforce unique strong user passwords, and/or you have a high value WordPress website.”

Be Aware Of Vulnerabilities

While the many of the medium level vulnerabilities may not be something to worry about it’s still a good idea to stay informed of them. Security Scanners like the free version of WPScan can give a warning when a plugin or theme becomes vulnerable. It’s a good way to have a warning system in place to keep on top of vulnerabilities.

WordPress security plugins like Wordfence offer a proactive security stance that actively blocks automated hacking attacks and can be further tuned by advanced users to block specific bots and user agents. The free version of Wordfence offers significant protection in the form of a firewall and a malware scanner. The paid version offers protection for all vulnerabilities as soon as they’re discovered and before the vulnerability is patched. I use Wordfence on all of my websites and can’t imagine setting up a website without it.

Security is generally not regarded as an SEO issue but it should be considered as one because failure to secure a site can undo all the hard word done to make a site rank well.

Featured Image by Shutterstock/Juan villa torres

2024 WordPress Vulnerability Report Shows Errors Sites Keep Making via @sejournal, @martinibuster

WordPress security scanner WPScan’s 2024 WordPress vulnerability report calls attention to WordPress vulnerability trends and suggests the kinds of things website publishers (and SEOs) should be looking out for.

Some of the key findings from the report were that just over 20% of vulnerabilities were rated as high or critical level threats, with medium severity threats, at 67% of reported vulnerabilities, making up the majority. Many regard medium level vulnerabilities as if they are low-level threats but they’re not and should be regarded as deserving attention.

The WPScan report advised:

“While severity doesn’t translate directly to the risk of exploitation, it’s an important guideline for website owners to make an educated decision about when to disable or update the extension.”

WordPress Vulnerability Severity Distribution

Critical level vulnerabilities, the highest level of threat, represented only 2.38% of vulnerabilities, which is (essentially good news for WordPress publishers. Yet as mentioned earlier, when combined with the percentages of high level threats (17.68%) the number or concerning vulnerabilities rises to almost 20%.

Here are the percentages by severity ratings:

  • Critical 2.38%
  • Low 12.83%
  • High 17.68%
  • Medium 67.12%

Authenticated Versus Unauthenticated

Authenticated vulnerabilities are those that require an attacker to first attain user credentials and their accompanying permission levels in order to exploit a particular vulnerbility. Exploits that require subscriber-level authentication are the most exploitable of the authenticated exploits and those that require administrator level access present the least risk (although not always a low risk for a variety of reasons).

Unauthenticated attacks are generally the easiest to exploit because anyone can launch an attack without having to first acquire a user credential.

The WPScan vulnerability report found that about 22% of reported vulnerabilities required subscriber level or no authentication at all, representing the most exploitable vulnerabilities. On the other end of the scale of the exploitability are vulnerabilities requiring admin permission levels representing a total of 30.71% of reported vulnerabilities.

Permission Levels Required For Exploits

Vulnerabilities requiring administrator level credentials represented the highest percentage of exploits, followed by Cross Site Request Forgery (CSRF) with 24.74% of vulnerabilities. This is interesting because CSRF is an attack that uses social engineering to get a victim to click a link from which the user’s permission levels are acquired. If they can trick an admin level user to follow a link then they will be able to assume that level of privileges to the WordPress website.

The following is the percentages of exploits ordered by roles necessary to launch an attack.

Ascending Order Of User Roles For Vulnerabilities

  • Author 2.19%
  • Subscriber 10.4%
  • Unauthenticated 12.35%
  • Contributor 19.62%
  • CSRF 24.74%
  • Admin 30.71%

Most Common Vulnerability Types Requiring Minimal Authentication

Broken Access Control in the context of WordPress refers to a security failure that can allow an attacker without necessary permission credentials to gain access to higher credential permissions.

In the section of the report that looks at the occurrences and vulnerabilities underlying unauthenticated or subscriber level vulnerabilities reported (Occurrence vs Vulnerability on Unauthenticated or Subscriber+ reports), WPScan breaks down the percentages for each vulnerability type that is most common for exploits that are the easiest to launch (because they require minimal to no user credential authentication).

The WPScan threat report noted that Broken Access Control represents a whopping 84.99% followed by SQL injection (20.64%).

The Open Worldwide Application Security Project (OWASP) defines Broken Access Control as:

“Access control, sometimes called authorization, is how a web application grants access to content and functions to some users and not others. These checks are performed after authentication, and govern what ‘authorized’ users are allowed to do.

Access control sounds like a simple problem but is insidiously difficult to implement correctly. A web application’s access control model is closely tied to the content and functions that the site provides. In addition, the users may fall into a number of groups or roles with different abilities or privileges.”

SQL injection, at 20.64% represents the second most prevalent type of vulnerability, which WPScan referred to as both “high severity and risk” in the context of vulnerabilities requiring minimal authentication levels because attackers can access and/or tamper with the database which is the heart of every WordPress website.

These are the percentages:

  • Broken Access Control 84.99%
  • SQL Injection 20.64%
  • Cross-Site Scripting 9.4%
  • Unauthenticated Arbitrary File Upload 5.28%
  • Sensitive Data Disclosure 4.59%
  • Insecure Direct Object Reference (IDOR) 3.67%
  • Remote Code Execution 2.52%
  • Other 14.45%

Vulnerabilities In The WordPress Core Itself

The overwhelming majority of vulnerability issues were reported in third-party plugins and themes. However, there were in 2023 a total of 13 vulnerabilities reported in the WordPress core itself. Out of the thirteen vulnerabilities only one of them was rated as a high severity threat, which is the second highest level, with Critical being the highest level vulnerability threat, a rating scoring system maintained by the Common Vulnerability Scoring System (CVSS).

The WordPress core platform itself is held to the highest standards and benefits from a worldwide community that is vigilant in discovering and patching vulnerabilities.

Website Security Should Be Considered As Technical SEO

Site audits don’t normally cover website security but in my opinion every responsible audit should at least talk about security headers. As I’ve been saying for years, website security quickly becomes an SEO issue once a website’s ranking start disappearing from the search engine results pages (SERPs) due to being compromised by a vulnerability. That’s why it’s critical to be proactive about website security.

According to the WPScan report, the main point of entry for hacked websites were leaked credentials and weak passwords. Ensuring strong password standards plus two-factor authentication is an important part of every website’s security stance.

Using security headers is another way to help protect against Cross-Site Scripting and other kinds of vulnerabilities.

Lastly, a WordPress firewall and website hardening are also useful proactive approaches to website security. I once added a forum to a brand new website I created and it was immediately under attack within minutes. Believe it or not, virtually every website worldwide is under attack 24 hours a day by bots scanning for vulnerabilities.

Read the WPScan Report:

WPScan 2024 Website Threat Report

Featured Image by Shutterstock/Ljupco Smokovski

WordPress Discovers XSS Vulnerability – Recommends Updating To 6.5.2 via @sejournal, @martinibuster

WordPress announced the 6.5.2 Maintenance and Security Release update that patches a store cross site scripting vulnerability and fixes over a dozen bugs in the core and the block editor.

The same vulnerability affects both the WordPress core and the Gutenberg plugin.

Cross Site Scripting (XSS)

An XSS vulnerability was discovered in WordPress that could allow an attacker to inject scripts into a website that then attacks site visitors to those pages.

There are three kinds of XSS vulnerabilities but the most commonly discovered in WordPress plugins, themes and WordPress itself are reflected XSS and stored XSS.

Reflected XSS requires a victim to click a link, an extra step that makes this kind of attack harder to launch.

A stored XSS is the more worrisome variant because it exploits a flaw that allows the attacker to upload a script into the vulnerable site that can then launch attacks against site visitors. The vulnerability discovered in WordPress is a stored XSS.

The threat itself is mitigated to a certain degree because this is an authenticated stored XSS, which means that the attacker needs to first acquire at least a contributor level permissions in order to exploit the website flaw that makes the vulnerability possible.

This vulnerability is rated as a medium level threat, receiving a Common Vulnerability Scoring System (CVSS) score of 6.4 on a scale of 1 – 10.

Wordfence describes the vulnerability:

“WordPress Core is vulnerable to Stored Cross-Site Scripting via user display names in the Avatar block in various versions up to 6.5.2 due to insufficient output escaping on the display name. This makes it possible for authenticated attackers, with contributor-level access and above, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

WordPress.org Recommends Updating Immediately

The official WordPress announcement recommended that users update their installations, writing:

“Because this is a security release, it is recommended that you update your sites immediately. Backports are also available for other major WordPress releases, 6.1 and later.”

Read the Wordfence advisories:

WordPress Core < 6.5.2 – Authenticated (Contributor+) Stored Cross-Site Scripting via Avatar Block

Gutenberg 12.9.0 – 18.0.0 – Authenticated (Contributor+) Stored Cross-Site Scripting via Avatar Block

Read the official WordPress.org announcement:

WordPress 6.5.2 Maintenance and Security Release

Featured Image by Shutterstock/ivan_kislitsin

XSS Vulnerability Affects Beaver Builder WordPress Page Builder via @sejournal, @martinibuster

The popular Beaver Builder WordPress Page Builder was found to contain an XSS vulnerability that can allow an attacker to inject scripts into the website that will run when a user visits a webpage.

Beaver Builder

Beaver Builder is a popular plugin that allows anyone to create a professional looking website using an easy to use drag and drop interface. Users can start with a predesigned template or create a website from scratch.

Stored Cross Site Scripting (XSS) Vulnerability

Security researchers at Wordfence published an advisory about an XSS vulnerability affecting the page builder plugin. An XSS vulnerability is typically found in a part of a theme or plugin that allows user input. The flaw arises when there is insufficient filtering of what can be input (a process called input sanitization). Another flaw that leads to an XSS is insufficient output escaping, which is a security measure on the output of a plugin that prevents harmful scripts from passing to a website browser.

This specific vulnerability is called a Stored XSS. Stored means that an attacker is able to inject a script directly onto the webs server. This is different from a reflected XSS which requires a victim to click a link to the attacked website in order to execute a malicious script. A stored XSS (as affects the Beaver Builder), is generally considered to be more dangerous than a reflected XSS.

The security flaws that gave rise to an XSS vulnerability in the Beaver Builder were due to insufficient input sanitization and output escaping.

Wordfence described the vulnerability:

“The Beaver Builder – WordPress Page Builder plugin for WordPress is vulnerable to Stored Cross-Site Scripting via the plugin’s Button Widget in all versions up to, and including, 2.8.0.5 due to insufficient input sanitization and output escaping on user supplied attributes. This makes it possible for authenticated attackers, with contributor-level access and above, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

The vulnerability is rated 6.4, a medium level threat. Attackers must gain at least contributor-level permission levels in order to be able to launch an attack, which makes this vulnerability a little harder to exploit.

The official Beaver Builder changelog, which documents what’s contained in an update, notes that a patch was issued in version 2.8.0.7.

The changelog notes:

“Fix XSS issue in Button & Button Group Modules when using lightbox”

Recommended action: It’s generally a good practice to update and patch a vulnerability before an attacker is able to exploit it. It’s a best-practice to stage the site first before pushing an update live in case that the updated plugin conflicts with another plugin or theme.

Read the Wordfence advisory:

Beaver Builder – WordPress Page Builder <= 2.8.0.5 – Authenticated (Contributor+) Stored Cross-Site Scripting via Button

See also:

Featured Image by Shutterstock/Prostock-studio

WordPress Backup Plugin DoS Vulnerability Affects +200,000 Sites via @sejournal, @martinibuster

A popular WordPress backup plugin installed in over 200,000 websites recently patched a high severity vulnerability that could lead to a denial of service attack. Wordfence assigned a CVSS severity level rating of High, with a score of 7.5/10, indicating that plugin users should take note and update their plugin.

Backuply Plugin

The vulnerability affects the Backuply WordPress backup plugin. Creating backups is a necessary function for every website, not just WordPress sites, because backups help publishers roll back to a previous version should the server fail and lose data in a catastrophic failure.

Website backups are invaluable for site migrations, hacking recovery and failed updates that render a website non-functional.

Backuply is an especially useful plugin because it backup data to multiple trusted third party cloud services and supports multiple ways to download local copies in order to create redundant backups so that if a cloud backup is bad the site can be recovered from another backup stored locally.

According to Backuply:

“Backuply comes with Local Backups and Secure Cloud backups with easy integrations with FTP, FTPS, SFTP, WebDAV, Google Drive, Microsoft OneDrive, Dropbox, Amazon S3 and easy One-click restoration.”

Vulnerability Affecting Backuply

The United States Government National Vulnerability Database warns that Backuply up to and including version 1.2.5 contains a flaw that can lead to denial of service attacks.

The warning explains:

“This is due to direct access of the backuply/restore_ins.php file and. This makes it possible for unauthenticated attackers to make excessive requests that result in the server running out of resources.”

Denial Of Service (DoS) Attack

A denial of service (DoS) attack is one in which a flaw in a software allows an attacker to make so many rapid requests that the server runs out of resources and can no longer process any further requests, including serving webpages to site visitors.

A feature of DoS attacks is that it is sometimes possible to upload scripts, HTML or other code that can then be executed, allowing the attacker to perform virtually any action.

Vulnerabilities that enable DoS attacks are considered critical, and steps to mitigate them should be taken as soon as possible.

Backuply Changelog Documentation

The official Backuply changelog, which announces the details of every update, notes that a fix was implemented in version of 1.2.6. Backuply’s transparency and rapid response is responsible and a sign of a trustworthy developer.

According to the Changelog:

“1.2.6 (FEBRUARY 08 2024)
[Security-Fix] In some cases it was possible to fill up the logs and has been fixed. Reported by Villu Orav (WordFence)”

Recommendations

In general it is highly recommended that all users of the Backuply plugin update their plugin as soon as possible in order to prevent an unwanted security event.

Read the National Vulnrability Database description of the vulnerability:

CVE-2024-0842

Read the Wordfence Backuply vulnerability report:

Backuply – Backup, Restore, Migrate and Clone <= 1.2.5 – Denial of Service

Featured Image by Shutterstock/Doppelganger4

Mozilla VPN Security Risks Discovered via @sejournal, @martinibuster

Mozilla published the results of a recent third-party security audit of its VPN services as part of it’s commitment to user privacy and security. The survey revealed security issues which were presented to Mozilla to be addressed with fixes to ensure user privacy and security.

Many search marketers use VPNs during the course of their business especially when using a Wi-Fi connection in order to protect sensitive data, so the  trustworthiness of a VNP is essential.

Mozilla VPN

A Virtual Private Network (VPN), is a service that hides (encrypts) a user’s Internet traffic so that no third party (like an ISP) can snoop and see what sites a user is visiting.

VPNs also add a layer of security from malicious activities such as session hijacking which can give an attacker full access to the websites a user is visiting.

There is a high expectation from users that the VPN will protect their privacy when they are browsing on the Internet.

Mozilla thus employs the services of a third party to conduct a security audit to make sure their VPN is thoroughly locked down.

Security Risks Discovered

The audit revealed vulnerabilities of medium or higher severity, ranging from Denial of Service (DoS). risks to keychain access leaks (related to encryption) and the lack of access controls.

Cure53, the third party security firm, discovered and addressed several risks. Among the issues were potential VPN leaks to the vulnerability of a rogue extension that disabled the VPN.

The scope of the audit encompassed the following products:

  • Mozilla VPN Qt6 App for macOS
  • Mozilla VPN Qt6 App for Linux
  • Mozilla VPN Qt6 App for Windows
  • Mozilla VPN Qt6 App for iOS
  • Mozilla VPN Qt6 App for Androi

These are the risks identified by the security audit:

  • FVP-03-003: DoS via serialized intent
  • FVP-03-008: Keychain access level leaks WG private key to iCloud
  • VP-03-010: VPN leak via captive portal detection
  • FVP-03-011: Lack of local TCP server access controls
  • FVP-03-012: Rogue extension can disable VPN using mozillavpnnp (High)

The rogue extension issue was rated as high severity. Each risk was subsequently addressed by Mozilla.

Mozilla presented the results of the security audit as part of their commitment to transparency and to maintain the trust and security of their users. Conducting a third party security audit is a best practice for a VPN provider that helps assure that the VPN is trustworthy and reliable.

Read Mozilla’s announcement:
Mozilla VPN Security Audit 2023

Featured Image by Shutterstock/Meilun