10 Hosting Trends Agencies Should Watch In 2025

This post was sponsored by Bluehost. The opinions expressed in this article are the sponsor’s own.

Which hosting service is best for agencies?

How do I uncover what will be best for my clients in 2025?

What features should my hosting service have in 2025?

Hosting has evolved well beyond keeping websites online.

Hosting providers must align their services to meet clients’ technological needs and keep up with constantly changing technological advances.

Today, quality hosting companies must focus on speed, security, and scalability. Staying ahead of hosting trends is critical to maintaining competitive offerings, optimizing workflows, and meeting client demands.

So, what should you watch for in 2025?

The next 12 months promise significant shifts in hosting technologies, with advancements in AI, automation, security, and sustainability leading the way.

Understanding and leveraging these trends enables agencies and professionals to provide better client experiences, streamline operations, and reduce the negative effects of future industry changes.

Trend 1: Enhanced AI & Automation Implemented In Hosting

AI and automation are already transforming hosting, making it smarter and more efficient for service providers, agencies, brands, and end-point customers alike.

Hosting providers now leverage AI to optimize server performance, predict maintenance needs, and even supplement customer support with AI-driven features like chatbots.

As a result, automating routine tasks such as backups, updates, and resource scaling reduces downtime and the need for manual intervention. These innovations are game-changing for those managing multiple client sites and will become increasingly important in 2025.

It only makes sense.

Automated systems free up valuable time, allowing you more time to focus on strategic growth instead of tedious maintenance tasks. AI-powered insights can also identify performance bottlenecks, enabling you to address issues before they impact your website or those of your clients.

Agencies that adopt these technologies this year will not only deliver exceptional service but also be able to position themselves as forward-thinking.

Bluehost embraces automation with features like automated backups, one-click updates, and a centralized dashboard for easy site management. These tools streamline workflows, enabling agencies and professionals to manage multiple sites with minimal effort while ensuring optimal performance.

Trend 2: Multi-Cloud & Hybrid Cloud Solutions Are Now Essential

In 2025, as businesses demand more flexibility and reliability from their online infrastructure, multi-cloud and hybrid cloud solutions will become essential in the hosting world.

These approaches offer the best of both worlds:

  • The ability to leverage multiple cloud providers for redundancy and performance.
  • The option to combine public and private cloud environments for greater control and customization.

For agencies managing diverse client needs, multi-cloud and hybrid cloud strategies provide the scalability and adaptability required to meet modern demands. Multi-cloud solutions allow agencies to distribute their clients’ workloads across multiple cloud providers, ensuring that no single point of failure disrupts their operations.

This feature is particularly valuable for agencies with high-traffic websites, where downtime or slow performance can have a significant impact on revenue and user experience. Hybrid cloud solutions, on the other hand, let agencies blend the scalability of public clouds with the security and control of private cloud environments.

This service is ideal for clients with sensitive data or compliance requirements, such as ecommerce or healthcare businesses.

Bluehost Cloud provides scalable infrastructure and tools that enable agencies to customize hosting solutions to fit their clients’ unique requirements. Our cloud hosting solution’s elastic architecture ensures that websites can handle sudden traffic spikes without compromising speed or reliability.

Additionally, our intuitive management dashboard allows agencies to easily monitor and allocate resources across their client portfolio, making it simple to implement tailored solutions for varying workloads.

By adopting multi-cloud and hybrid cloud strategies, agencies can offer their clients enhanced performance, improved redundancy, and greater control over their hosting environments.

With our scalable solutions and robust toolset, agencies can confidently deliver hosting that grows with their clients’ businesses while maintaining consistent quality and reliability. This flexibility not only meets today’s hosting demands but also helps position your agency for long-term success in a rapidly evolving digital landscape.

Trend 3: Edge Computing & CDNs Replace AMP For Improving Website Speed

As online audiences grow, the demand for faster, more responsive websites has never been higher. Edge computing and Content Delivery Networks (CDNs) are at the forefront of this evolution, enabling websites to reduce latency significantly. For agencies managing clients with diverse and international audiences, these technologies are crucial for improving user experience and ensuring website performance remains competitive.

Edge computing brings data processing closer to the end user by leveraging servers located at the “edge” of a network, reducing the time it takes for information to travel.

Combined with CDNs that cache website content on servers worldwide, these technologies ensure faster load times, smoother navigation, and better performance metrics.

These features are especially beneficial for media-heavy or high-traffic websites, where even a slight delay can impact engagement and conversions.

Bluehost integrates with leading CDN solutions to deliver content quickly and efficiently to users across the globe. By leveraging a CDN, Bluehost ensures that websites load faster regardless of a visitor’s location, enhancing user experience and SEO performance.

This integration simplifies the optimization of site speed for agencies with multiple clients. By adopting edge computing and CDN technology, you can help your clients achieve faster load times, improved site stability, and higher customer satisfaction.

Bluehost’s seamless CDN integration enables you to deliver a hosting solution that meets the expectations of a modern, global audience while building trust and loyalty with your clients.

Trend 4: Core Web Vitals & SEO Hosting Features Make Or Break Websites

Core Web Vitals play an important role in today’s SEO, as Google is increasingly emphasizing website performance and user experience in its ranking algorithms. Today, loading speed, interactivity, and visual stability impact a site’s ability to rank well in search results and keep visitors engaged.

That means optimizing Core Web Vitals isn’t just an SEO task for agencies managing client websites. Fast load times and responsive design are critical parts of delivering a high-quality digital experience. For example, metrics like Largest Contentful Paint (LCP), which measures how quickly a page’s main content loads, depend heavily on hosting infrastructure.

Agencies need hosting solutions optimized for these metrics to ensure their clients’ sites stay competitive in the SERPs.

Bluehost offers a WordPress-optimized hosting environment with features specifically designed to improve load times and server response speeds. From advanced caching technology to robust server architecture, Bluehost ensures that sites meet Core Web Vitals standards with ease.

Additionally, our hosting solutions include tools for monitoring site performance, allowing agencies to proactively address any issues that could impact rankings or user experience.

By prioritizing Core Web Vitals and leveraging SEO-focused hosting features, agencies can enhance their clients’ visibility, engagement, and overall online success. With Bluehost’s optimized hosting solutions, you’ll have the tools and infrastructure needed to deliver fast, stable, and high-performing websites that delight users and search engines.

Trend 5: Sustainable Hosting Practices Help Reduce Energy Consumption

Sustainability is no longer just a buzzword. It’s a key consideration for businesses and agencies alike. As 2025 progresses, more clients will prioritize environmentally conscious practices, and hosting providers will step up to offer greener solutions, such as energy-efficient data centers and carbon offset programs.

Migrating to a sustainable hosting provider not only supports client values but also demonstrates a commitment to responsible business practices, which will resonate more with consumers in 2025 than ever before.

Efficient hosting practices reduce energy consumption and create a more sustainable digital ecosystem. It will also allow you to help clients meet their environmental goals without compromising on performance.

These benefits are especially valuable for clients with higher energy and performance demands, such as those in ecommerce, media-heavy, or high-traffic industries.

Bluehost has long been recognized as a trusted hosting provider that operates with efficiency in mind.

Our robust, energy-efficient infrastructure already aligns with the sustainability goals of environmentally conscious clients.

In addition, our long-standing reputation, proven history with WordPress, and demonstrable reliability enhance your clients’ sustainability objectives, ensuring they can operate responsibly and confidently.

By choosing sustainable hosting practices and partners like Bluehost, you can contribute to a greener digital future while reinforcing your clients’ environmental commitments and strengthening client relationships by aligning with their values.

Trend 6: Security Must Be A Core Offering

Security is a non-negotiable priority for any website. Cyber threats like data breaches, malware, and DDoS attacks are on the rise, and the consequences of a breach, including lost revenue, damaged reputations, and potential legal issues, can devastate clients. As a result, offering secure hosting solutions with proactive security measures is essential to safeguarding clients’ businesses and building trust.

These key features include SSL certificates, which protect sensitive data while boosting SEO rankings and user trust, and regular malware scans to prevent vulnerabilities.

They should also include automated backups that enable quick restoration in the event of a crash or attack and provide comprehensive protection and peace of mind. Essential security features are standard in Bluehost hosting plans, including SSL certificates, daily automated backups, and proactive malware scanning.

These built-in tools eliminate the need for additional solutions, added complexity, or costs. For agencies, our security features reduce risks for your clients and provide peace of mind.

By choosing a hosting provider like Bluehost, you can prioritize client security, reinforce client trust, and minimize emergencies, allowing you to avoid spending time and resources addressing threats or repairing damage.

In short, by partnering with Bluehost, security becomes a core part of your agency’s value proposition.

Trend 7: Hosting Optimized For AI & Machine Learning Is Key To High Visibility On SERPs

As artificial intelligence and machine learning become increasingly integrated with websites and applications in 2025, hosting providers must keep pace with the increasing demands these technologies place on infrastructure.

AI-driven tools like chatbots, recommendation engines, and predictive analytics require significant computational power and seamless data processing.

AI and machine learning applications often involve handling large datasets, running resource-intensive algorithms, and maintaining real-time responsiveness. Hosting optimized for these needs ensures that websites can perform reliably under heavy workloads, reducing latency and downtime and delivering consistent performance.

If you plan to be successful, you’ll also require scalable scalable hosting solutions. These solutions allow resources to expand dynamically with demand, accommodate growth, and handle traffic surges.

Bluehost’s scalable hosting is built to support advanced tools and applications, making it an ideal choice for agencies working on AI-driven projects. Our robust infrastructure delivers consistent performance, and flexibility allows you to scale easily as your client’s needs evolve. By leveraging Bluehost, agencies can confidently deliver AI-integrated websites that meet modern performance demands.

Trend 8: Managed Hosting Helps You Focus More On Profits

In 2025, websites will become increasingly complex. Businesses will require higher performance and reliability, and everyone will be looking to operate as lean and efficiently as possible. These trends mean managed hosting will become the go-to solution for agencies and their clients.

Managed hosting shifts time-intensive technical maintenance away from agencies and business owners by including features such as automatic updates, performance monitoring, and enhanced security. In short, managed hosting enables you to simplify workflows, save time, and deliver consistent quality to your clients.

These hosting services are particularly valuable for WordPress websites, where regular updates, plugin compatibility checks, and security enhancements occur frequently but are essential to maintaining optimal performance.

Managed hosting also typically includes tools like staging environments, which allow agencies to test changes and updates without risking disruptions to live sites and ensure you can deliver a seamless experience to clients.

Bluehost offers managed WordPress hosting that includes automatic updates, staging environments, and 24/7 expert support. These features allow you to handle technical details efficiently while focusing on delivering results for your clients without added stress or time.

Trend 9: The Shift Toward Decentralized Hosting Boosts Your Brand’s Longevity

In 2025, expect to see decentralized hosting gain attention as a futuristic approach to web hosting. Like Bitcoin and similar advancements, the technology leverages blockchain technology and peer-to-peer networks to create hosting environments that prioritize privacy, resilience, and independence from centralized control.

While this model appears to provide exciting new opportunities, it’s still in the early stages. It faces challenges in scalability, user-friendliness, and widespread adoption, which agencies can’t typically rely on for client sites.

Decentralized hosting may become a viable option for specific use cases, such as privacy-focused projects or highly distributed systems. However, centralized hosting providers still offer the best balance of reliability, scalability, and accessibility for most businesses and agencies today.

For these reasons, agencies managing client websites will continue to focus on proven, reliable hosting solutions that deliver consistent performance and robust support.

So, while decentralized hosting may gain traction this year, Bluehost will continue to provide a trustworthy hosting environment designed to meet the needs of modern websites. With a strong emphasis on reliability, scalability, and user-friendly management tools, we offer a proven solution agencies can depend on to deliver exceptional client results.

Trend 10: Scalable Hosting For High-Growth Websites Is Key For Growth

As businesses grow, their websites will experience increasing traffic and resource demands. High-growth websites, such as e-commerce platforms, content-heavy blogs, or viral marketing campaigns, require hosting solutions that can scale instantly. And scalable hosting is critical to delivering consistent user experiences and avoiding downtime during peak periods.

Scalable hosting like Bluehost ensures your clients’ websites can easily adjust resources like bandwidth, storage, and processing power to meet fluctuating demands. Our scalable hosting solutions are designed for high-growth websites. Our unmetered bandwidth and infrastructure were built to handle traffic surges, ensuring websites remain fast and accessible.

These features make us the ideal choice for agencies looking to future-proof their clients’ hosting needs.

As the digital landscape continues to evolve in 2025, keeping up with the latest trends in hosting is essential for agencies to provide top-tier service, drive client satisfaction, and maintain a competitive edge. From AI and automation to scalability and security, the future of hosting demands flexible, efficient solutions tailored to modern needs.

By understanding and leveraging these trends, you can position your agency as a trusted partner and deliver exceptional results to your clients, whether by adopting managed hosting or integrating CDNs.

Bluehost hosting will meet today’s demands while helping to prepare agencies like yours for tomorrow. With features like 100% uptime guaranteed through our Service Level Agreement (SLA), 24/7 priority support, and built-in tools like SSL certificates, automated backups, and advanced caching, Bluehost offers a robust and reliable hosting environment.

Additionally, Bluehost Cloud makes switching easy and cost-effective with $0 migration costs and credit for remaining contracts, giving you the flexibility to transition seamlessly without the high cost.

Take your agency’s hosting strategy to the next level with Bluehost. Discover how our comprehensive hosting solutions can support your growth, enhance client satisfaction, and keep your business ahead of the curve.


Image Credits

Featured Image: Image by Bluehost. Used with permission.

7 Things To Look For In An SEO-Friendly WordPress Host

This post was sponsored by Bluehost. The opinions expressed in this article are the sponsor’s own.

When trying to improve your WordPress site’s search rankings, hosting might not be the first thing on your mind.

But your choice of hosting provider can significantly impact your SEO efforts.

A poor hosting setup can slow down your site, compromise its stability and security, and drain valuable time and resources.

The answer? Choosing the right WordPress hosting provider.

Here are seven essential features to look for in an SEO-friendly WordPress host that will help you:

1. Reliable Uptime & Speed for Consistent Performance

A website’s uptime and speed can significantly influence your site’s rankings and the success of your SEO strategies.

Users don’t like sites that suffer from significant downtime or sluggish load speeds. Not only are these sites inconvenient, but they also reflect negatively on the brand and their products and services, making them appear less trustworthy and of lower quality.

For these reasons, Google values websites that load quickly and reliably. So, if your site suffers from significant downtime or sluggish load times, it can negatively affect your site’s position in search results as well as frustrate users.

Reliable hosting with minimal downtime and fast server response times helps ensure that both users and search engines can access your content seamlessly.

Performance-focused infrastructure, optimized for fast server responses, is essential for delivering a smooth and engaging user experience.

When evaluating hosting providers, look for high uptime guarantees through a robust Service Level Agreement (SLA), which assures site availability and speed.

Bluehost Cloud, for instance, offers a 100% SLA for uptime, response time, and resolution time.

Built specifically with WordPress users in mind, Bluehost Cloud leverages an infrastructure optimized to deliver the speed and reliability that WordPress sites require, enhancing both SEO performance and user satisfaction. This guarantee provides you with peace of mind.

Your site will remain accessible and perform optimally around the clock, and you’ll spend less time troubleshooting and dealing with your host’s support team trying to get your site back online.

2. Data Center Locations & CDN Options For Global Reach

Fast load times are crucial not only for providing a better user experience but also for reducing bounce rates and boosting SEO rankings.

Since Google prioritizes websites that load quickly for users everywhere, having data centers in multiple locations and Content Delivery Network (CDN) integration is essential for WordPress sites with a global audience.

To ensure your site loads quickly for all users, no matter where they are, choose a WordPress host with a distributed network of data centers and CDN support. Consider whether it offers CDN options and data center locations that align with your audience’s geographic distribution

This setup allows your content to reach users swiftly across different regions, enhancing both user satisfaction and search engine performance.

Bluehost Cloud integrates with a CDN to accelerate content delivery across the globe. This means that whether your visitors are in North America, Europe, or Asia, they’ll experience faster load times.

By leveraging global data centers and a CDN, Bluehost Cloud ensures your site’s SEO remains strong, delivering a consistent experience for users around the world.

3. Built-In Security Features To Protect From SEO-Damaging Attacks

Security is essential for your brand, your SEO, and overall site health.

Websites that experience security breaches, malware, or frequent hacking attempts can be penalized by search engines, potentially suffering from ranking drops or even removal from search indexes.

Therefore, it’s critical to select a host that offers strong built-in security features to safeguard your website and its SEO performance.

When evaluating hosting providers, look for options that include additional security features.

Bluehost Cloud, for example, offers comprehensive security features designed to protect WordPress sites, including free SSL certificates to encrypt data, automated daily backups, and regular malware scans.

These features help maintain a secure environment, preventing security issues from impacting your potential customers, your site’s SEO, and ultimately, your bottom line.

With Bluehost Cloud, your site’s visitors, data, and search engine rankings remain secure, providing you with peace of mind and a safe foundation for SEO success.

4. Optimized Database & File Management For Fast Site Performance

A poorly managed database can slow down site performance, which affects load times and visitor experience. Therefore, efficient data handling and optimized file management are essential for fast site performance.

Choose a host with advanced database and file management tools, as well as caching solutions that enhance site speed. Bluehost Cloud supports WordPress sites with advanced database optimization, ensuring quick, efficient data handling even as your site grows.

With features like server-level caching and optimized databases, Bluehost Cloud is built to handle WordPress’ unique requirements, enabling your site to perform smoothly without additional plugins or manual adjustments.

Bluehost Cloud contributes to a better user experience and a stronger SEO foundation by keeping your WordPress site fast and efficient.

5. SEO-Friendly, Scalable Bandwidth For Growing Sites

As your site’s popularity grows, so does its bandwidth requirements. Scalable or unmetered bandwidth is vital to handle traffic spikes without slowing down your site and impacting your SERP performance.

High-growth websites, in particular, benefit from hosting providers that offer flexible bandwidth options, ensuring consistent speed and availability even during peak traffic.

To avoid disaster, select a hosting provider that offers scalable or unmetered bandwidth as part of their package. Bluehost Cloud’s unmetered bandwidth, for instance, is designed to accommodate high-traffic sites without affecting load times or user experience.

This ensures that your site remains responsive and accessible during high-traffic periods, supporting your growth and helping you maintain your SEO rankings.

For websites anticipating growth, unmetered bandwidth with Bluehost Cloud provides a reliable, flexible solution to ensure long-term performance.

6. WordPress-Specific Support & SEO Optimization Tools

WordPress has unique needs when it comes to SEO, making specialized hosting support essential.

Hosts that cater specifically to WordPress provide an added advantage by offering tools and configurations such as staging environments and one-click installations specifically for WordPress.

WordPress-specific hosting providers also have an entire team of knowledgeable support and technical experts who can help you significantly improve your WordPress site’s performance.

Bluehost Cloud is a WordPress-focused hosting solution that offers priority, 24/7 support from WordPress experts, ensuring any issue you encounter is dealt with effectively.

Additionally, Bluehost’s staging environments enable you to test changes and updates before going live, reducing the risk of SEO-impacting errors.

Switching to Bluehost is easy, affordable, and stress-free, too.

Bluehost offers a seamless migration service designed to make switching hosts simple and stress-free. Our dedicated migration support team handles the entire transfer process, ensuring your WordPress site’s content, settings, and configurations are moved safely and accurately.

Currently, Bluehost also covers all migration costs, so you can make the switch with zero out-of-pocket expenses. We’ll credit the remaining cost of your existing contract, making the transition financially advantageous.

You can actually save money or even gain credit by switching

7. Integrated Domain & Site Management For Simplified SEO Administration

SEO often involves managing domain settings, redirects, DNS configurations, and SSL updates, which can become complicated without centralized management.

An integrated hosting provider that allows you to manage your domain and hosting in one place simplifies these SEO tasks and makes it easier to maintain a strong SEO foundation.

When selecting a host, look for providers that integrate domain management with hosting. Bluehost offers a streamlined experience, allowing you to manage both domains and hosting from a single dashboard.

SEO-related site administration becomes more manageable, and you can focus on the things you do best: growth and optimization.

Find A SEO-Friendly WordPress Host

Choosing an SEO-friendly WordPress host can have a significant impact on your website’s search engine performance, user experience, and long-term growth.

By focusing on uptime, global data distribution, robust security, optimized database management, scalable bandwidth, WordPress-specific support, and integrated domain management, you create a solid foundation that supports both SEO and usability.

Ready to make the switch?

As a trusted WordPress partner with over 20 years of experience, Bluehost offers a hosting solution designed to meet the unique demands of WordPress sites big and small.

Our dedicated migration support team handles every detail of your transfer, ensuring your site’s content, settings, and configurations are moved accurately and securely.

Plus, we offer eligible customers a credit toward their remaining contracts, making the transition to Bluehost not only seamless but also cost-effective.

Learn how Bluehost Cloud can elevate your WordPress site. Visit us today to get started.


Image Credits

Featured Image: Image by Bluehost. Used with permission.

In-Post Image: Images by Bluehost. Used with permission.

12 reasons your page won’t rank – even though it’s optimized

What could be the matter if your perfectly optimized post isn’t ranking? Is the problem that your site is not on Google, or is something else going wrong? What is keeping your content from reaching that coveted #1 position? In this post, we’ll discuss many possible reasons why your page is not ranking, even though it’s optimized.

We’ve divided the possible issues you might be having into four sections:

Pro tip

Quick question: how’s your internal linking? If your content is optimized but not ranking, or Google is ranking the wrong pages from your site, it could be because you need to improve your site structure or fix your orphaned content. We’ve made some really neat SEO workouts to help you check and remedy these kinds of issues — check them out and fix those issues now!

Indexing and crawl issues

The first few points on the list all deal with indexing and crawl issues. Put simply, you can’t rank if your page or site is not on Google in the first place. If you find these topics confusing, you might want to read up on how Google works and how to start with SEO.

1. Your site/page is not on Google

If you need help determining whether your site is on Google, you can use the site: search operator in Google. Type site:yoast.com, and you’ll see a list of pages for that domain. If you type in the full URL of a specific article, you should see only one search result return. If you see your pages, this means that Google knows about your site and has put — at least some of it — in its index. Once you discover that your page is in the index, but you think it is not performing well, you might want to dig deeper.

an example of a site index search on google with yoast.com showing thousands of pages indexed
The site: search operator helps you find your site in Google’s index

How to fix it

Check your WordPress Reading Settings. For the Search Visibility option, if you’ve ticked the box ‘Discourage search engines from indexing this site’, that’s the most likely reason your site is not on Google. If that’s the case, uncheck that box and click to save your changes. If the problem is that only some specific pages aren’t showing up on Google, then you might want to review your Search Appearance settings in Yoast SEO. Go to the ‘Content Types’ tab and ensure your settings are correct.

2. Your site/page is still too new

If your site or page is new, it might simply be a matter of chilling out and checking back in a little while. There are many moving parts in getting your content crawled, indexed and ranked. Sometimes, it takes days or maybe even weeks for Google to finish its discovery process.

How to fix it

If you check and find your site is not on Google yet, you can install Yoast SEO and submit the generated XML sitemap to Google Search Console to help Google discover your website. You can also use the URL Inspection tool in Search Console to determine how specific pages are doing. It tells you exactly how Google crawls and views your site.

3. Your content is noindexed

One of the most common reasons Google does not index your site or a specific page is that it has been noindexed inadvertently. Adding noindex meta robot tags to a page tells Googlebot that it can crawl the page but that the results can’t be added to the index.

How can you check if your page is noindexed? That’s easy; simply open the page and view the source code. You’ll find the code below somewhere at the top of the page. This tells search engine crawlers that the page’s content shouldn’t be added to the index, thus keeping it from ranking.

How to fix it

It happens! Even we occasionally make a mistake and inadvertently noindex a post. Luckily, it’s an easy fix. We wrote about how to set a piece of content back on the right track with Yoast SEO.

4. Your site/page is blocking Google with robots.txt

You might have told Google not to index your content, but it’s also possible you’ve told Google not to crawl your site at all! Blocking crawlers in a so-called robots.txt file is a surefire way never to get any traffic. Blocking robots is easier than you might think. For instance, WordPress has a Search Engine Visibility setting that does its best to keep crawlers out once set to Discourage search engines from indexing this site. Uncheck this to make your site available again.

this is the search engine visibility setting in wordpress without a checkmark
See that this option isn’t inadvertently checked

WordPress uses the noindex approach described above to handle the indexing of sites via the Search Engine Visibility setting. It does have a warning that it’s up to search engines to honor the request.

Besides telling WordPress to block search engines, it might be that other technical issues generate crawl errors, preventing Google from crawling your site properly. Your site’s web server could be acting up and presenting server errors, or buggy bits of JavaScript in your code trip up the crawler. Make sure Google can crawl your site easily.

How to fix it

If your robots.txt file is blocking Google from crawling your website (or parts of it) and you want to change that, then you’ll need to edit the file. You can follow this guide to edit your robots.txt file.

5. You must enhance your index coverage

Ensuring that Google indexes your web pages is essential to succeed. Index coverage refers to the number of your site’s URLs included in Google’s search index. Even the most optimized content may not appear in search results without comprehensive index coverage.

To identify the issue, you must examine the Index Coverage report in Google Search Console. This tool categorizes your pages into various categories and explains why pages are not indexed. If you notice many pages falling under “Error” or “Excluded,” it’s time to investigate further. One of the most common errors is ‘Crawled – currently not indexed’ in Search Console.

How to fix it

Ensure your XML sitemap is current and accurately represents your site structure. Please submit it to Google Search Console to help Google find your pages. Review and resolve any crawl errors such as 404s, server errors, or redirect issues. These errors can prevent pages from being indexed. Pages with low-quality or duplicate content might be excluded from the index. Focus on creating unique, valuable content that provides genuine user engagement. Use the URL Inspection tool to request indexing for crucial pages not yet indexed. This tool also provides insights into how Google perceives your page.

Google Search Console helps you understand why pages are not indexed

Technical issues affecting ranking

Is your page/website indexed but not ranking? Then, technical problems need to be checked.

6. You’re not ranking because your site has technical issues

Your website needs to meet certain technical benchmarks if you’re going to rank on Google! Loading speed, or how quickly your pages load, is important. Security and hosting quality are important too, and that’s not all. You can read about all the essentials in our article: things everyone should know about technical SEO.

If your post doesn’t appear in the search engines, technical issues could prevent it from appearing in the search results completely. You could have conflicting plugins causing problems, and we’ve also seen some themes that prevent Google from indexing your site. And, while Yoast SEO takes care of many technical issues under the hood, it should be set correctly to do that properly.

How to fix it

The fix you need will depend on the technical issues your website is having, and we can’t cover everything here. You might want to check the following points:

  • Ensure all your Yoast plugin settings are correct
  • Check that you’re doing things the right way to keep loading times down
  • Make sure your site is set to https:// and your security certificates are up to date
  • Upgrade your hosting plan
  • Check your plugins and/or theme aren’t causing problems.

If your technical SEO looks good and your site is indexed, you must dig deeper to discover the problem. Keep reading!

7. You’re being penalized for breaking SEO rules

If Google catches you using shady SEO techniques that it doesn’t allow — e.g., sneaky tactics like buying links or stuffing keywords into hidden text — your page or site can be penalized. When you’re already putting in the effort to make a good website and quality content, it’s counterproductive to try. Even when everything else on your page is perfect, if you’re doing something that Google doesn’t allow, you will have problems ranking (or appearing in the Google search results).

Most of these things are common sense, so you probably don’t need to worry if you’re not trying to trick Google or spam people. However, a few things used to be common SEO practices that can now lead to issues — check out our article about SEO myths for more examples of bad SEO practices to avoid.

How to fix it

You can check whether Google has flagged your page for these problems in the Manual Actions tab in Google Search Console (GSC). If you’re still new to using GSC, you might want to check out our introductory article. If you find an issue under the Manual Actions tab, read this help article to learn more about what it means and how to fix it.

Linking issues that affect ranking

A good internal linking structure and quality backlinks are important if you want to rank high. Google crawls the web, following each link it finds, so if your links are lacking, it can cause problems with ranking.

8. Your site doesn’t have a proper internal linking structure

Another reason your content doesn’t appear in the search results is that a crucial part of your SEO strategy is not in order. Don’t underestimate the importance of site structure – the internal linking structure – for your SEO strategy. Having a clear site structure leads to a better understanding of your site by Google. If your internal linking structure is poor, chances to rank high are lower – even when your content is well-optimized and awesome. 

How to fix it

Start adding those links! Make sure that your important posts and pages have the most internal links to them. But don’t randomly add links: make sure you add relevant, related links that add value for your users.

You can use the Yoast SEO orphaned content filter to find posts without incoming internal links. Yoast SEO Premium will help you even more by offering helpful linking suggestions as you write. In addition, if you use Yoast SEO Premium, you get various other AI features, like Yoast AI Optimize, that help you do the hard work. And if you really want to improve your site structure, check out our site structure training — which is also included in Premium!

Pro tip: Take care of your orphaned content and internal linking the easy way with our SEO workouts, available in Yoast SEO Premium.

Read on: Site structure: the ultimate guide »

If you just started with your website, your content won’t instantly rank. Not even if you have optimized everything perfectly and every bullet in Yoast SEO is green. To rank, you’ll need some links from other websites. After all, Google has to know your website exists. 

How to fix it

Creating incredible content is a good way to get links to your pages. High-quality content attracts clicks from readers who might share the content far and wide via social media. All this helps to get those links. Of course, you can do more to get links in a natural, non-spammy way: here are fifteen ways of getting high-quality backlinks.

To get (more) backlinks, you can reach out to other websites. You’ll need to do some PR or link building. Ask them to mention your site or talk about your product and link to your site. You can also use social media to get the word out! Learn all about link-building strategies in our All-Around SEO training!

Content and keyword issues affecting ranking

If everything else is as it should be SEO-wise, then your page or site is not ranking might be related to your content or keywords.

10. Your page is great, but there’s too much competition

Usually, a page doesn’t rank because there’s simply too much competition. If you optimize your content for competitive keywords and keyphrases, such as [cat behavior], [robot vacuum cleaner], or [real estate agent], chances are high that you won’t rank for that term. 

Check the results pages for your keyword to determine if this is the problem. Do high authority sites like Wikipedia or Amazon dominate the first page? Do you see many sites already firmly established themselves in this niche? Probably, your site doesn’t have the authority that these other sites have (yet). So you can optimize all you want, but unfortunately, that’s not enough to rank high in the search results if your niche is too competitive. 

How to fix it

If you want to rank for highly competitive terms, try a long-tail keyword strategy. Write content that targets related long-tail keywords and phrases before tackling the competitive keywords. If these long-tail articles start ranking, you can also rank for more competitive terms. Such a strategy requires long-term efforts, but in the end, it will pay off.

Read more: Why you should focus on long tail keywords »

11. Low-quality content or wrong type of intent

Another reason your content isn’t ranking is that it doesn’t match the intent of people searching for your keyword. Search intent is important for search engines: do people want to buy something, go to a specific website, or seek information? Even if you’re targeting a more long-tail keyphrase, if your content doesn’t match the dominant intent of searchers, search engines won’t show it in the results because it won’t be what people are looking for.

Let’s look at a few examples. Say you’re a dog trainer who wants to rank for puppy training services, so you optimize for [training your puppy], with transactional intent in mind. But if you look at the search results, you’ll see that there are informational videos, and all the results explain how to train a puppy yourself. So, searchers have informational intent. This can work the other way around, too. If you’ve written a step-by-step guide for your blog on making garden decorations, aiming to rank for [flower garland garden decoration], you may have trouble ranking for that term if people just want to buy that, not make it themselves.

Remember that not every search term has one dominant type of intent. Also, it isn’t impossible to rank with content for differing intent. Still, it can be worthwhile to look into this if your optimized content doesn’t rank in the search engines.

How to fix it

Unfortunately, you don’t have the power to change the intent of search engine users. But you can adapt your content strategy. If your optimized content isn’t ranking, look at the search results (use private mode) and analyze what you see. Is one specific type of result dominant? Are there images or videos? Which related queries are shown? This is where your opportunities are. If you find primarily informational intent for a query, you can write content to get people to your site, establish your brand as a reliable source of information, and stay top of mind when people want to buy something. If you find a lot of images in the search results, you may need to focus more on image SEO. Consider what you see on the results pages when determining your SEO strategy.

12. Your content lacks uniqueness

Even well-written and optimized content might struggle to rank if it doesn’t stand out. Search engines prioritize content that offers a unique perspective or provides additional value compared to existing articles on the same topic.

Check the search results for your target keywords and examine the top-ranking pages. Does your content offer something different or more insightful? If your page presents similar information in a comparable format, you may find it difficult to climb the rankings. With the advent of generative AI, we’ll see a wave of mediocre sameness appear in the search results. If you publish the same stuff, search engines won’t bother with it.

Generative AI can help create content but needs help maintaining quality and relevance. While AI can quickly produce large volumes of content, we should prioritize quality over quantity. You should make sure that the material is original and valuable to your audience. AI-generated content might be repetitive or lack diverse perspectives. It’s essential to refine it with your unique insights or expert opinions.

Additionally, the content should always align with your audience’s needs and search intent, as AI may not fully capture human nuances. Always comply with search engine guidelines regarding AI-generated content to avoid potential penalties or indexing issues. You can enhance your content strategy while preserving its integrity by using AI as a supportive tool rather than a standalone solution.

How to fix it

Quit simply; add unique insights and views. Add your own voice and incorporate original research, case studies, or expert opinions to set your content apart. Keep your content fresh with the latest information, trends, or data to maintain relevance and uniqueness. Encourage comments and discussions to build a community around your content, making it more dynamic and engaging.

Is your optimized content still not ranking?

Multiple reasons could prevent a post from ranking. Have you optimized your post correctly with Yoast SEO? Then, the most common cause is likely to be that the competition in a niche is too fierce. Unfortunately, SEO is a long-term strategy. You need to work hard and be patient. In the meantime, you can tackle many other aspects of your SEO (site structure, link building). Try to focus on all website optimization aspects and be the best result. It will pay off eventually!

Read more: Rank tracking: why you should monitor your keywords »

Coming up next!

Google Shows How To Block Bots And Boost Site Performance via @sejournal, @martinibuster

Google’s Martin Splitt answered a question about malicious bots that impact site performance, offering suggestions every SEO and site owner should know and put into action.

Malicious Bots Are An SEO Problem

Many SEOs who do site audits commonly overlook security and bot traffic as part of their audits because it’s not widely understood by digital marketers that security events impact site performance and can account for why a site is inadequately crawled. Improving core web vitals will do nothing to improve site performance when a poor security posture is contributing to poor site performance.

Every website is under attack and the effects of excessive crawling can trigger a “500 server error” response code, signaling an inability to serve web pages and hindering Google’s ability to crawl web pages.

How To Defend Against Bot Attacks

The person asking the question wanted Google’s advice on how to fight back against the waves of scraper bots impacting their server performance.

This is the question asked:

“Our website is experiencing significant disruptions due to targeted scraping by automated software, leading to performance issues, increased server load, and potential data security concerns. Despite IP blocking and other preventive measures, the problem persists. What can we do?”

Google’s Martin Splitt suggested identifying the service that is serving as the source of the attacks and notifying them of an abusive use of their services. He also recommended the firewall capabilities of a CDN (Content Delivery Network).

Martin answered:

“This sounds like somewhat of a distributed denial-of-service issue if the crawling is so aggressive that it causes performance degradation.

You can try identifying the owner of the network where the traffic is coming from, thank “their hoster” and send an abuse notification. You can use WHOIS information for that, usually.

Alternatively, CDNs often have features to detect bot traffic and block it and by definition they take the traffic away from your server and distribute it nicely, so that’s a win. Most CDNs recognize legitimate search engine bots and won’t block them but if that’s a major concern for you, consider asking them before starting to use them.”

Will Google’s Advice Work?

Identifying the cloud provider or server data center that’s hosting the malicious bots is good advice. But there are many scenarios where that won’t work.

Three Reasons Why Contacting Resource Providers Won’t Work

1. Many Bots Are Hidden

Bots often use VPNs and open source “Tor” networks that hide the source of the bots, defeating all attempts of identifying the cloud services or web host providing the infrastructure for the bots. Hackers also hide behind compromised home and business computers, called botnets to launch their attacks. There’s no way to identify them.

2. Bots Switch IP Addresses

Some bots respond to IP blocking by instantly switching to a different network to immediately resume their attack. An attack can originate from a German server and when blocked will switch to a network provider in Asia.

3. Inefficient Use Of Time

Contacting network providers about abusive users is futile when the source of the traffic is obfuscated or from hundreds of sources. Many site owners and SEOs might be surprised to discover how intensive the attacks on their websites are. Even taking action against a small group of offenders is an inefficient use of time because there are literally millions of other bots that will replace the ones blocked by a cloud provider.

And what about botnets made up of thousands of compromised computers around the world? Think you have time to notify all of those ISPs?

Those are three reasons why notifying infrastructure providers is not a viable approach to stopping bots that impact site performance. Realistically, it’s a futile and inefficient use of time.

Use A WAF To Block Bots

Using a Web Application Firewall (WAF) is a good idea and that’s the function that Martin Splitt suggests when he mentioned using a CDN (content delivery network). A CDN, like Cloudflare, sends browsers and crawlers the requested web page from a server that’s located closest to them, speeding up site performance and reducing server resources for the site owner.

A CDN also has a WAF (Web Application Firewall) which automatically blocks malicious bots. Martin’s suggestion for using a CDN is definitely a good option, especially because it has the additional benefit of improving site performance.

An option that Martin didn’t mention is to use a WordPress plugin WAF like Wordfence. Wordfence has a WAF that automatically shuts down bots based on their behavior. For example, if a bot is requesting ridiculous amounts of pages it will automatically create a temporary IP block. If the bot rotates to another IP address it will identify the crawling behavior and block it again.

Another solution to consider is a SaaS platform like Sucuri that offers a WAF and a CDN to speed up performance. Both Wordfence and Sucuri are trustworthy providers of WordPress security and they come with limited but effective free versions.

Listen to the question and answer at the 6:36 minute mark of the Google SEO Office Hours podcast:

Featured Image by Shutterstock/Krakenimages.com

What To Know About Medium-Level WordPress Vulnerabilities via @sejournal, @martinibuster

The majority of WordPress vulnerabilities, about 67% of them discovered in 2023, are rated as medium level. Because of they’re the most common, it makes sense to understand what they are and when they represent an actual security threat. These are the facts about those kinds of vulnerabilities what you should know about them.

What Is A Medium Level Vulnerability?

A spokesperson from WPScan, a WordPress Security Scanning company owned by Automattic, explained that they use the Common Vulnerability Scoring System (CVSS Scores) to rate the severity of a threat. The scores are based on a numbering system from 1 – 10 and ratings from low, medium, high, and critical.

The WPScan spokesperson explained:

“We don’t flag levels as the chance of happening, but the severity of the vulnerability based on FIRST’s CVSS framework. Speaking broadly, a medium-level severity score means either the vulnerability is hard to exploit (e.g., SQL Injection that requires a highly privileged account) or the attacker doesn’t gain much from a successful attack (e.g., an unauthenticated user can get the content of private blog posts).

We generally don’t see them being used as much in large-scale attacks because they are less useful than higher severity vulnerabilities and harder to automate. However, they could be useful in more targeted attacks, for example, when a privileged user account has already been compromised, or an attacker knows that some private content contains sensitive information that is useful to them.

We would always recommend upgrading vulnerable extensions as soon as possible. Still, if the severity is medium, then there is less urgency to do so, as the site is less likely to be the victim of a large-scale automated attack.

An untrained user may find the report a bit hard to digest. We did our best to make it as suitable as possible for all audiences, but I understand it’d be impossible to cover everyone without making it too boring or long. And the same can happen to the reported vulnerability. The user consuming the feed would need some basic knowledge of their website setup to consider which vulnerability needs immediate attention and which one can be handled by the WAF, for example.

If the user knows, for example, that their site doesn’t allow users to subscribe to it. All reports of subscriber+ vulnerabilities, independent of the severity level, can be reconsidered. Assuming that the user maintains a constant review of the site’s user base.

The same goes for contributor+ reports or even administrator levels. If the person maintains a small network of WordPress sites, the admin+ vulnerabilities are interesting for them since a compromised administrator of one of the sites can be used to attack the super admin.”

Contributor-Level Vulnerabilities

Many medium severity vulnerabilities require a contributor-level access. A contributor is an access role that gives that registered user the ability to write and submit content, although in general they don’t have the ability to publish them.

Most websites don’t have to worry about security threats that require contributor level authentication because most sites don’t offer that level of access.

Chloe Chamberland – Threat Intelligence Lead at Wordfence explained that most site owners shouldn’t worry about medium level severity vulnerabilities that require a contributor-level access in order to exploit them because most WordPress sites don’t offer that permission level. She also noted that these kinds of vulnerabilities are hard to scale because exploiting them is difficult to automate.

Chloe explained:

“For most site owners, vulnerabilities that require contributor-level access and above to exploit are something they do not need to worry about. This is because most sites do not allow contributor-level registration and most sites do not have contributors on their site.

In addition, most WordPress attacks are automated and are looking for easy to exploit high value returns so vulnerabilities like this are unlikely to be targeted by most WordPress threat actors.”

Website Publishers That Should Worry

Chloe also said that publishers who do offer contributor-level permissions may have several reasons to be concerned about these kinds of exploits:

“The concern with exploits that require contributor-level access to exploit arises when site owners allow contributor-level registration, have contributors with weak passwords, or the site has another plugin/theme installed with a vulnerability that allows contributor-level access in some way and the attacker really wants in on your website.

If an attacker can get their hands on one of these accounts, and a contributor-level vulnerability exists, then they may be provided with the opportunity to escalate their privileges and do real damage to the victim. Let’s take a contributor-level Cross-Site Scripting vulnerability for example.

Due to the nature of contributor-level access, an administrator would be highly likely to preview the post for review at which point any injected JavaScript would execute – this means the attacker would have a relatively high chance of success due to the admin previewing the post for publication.

As with any Cross-Site Scripting vulnerability, this can be leveraged to add a new administrative user account, inject backdoors, and essentially do anything a site administrator could do. If a serious attacker has access to a contributor-level account and no other trivial way to elevate their privileges, then they’d likely leverage that contributor-level Cross-Site Scripting to gain further access. As previously mentioned, you likely won’t see that level of sophistication targeting the vast majority of WordPress sites, so it’s really high value sites that need to be concerned with these issues.

In conclusion, while I don’t think a vast majority of site owners need to worry about contributor-level vulnerabilities, it’s still important to take them seriously if you allow user registration at that level on your site, you don’t enforce unique strong user passwords, and/or you have a high value WordPress website.”

Be Aware Of Vulnerabilities

While the many of the medium level vulnerabilities may not be something to worry about it’s still a good idea to stay informed of them. Security Scanners like the free version of WPScan can give a warning when a plugin or theme becomes vulnerable. It’s a good way to have a warning system in place to keep on top of vulnerabilities.

WordPress security plugins like Wordfence offer a proactive security stance that actively blocks automated hacking attacks and can be further tuned by advanced users to block specific bots and user agents. The free version of Wordfence offers significant protection in the form of a firewall and a malware scanner. The paid version offers protection for all vulnerabilities as soon as they’re discovered and before the vulnerability is patched. I use Wordfence on all of my websites and can’t imagine setting up a website without it.

Security is generally not regarded as an SEO issue but it should be considered as one because failure to secure a site can undo all the hard word done to make a site rank well.

Featured Image by Shutterstock/Juan villa torres

2024 WordPress Vulnerability Report Shows Errors Sites Keep Making via @sejournal, @martinibuster

WordPress security scanner WPScan’s 2024 WordPress vulnerability report calls attention to WordPress vulnerability trends and suggests the kinds of things website publishers (and SEOs) should be looking out for.

Some of the key findings from the report were that just over 20% of vulnerabilities were rated as high or critical level threats, with medium severity threats, at 67% of reported vulnerabilities, making up the majority. Many regard medium level vulnerabilities as if they are low-level threats but they’re not and should be regarded as deserving attention.

The WPScan report advised:

“While severity doesn’t translate directly to the risk of exploitation, it’s an important guideline for website owners to make an educated decision about when to disable or update the extension.”

WordPress Vulnerability Severity Distribution

Critical level vulnerabilities, the highest level of threat, represented only 2.38% of vulnerabilities, which is (essentially good news for WordPress publishers. Yet as mentioned earlier, when combined with the percentages of high level threats (17.68%) the number or concerning vulnerabilities rises to almost 20%.

Here are the percentages by severity ratings:

  • Critical 2.38%
  • Low 12.83%
  • High 17.68%
  • Medium 67.12%

Authenticated Versus Unauthenticated

Authenticated vulnerabilities are those that require an attacker to first attain user credentials and their accompanying permission levels in order to exploit a particular vulnerbility. Exploits that require subscriber-level authentication are the most exploitable of the authenticated exploits and those that require administrator level access present the least risk (although not always a low risk for a variety of reasons).

Unauthenticated attacks are generally the easiest to exploit because anyone can launch an attack without having to first acquire a user credential.

The WPScan vulnerability report found that about 22% of reported vulnerabilities required subscriber level or no authentication at all, representing the most exploitable vulnerabilities. On the other end of the scale of the exploitability are vulnerabilities requiring admin permission levels representing a total of 30.71% of reported vulnerabilities.

Permission Levels Required For Exploits

Vulnerabilities requiring administrator level credentials represented the highest percentage of exploits, followed by Cross Site Request Forgery (CSRF) with 24.74% of vulnerabilities. This is interesting because CSRF is an attack that uses social engineering to get a victim to click a link from which the user’s permission levels are acquired. If they can trick an admin level user to follow a link then they will be able to assume that level of privileges to the WordPress website.

The following is the percentages of exploits ordered by roles necessary to launch an attack.

Ascending Order Of User Roles For Vulnerabilities

  • Author 2.19%
  • Subscriber 10.4%
  • Unauthenticated 12.35%
  • Contributor 19.62%
  • CSRF 24.74%
  • Admin 30.71%

Most Common Vulnerability Types Requiring Minimal Authentication

Broken Access Control in the context of WordPress refers to a security failure that can allow an attacker without necessary permission credentials to gain access to higher credential permissions.

In the section of the report that looks at the occurrences and vulnerabilities underlying unauthenticated or subscriber level vulnerabilities reported (Occurrence vs Vulnerability on Unauthenticated or Subscriber+ reports), WPScan breaks down the percentages for each vulnerability type that is most common for exploits that are the easiest to launch (because they require minimal to no user credential authentication).

The WPScan threat report noted that Broken Access Control represents a whopping 84.99% followed by SQL injection (20.64%).

The Open Worldwide Application Security Project (OWASP) defines Broken Access Control as:

“Access control, sometimes called authorization, is how a web application grants access to content and functions to some users and not others. These checks are performed after authentication, and govern what ‘authorized’ users are allowed to do.

Access control sounds like a simple problem but is insidiously difficult to implement correctly. A web application’s access control model is closely tied to the content and functions that the site provides. In addition, the users may fall into a number of groups or roles with different abilities or privileges.”

SQL injection, at 20.64% represents the second most prevalent type of vulnerability, which WPScan referred to as both “high severity and risk” in the context of vulnerabilities requiring minimal authentication levels because attackers can access and/or tamper with the database which is the heart of every WordPress website.

These are the percentages:

  • Broken Access Control 84.99%
  • SQL Injection 20.64%
  • Cross-Site Scripting 9.4%
  • Unauthenticated Arbitrary File Upload 5.28%
  • Sensitive Data Disclosure 4.59%
  • Insecure Direct Object Reference (IDOR) 3.67%
  • Remote Code Execution 2.52%
  • Other 14.45%

Vulnerabilities In The WordPress Core Itself

The overwhelming majority of vulnerability issues were reported in third-party plugins and themes. However, there were in 2023 a total of 13 vulnerabilities reported in the WordPress core itself. Out of the thirteen vulnerabilities only one of them was rated as a high severity threat, which is the second highest level, with Critical being the highest level vulnerability threat, a rating scoring system maintained by the Common Vulnerability Scoring System (CVSS).

The WordPress core platform itself is held to the highest standards and benefits from a worldwide community that is vigilant in discovering and patching vulnerabilities.

Website Security Should Be Considered As Technical SEO

Site audits don’t normally cover website security but in my opinion every responsible audit should at least talk about security headers. As I’ve been saying for years, website security quickly becomes an SEO issue once a website’s ranking start disappearing from the search engine results pages (SERPs) due to being compromised by a vulnerability. That’s why it’s critical to be proactive about website security.

According to the WPScan report, the main point of entry for hacked websites were leaked credentials and weak passwords. Ensuring strong password standards plus two-factor authentication is an important part of every website’s security stance.

Using security headers is another way to help protect against Cross-Site Scripting and other kinds of vulnerabilities.

Lastly, a WordPress firewall and website hardening are also useful proactive approaches to website security. I once added a forum to a brand new website I created and it was immediately under attack within minutes. Believe it or not, virtually every website worldwide is under attack 24 hours a day by bots scanning for vulnerabilities.

Read the WPScan Report:

WPScan 2024 Website Threat Report

Featured Image by Shutterstock/Ljupco Smokovski

WordPress Discovers XSS Vulnerability – Recommends Updating To 6.5.2 via @sejournal, @martinibuster

WordPress announced the 6.5.2 Maintenance and Security Release update that patches a store cross site scripting vulnerability and fixes over a dozen bugs in the core and the block editor.

The same vulnerability affects both the WordPress core and the Gutenberg plugin.

Cross Site Scripting (XSS)

An XSS vulnerability was discovered in WordPress that could allow an attacker to inject scripts into a website that then attacks site visitors to those pages.

There are three kinds of XSS vulnerabilities but the most commonly discovered in WordPress plugins, themes and WordPress itself are reflected XSS and stored XSS.

Reflected XSS requires a victim to click a link, an extra step that makes this kind of attack harder to launch.

A stored XSS is the more worrisome variant because it exploits a flaw that allows the attacker to upload a script into the vulnerable site that can then launch attacks against site visitors. The vulnerability discovered in WordPress is a stored XSS.

The threat itself is mitigated to a certain degree because this is an authenticated stored XSS, which means that the attacker needs to first acquire at least a contributor level permissions in order to exploit the website flaw that makes the vulnerability possible.

This vulnerability is rated as a medium level threat, receiving a Common Vulnerability Scoring System (CVSS) score of 6.4 on a scale of 1 – 10.

Wordfence describes the vulnerability:

“WordPress Core is vulnerable to Stored Cross-Site Scripting via user display names in the Avatar block in various versions up to 6.5.2 due to insufficient output escaping on the display name. This makes it possible for authenticated attackers, with contributor-level access and above, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

WordPress.org Recommends Updating Immediately

The official WordPress announcement recommended that users update their installations, writing:

“Because this is a security release, it is recommended that you update your sites immediately. Backports are also available for other major WordPress releases, 6.1 and later.”

Read the Wordfence advisories:

WordPress Core < 6.5.2 – Authenticated (Contributor+) Stored Cross-Site Scripting via Avatar Block

Gutenberg 12.9.0 – 18.0.0 – Authenticated (Contributor+) Stored Cross-Site Scripting via Avatar Block

Read the official WordPress.org announcement:

WordPress 6.5.2 Maintenance and Security Release

Featured Image by Shutterstock/ivan_kislitsin

XSS Vulnerability Affects Beaver Builder WordPress Page Builder via @sejournal, @martinibuster

The popular Beaver Builder WordPress Page Builder was found to contain an XSS vulnerability that can allow an attacker to inject scripts into the website that will run when a user visits a webpage.

Beaver Builder

Beaver Builder is a popular plugin that allows anyone to create a professional looking website using an easy to use drag and drop interface. Users can start with a predesigned template or create a website from scratch.

Stored Cross Site Scripting (XSS) Vulnerability

Security researchers at Wordfence published an advisory about an XSS vulnerability affecting the page builder plugin. An XSS vulnerability is typically found in a part of a theme or plugin that allows user input. The flaw arises when there is insufficient filtering of what can be input (a process called input sanitization). Another flaw that leads to an XSS is insufficient output escaping, which is a security measure on the output of a plugin that prevents harmful scripts from passing to a website browser.

This specific vulnerability is called a Stored XSS. Stored means that an attacker is able to inject a script directly onto the webs server. This is different from a reflected XSS which requires a victim to click a link to the attacked website in order to execute a malicious script. A stored XSS (as affects the Beaver Builder), is generally considered to be more dangerous than a reflected XSS.

The security flaws that gave rise to an XSS vulnerability in the Beaver Builder were due to insufficient input sanitization and output escaping.

Wordfence described the vulnerability:

“The Beaver Builder – WordPress Page Builder plugin for WordPress is vulnerable to Stored Cross-Site Scripting via the plugin’s Button Widget in all versions up to, and including, 2.8.0.5 due to insufficient input sanitization and output escaping on user supplied attributes. This makes it possible for authenticated attackers, with contributor-level access and above, to inject arbitrary web scripts in pages that will execute whenever a user accesses an injected page.”

The vulnerability is rated 6.4, a medium level threat. Attackers must gain at least contributor-level permission levels in order to be able to launch an attack, which makes this vulnerability a little harder to exploit.

The official Beaver Builder changelog, which documents what’s contained in an update, notes that a patch was issued in version 2.8.0.7.

The changelog notes:

“Fix XSS issue in Button & Button Group Modules when using lightbox”

Recommended action: It’s generally a good practice to update and patch a vulnerability before an attacker is able to exploit it. It’s a best-practice to stage the site first before pushing an update live in case that the updated plugin conflicts with another plugin or theme.

Read the Wordfence advisory:

Beaver Builder – WordPress Page Builder <= 2.8.0.5 – Authenticated (Contributor+) Stored Cross-Site Scripting via Button

See also:

Featured Image by Shutterstock/Prostock-studio

WordPress Backup Plugin DoS Vulnerability Affects +200,000 Sites via @sejournal, @martinibuster

A popular WordPress backup plugin installed in over 200,000 websites recently patched a high severity vulnerability that could lead to a denial of service attack. Wordfence assigned a CVSS severity level rating of High, with a score of 7.5/10, indicating that plugin users should take note and update their plugin.

Backuply Plugin

The vulnerability affects the Backuply WordPress backup plugin. Creating backups is a necessary function for every website, not just WordPress sites, because backups help publishers roll back to a previous version should the server fail and lose data in a catastrophic failure.

Website backups are invaluable for site migrations, hacking recovery and failed updates that render a website non-functional.

Backuply is an especially useful plugin because it backup data to multiple trusted third party cloud services and supports multiple ways to download local copies in order to create redundant backups so that if a cloud backup is bad the site can be recovered from another backup stored locally.

According to Backuply:

“Backuply comes with Local Backups and Secure Cloud backups with easy integrations with FTP, FTPS, SFTP, WebDAV, Google Drive, Microsoft OneDrive, Dropbox, Amazon S3 and easy One-click restoration.”

Vulnerability Affecting Backuply

The United States Government National Vulnerability Database warns that Backuply up to and including version 1.2.5 contains a flaw that can lead to denial of service attacks.

The warning explains:

“This is due to direct access of the backuply/restore_ins.php file and. This makes it possible for unauthenticated attackers to make excessive requests that result in the server running out of resources.”

Denial Of Service (DoS) Attack

A denial of service (DoS) attack is one in which a flaw in a software allows an attacker to make so many rapid requests that the server runs out of resources and can no longer process any further requests, including serving webpages to site visitors.

A feature of DoS attacks is that it is sometimes possible to upload scripts, HTML or other code that can then be executed, allowing the attacker to perform virtually any action.

Vulnerabilities that enable DoS attacks are considered critical, and steps to mitigate them should be taken as soon as possible.

Backuply Changelog Documentation

The official Backuply changelog, which announces the details of every update, notes that a fix was implemented in version of 1.2.6. Backuply’s transparency and rapid response is responsible and a sign of a trustworthy developer.

According to the Changelog:

“1.2.6 (FEBRUARY 08 2024)
[Security-Fix] In some cases it was possible to fill up the logs and has been fixed. Reported by Villu Orav (WordFence)”

Recommendations

In general it is highly recommended that all users of the Backuply plugin update their plugin as soon as possible in order to prevent an unwanted security event.

Read the National Vulnrability Database description of the vulnerability:

CVE-2024-0842

Read the Wordfence Backuply vulnerability report:

Backuply – Backup, Restore, Migrate and Clone <= 1.2.5 – Denial of Service

Featured Image by Shutterstock/Doppelganger4

Mozilla VPN Security Risks Discovered via @sejournal, @martinibuster

Mozilla published the results of a recent third-party security audit of its VPN services as part of it’s commitment to user privacy and security. The survey revealed security issues which were presented to Mozilla to be addressed with fixes to ensure user privacy and security.

Many search marketers use VPNs during the course of their business especially when using a Wi-Fi connection in order to protect sensitive data, so the  trustworthiness of a VNP is essential.

Mozilla VPN

A Virtual Private Network (VPN), is a service that hides (encrypts) a user’s Internet traffic so that no third party (like an ISP) can snoop and see what sites a user is visiting.

VPNs also add a layer of security from malicious activities such as session hijacking which can give an attacker full access to the websites a user is visiting.

There is a high expectation from users that the VPN will protect their privacy when they are browsing on the Internet.

Mozilla thus employs the services of a third party to conduct a security audit to make sure their VPN is thoroughly locked down.

Security Risks Discovered

The audit revealed vulnerabilities of medium or higher severity, ranging from Denial of Service (DoS). risks to keychain access leaks (related to encryption) and the lack of access controls.

Cure53, the third party security firm, discovered and addressed several risks. Among the issues were potential VPN leaks to the vulnerability of a rogue extension that disabled the VPN.

The scope of the audit encompassed the following products:

  • Mozilla VPN Qt6 App for macOS
  • Mozilla VPN Qt6 App for Linux
  • Mozilla VPN Qt6 App for Windows
  • Mozilla VPN Qt6 App for iOS
  • Mozilla VPN Qt6 App for Androi

These are the risks identified by the security audit:

  • FVP-03-003: DoS via serialized intent
  • FVP-03-008: Keychain access level leaks WG private key to iCloud
  • VP-03-010: VPN leak via captive portal detection
  • FVP-03-011: Lack of local TCP server access controls
  • FVP-03-012: Rogue extension can disable VPN using mozillavpnnp (High)

The rogue extension issue was rated as high severity. Each risk was subsequently addressed by Mozilla.

Mozilla presented the results of the security audit as part of their commitment to transparency and to maintain the trust and security of their users. Conducting a third party security audit is a best practice for a VPN provider that helps assure that the VPN is trustworthy and reliable.

Read Mozilla’s announcement:
Mozilla VPN Security Audit 2023

Featured Image by Shutterstock/Meilun