Bricks Builder For WordPress RCE Vulnerability via @sejournal, @martinibuster

Bricks Visual Site Builder for WordPress recently patched a critical severity vulnerability rated 9.8/10 which is actively being exploited right now.

Bricks Builder

Bricks Builder is a popular WordPress development theme that makes it easy to create attractive and fast performing websites in hours that would costs up to $20,000 of development time to do from scratch without it. Ease of use and developer components for CSS have made it a popular choice for developers.

Unauthenticated RCE Vulnerability

Bricks Builder is affected by a remote code execution (RCE) vulnerability. It’s rated 9.8/10 on the Common Vulnerability Scoring System (CVSS), which is nearly the highest level.

What makes this vulnerability particularly bad is that it’s an unauthenticated vulnerability which means that a hacker doesn’t need to attain permission credentials to exploit the vulnerability. Any hacker who knows of the vulnerability can exploit it, which in this case means an attacker can execute code.

Wordfence describes what can happen:

“This makes it possible for unauthenticated attackers to execute code on the server.”

The details of the vulnerability have not been officially published.

According to the official Bricks Builder changelog:

“We just released a mandatory security update with Bricks 1.9.6.1.

A leading security expert in the WordPress space just brought this vulnerability to our attention, and we instantly got to work, providing you now with a verified patch.

As of the time of this release, there’s no evidence that this vulnerability has been exploited. However, the potential for exploitation increases the longer the update to 1.9.6.1 is delayed.

We advise you to update all your Bricks sites immediately.”

Vulnerability Is Being Actively Exploited

According to Adam J. Humphreys (LinkedIn), founder of the web development company Making 8, the vulnerability is actively being exploited. The Bricks Builder Facebook community is said to be responding to affected users with information on how to recover from the vulnerability.

Adam J. Humphrey’s commented to SEJ:

“Everyone is getting hit bad. People on hosts without good security got exploited. A lot of people are dealing with it now. It’s a bloodbath and it’s the number one rated builder.

I have strong security. I’m so glad that I’m very protective of clients. It all seemed overkill until this.

People on hosts without good security got exploited.

SiteGround when installed has WordPress security. They also have a CDN and easy migrations with their plugin. I’ve found their support more responsive than the most expensive hosts. The WordPress security plugin at SiteGround is good but I also combine this with Wordfence because protection never hurts.”

Recommendations:

All Bricks Builder users are encouraged to update to the latest version, 1.9.6.1.

The Bricks Builder changelog announcement advises:

“Update Now: Update all your Bricks sites to the latest Bricks 1.9.6.1 as soon as possible. But at least within the next 24 hours. The earlier, the better.

Backup Caution: If you use website backups, remember they may include an older, vulnerable version of Bricks. Restoring from these backups can reintroduce the vulnerability. Please update your backups with the secure 1.9.6.1 version.”

This is a developing event, more information will be added when known.

Google Updates Guidance On Image Removal From Search Index via @sejournal, @martinibuster

Google updated their emergency and non-emergency image removal guidance with added details that give new clarity to the documentation.

Removing Images From Search Index

Google offers multiple ways to remove images from the search index on both an emergency and non-emergency basis.

There are multiple relatively trivial changes but these are the topics that had different levels of changes:

  • How to quickly remove images.
  • What to do when there is no access to the CDN that’s hosting the images or if the CMS doesn’t offer a way to block indexing.
  • More details about the use of robots.txt for images.
  • How to use wildcards in robots.txt
  • A caveat about the use of noimageindex robots tag.

How To Quickly Remove Images From Index

The first addition to the documentation is the following paragraph:

“For emergency image removal
To quickly remove images hosted on your site from Google’s search results, use the Removals tool. Keep in mind that unless you also remove the images from your site or otherwise block the images as described in the non-emergency image removal section, the images may resurface in Google’s search results once the removal request expires.”

When There’s No Access To Images On CDN Or By CMS

The next scenario is when an image is hosted on a CDN but for whatever reason they can’t be accessed or the CMS prevents blocking the image.

This is the added paragraph:

“If you don’t have access to the site that’s hosting your images (for example a CDN) or your CMS doesn’t provide a way to block images with the noindex X-Robots-Tag HTTP header or robots.txt, you might need to delete the images altogether from your site.”

Images And Robots.txt

The next changes are minor additions to two paragraphs that together make the message clearer, with addition of the phrase, “for example https://yoursite.example.com/robots.txt” and some other extra words that are relatively trivial.

The following passage about robots.txt structure was changed from this:

“Rules may include special characters for more flexibility and control. The * character matches any sequence of characters, and patterns may end in $ to indicate the end of a path.

To this:

“Rules may include special characters for more flexibility and control. Specifically, the * character matches any sequence of characters which lets you to match multiple image paths with one rule.”

Change To Guidance On Robots.txt Wildcards

The next change is more substantial because it offers more details on how to use wildcards. Wildcards in this context relates to the use of the * symbol which means any character can be there.

This part:

“# Wildcard character in the filename for
# images that share a common suffix:”

Becomes this:

“# Wildcard character in the filename for
# images that share a common suffix. For example,
# animal-picture-UNICORN.jpg and
# animal-picture-SQUIRREL.jpg
# in the “images” directory
# will be matched by this pattern.”

New Paragraph About Noimageindex Robots Tag

The last of the significant changes is a passage that offers a caveat about the use of the noimageindex.

This is the new passage:

“Note that adding the noimageindex robots tag to a particular page will also prevent that images embedded in that page from getting indexed. However, if the same images also appear in other pages, they might get indexed through those pages. To make sure a particular image is blocked no matter where it appears, use the noindex X-Robots-Tag HTTP response header.”

Google Search Central Updating Documentation

This is the latest of an ongoing series of updates to Google’s documentations. Long webpages are edited to make them more concise. Others, like this webpage, are edited to make them clearer.

Read the newly updated guidance on removing images from Google’s index:

Remove images hosted on your site from search results

Featured Image by Shutterstock/Piotr Swat

Google Is Rolling Out New Search Features In Europe via @sejournal, @MattGSouthern

Google has unveiled new search experiences that will soon roll out across the European Economic Area (EEA).

The changes are part of Google’s preparations to comply with the European Union’s Digital Markets Act (DMA).

One notable change is a carousel-style rich result for queries like “hotels near me.”

Google is also adding dedicated ‘aggregator units’ to showcase links to major aggregator sites relevant to the search.

In a blog post, Google explains that it aims to “present users with rich and relevant information for their searches” and “improve the visibility of ecosystem participants.”

Rich Results Carousel

Google will soon roll out a new search feature that displays carousel-style rich results for queries related to travel, local services, and shopping.

This new format will allow users to horizontally scroll through tiles showing additional information like prices, ratings, and images.

The carousel results depend on web pages having the appropriate structured data markup. Without that markup, pages will continue showing the standard text search results.

While initially launched for travel and local searches, Google is testing the carousel for shopping queries in Germany, France, Czechia, and the UK.

Aggregator Units & Refinement Chips

Google is implementing new search features that provide direct links to content from aggregator websites in top search results.

Additionally, Google is adding refinement chips that allow searchers to narrow down results to specific types of content.

Varieties of these aggregator units include:

  • Places sites
  • Jobs sites
  • Flight sites
  • Product sites (initially tested in Germany, France, Czechia, and the UK)

Google says these features will not require additional work from publishers to implement.

Flight Queries

Alongside the abovementioned features, Google is testing a new search feature for flight-related searches.

This feature displays airline website results in a separate unit, allowing users to find flight details more easily.

Google’s Invitation To EEA Publishers

These new search features are exclusively available to users in the EEA to comply with the DMA.

Google invites EEA-based companies or those serving EEA users to express interest in these new search features by filling out a form.

Through these changes, Google aims to create a more user-focused and competitive digital market in line with DMA goals while providing businesses with new opportunities in search results.

Why These Features Are Exclusive To Europe

Google is rolling out these features in Europe because of a legal obligation to comply with the DMA.

The DMA is new EU legislation aimed at regulating large technology companies designated as “gatekeepers” due to their market dominance.

It will come into effect in March and requires significant changes by companies like Google, Apple, Amazon, Meta, Microsoft, and ByteDance.

As a designated gatekeeper, Google must adjust products like Search, Maps, and the Android app store to comply.

Requirements For “Gatekeepers”

The DMA’s main objectives are the following:

  • Give users more choice over default apps and services
  • Allow alternative app stores
  • Mandate interoperability between messaging services
  • Ban self-preferencing in rankings
  • Require consent for targeted ads
  • Improve data transparency.

Gatekeepers will need user consent for targeted ads and to provide more precise advertising data to business customers. They will also have to allow other app stores on their devices.

E-commerce, search, and social media platforms cannot unfairly rank their services above competitors. Messaging services like WhatsApp and iMessage will have to become interoperable.

Lawmakers in the EU believe this will benefit consumers, who can choose default apps and services easily.

Looking Ahead

As the enforcement date approaches, Google intends to comply with the DMA while maintaining user experience.

Companies have a six-month transition period before the DMA fully takes effect. The legislation is expected to reshape the digital market in the EU significantly.


FAQ

What new search experiences are Google rolling out in the European Economic Area?

Google is introducing a range of new search enhancements designed to comply with the European Union’s Digital Markets Act (DMA). These enhancements include:

  • A carousel-style rich result for travel-related queries, such as “hotels near me.”
  • ‘Aggregator units’ showcase links to major aggregator websites relevant to the user’s search.
  • Refinement chips enable users to filter search results more precisely.
  • Features to improve the visibility of airline websites for flight-related queries.

How will Google’s carousel-style rich results affect search visibility?

Introducing carousel-style rich results will enhance the visibility of specific search results by displaying them in a scannable and visually appealing horizontal format. Key impacts include:

  • Improved user engagement through interactive elements like prices, ratings, and images.
  • Increased visibility for web pages that implement the appropriate structured data markup.
  • Possibility for businesses with correctly marked-up pages to gain more attention and potentially drive more traffic to their site.

Without the requisite structured data markup, pages will continue to appear as standard text search results.

What implications does the Digital Markets Act have for tech companies and users in the EU?

The Digital Markets Act (DMA) introduces a series of regulations and requirements that will have wide-ranging implications for tech companies and consumers in the EU, including:

  • More user autonomy over default apps and services by prohibiting self-preferencing and mandatory options for alternative services.
  • Introduction of alternative app stores and interoperability between messaging services.
  • Requirement for explicit user consent for targeted advertising and greater transparency regarding advertising data.
  • An environment that encourages fair competition by preventing e-commerce, search, and social media platforms from unfairly ranking their services above competitors.

The DMA is expected to significantly alter the digital market environment within the European Union, providing consumers with increased choice and control.


Featured Image: Screenshot from developers.google.com, February 2024. 

Google Updates Rel=Canonical Documentation via @sejournal, @martinibuster

Google updated their rel canonical documentation in order to clarify how Google handles the extraction of rel canonical annotations. The clarification is not meant to indicate a change in how Google handles rel=canonical annotations but rather to make it explicitly clear how Google processes them.

Canonical Link Relation – RFC 5988

Google’s documentation has always referenced RFC 5988 as the standard Google uses for how it uses the link relation canonical. The RFC is a standard published by the Internet Engineering Task Force (IETF) that defines specifications for various Internet and networking technologies, in this case the standards related to HTML rel link attribute.

An HTML element is like a basic building block of an HTML webpage. An element can be extended with an attribute. In this case the Link element is modified by the Rel attribute.
RFC 6596 defines the rel link attribute as:

“RFC 5988 specifies a way to define relationships between links on the web. This document describes a new type of such a relationship, “canonical”, to designate an Internationalized Resource Identifier (IRI) as preferred over resources with duplicative content.

…Common implementations of the canonical link relation are to specify the preferred version of an IRI from duplicate pages created with the addition of IRI parameters (e.g., session IDs) or to specify the single-page version as preferred over the same content separated on multiple component pages.”

What that means is that the canonical link element specifies when another document is duplicate (duplicative) and which one is the preferred original. These are the parameters that Google has used to process the canonical link element.

Changes To Canonical Documentation

The changes to the Search Central Documentation were specific to rel=”canonical” link annotations that are outside of the use case of specifying documents that are duplicative plus some minor and trivial changes to the page.

Google changed the following sentence:

“Google supports rel canonical link annotations as described in RFC 6596.”

The change is limited to adding the word explicit:

“Google supports explicit rel canonical link annotations as described in RFC 6596.”

While that change may seem trivial it’s actually the focus of the documentation change in that it makes it clear that Google is not deviating from the standards as laid out in the RFC 6596.

The next change is an addition of an entirely new paragraph.

This is the new paragraph:

“rel=”canonical” annotations that suggest alternate versions of a page are ignored; specifically, rel=”canonical” annotations with hreflang, lang, media, and type attributes are not used for canonicalization.

Instead, use the appropriate link annotations to specify alternate versions of a page; for example, link rel=”alternate” hreflang for language and country annotations.”

What that means is to not use “canonical” to specify something that is not a duplicative webpage, such as a page in another language or media but rather it’s better to use “alternate” instead.

This does not represent a change in how Google uses or ignores canonical or alternate link elements.

Google’s changelog documentation explains it:

“Clarifying the extraction of rel=”canonical” annotations
What: Clarified that rel=”canonical” annotations with certain attributes are not used for canonicalization.

Why: The rel=”canonical” annotations help Google determine which URL of a set of duplicates is the canonical. Adding certain attributes to the link element changes the meaning of the annotation to denote a different device or language version. This is a documentation change only; Google has always ignored these rel=”canonical” annotations for canonicalization purposes.”

Read Google’s updated documentation:

How to specify a canonical with rel=”canonical” and other methods

Featured Image by Shutterstock/Kues

Google Launches “How Search Works” Series To Demystify SEO via @sejournal, @MattGSouthern

Google is releasing a new educational video series on its Search Central YouTube channel.

The videos feature Google engineer Gary Illyes, who provides an inside look at how Google Search works. The goal of the series is to explain the complexities of the search engine in an understandable way.

The first episode serves as an introduction to the series. Future episodes will delve deeper into techniques for improving website visibility in search results.

“How Search Works” Series Debut

“How Search Works” is a five-part series that delves into the technical aspects of Google’s Search functionality.

“We created this series to help you, your friends, family, business partners, and anyone, really, increase the visibility of your sites,” Illyes states, emphasizing the technical focus of the content.

The Mechanics of Search

The series aims to unpack the three core stages of Google Search:

  1. Crawling – The process by which Google discovers URLs and explores the web.
  2. Indexing – How Google understands a page’s content and context in relation to the internet, storing it in a searchable format.
  3. Serving – The method Google employs to serve and rank search results.

Additionally, future videos will dissect the components of search results and how to optimize webpages pages to enhance visibility.

Key Takeaways From Episode One

Illyes underlines two critical points for understanding Google Search:

  1. Google doesn’t accept payment for more frequent crawling or higher ranking. Gary emphasizes, “If anyone tells you otherwise, they’re wrong.”
  2. The quality of a website’s content is paramount for achieving a favorable position in search results.

Illyes claims Google’s definition of “quality” content will be explained in future episodes.

More Of The Same?

It’s unclear how Google’s new “How Search Works” video series will differ from its existing videos like “Search for Beginners.”

Based on the introduction, there is potential for a lot of repetitive content between the two series.

However, as advocates for publishing unique content, one would expect that the new videos will provide additional insights and information beyond what has already been covered in previous educational materials from Google.

FAQ

What is the purpose of Google’s “How Search Works” video series?

  • Google created the “How Search Works” video series to make the intricate workings of Google Search easier to understand for more people.
  • By offering explanations straight from the source at Google, the goal is to help website owners optimize their sites to rank higher in search results.

Does payment for crawling or ranking influence Google Search?

  • Google makes it clear it doesn’t take money in exchange for boosting a website’s ranking or crawling rate in search results. In the video, Illyes denies any suggestion that payments can sway search rankings.
  • Google insists that search result order is determined by assessing website content quality and relevance to the user’s query. Financial transactions play no role in influencing search rankings.

How can marketers leverage insights from the “How Search Works” series?

  • The “How Search Works” series offers marketers valuable insights into the technical inner workings of Google Search.
  • By learning about critical SEO concepts like crawling, indexing, and serving, marketers can better optimize websites to improve visibility and rankings on Google.
  • The series aims to give marketers a clearer picture of how to optimize sites for Google Search ethically.


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, February 2024. 

Google’s Danny Sullivan Provides 5-Step Plan To Diagnose Ranking Drops via @sejournal, @MattGSouthern

Google’s Search Liaison, Danny Sullivan, recently offered guidance on how to diagnose ranking declines.

Sullivan provided the advice on X (formerly Twitter) to Wesley Copeland, owner of a gaming news website, who sought help after seeing a significant drop in traffic from Google searches.

Google’s Search Liaison Offers SEO Tips

According to Copeland’s post, he’s been struggling to understand why his website’s status as the go-to source for Steam Deck guides has changed. He stated:

“Hey Danny! Any chance you could take a look at my site http://RetroResolve.com, please? We used to be the go-to for guides on Steam Deck but got hit pretty badly and I’m a bit lost as to why.”

A Five-Step Plan To Recovery

Sullivan recommended several steps to diagnose and address potential issues with the website’s performance:

  1. First, use Google Search Console to compare the site’s metrics over the past six months versus the prior period.
  2. Next, sort the Queries report by click change to identify notable decreases.
  3. Check if the site still ranks highly for those terms.
  4. If so, the content quality and SEO may not be the problem.
  5. Recognize that Google’s ranking algorithms evolve continually, so some volatility is expected.

“If you’re still ranking in the top results, there’s probably nothing fundamental you have to correct,” Sullivan assured.

He elaborated that changes in traffic could be due to Google’s systems finding other content that could be deemed more useful at the time.

Implications & Insights For SEO Professionals

Sullivan’s advice highlights the importance for SEO professionals to regularly analyze performance using tools like Google Search Console. His recommended approach can provide insights into traffic changes and identify areas to potentially optimize.

High search rankings require aligning with Google’s evolving ranking criteria. Google continually improves its algorithms to deliver the most relevant content to users. Therefore, search ranking fluctuations are expected.

Final Words

Copeland’s experience demonstrates the volatile nature of SEO, demonstrating that even well-established websites can be impacted by changes to Google’s ranking priorities.

Sullivan’s final words offer a mix of assurance and the reality of SEO:

“But you probably don’t have any fundamental issues, and it might be the mix of how we show content could change to help you over time.”

The conversation between Copeland and Sullivan is a lesson in staying vigilant and responsive to the ever-evolving demands of Google’s algorithms.

FAQ

What strategies should SEO professionals employ to adapt to ranking fluctuations?

  • Regularly monitor website performance data through Google Search Console to detect trends or changes in traffic.
  • Keep informed about updates to Google’s ranking algorithms and adapt SEO tactics accordingly.
  • Focus on creating content aligning with current Google relevancy and usefulness standards.
  • Remain vigilant and prepared to make optimizations as market conditions and ranking criteria evolve.

What insight does the interaction between Wesley Copeland and Danny Sullivan give to SEO marketers?

  • It underscores the unpredictable nature of SEO, indicating that even popular websites can experience shifts due to ranking algorithm updates.
  • The guidance confirms the necessity of maintaining a proactive approach to SEO, particularly in evaluating performance metrics.
  • It highlights that, while traffic declines can be worrying, they don’t always signal fundamental issues with the content or SEO practices.


Featured Image: Who is Danny/Shutterstock

WordPress User Survey Indicates Rising Frustration via @sejournal, @martinibuster

WordPress released the results of their annual user and developer survey which showed mixed feelings about the direction the software is going and an increasing sense of not being welcome in the overall WordPress community.

The Gutenberg Editor

Gutenberg is the modernized version of the the default site editor which brings the paradigm of a visual editor to the WordPress core.

Third party visual WordPress editors have revolutionized the process of building websites with WordPress, making it relatively easy to create websites with intuitive interfaces.

That was the goal behind Gutenberg, which introduced the full site editor in 2022. The WordPress core development team have spent the last two years making incremental improvements to the user interface to make it more intuitive as well as adding more features.

What was reflected in the 2023 annual survey, especially in contrast the previous year, is a sense that users are feeling less confidence in Gutenberg, even though more publishers are using Gutenberg now than at any other time.

Which Editor Do You Use?

Question nine tracks the percentage of users adopting Gutenberg, showing a steady increase of users from 37% in 2020 to 60% in 2023.

But according to the answers to question 10 that asks whether WordPress needs their needs, 29% disagree that WordPress meets their needs and less than half of users (45%) agreed that WordPress met their needs. A full 26% of respondents answered that they were neutral.

Those results mean that 55% of WordPress users did not answer that WordPress meets their needs. This was the first year the question was asked so there’s no data to show whether that’s an increase or a decrease but it’s still an underwhelming result.

Less Users Believe WordPress As Good As Others

Question #19 asked if WordPress was as good as or better than other site builders and content management systems.

In 2022 68% of users agreed that WordPress was as good as or better. That number dropped to 63% in 2023.

The number of users who disagreed that WordPress is as good as or better increased from 9% in 2022 to 13% in 2023 and the number of people who were neutral increased by 1% to 24% of respondents.

That means that in 2023 37% of WordPress users responding to the survey did not agree with the statement that WordPress is as good as or better, an increase by five percentage points from the previous year.

Clearly the results about how users feel about Gutenberg and WordPress in general indicate that users are losing confidence in WordPress.

That response must surely be a disappointment to the core development team because the 2023 version of Gutenberg is actually more intuitive to use than it has ever been the WordPress performance scores are also at all-time highs.

So what’s going on, why is are user satisfaction signals trending downwards?

Why User Satisfaction Is Trending Downward

A clue as to why user happiness and confidence in WordPress is trending downward may have something to do with users looking over the fence at the Wix and Duda platforms that boast significantly better performance scores and are also easier to build websites with.

On the other side of the fence are third-party website builders (like Bricks Builder, Breakdance Website Builder, and Elementor) and WordPress hosts (Bluehost) that offer an arguably superior website building experience for developers who need advanced flexibility and for users who don’t know how to code.

Perhaps a clue to why users satisfaction is dropping can be found in the answers for question 20 which asks what the three best things are about WordPress.

The biggest declines were for:

  1. Ease of use
  2. Flexibility
  3. Cost
  4. Block themes

Ease Of Use
In 2022 32% of users cited Ease Of Use as one of the three best things about WordPress. In 2023 that number dropped to 21.7%

Flexibility
Flexibility ranked 31% in 2022 but by 2023 that ranking dropped to 18.5%.

Cost
In 2022 37% of users cited Cost as one of the best things but by 2023 that number collapsed to 17%.

Block Themes
Block Themes went from 10% citing block themes as one of the three best things to only 5.3% in 2023.

Users aren’t feeling it for WordPress and that lack of “feels” is reflected in the market share statistics reported by W3Techs that indicate a two year negative downward trend in market share.

Market share dropped from 43.3% in 2022 (cited in an article by Joost deValk) and (according to W3Techs) it dropped further to 43.2% February 2023 and from there it dropped further 43.1% in February 2024.

Wix usage increased from 2.5% in February 2023 to 2.6% in 2024. Shopify went from 3.8% in 2023 to 4.3% in 2023.

Joost deValk, co-founder of Yoast SEO sounded the alarm back in 2022 when he noted that WordPress market share was shrinking, pointing to the slow pace of performance improvements and the difficulty of using WordPress as two major reasons for the shrinking market share.

The article written by Joost explained:

“WordPress has a performance team now, and it has made some progress. But the reality is that it hasn’t really made big strides yet… I think WordPress, for the first time in a decade, is being out-‘innovated’.”

What Frustrates WordPress Users

Another clue as to why WordPress users are increasingly expressing dissatisfaction is what they feel most frustrated about WordPress, noted in question 21 where survey respondents were asked to choose the top three most frustrating things.

The answer of “too many plugins (finding the right one)” experienced a whopping 133% change, with 8% citing too many plugins in 2022 and 18.6% in 2023.

Site editing experience (17%), security (16.4%), and performance (16.2%) were top sources of frustration with WordPress.

One bright spot is that the number of respondents who were frustrated because site editing is difficult to learn dropped from 26% in 2022 to 15% in 2023.

Those answers were echoed in question 25 that asked which three areas of WordPress need more attention.

Here are the top five areas users say need more attention:

  1. Performance 19%
  2. Security 18%
  3. Developer resources (examples, demos, docs, tutorials, etc.) 16%
  4. Design/UI 14%
  5. Core functionality/stability 13%

The Future Of WordPress

WordPress was at a crossroads two years ago with regards to site performance and they took steps to address those problems. But their competitors are “out-innovating” them by improving at a faster pace, not just in site speed but in ease of use, SEO and features.

The results of this survey provide clear direction to the WordPress community who have a history of being responsive to user needs. Part of the solution is acknowledging search marketing, affiliate and publishing communities who are influential but not recognized in the annual surveys.

When I saw the survey last year I offered the core development team feedback about question number five which asked how respondents used WordPress.

These were the choices:

  • A personal or passion project
  • A service offering for my clients
  • A platform for running my business
  • A website for my employer or place of work
  • School or academics or research
  • None of the above

What was missing were the categories of content publishing, affiliate marketing, recipe bloggers and local businesses.

Lumping WordPress users like Disney with family-run restaurants and recipe bloggers into the category of a “platform for running my business” is unhelpful and provides little actionable insights. That oversight feeds into the perception that WordPress is aloof to the millions of users that the survey seeks to understand.

The good news is that WordPress is not aloof. The survey provides feedback on how the publishing community feels. My email conversations with members of the core development team make it clear to me that they are keen to embrace all their users as part of the greater WordPress community.

Read the summary of the WordPress survey:

2023 Annual Survey Results and Next Steps

Download the PDF version with more details:

Report for 2023 WordPress Annual Survey

Featured Image by Shutterstock/Krakenimages.com

Google On 404 Errors And Search Console Validation Fix via @sejournal, @martinibuster

Google’s John Mueller answered an interesting question about what actually happens after clicking the “validate a fix” link in Search Console if the 404 status still exists. John Mueller explained what’s going on with that “validate a fix” function.

What Causes A 404 Status Code And How Should It Be Dealt With?

When a browser requests a webpage the server offers a response and a code that relates the status of the request. If the request for a webpage is successful the server responds with a status code of 200 (Okay). If the request was unsuccessful because the requested webpage does not exist at the requested URL address then the server will respond with a 404 (Not Found) status code.

How does Google Search Console (GSC) handle the validation of fixed 404 errors?

Dixon Jones, CEO of Inlinks asked a question about what it means to validate a 404 error response in search console when the 404 still exists.

He tweeted his question:

Hi @JohnMu – boring question… If a 404 means a page does not exist (and should not exist), what does GSC do when it tries to “validate a fix”?

It will still be a 404… so what drives it out of the 404 list? Removal of links To that page? Or should we start creating 301s? I assume not…”

Google’s John Mueller explained the purpose of 404 search console validation:

“It’s more if you accidentally 404’d something and fixed it. You obviously don’t have to fix 404s that you want to be 404s. Also, this is more about tracking for you (“I fixed this, tell me when you see it fixed too”).”

It’s not uncommon for publishers to accidentally remove webpages or to disappear because of technical issues. As a convenience to publishers (and to searchers), Google keeps rembering the location of the missing webpages so that it can start showing them in the search results again once the page returns, as John Mueller says, “…if you accidentally 404’d something and fixed it.”

What Causes A 404 Status Code And How Should It Be Interpreted?

The 404 response is called an error because the webpage requested from the URL does not exist and thus the request is in error, not that Google found and error on the webpage that needs to be fixed.

RFC-Editor.org lists the official Internet standards for HTML and the official description of the 404 Status Code does not even mention the word error.

This is the official standard for the 404 status code:

“15.5.5. 404 Not Found
The 404 (Not Found) status code indicates that the origin server did not find a current representation for the target resource or is not willing to disclose that one exists.

A 404 status code does not indicate whether this lack of representation is temporary or permanent; the 410 (Gone) status code is preferred over 404 if the origin server knows, presumably through some configurable means, that the condition is likely to be permanent.”

Technically, if 404 status is known to be permanent and the webpage is never coming back then the correct response is to show a 410 status code.

But Google treats the 404 and 410 response codes almost equally. The 410 response causes the webpage to drop out of Google’s search index just a little bit faster. But the end result is the same.

Is It Necessary To Fix All 404 Errors Including From External Links?

Jeannie Hill stepped into the discussion to ask about inbound links from other sites to the wrong URL.

She tweeted:

“Most of the 404s we don’t want are derived from external sources that fail to get the inbound URL right. Even Research Gate. Trying to correspond typically lags or has no response. Is it worth pursuing?”

John Mueller responded:

“Probably not. (Also “validate fix” is about checking the URL on your site, not the linking URL, so it wouldn’t apply there anyway.)”

Jeannie followed up:

“Thanks, @JohnMu for the response.

It is useful to identify these inbound 404s while “Validate fix” helps resolve internal linking issues.

We’ve resolved a few inbound 404s that we thought were more important. However, I question the value gained for the effort it takes.”

John responded with a commented on the value of spending the time to fix inbound links:

“I’d look at the traffic and not SEO-Juice. Are too many people getting lost when they want to visit you? That seems like something worth fixing if you can.”

The Role Of “Validate Fix” In Managing Internal And External 404 Errors

John Mueller made clear that the 404 search console report is meant to be a way to communicate that Google found missing pages. It’s up to publishers to decide what to do about them.

When it comes to external links to URLs that don’t exist, Mueller suggests that fixing those is not worth pursuing but I think most SEOs would disagree if the link is from a legit website. It makes sense to pursue fixing those inbound links by creating a 301 redirect from the malformed URL to the correct URL.

Featured Image by Shutterstock/tynyuk

New Google Analytics Feature Detects Subtle Data Trend Changes via @sejournal, @MattGSouthern

Google has announced a new trend change detection feature in Google Analytics.

This new capability will allow you to more easily identify subtle but meaningful shifts in data trends over time.

What Is Trend Change Detection?

Trend change detection is designed to detect gradual, long-term changes in metrics rather than sudden spikes or dips.

While Google Analytics already has anomaly detection to highlight abnormal spikes or drops, this new feature focuses on more subtle changes in data trends spanning weeks or months.

How It Works

When Google Analytics detects a change in the direction of time-series data for a metric, it overlaps a marker in charts and graphs on the date the change occurred.

You can hover over the marker to see details about the change, including the previous and current rates of change and the exact date the trend shifted.

New Google Analytics Feature Detects Subtle Data Trend ChangesScreenshot from: support.google.com, February 2024.

Clicking “Investigate Report” will open a more detailed view to analyze the data further. You can adjust the date range, compare other dimensions, and add breakdowns to better understand what is driving the trend change.

Why Trend Change Detection Matters

Unexpected downward trends require investigation to determine the cause. For example, a website code update could inadvertently break a registration button, leading to stalled user growth.

Without trend change detection, it could take weeks or months to notice the gradual decline in new registrations.

Google Analytics uses a signal segmentation algorithm to detect trend changes in time series data. For daily data, it examines approximately 90 days of history. For weekly data, the algorithm looks at around 32 weeks to identify potential trend changes.

Where You’ll See Trend Change Detection

Trend changes will be displayed in the following places in the Google Analytics interface:

  • Insight cards on the main Home page
  • The Reports snapshot page
  • The Advertising snapshot page
  • The dedicated Insights hub

Types of Trend Changes Detected

The new detection capability is designed to surface the following types of trend changes:

  • Increase to decrease (growth slowing)
  • Decrease to increase (decline reversing)
  • Larger increase or decrease (growth or decline accelerating)
  • Smaller increase or decrease (growth slowing or decline easing)

Tips For Using Trend Change Detection

Leveraging Google’s new granular trend tracking provides an edge in diagnosing opportunities and threats in organic data patterns.

Monitoring organic traffic metrics week-over-week and month-over-month allows you to detect unexpected downward trends that could indicate issues like:

  • Sites being penalized or blocked
  • New algorithm updates decreasing rankings
  • Technical SEO issues preventing crawling and indexing

Early detection of these trend changes allows you to investigate and address the root causes quickly.

FAQ

How does trend change detection differ from anomaly detection in Google Analytics?

Trend change detection and anomaly detection are two distinct features within Google Analytics that cater to different analytical requirements:

  • Trend change detection is engineered to identify prolonged, gradual shifts in metrics over time rather than immediate, stark fluctuations.
  • Anomaly detection highlights irregular spikes or drops that deviate from the standard data patterns.
  • Together, they provide a comprehensive overview of both short-term anomalies and long-term trends within the data.

How can marketers effectively utilize Google Analytics’ trend change detection?

Marketers can leverage trend change detection to maintain oversight of website performance in the following ways:

  • Regularly monitor insight cards and reports within Google Analytics to detect and investigate any marked changes in data trends.
  • Use the “Investigate Report” feature to delve deeper into the data and understand the dynamics driving these changes.
  • Quickly address issues flagged by downward trends that may indicate technical SEO problems, penalties, or adverse effects from algorithm updates.


Featured Image: Vladimka production/Shutterstock