WordPress.com Launches Studio Sync Local Development via @sejournal, @martinibuster

WordPress.com, the Automattic web hosting platform, just announced that the free and open source Sync local development app can now integrate directly with WordPress.com hosting. The new synchronization feature streamlines the process of developing a website on the desktop then pushing it live when it’s ready for deployment.

WordPress.com Hosting

WordPress.com is a WordPress web hosting and publishing platform that offers a free and paid tier, plus bargain-priced domain name registrations. WordPress.com is a for-profit company that’s owned by Automattic. Their slogan is “Everything you need to build and grow any website—all in one place” and with this new feature that slogan has never been more true.

The new feature is available to WordPress.com users on the paid Business plan level or higher.

According to the announcement, the new features bring the following benefits to users:

  • “Push and Pull with Ease: You can keep your local Studio sites connected to your WordPress.com site, so pushing or pulling will be as easy as clicking a button.
  • Flexible Syncing: Having complete freedom, you can connect a WordPress.com site to multiple Studio sites.
  • Team Collaboration: Multiple developers can connect a local Studio site to a shared WordPress.com site, making it easy to push and pull changes as a team.
  • Sync To and From Staging: If using staging sites are part of your development workflow, you can now easily push from your local Studio site to your WordPress.com staging site.
  • One-Click Bliss: No need to worry yourself with database dumps, manually syncing files over SFTP, or performing other monotonous manual steps. Synchronize your local and hosted sites at any time with just one click.”

Studio is available for both the Mac and Windows desktop environments.

Read the announcement:

Build Locally, Deploy Globally: Meet Studio Sync for WordPress.com

Featured Image by Shutterstock/Fauzi Muda

Study: 47% Find Search Terms Irrelevant During Product Research via @sejournal, @MattGSouthern

A recent study from Reddit shows that many people are frustrated with traditional search engines and ads, pushing them to seek product recommendations on community platforms.

The study found that 47% of social media users find “irrelevant search terms” particularly annoying during online product research. This frustration drives users to Reddit, where they can engage in discussions and receive personalized advice.

These findings suggest Reddit plays an increasingly important role in consumer decision-making.

Consumer Frustrations With Search

The study suggests there’s a growing gap between what users expect from search engines and what they deliver.

Almost half (47%) of the users surveyed were unhappy with search engines because they couldn’t find relevant, helpful answers during product research.

As a result, many users are turning to Reddit for better, context-specific recommendations.

71% of respondents said that Reddit is the best social media platform for finding quick and specific answers to their questions.

Reddit’s Role in the Purchase Journey

Reddit’s research shows how important the platform is in the buying process, from discovering products to making decisions.

The study finds that ads on Reddit help create more conversations about brands, which increases their visibility.

For every 1,000 ad impressions on Reddit, advertisers get about two organic posts, averaging 3,500 views.

The study also reveals that 23% of recommendation posts on Reddit lead to “redirection.” This means users start considering brands they had not thought about before.

These redirections usually occur when users ask for help, such as when they want advice on products that better suit their needs.

Reddit vs. Other Recommendation Sources

One of the study’s more striking claims is that Reddit recommendations are more trusted than many other forms of advice.

According to the findings, 42% of social media users value Reddit recommendations over other sources, including influencer-sponsored posts, branded ads, and even some expert reviews.

While Reddit recommendations ranked slightly below expert review sites (+17%) and consumer review sites (+15%) in terms of influence, they reportedly outperformed social media ads and influencer posts.

This suggests an increasing preference for community-driven recommendations over more traditional forms of advertising.

Reddit’s Reach Compared to Influencers

The study claims that Reddit’s reach is comparable to, or in some cases greater than, traditional influencer marketing:

  • 82 Reddit recommendation posts reportedly reach the same audience as an article on an endemic review site.
  • Six Reddit posts are said to match the reach of an Instagram influencer with 100,000 followers.
  • Eight Reddit posts equal the reach of a TikTok influencer with the same follower count.

Given that an estimated 25% of Reddit posts are recommendation-related, the platform’s potential for scale is significant.

For example, over 25,000 recommendation posts were recorded in the beauty category alone in December.

Balancing Reddit’s Claims

While Reddit’s findings highlight the platform’s potential as a discovery and recommendation tool, it’s important to view these claims in the context of the study’s source.

As a platform promoting itself as a solution to consumer frustrations, Reddit is interested in positively presenting its influence.

That said, the data does align with broader industry trends, showing a growing demand for authentic, peer-driven recommendations.

Consumers increasingly prioritize trust and personalization in their decision-making processes, and platforms like Reddit offer a space for this type of engagement.

Looking Ahead

Reddit’s research shows how consumer behavior is changing. Consumers increasingly value personalized recommendations from their communities over traditional discovery methods.

This shift could change how brands interact with consumers. Right now, Reddit’s statement that “conversation is the new influencer” highlights an important trend to watch.

Methodology

The study surveyed 1,000 social media users across seven key product verticals—laptops, TVs, cars, refrigerators, credit cards, makeup, and movie tickets—totaling 7,000 respondents.

Participants were asked to evaluate various recommendation sources, including Reddit posts, influencer-sponsored content, expert reviews, and branded ads.


Featured Image: voronaman/Shutterstock

Google Search Console Adds Hourly Data Export To 24-Hour View via @sejournal, @MattGSouthern

Google has expanded its newly launched 24-hour data view in Search Console by adding the ability to export hourly performance data.

This enhancement comes in response to user feedback since the feature’s initial rollout.

The announcement, shared on LinkedIn, stated:

“Last month we announced the 24-hour view in the Search Console Performance reports, and we got lots of positive feedback and feature requests. Today, we’re making the export button available for that view (one of the most requested features): the ability to export data on an hourly basis for the last 24 hours. Enjoy the new data!”

This update allows users to access detailed website performance data within the last 24 hours, broken down by the hour, and export it for further analysis.


Key Details About the 24-Hour Data View

The 24-hour data view, introduced in December, is an upgrade to Google Search Console’s performance reports.

It aims to provide near real-time insights into how website content performs across Google Search, Discover, and News.

Fresher, Faster Data

With the 24-hour view, website owners no longer need to wait up to 48 hours for performance metrics to update.

Data for clicks, impressions, average click-through rate (CTR), and average position is now available with minimal delay.

This feature is handy for monitoring the early performance of newly published content or tracking the immediate impact of updates to existing pages.

It provides timely insights into which search queries drive traffic and how pages rank in search results.

Hourly Breakdown

The 24-hour view includes an hourly breakdown of performance metrics displayed in an overtime graph.

While the data updates as soon as it becomes available, Google notes that incomplete data is marked with a dotted line in the interface to indicate ongoing collection.

You can filter the data by query, page, or country and analyze performance within their local time zone, as determined by browser settings.


How the Export Feature Enhances the 24-Hour View

Adding an export button to the 24-hour view addresses one of the most requested features from users.

With this functionality, you can now download hourly data for the past day, enabling deeper analysis through external tools like Excel or Google Sheets.

This enhancement is expected to help businesses better respond to performance trends and optimize their content strategies.


Why It Matters

By making website metrics more accessible and timely, Google is helping marketers and SEO professionals stay competitive.

To access the 24-hour view, you can go to the performance reports section of Search Console and select the “24 hours” tab. The export button is now available to all users with access to this feature.

As the rollout continues over the coming months, this update highlights Google’s broader commitment to providing real-time tools that enhance website performance tracking and optimization.


Featured Image: Vladimka production/Shutterstock

Google Criticizes Bing For Mimicking Google’s Homepage via @sejournal, @MattGSouthern

Parisa Tabriz, the security leader for Google Chrome, has criticized Microsoft for a new strategy involving Bing’s search interface.

In a post on X (formerly Twitter), Tabriz denounced Microsoft’s decision to imitate the design of Google’s homepage, labeling it “another tactic in its long history of tricks to confuse users and limit choice.”

She concluded her statement with sharp words: “New year; new low, Microsoft.”

This criticism comes after Bing introduced a controversial feature that mimics Google’s user interface when users search for “Google” or “Google.com.”

Microsoft’s Sneaky New Bing Interface

When users not signed into a Microsoft account search for Google on Bing, they see a page that looks a lot like Google’s homepage.

Screenshot from: Bing.com, January 2025.

The page has a search bar in the center, a banner with animated figures similar to Google Doodles, and a message saying, “Every search brings you closer to a free donation. Choose from over 2 million nonprofits!”

This message links to the Microsoft Rewards catalog, where users can donate their reward points to nonprofit organizations.

The design makes it hard to see Bing’s branding by scrolling the page slightly down to hide the Bing logo.

Users may only realize they’re still using Bing when they scroll or interact with the page further.

Attempt To Retain Users

Industry observers like The Verge note this move appears targeted at users setting up new Windows PCs, who might initially search for Google through Microsoft Edge’s default Bing search engine.

The design change could potentially retain users who might otherwise switch to Google’s search platform.

Many of these users search for Google to switch their search engine. Microsoft’s change aims to keep users from leaving Bing.

While tech-savvy users may notice this strategy, it might persuade less experienced users to keep searching on Bing, helping Microsoft retain more users.

Broader Context: The Search Engine Wars

This latest tactic highlights the ongoing competition between Microsoft and Google in the search engine market.

Microsoft has employed various strategies to promote its Bing search engine and Edge browser, including pop-ups and changes to Chrome’s download pages.

In parallel, Google has encouraged users to download Chrome and set Google as their default search engine, though its methods haven’t included outright deception.

Google’s and Microsoft’s rivalry remains heated. As of December, Google’s search engine maintained a dominant global market share of 89.74%, while Microsoft’s Bing held 3.97%.

Final Thoughts

As Microsoft continues to push for greater adoption of Bing, the company’s latest tactic raises questions about user trust and transparency.

While the mimicry may boost Bing’s metrics in the short term, the backlash from users and industry leaders could damage Microsoft’s reputation.

Whether Microsoft will address the criticism or double down on its strategy remains to be seen.


Featured Image: kovop/Shutterstock

Google Clarifies 404 & Redirect Validation In Search Console via @sejournal, @MattGSouthern

Google’s Search Advocate, John Mueller, has provided insights into Search Console’s validation process, addressing how it handles 404 errors and redirects during site migrations.

Key Points

A Reddit user shared their experience with a client’s website migration that led to a loss in rankings.

They explained that they took several steps to address the issues, including:

  • Fixing on-site technical problems.
  • Redirecting 404 pages to the appropriate URLs.
  • Submitting these changes for validation in Google Search Console.

Although they confirmed that all redirects and 404 pages were working correctly, they failed to validate the changes in Search Console.

Feeling frustrated, the user sought advice on what to do next.

This prompted a response from Mueller, who provided insights into how Google processes these changes.

Mueller’s Response

Mueller explained how Google manages 404 errors and redirect validations in Search Console.

He clarified that the “mark as fixed” feature doesn’t speed up Google’s reprocessing of site changes. Instead, it’s a tool for site owners to monitor their progress.

Mueller noted:

“The ‘mark as fixed’ here will only track how things are being reprocessed. It won’t speed up reprocessing itself.”

He also questioned the purpose of marking 404 pages as fixed, noting that no further action is needed if a page intentionally returns a 404 error.

Mueller adds:

“If they are supposed to be 404s, then there’s nothing to do. 404s for pages that don’t exist are fine. It’s technically correct to have them return 404. These being flagged don’t mean you’re doing something wrong, if you’re doing the 404s on purpose.”

For pages that aren’t meant to be 404, Mueller advises:

“If these aren’t meant to be 404 – the important part is to fix the issue though, set up the redirects, have the new content return 200, check internal links, update sitemap dates, etc. If it hasn’t been too long (days), then probably it’ll pick up again quickly. If it’s been a longer time, and if it’s a lot of pages on the new site, then (perhaps obviously) it’ll take longer to be reprocessed.”

Key Takeaways From Mueller’s Advice

Mueller outlined several key points in his response.

Let’s break them down:

For Redirects and Content Updates

  • Ensure that redirects are correctly set up and new content returns a 200 (OK) status code.
  • Update internal links to reflect the new URLs.
  • Refresh the sitemap with updated dates to signal changes to Google.

Reprocessing Timeline

  • If changes were made recently (within a few days), Google will likely process them quickly.
  • For larger websites or older issues, reprocessing may take more time.

Handling 404 Pages

  • If a page is no longer meant to exist, returning a 404 error is the correct approach.
  • Seeing 404s flagged in Search Console doesn’t necessarily indicate a problem, provided the 404s are intentional.

Why This Matters

Website migrations can be complicated and may temporarily affect search rankings if not done correctly.

Google Search Console is useful for tracking changes, but it has limitations.

The validation process checks if fixes are implemented correctly, not how quickly changes will be made.

Practice patience and ensure all technical details—redirects, content updates, and internal linking—are adequately addressed.


Featured Image: Sammby/Shutterstock

Google AI Overviews Appear in 18% Of Publisher-Related Queries via @sejournal, @MattGSouthern

New research indicates that Google’s AI Overviews appear in 18% of publisher-related search queries.

Additionally, the findings suggest that traditional search ranking factors may be less relevant for content appearing in AI Overviews.

Here are more highlights from the study released by ZipTie.dev, which analyzes over 500,000 queries across multiple industries.

Key Findings

Data indicates that 63% of sources cited in AI Overviews are not found in the top 10 traditional search results.

This change illustrates a shift in Google’s strategy, as explained by Rudzki.:

“In traditional ranking, Google’s job is to send you to pages that you will likely be satisfied with. With AI Overviews the goal is different, it’s about showing you the best answer.”

The analysis found different frequencies of AI Overviews in search results:

  • “How much” queries show AI Overviews 54% of the time.
  • Review-related queries show AI Overviews only 9% of the time.
  • “What is” queries generate AI Overviews 39% of the time.

The study also notes that Google is using YouTube content in AI Overviews. This change could give publishers with video strategies more visibility opportunities.

Questions About Authority

Research shows that some publications are featured prominently in AI Overviews, even when the topics are outside their usual areas of expertise.

For example, Business Insider is often cited for celebrity news, while The Times of India is mentioned in health-related discussions.

This trend indicates that traditional ideas about who has authority on a topic are becoming less important.

Looking Ahead

AI Overviews are now available in over 100 countries and territories. However, their use in the EU is limited because of regulations. Right now, the feature has only a small amount of ads.

The study expects AI Overviews to grow more in the future, but notes two main factors that could slow this expansion.

Rudzki states:

“Google is not putting ads in AI Overviews, except for very limited usage. Once they will find a good way to earn money, they will likely increase the share of AI Overviews.”

Additionally, he notes that user experience remains crucial:

“Google just can’t put AI Overviews for every keyword. This would translate to extremely low satisfaction rates.”

Methodology

The analysis examined over 500,000 queries across multiple industries between June and December 2024.

The complete study and detailed methodology are available through ZipTie.


Featured Image: Below the Sky/Shutterstock

Mullenweg Criticizes WP Engine For Something He Also Does via @sejournal, @martinibuster

Matt Mullenweg cited a Reddit thread on X to promote the idea that WP Engine makes it difficult to cancel accounts. Turns out that his own hosting company does the exact same thing.

“Money Grab” Post By Redditor

Someone posted that they cancelled a WP Engine account on Friday December 6th. They subsequently learned that WP Engine has a 30 day advance notice cancellation policy so they called customer service and was assured they wouldn’t be charged, despite not giving 30 days advance notice.

They wrote:

“On Dec 6th, I cancelled my WPEngine service that I’ve had since 2015. …That’s when I discovered that WPEngine requires 30 days notice to cancel. An obvious money-grab. A user should be able to cancel a single-site hosting environment instantly with one click of a button. In fact, this will be the law soon, created because of unscrupulous cancellation tactics like this.

WPEngine support informed me that my site would be cancelled on Jan 3rd. …Surprise, surprise… it’s Jan 5th. My account hasn’t been cancelled, and I was charged $300 today for another year of service.”

Hours after starting the Reddit post to complain about WP Engine they updated it to say that WP Engine had refunded their money.

They posted:

“UPDATE: WP Engine support got back to me, cancelled the account today, and initiated a refund that’ll take up to 10 business days.”

Mullenweg Dumps On WP Engine

Mullenweg posted on X to compare WP Engine to an unscrupulous gym, accompanying his post with a screenshot of the Reddit post:

“One way @wpengine juices its profits at the expense of its customers is by making it hard to export or cancel your plan, like one of those bad gyms.”

He followed that up with another post touting that WordPress.com has a flexible cancellation policy:

“The WordPress philosophy is to make it easy for people to leave, so they’re more likely to stay. Give freedom and choice.”

But the reality is that WordPress.com’s cancellation policy also requires 30 day advanced notice:

“You must cancel at least one month before the scheduled end date of any annual subscription…”

What’s Going On?

WP Engine’s contract is clear that they require 30 days notice to cancel a service. But it’s not like once you pay you’re committed to a whole year of hosting. The contract enables customers to cancel their yearly hosting plan at any time (with 30 days notice) and the difference for any remaining months will be refunded.

The Redditor cancelled their account with less than 30 days notice (on a Friday), got charged 24 days later and then refunded on a Sunday, before the weekend was over.

Response On Reddit

While many Redditors were supportive of the person who started the discussion, others pointed out the obvious that it’s a weekend and they failed to give adequate notice.

A Redditor named ThePresidentOfStraya posted:

“Not affiliated with WPEngine. Downvoted. This is a boring billing issue, you’re not being oppressed. Annoying sure. But just call them Monday mate.”

Another Redditor downplayed the events:

“Meh, crap happens. It’s not at all abnormal to have a 30 day opt-out prior to renewal.”

Another Redditor put the original posters situation into perspective, commenting:

“Money grab? You know the rules…

Inform yourself. You should have cancelled sooner. Now be polite and ask them for a solution instead of ranting about it online.”

Read the original Reddit post:

UPDATE: WPEngine didn’t follow through on cancelling my account on Jan 3rd and I was charged for another year of service

Featured Image by Shutterstock/Nicoleta Ionescu

WordPress Popular Posts Plugin Vulnerability Affects 100k+ Sites via @sejournal, @martinibuster

An advisory has been issued about a high-severity WordPress vulnerability that makes it possible for attackers to inject arbitrary shortcodes into sites using the WordPress Popular Posts plugin. Attackers do not need a user account to launch an attack.

WordPress Popular Posts is installed in over 100,000 websites enables websites to display the most popular posts within any given time period and has been translated into sixteen different languages to extend its use around the world. It comes with caching features to improve performance and an admin console that allows website administrators to view popularity statistics.

WordPress Shortcode Vulnerability

Shortcodes is a feature that allows users to insert functionalities within a web page by inserting a predefined snippet within brackets that automatically inserts a script that performs a function, like adding a contact form with a shortcode that looks like this: [add_contact_form].

WordPress is gradually evolving away from the use of shortcodes in favor of blocks with specific functionalities. The official WordPress developer site encourages plugin and theme developers to discontinue using shortcodes in favor of dedicated blocks, with the main reason being that it’s a smoother workflow for a user to select and insert a block rather than configure a shortcode within a plugin then manually inserting the shortcode into a webpage.

WordPress advises:

“We would recommend people eventually upgrade their shortcodes to be blocks.”

The vulnerability discovered in the WordPress Popular Posts plugin is due to the implementation of the shortcode functionality, specifically a part called do_shortcode(), which is a WordPress function for processing and executing shortcodes that requires input sanitization and other standard WordPress plugin and theme security practices.

According to an advisory published by Wordfence:

“The WordPress Popular Posts plugin for WordPress is vulnerable to arbitrary shortcode execution in all versions up to, and including, 7.1.0. This is due to the software allowing users to execute an action that does not properly validate a value before running do_shortcode. This makes it possible for unauthenticated attackers to execute arbitrary shortcodes.”

That part about “validating a value” generally means checking to ensure that what the user inputs (the “value”), such as the content of a shortcode, is validated to confirm that it’s safe and conforms to expected inputs before being passed along for use by the website.

Official Plugin Changelog

A changelog is the documentation of what’s being updated, which for users of the plugin provides them an opportunity to understand what is being updated and to make decisions about whether to update their installation or not, thus transparency is important.

The WordPress Popular Posts plugin is responsibly transparent in their documentation of the update.

The plugin changelog advises:

“Fixes a security issue that allows unintended arbitrary shortcode execution (props to mikemyers and the Wordfence team!)”

Recommended Actions

All versions of the WordPress Popular Posts plugin up to and including version 7.1.0 are vulnerable. Wordfence recommends updating to the latest version of the plugin, 7.2.0.

Read the official Wordfence advisory:

WordPress Popular Posts <= 7.1.0 – Unauthenticated Arbitrary Shortcode Execution

Featured Image by Shutterstock/GrandeDuc

WordPress Backup Plugin Vulnerability Affects 3+ Million Sites via @sejournal, @martinibuster

A high severity vulnerability in a popular WordPress backup plugin allows unauthenticated attackers to exploit the flaw. The vulnerability is rated 8.8 on a scale of 0.0 to 10.

UpdraftPlus: WP Backup & Migration Plugin

The vulnerability affects the popular Updraft Plus WordPress plugin, installed in over 3 million websites. Updraft Plus comes in a free and paid version that allows users to upload backups to a user’s cloud storage or to email the files. The plugin allows users to manually backup the website or schedule it for automatic backups. It offers a tremendous amount of flexibility of what can be backed up and can make a huge difference for recovering from a catastrophic server issue and is also useful for migrating to a different server altogether.

Wordfence explains the vulnerability:

“The UpdraftPlus: WP Backup & Migration Plugin plugin for WordPress is vulnerable to PHP Object Injection in all versions up to, and including, 1.24.11 via deserialization of untrusted input in the ‘recursive_unserialized_replace’ function. This makes it possible for unauthenticated attackers to inject a PHP Object.

No known POP chain is present in the vulnerable software. If a POP chain is present via an additional plugin or theme installed on the target system, it could allow the attacker to delete arbitrary files, retrieve sensitive data, or execute code. An administrator must perform a search and replace action to trigger the exploit.”

The Updraft Plus changelog seems to minimize the vulnerability, it doesn’t even call the update a security patch, it’s labeled as a “tweak.”

From the official Updraft Plus WordPress plugin changelog:

“TWEAK: Complete the review and removal of calls to the unserialize() PHP function allowing class instantiation begun in 1.24.7. (The final removal involved a theoretical security defect, if your development site allowed an attacker to post content to it which you migrated to another site, and which contained customised code that could perform destructive actions which the attacker knew about, prior to you then cloning the site. The result of this removal is that some search-replaces, highly unlikely to be encountered in practice, will be skipped).”

Updraft Plus Vulnerability Patched

Users are recommended to consider updating their installations of Updraft Plus to the latest version, 1.24.12. All versions prior to the latest version are vulnerable.

Read the Wordfence advisory:

UpdraftPlus: WP Backup & Migration Plugin <= 1.24.11 – Unauthenticated PHP Object Injection

Featured Image by Shutterstock/Tithi Luadthong

Google Shows How To Confirm Indexing Issues Due To JavaScript via @sejournal, @martinibuster

SearchNorwich recently published an excellent video featuring Google’s Martin Splitt discussing how to debug crawling and indexing issues related to JavaScript, saying that most of the times it’s not JavaScript that’s causing indexing issues, the actual cause is something else. Even if you don’t know how to code with JavaScript, the tips that Martin shares will enable anyone to get a good start on debugging crawl issues that are originating on a website.

JavaScript Is Rarely The Cause Of SEO Issues

Martin’s SearchNorwich video was published a month ago. Just a few days ago John Mueller advises that too much JavaScript can have a negative impact on SEO, which aligns with Martin’s assertion that JavaScript is rarely the reason for SEO issues, that it’s either the misuse of JavaScript or something else entirely.

He explains that of the issues that virtually all suspected JavaScript issues that get emailed to him end up being something else. He pins the blame on a flawed approach to debugging SEO issues. What he describes is confirmation bias, which is suspecting that something is the cause and then looking for clues to justify that opinion. The definition of confirmation bias is the tendency to interpret existing evidence or to look for evidence that confirms existing beliefs, while ignoring evidence that contradicts those beliefs.

Martin explained:

“…it seems to me, as someone on the Google side of things, that SEOs look for clues that allow them to blame things they’re seeing on JavaScript. Then they show up, or someone from their team shows up, in my inbox or on my social media and says, “We found a bug. It’s JavaScript. You say JavaScript works in Google Search, but we have a strong hint that it doesn’t, and you know it’s because of JavaScript.”

He goes on to say that out of hundreds of times a year that he’s approached with a diagnosis that JavaScript is to blame for an SEO problem he has only seen one actual instance where an actual bug related to JavaScript was to blame. Just one.

He also says:

“People often claim, “You say it works if you use client-side rendering, but clearly, it is not working. It must be a JavaScript problem and maybe even a bug in Google.” Surprisingly, many of the people who end up in my inbox suspect it’s a Google bug. I find that interesting, especially when a small, niche website claims to be affected by a bug that doesn’t affect any other websites. Most of the time, it’s not us—it’s you.”

Splitt explains that when JavaScript is involved in a crawling or rendering issue, it’s most often not because JavaScript is to blame but rather it’s being used incorrectly

Finding Source Of Rendering Issues

Martin suggests debugging rendering issues by checking how Google “sees” the web page. Rendering, in the context of Googlebot crawling, is the process of downloading all the resources from a web page like fonts, JavaScript, CSS and HTML and then creating fully functional web page that’s similar to what a human user would experience in a web browser.

Debugging how Google renders a page may show that the page renders fine, that certain parts don’t render or that the page cannot be indexed at all.

He recommends using the following tools for debugging possible JavaScript issues:

1. Google Search Console URL Inspection Tool

2. Google Rich Results Test

3. Chrome Dev Tools

Easy JavaScript Debugging

Both of the first two tools let you submit a URL that gets immediately crawled by Google and they’ll show you the rendered page, what the page looks like for Google for indexing purposes.

Martin explains the usefulness of the JavaScript console messages in Chrome Dev Tools:

“There’s also more info that gives you very helpful details about what happened in the JavaScript console messages and what happened in the network. If your content is there and it’s what you expect it to be, then it’s very likely not going to be JavaScript that is causing the problem. If people were doing just that, checking these basics, 90% of the people showing up in my inbox would not show up in my inbox. That’s what I do.”

He also explained that just because the JavaScript console flags an error that doesn’t mean that the problem is with the JavaScript itself. He uses the example of an error in how JavaScript failed to execute that was caused by an API that’s blocked by Robots.txt, preventing the page from rendering.

Why Do So Many SEOs Blame JavaScript?

Martin implies that not knowing how to debug JavaScript is the cause of the reputation it’s received as a cause of crawling and indexing issues. I get it, I learned the basics of coding JavaScript by hand 25 years ago and I disliked it then and now, it’s never been my thing.

But Martin’s right that knowing a few tricks for debugging JavaScript will save a lot of wasted time chasing down the wrong problem.

Watch Martin Splitt’s presentation here:

Maybe It Isn’t JavaScript – Martin Splitt at SearchNorwich 18

Featured Image by Shutterstock/Artem Samokhvalov