Google Clarifies Site Reputation Abuse Policy via @sejournal, @MattGSouthern

Google has issued new clarification for its site reputation abuse policy, first launched earlier this year, which targets “parasite SEO” practices where websites leverage established domains to manipulate search rankings through third-party content.

Chris Nelson from the Google Search Quality team states:

“We’ve heard very clearly from users that site reputation abuse – commonly referred to as ‘parasite SEO’ – leads to a bad search experience for people, and today’s policy update helps to crack down on this behavior.”

Policy Clarification

The updated policy states that using third-party content to exploit a site’s ranking signals violates Google’s guidelines, regardless of first-party involvement or oversight.

This clarification comes after Google’s review of various business arrangements, including white-label services, licensing agreements, and partial ownership structures.

The updated policy language states:

“Site reputation abuse is the practice of publishing third-party pages on a site in an attempt to abuse search rankings by taking advantage of the host site’s ranking signals.”

Policy Details

What’s A Violation?

Google outlines several examples of policy violations, including:

  • Educational sites hosting third-party payday loan reviews
  • Medical sites publishing unrelated content about casino reviews
  • Movie review sites featuring content about social media services
  • Sports websites hosting third-party supplement reviews without editorial oversight
  • News sites publishing coupon content from third parties without proper involvement

What’s Not A Violation?

Google acknowledges there’s a difference between abusive practices and legitimate third-party content.

Acceptable examples include:

  • Wire service and syndicated news content
  • User-generated content on forum websites
  • Editorial content with close host site involvement
  • Properly disclosed advertorial content
  • Standard advertising units and affiliate links

Background

Enforcement of the site reputation abuse policy began in May.

The rollout is having a notable impact in the news and publishing industry, as documented by Olga Zarr.

Major organizations including CNN, USA Today, and LA Times were among the first to receive manual penalties, primarily for hosting third-party coupon and promotional content.

Glenn Gabe shared early observations:

The recovery process has shown clear patterns: sites that removed offending content or implemented noindex tags on affected sections have started seeing their manual actions lifted. However, ranking recovery takes time as Google’s crawlers need to process these changes.

Looking Ahead

While enforcement relies on manual actions, Google has indicated plans for algorithmic updates to automate the detection and demotion of site reputation abuse, though no specific timeline has been announced.

Site owners found in violation will receive notifications through Search Console and can submit reconsideration requests.


Featured Image: JarTee/Shutterstock

Google Rolls Out AI-Powered In-Store Shopping Tools via @sejournal, @MattGSouthern

Google announced new features for in-store shopping and expanded payment options, marking changes to its retail technology offerings.

Key Updates

Google Lens

Google Lens, which reportedly processes 20 billion searches monthly, will enable users to photograph products in stores to find price comparisons and reviews.

The system uses Google’s product database of over 45 billion listings and its Gemini AI models.

Google announcement states:

“This new update is made possible by major advancements in our AI image recognition technology. It’s powered by the Shopping Graph’s 45 billion+ product listings, in-stock inventory data from a range of retailers and our Gemini models to bring you an entirely new way to shop in-store.”

Internal research cited by the company suggests that 72% of Americans use smartphones while shopping in physical stores.

The feature will initially be launched for beauty products, toys, and electronics at participating retailers in the United States.

Users must opt into location sharing through the Google app on Android or iOS to access the functionality.

In a related development, Google Maps will incorporate product search capabilities, allowing users to locate specific items at nearby stores.

Security Measures

Google also announced plans to test new fraud detection services for merchants.

The system aims to identify fraudulent transactions better while reducing false positives that may block legitimate purchases.

Google explains:

“We’re always working to protect consumers and businesses from fraud, which is forecasted to grow substantially in the coming years. Soon we’ll begin piloting a service to help merchants better identify fraudulent transactions and help prevent fraudsters from using stolen financial information. This will also help unblock good transactions that may be mistaken as fraud.”

Looking Ahead

The announcements come as retailers prepare for increased holiday shopping activity.

According to company statements, the features are expected to roll out gradually over the coming weeks.

The timing coincides with broader industry efforts to integrate AI technology into retail experiences while addressing growing concerns about payment security.

Google Analytics 4 (GA4) Users Report Data Collection Issues via @sejournal, @MattGSouthern

Google Analytics 4 (GA4) users report data collection issues affecting websites globally, with many experiencing up to 50% drops in reported traffic since November 13.

The problem has sparked discussions across Google’s support forums and social media platforms.

Key Issues

Multiple website owners have documented discrepancies between GA4 reports and actual traffic levels.

While GA4 shows reduced numbers, cross-referencing with Google Search Console and other analytics platforms confirms normal traffic levels.

One user explained the severity of the issue:

“The incomplete data is there since 13th November which shows only 4445 users when in actual (looking at Search and Discover in GSC), I am calculating more than 13,000 users (at least).”

Real-time tracking appears unaffected, suggesting the issue impacts historical data.

Technical Details

Investigations reveal that data flows to BigQuery for users with connected accounts.

However, this only provides a partial solution, as many GA4 users don’t utilize BigQuery integration.

The timing coincides with Google’s mid-November attribution system updates, though no direct connection has been confirmed.

Affected metrics Include:

  • Overall traffic volumes
  • Channel attribution data
  • Landing page metrics
  • Event tracking

Site owners from multiple countries, including Taiwan and various European regions, report identical patterns of data loss beginning November 13:

“Taiwan is experiencing the same issue. On 11/13, there was a sudden drop in traffic, and from 11/14 to 11/17, it decreased by 20-30% compared to the same period last month.”

People note that while their real-time analytics show expected traffic levels, historical data since November 13 reflects only about half of their actual visitor numbers:

“I usually track the data from the day before yesterday on the current day. However, there’s only nearly 50% traffic on my website. Just want to know is there anyone with the same situation as me?”

Why This Matters

This disruption poses challenges for organizations relying on GA4 for business intelligence and reporting.

Many companies face difficulties in performance analysis and decision-making processes without accurate historical data.

Despite numerous support threads and community discussions, Google hasn’t officially addressed the situation or indicated whether the missing data will be retroactively restored to affected accounts.

We will continue to monitor this situation and provide updates as information becomes available.


Featured Image: MacroEcon/Shutterstock

Google’s AI Search Experiment: “Learn About” via @sejournal, @martinibuster

Google has quietly introduced a new AI Search experiment called Learn About, which summarizes content and offers navigational menus to explore related subtopics. This new way of exploring content uses drill-down navigational menus called Interactive Lists and if the user scrolls down far enough they will eventually find links to human created content.

This new way of searching encourages exploration with an interface that continually presents additional summaries and links to human-created content. The experience resembles a children’s “choose your story” book, where the narrative shifts based on the reader’s decisions.

Google’s Learning Initiative

The Learn About AI Search is offered as part of Google Labs. It’s also a part of Google’s Learning Initiative. The Learning Initiative page offers links to Google Labs projects that are related to learning.

The Learning Initiative contains links to various projects:

  • Learn About
  • Shiffbot
  • Illuminate
  • NotebookLM

Pilot Program (early access to AI products for 12 and higher education)

Experiments for Learning (AI learning tools that students can use to create songs or travel virtually to Mars)

The Google Learning Initiative page describes Learn About:

“Learn About
Grasp new topics and deepen understanding with this adaptable, conversational, AI-powered learning companion.”

Interactive List User Interface

Learn About’s Interactive List exploration menus are illustrated with images, which is appealing because humans are visually oriented. That makes it faster to comprehend the written content because the image reinforces the text.

The images in the interactive menu appear to be licensed from stock image providers like Shutterstock, Adobe, and Alamy. None of the images appear to be sourced from creator websites.

Screenshot Of Interactive List Navigational Menu

Questions trigger a summary and a drill down navigational menu that’s called an Interactive List. These search results lead to related topics and progressively granular summaries, more Interactive Lists.

Beneath the Interactive Lists is a section called “Explore related content” that offers links to actual human created content like YouTube videos and website content.

Beneath the links to creator content is a group of buttons labeled with options to Simplify, Go deeper, or Get images. Beneath those three choices are speech balloons with additional search queries on related topics.

Screenshot Of Explore Related Content Section

There is also a left-hand navigational menu with an invitation to explore using Interactive List menu.

Screenshot Of Left-Hand Navigation

Availability Of Learn About

Learn About is only available to users who are 18 or older in the United States and is available in in English.

Interestingly, it also answers questions in Spanish but then quickly erases the Spanish answer and replaces it with a statement that it doesn’t speak that language yet. But if you ask it a question in English followed by another question in Spanish then it may answer the question in English and provide links to Spanish language human created content. As shown in the image below, Google Learn About will not only understand and answer a Spanish language query.

Learn about will also understand it when the query contains a typo. The query below contains a typo of the word “comer” which is missing the letter “r.”

The Spanish language query I tried was “Es posible a comer el ojo de un pescado” which means, “is it possible to eat the eye of a fish?”

Screenshot Of Spanish Language Query In Learn About

Privacy Controls

Google’s Learn About has privacy controls that are explained in a consent form that must be agreed to before using Learn About.

It contains information about how Google handles questions, a warning to not ask questions of a personal and private nature and details about managing the information saved by Learn About. It also says that human reviewers may access information shared with Learn About but that it will be stripped of identifying information.

The consent agreement explains:

“Google stores your Learn About activity with your Google Account for up to 18 months.

You can choose to delete your Learn About data any time by clicking the settings button next to your Google account profile photo in Learn About and then choosing “Delete activity”.

To help with quality and improve our products (such as generative machine-learning models that power Learn About), human reviewers read, annotate, and process your Learn About conversations. We take steps to protect your privacy as part of this process. This includes disconnecting your conversations with Learn About from your Google Account before reviewers see or annotate them.

Please don’t enter confidential information in your conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies.”

Google Learn About And SEO

There is no hint about whether this will eventually be integrated into Google Search. Given that it’s a part of Google’s Learning Initiative it’s possible that it could become a learning-only tool.

Try Learn About, an experimental project of Google Labs.

Featured Image by Shutterstock/Cast Of Thousands

WP Engine Escalates Legal Battle With Automattic and Mullenweg via @sejournal, @martinibuster

WP Engine escalated its Federal complaint by citing Automattic’s publication of the WP Engine Tracker website as evidence of intent to harm WP Engine and exposing customers to potential cybercrimes. The updated complaint incorporates recent actions by Mullenweg to further strengthen their case.

A spokesperson for WP Engine issued a statement to Search Engine Journal about the WP Engine Tracker website:

“Automattic’s wrongful and reckless publication of customer’s information without their consent underscores why we have moved for a preliminary injunction. WP Engine has requested the immediate takedown of this information and looks forward to the November 26th hearing on the injunction.”

Legal Complaint Amended With More Evidence

WP Engine (WPE) filed a complaint in Federal court seeking a preliminary injunction to prevent Matt Mullenweg and Automattic from continuing actions that harm WPE’s business and their relationships with their customers. That complaint was amended with further details to support their allegations against Mullenweg and Automattic.

The legal complaint begins by stating in general terms what gives rise to their claim:

“This is a case about abuse of power, extortion, and greed.”

It then grows progressively specific by introducing evidence of how Automattic and Mullenweg continue their “bad acts unabated” for the purpose of harming WP Engine (WPE).

The amended claim adds the following, quoting Mullenweg himself:

“Since then, Defendants have continued to escalate their war, unleashing a campaign to steal WPE’s software, customers and employees. Indeed, just days ago, Defendants were unambiguous about their future plans:”

This is the statement Mullenweg made that is quoted in the amended complaint:

“[S]ince this started [with WPE] they’ve had uh, we estimate tens of thousands of customers leave. . . . So, um you know, I think over the next few weeks, they’re actually gonna lose far more than 8% of their business . . . we’re at war with them. We’re . . . going to go brick by brick and take . . . take every single one of their customers . . . if they weren’t around guess what? . . . We’d happily have those customers, and in fact we’re getting a lot of them.”

WP Engine Tracker Site Used As Evidence

Automattic recently created a website on the WordPressEngineTracker.com domain called WP Engine Tracker that encourages WordPress Engine customers to leave, offering links to promotions that offer discounts and promise a smooth transition to other web hosts.

WPE states that the WP Engine Tracker website is part of a campaign to encourage WPE customers to abandon it, writing:

“Defendants also created a webpage at wordpress.org offering “Promotions and Coupons” to convince WPE customers to stop doing business with WPE and switch over to Automattic’s competitor hosting companies like wordpress.com and Pressable; they later added links to other competitors as well.”

The WordPress Engine Tracker website calls attention to the number of sites that have abandoned WP Engine (WPE) since Matt Mullenweg’s September 21st public denunciation of WP Engine and the start of his “nuclear” war against the web host. The amended Federal lawsuit points to the September 21st date listed on that site as additional evidence tying Automattic to a campaign to harm WP Engine’s business.

The legal document explains:

“Just last week, in an apparent effort to brag about how successful they have been in harming WPE, Defendants created a website—www.wordpressenginetracker.com—that “list[s] . . . every domain hosted by @wpengine, which you can see decline every day. 15,080 sites have left already since September 21st.

September 21 was not selected randomly. It is the day after Defendants’ self-proclaimed nuclear war began – an admission that these customer losses were caused by Defendants’ wrongful actions. In this extraordinary attack on WPE and its customers, Defendants included on their disparaging website a downloadable file of ‘all [WPE] sites ready for a new home’—that is, WPE’s customer list, literally inviting others to target and poach WPE’s clients while Defendants’ attacks on WPE continued..”

The purpose of the above allegations are to build as much evidence that lend credence to WP Engine’s claim that Automattic is actively trying to cause harm WP Engine’s business.

WPE Accuses Automattic Of Additional Harms

Another new allegation against Automattic is that the spreadsheet offered for download on the WP Engine Tracker website includes sensitive information that is not publicly available and could cause direct harm to WPE customers.

The amended Federal lawsuit explains:

“Worse, this downloadable file contains private information regarding WPE’s customers’ domain names, including development, test, and pre-production servers—many of which are not intended to be accessed publicly and contain sensitive or private information. Many of these servers are intentionally not indexed or otherwise included in public search results because the servers are not safe, secure or production-ready and not intended to be accessed by the general public.

By disclosing this information to the general public, Defendants put these development, test, and pre-production domains at risk for hacking and unauthorized access.”

WP Engine Tracker Site Part Of A Larger Strategy

WPE’s amended complaint alleges that the WP Engine Tracker site is one part of a larger strategy to cause harm to WP Engine’s business that includes encouraging WPE employees to resign. The legal document adds new information of how the WP Engine Tracker website is just one part of a larger strategy to harm WPE’s business.

The updated document adds the following new allegations as evidence of WPE’s claims:

“Not content with interfering with WPE’s customer relations, Automattic has recently escalated its tactics by actively recruiting hundreds of WPE employees, in an apparent effort to weaken WPE by sowing doubts about the company’s future and enticing WPE’s employees to join Automattic:”

The document includes a screenshot of an email solicitation apparently sent to an employee that encourages them to join Automattic.

Screenshot Of Evidence Presented In Amended Complaint

Escalation Of Federal Complaint

WP Engine’s amended complaint against Mullenweg and Automattic invokes the Sherman Act (prohibiting monopolization to maintain a competitive marketplace), the Lanham Act (governing trademarks, false advertising, and unfair competition), and the Computer Fraud and Abuse Act (addressing unauthorized computer access and cybercrimes). The amendments tie recent actions by Mullenweg and Automattic—such as the creation of the WP Engine Tracker website—directly to their claims, turning Mullenweg’s attacks on WP Engine into evidence.

Read the amended Federal complaint here: (PDF).

Featured Image by Shutterstock/chaiyapruek youprasert

6 Web Hosts Ranked By Core Web Vitals: One Outperforms All via @sejournal, @martinibuster

HTTPArchive is offering a new technology comparison dashboard, currently in beta testing. Users can now view real-world web hosting performance scores for Core Web Vitals. We compare six web hosts and find one that consistently performs better across nearly all metrics.

About HTTPArchive

HTTPArchive tracks websites through crawling and with data collected in the Chrome User Experience Report (CrUX). It publishes reports about the technologies that power websites, including Core Web Vitals performance of content management systems like WordPress and Wix.

New Technology Comparison Dashboard – Beta

HTTPArchive has new reports under development, one of which is a comparison of Core Web Vitals and Lighthouse performance scores by web hosts. HTTPArchive also tracks the median page weight by web hosts but it’s still under development and is in Beta testing.

The new reports allow comparison by web hosts. There isn’t data yet for many web hosts but there is for the following six. Comparing web hosts by core web vitals is not a totally fair comparison. A web host like Ionos might host many thousands of small and local sites which might not be resource intensive.

So with those caveats, here are the six web hosts under comparison:

  1. Bluehost
  2. GoDaddy
  3. HostGator
  4. IONOS
  5. SiteGround
  6. WP Engine

Core Web Vitals By Web Host

The following are the list of web hosts by percentage of sites hosted at each one that pass Core Web Vitals. The HTTPArchive says that thise report is still under development and, as previously mentioned, the percentages don’t necessarily reflect the quality of the web hosts themselves, but rather the quality of the sites hosted there.

This is the description of the CWV metric scores:

Passes Core Web Vitals
The percentage of origins passing all three Core Web Vitals (LCP, INP, CLS) with a good experience. Note that if an origin is missing INP data, it’s assessed based on the performance of the remaining metrics.”

However, it’s interesting to see that the number one web host is a managed WordPress web host because that may indicate that the platform itself may be optimized better than a general web host. The following scores are based on a snapshot taken at the beginning of September.

Core Web Vitals Scores In Descending Order

  • WP Engine 70%
  • GoDaddy 67%
  • SiteGround 65%
  • HostGator 58%
  • Ionos 58%
  • Bluehost 45%

Largest Contentful Paint (LCP)

LCP measures the perceived page loading speed, how fast the page appears to load for a site visitor.

HTTPArchive defines this metric:

“Largest Contentful Paint (LCP) is an important, stable Core Web Vital metric for measuring perceived load speed because it marks the point in the page load timeline when the page’s main content has likely loaded—a fast LCP helps reassure the user that the page is useful. Good experiences are less than or equal to 2.5 seconds.”

WP Engine again comes out on top, perhaps indicating the quality of the sites hosted on that platform as well as the performance optimizations that are a key element of that web host.

LCP Scores In Descending Order

  • WP Engine 79%
  • GoDaddy 78%
  • SiteGround 75%
  • HostGator 69%
  • IONOS 69%
  • Bluehost 52%

Cumulative Layout Shift (CLS)

HTTPArchive also provides a comparison of the six web hosts by the CLS score. CLS measures how much a web page shifts around as it’s rendered in a web browser. A score of 0.1 or less for 75% of visitors is recommended. The percentages for each of the web hosts were all higher than the 75% minimum. This time WP Engine is tied for first place with HostGator.

CLS Scores In Descending Order

  • WP Engine 88%
  • HostGator 88%
  • Bluehost 87%
  • SiteGround 86%
  • IONOS 85%
  • GoDaddy 84%

First Contentful Paint (FCP)

FCP measures how long it takes for the content to become visible. A low FCP means that the content is rendered quickly. The number one ranked web host for FCP turns out to be GoDaddy, ahead by a significant margin of 7 points. WP Engine comes in second, followed by SiteGround.

FCP Scores In Descending Order

  • GoDaddy 73%
  • WP Engine 67%
  • SiteGround 62%
  • IONOS 60%
  • HostGator 57%
  • Bluehost 39%

Time To First Byte (TTFB)

TTFB measures how long it takes from to download the first byte of a resource after it’s requested by a browser. GoDaddy scores top of the list again.

TTFB In Descending Order

  • GoDaddy 59%
  • IONOS 45%
  • WP Engine 39%
  • HostGator 38%
  • SiteGround 37%
  • Bluehost 25%

Interaction to Next Paint (INP)

This metric represents the overall responsiveness of the entire web page.

HTTPArchive explains what this score means:

“INP is a metric that assesses a page’s overall responsiveness to user interactions by observing the latency of all click, tap, and keyboard interactions that occur throughout the lifespan of a user’s visit to a page. The final INP value is the longest interaction observed, ignoring outliers. A good experience is less than or equal to 200ms.”

The scores are the percentage of pages that provide a good INP experience. WP Engine is back on top for INP but the other five web hosts are not far behind.

INP Scores In Descending Order

  • WP Engine 95%
  • SiteGround 94%
  • Bluehost 92%
  • GoDaddy 90%
  • HostGator 89%
  • IONOS 88%

Lighthouse Performance Score

Lighthouse is an open source auditing tool that scores web pages for performance, SEO, and other metrics. The performance scores for the six web hosts are fairly close to each, clustering on either side of a performance score of 40.

This is HTTPArchive’s description of this score:

“In general, only metrics contribute to your Lighthouse Performance score, not the results of Opportunities or Diagnostics.”

Interestingly, HostGator ranks the highest for the Lighthouse Performance score, with GoDaddy and Ionos tied for second place. The other three were tied for third place, by one point less than the second place. Nevertheless, HostGator was the clear winner for the Lighthouse Performance score metric.

Lighthouse Performance Scores

  • HostGator 43
  • GoDaddy 40
  • IONOS 40
  • Bluehost 39
  • SiteGround 39
  • WP Engine 39

HostGator came out near the top for Core Web Vitals and scores at the top of the list for the Lighthouse Performance metric. WP Engine is clustered with two other web hosts scoring 39 points.

Lighthouse Accessibility Scores

The accessibility scores are clustered similarly to the performance scores, on either side of a score of 85.

This is how HTTPArchive describes this metric:

“The Lighthouse Accessibility score is a weighted average of all accessibility audits. Weighting is based on axe user impact assessments. Each accessibility audit is pass or fail. Unlike the Performance audits, a page doesn’t get points for partially passing an accessibility audit.”

Accessibility Scores In Descending Order

  • GoDaddy 87
  • Bluehost 86
  • WP Engine 86
  • SiteGround 86
  • HostGator 85
  • Ionos 85

Lighthouse SEO Scores

The SEO scores were even more tightly clustered, with GoDaddy scoring the highest of the six web hosts under comparison.

HTTPArchive describes what the SEO Score is measuring:

“These checks ensure that your page is following basic search engine optimization advice. There are many additional factors Lighthouse does not score here that may affect your search ranking, including performance on Core Web Vitals.”

SEO Scores In Descending Order:

  • GoDaddy 91
  • Bluehost 88
  • WP Engine 88
  • HostGator 88
  • IONOS 88
  • SiteGround 88

Lighthouse Best Practices Score

The last score is interesting because it measures if the hosted sites are created with web development best practices. HTTPArchive doesn’t explain at this time what those best practices are.

Here’s the description of this score:

“This ensures that your page is built using modern web development best practices.”

Best Practices Scores In Descending Order

  • Bluehost 79
  • HostGator 79
  • SiteGround 79
  • WP Engine 77
  • GoDaddy 77
  • IONOS 77
  • Takeaway

HTTPArchive is expanding on what it is measuring. The performance dashboard is still in Beta and under development, meaning that it may have bugs but that it’s ready for a public preview. It’s interesting to see a managed WordPress host come on top. The scores will be more meaningful once there are more managed web hosts that can be compared against each other, which may provide a more meaningful comparison. Nevertheless, this is a good start.

Visit the new dashboard here and provide your feedback to make it better.

Featured Image by Shutterstock/TierneyMJ

Microsoft’s AI SEO Tips: New Guidance For AI Search Optimization via @sejournal, @MattGSouthern

Microsoft has provided guidance on how to optimize content for AI-powered search engines.

This advice is timely now that OpenAI has launched ChatGPT Search, which uses Bing’s search index.

Understanding user intent is everything in this new era of search, Microsoft says:

“In the past, digital marketing strategies often relied heavily on demographic data and broad customer segments. But in this era of generative AI, the focus now shifts from who the customer is to what they are looking for—in real-time.”

Microsoft explains several ways websites can optimize content for AI-powered search.

AI SEO Recommendations

Intent-Based Content

Content should address the underlying purpose of user queries, Microsoft says:

“Focus on the intent behind the search query rather than just the keywords themselves. For example, if based on your keyword research, you find that users are searching for “how to choose eco-friendly coffee makers,” provide detailed, step-by-step guides rather than just general information.”

Natural Language Processing (NLP)

Websites should leverage NLP techniques to align content with how AI systems process and understand language.

Microsoft states:

“Generative engines, such as Bing Generative Search, deliver content to searchers by understanding and generating human language through Natural Language Processing (NLP). By analyzing vast amounts of text data to learn language patterns, context, and semantics, they’re able to provide relevant and accurate responses to user queries.”

Additionally, Microsoft emphasized the following sentence in italics:

“Leveraging these same NLP strategies in creating your content can optimize it to rank higher, increase its relevance, and enhance its authority, ultimately boosting its visibility and effectiveness.”

Strategic Keyword Implementation

To improve your website and landing pages for AI search engines, Microsoft recommends these keyword strategies:

  • Long-tail keywords for specific user interests
  • Conversational phrases matching natural speech patterns
  • Semantic keywords providing contextual relevance
  • Question-based keywords addressing common user queries

Freshness

Microsoft encourages keeping content updated and suggests using the IndexNow protocol to quickly notify search engines about website changes.

This helps maintain search rankings and ensures AI systems have the latest information.

Microsoft states:

“While it can be tempting to set it and forget it, AI systems depend on the latest, freshest information to determine the most relevant content to display to searchers. Regularly updating your content not only helps maintain your rankings but also keeps your audience engaged with current and valuable information. This practice can significantly influence how AI systems perceive and rank your website.”

Why This Matters

ChatGPT Search now uses Bing’s index, making these optimization strategies vital for websites seeking better visibility in AI-powered searches.

While this can help you create more optimized content, Microsoft acknowledges there’s no “secret sauce” for AI search systems.

How To Get Indexed In ChatGPT Search

Refer to our article on ChatGPT search indexing to ensure your content is indexed in ChatGPT’s real-time search engine.

You can also watch the short video I recorded on this topic below:


Featured Image: jomel alos/Shutterstock

WordPress Security Plugin Vulnerability Endangers 4 Million+ Sites via @sejournal, @martinibuster

A critical vulnerability was discovered in a popular WordPress security plugin with over 4 million installations. The flaw allows attackers to log in as any user, including administrators, and gain full access to their site-level permissions. Assigned a threat score of 9.8 out of 10, it underscores the ease of exploitation and the potential for full site compromise, including malware injection, unauthorized content changes, and attacks on site visitors.

Really Simple Security

Really Simple Security is a WordPress plugin that was developed to improve resistance of WordPress sites against exploits (called security hardening), enable two-factor authentication, detect vulnerabilities and it also generates an SSL certificate. One of the reasons it promotes itself as lightweight is because it’s designed as a modular software that allows users to choose what security enhancements to enable so that (in theory) the processes for disabled capabilities don’t load and slow down the website. It’s a popular trend in WordPress plugins that allows a software to do many things but only do the tasks that a user requires.

The plugin is promoted through affiliate reviews and according to Google AI Overview enjoys highly positive reviews. Over 97% of reviews on the official WordPress repository are rated with five stars, the highest possible rating, with less than 1% rating the plugin as 1 star.

What Went Wrong?

A security flaw in the plugin makes it vulnerable to authentication bypass, which is a flaw that allows an attacker to access areas of a website that require a username and a password without having to provide credentials. The vulnerability specific to Really Simple Security allows an attacker to acquire access of any registered user of the website, including the administrator, simply by knowing the user name.

This is called an Unauthenticated Access Vulnerability, one of most severe kinds of flaws because it is generally easier to exploit than an “authenticated” flaw which requires an attacker to first attain the user name and password of a registered user.

Wordfence explains the exact reason for the vulnerability:

“The Really Simple Security (Free, Pro, and Pro Multisite) plugins for WordPress are vulnerable to authentication bypass in versions 9.0.0 to 9.1.1.1. This is due to improper user check error handling in the two-factor REST API actions with the ‘check_login_and_get_user’ function. This makes it possible for unauthenticated attackers to log in as any existing user on the site, such as an administrator, when the “Two-Factor Authentication” setting is enabled (disabled by default).

Wordfence blocked 310 attacks targeting this vulnerability in the past 24 hours.”

Recommended Course Of Action:

Wordfence encourages users of the plugin to update to Really Simple Security version 9.1.2 (or higher version).

The Really Simple Security plugin’s changelog responsibly announces the reason for the updated software:

“Changelog
9.1.2
security: authentication bypass”

Read the Wordfence security advisory:

Really Simple Security (Free, Pro, and Pro Multisite) 9.0.0 – 9.1.1.1 – Authentication Bypass

Featured Image by Shutterstock/Tithi Luadthong

Google Maps Launches Product Search: Steps For Retailers via @sejournal, @MattGSouthern

Google has updated Maps to include product-specific search capabilities, just in time for the holiday shopping season.

This feature allows consumers to search for specific items and find nearby stores with available inventory.

Key Features

The update lets you search for products directly within Google Maps and find real-time inventory availability at nearby stores.

Shoppers can find a wide range of items and have immediate access to local pickup options.

In an announcement, Google states:

“Need a last-minute gift? Now, you can search in Google Maps for items like pickleball rackets, board games, ice cream makers and more to find nearby stores that have them in stock so you can pick up what you need, stat. No empty stockings here!”

Implementation Guide For Retailers

Retailers should focus on three key components to succeed with Google Maps’ product search feature.

1. Structured Data Markup

Local inventory visibility depends on using Schema.org markup correctly.

The example below shows what’s for products to be visible on Google Maps:


2. Inventory Management

To ensure this feature works well, accurate inventory tracking is important.

Follow these key steps to keep your data correct:

  • Regularly sync with Google Merchant Center.
  • Automate inventory updates.
  • Update inventory at least once every 24 hours.

3. Store Location Integration

Accurate store location information helps customers find you after discovering your products.

Here are the key elements to focus on:

  • Ensure your Google Business Profile is correct.
  • Keep your geocoding information accurate.
  • Update your operating hours regularly.
  • Adjust pickup availability settings as needed.

Looking Ahead

For retailers aiming to make the most of this feature, here are the recommended steps to follow:

  1. Verify the setup of your Google Merchant Center.
  2. Implement the necessary structured data.
  3. Test the transmission of your data.
  4. Monitor performance using Google Search Console.
  5. Regularly update your inventory feeds.

This update can help drive more customers to physical stores during the holiday shopping season and beyond.

For more information, Google provides detailed guides in the Merchant Center help section.


Featured Image: Rokas Tenys/Shutterstock