Chrome To Warn Users Before Loading HTTP Sites Starting Next Year via @sejournal, @MattGSouthern

Google Chrome will enable “Always Use Secure Connections” by default with the release of Chrome 154 in October 2026, the company announced.

The change means Chrome will ask for user permission before loading any public website that doesn’t use HTTPS encryption. Users will see a bypassable warning explaining the security risks of unencrypted connections.

Google is rolling out the feature in stages. Chrome 147 will enable it for over 1 billion Enhanced Safe Browsing users in April 2026. All Chrome users will get it by default six months later.

What’s Changing

Public Site Warning

The warning system applies exclusively to public websites. Chrome excludes private sites including local IP addresses, single-label hostnames, and internal shortlinks.

Chris Thompson and the Chrome Security Team wrote:

“HTTP navigations to private sites can still be risky, but are typically less dangerous than their public site counterparts because there are fewer ways for an attacker to take advantage of these HTTP navigations.”

Here’s an example of what the warning will look like:

Image Credit: Google

Warning Frequency

Chrome limits how often users see warnings for the same sites. The browser won’t repeatedly warn about regularly visited insecure sites.

Testing data shows the median user sees fewer than one warning per week. The 95th percentile user sees fewer than three warnings per week.

Current HTTPS Adoption

HTTPS usage has plateaued at 95-99% of Chrome navigations across platforms. When excluding private sites, public HTTPS usage reaches 97-99% on most platforms.

Windows shows 98% HTTPS on public sites. Android and Mac exceed 99%. Linux reaches nearly 97%.

Why This Matters

You face security risks when clicking HTTP links. Attackers can hijack unencrypted navigations to load malware, exploitation tools, or phishing content.

Google’s transparency report shows HTTPS adoption stalled after rapid growth from 2015-2020. The remaining 1-5% of insecure traffic represents millions of navigations that create attack opportunities.

Website owners running HTTP-only sites have one year to migrate before Chrome warns their visitors.

You can enable “Always Use Secure Connections” today at chrome://settings/security to test how the warnings affect your site traffic.

Looking Ahead

Google continues outreach to companies responsible for the highest HTTP traffic volumes. Many sites use HTTP only for redirects to HTTPS destinations, creating an invisible security gap the new warnings will close.

Chrome plans additional work to reduce HTTPS adoption barriers for local network sites. The company introduced a local network access permission that allows HTTPS pages to communicate with private devices once users grant permission.

Users can disable warnings by turning off the “Always Use Secure Connections” setting. Enterprise and educational institutions can configure Chrome to meet their specific warning requirements.


Featured Image: Philo Athanasiou/Shutterstock

Google Labs & DeepMind Launch Pomelli AI Marketing Tool via @sejournal, @MattGSouthern

Pomelli, a Google Labs & DeepMind AI experiment, builds a “Business DNA” from your site and generates editable branded campaign assets for small businesses.

  • Pomelli scans your website to create a “Business DNA” profile.
  • It uses the created profile to keep content consistent across channels.
  • It suggests campaign ideas and generates editable marketing assets.
Trust In AI Shopping Is Limited As Shoppers Verify On Websites via @sejournal, @MattGSouthern

A new IAB and Talk Shoppe study finds AI is accelerating discovery and comparisons, but it’s not the last stop.

Here are the key points before we get into the details:

  • AI pushes people to verify details on retailer sites, search, reviews, and forums rather than replacing those steps.
  • Only about half fully trust AI recommendations, which creates predictable detours when links are broken or specs and pricing don’t match.
  • Retailer traffic rises after AI, with one in three shoppers clicking through directly from an assistant.

About The Report

This report combines more than 450 screen-recorded AI shopping sessions with a U.S. survey of 600 consumers, giving you observed behavior and stated attitudes in one place.

It tracks where AI helps, where trust breaks, and what people do next.

Key Findings

AI speeds up research and makes it more focused, especially for comparing options, but it increases the number of steps as shoppers validate details elsewhere.

In the sessions, people averaged 1.6 steps before AI and 3.8 afterward, and 95% took extra steps to feel confident before ending a session.

Retailer and marketplace sites are the primary destination for validation. Seventy-eight percent of shoppers visited a retailer or marketplace during the journey, and 32% clicked directly from an AI tool.

The share that visited retailer sites rose from 20% before AI to 50% after AI. On those pages, people most often checked prices and deals, variants, reviews, and availability.

Low Trust In AI Recommendations

Trust is a constraint. Only 46% fully trusted AI shopping recommendations.

Common friction points where people lost trust were:

  • Missing links or sources
  • Mismatched specs or pricing
  • Outdated availability
  • Recommendations that didn’t fit budget or compatibility needs

These friction points sent people back to search, retailers, reviews, and forums.

Why This Matters

AI chatbots now shape mid-journey research.

If your product data, comparison content, and reviews are inconsistent with retailer listings, shoppers will notice when they verify elsewhere.

This reinforces the need to align details across channels to retain customer trust.

What To Do With This Info

Here are concrete steps you can take based on the report’s information:

  • Keep specs, pricing, availability, and variants in sync with retailer feeds.
  • Build comparison and “alternatives” pages around the attributes people prompt for.
  • Expand structured data for specs, variants, availability, and reviews.
  • Create content to answer common objections surfaced in forums and comment threads.
  • Monitor the queries and communities where shoppers validate information to close recurring gaps.

Looking Ahead

Respondents said AI made research feel easier, but confidence still depends on clear sources and verified reviews.

Expect assistants to keep influencing discovery while retailer and brand pages confirm the details that matter.

For more insight into how AI influences the shopping journey, see the full report.


Featured Image: Andrey_Popov/Shutterstock

Google’s Advice On Canonicals: They’re Case Sensitive via @sejournal, @martinibuster

Google’s John Mueller answered a question about canonicals, expressing his opinion that “hope” shouldn’t be a part of your SEO strategy with regard to canonicals. The implication is that hoping Google will figure it out on its own misses the point of what SEO is about.

Canonicals And Case Sensitivity

Rel=canonical is an HTML tag that enables a publisher or SEO to tell Google what their preferred URL is. For example, it’s useful for suggesting the best URL when there are multiple URLs with the same or similar content. Google isn’t obligated to obey the rel=canonical declaration, it’s treated as a strong hint.

Someone on Reddit was in the situation where a website has category names that they begin with a capitalized letter but the canonical tag contains a lowercase version. There is currently a redirect from the lowercase version to the uppercase.

They’re currently not seeing any negative impact from this state of the website and were asking if it’s okay to leave it as-is because it hasn’t affected search visibility.

The person asking the question wrote:

“…I’m running into something annoying on our blog and could use a sanity check before I push dev too hard to fix it. It’s been an issue for a month, after a redesign was launched.

All of our URLs resolve in this format: /site/Topic/topic-title/

…but the canonical tag uses a lowercase topic, like: /site/topic/topic-title/

So the canonical doesn’t exactly match the actual URL’s case. Lowercase topic 301 redirects to the correct, uppercase version.

I know that mismatched canonicals can send mixed signals to Google.

Dev is asking, “Are you seeing any real impact from this?” and technically, the answer is no — but I still think it’s worth fixing to follow best practices.”

If It Works Don’t Fix It?

This is an interesting case because in many things related to SEO if something’s working there’s little point trying to fix a small detail for fear of triggering a negative response. Relying on Google to figure things out is another fallback.

Google’s John Mueller has a different opinion. He responded:

“URL path, filename, and query parameters are case-sensitive, the hostname / domain name aren’t. Case-sensitivity matters for canonicalization, so it’s a good idea to be consistent there. If it serves the same content, it’ll probably be seen as a duplicate and folded together, but “hope” should not be a part of an SEO strategy.

Case-sensitivity in URLs also matters for robots.txt.”

Takeaway

I know that in highly competitive niches the SEO is on a generally flawless level. If there’s something to improve it gets improved. And there’s a good reason for that. Someone at one of the search engines once told me that anything you can do to make it easier for the crawlers is a win. They advised me to make sites easy to crawl and content easy to understand. That advice is still useful, it follows with Mueller’s advice to not “hope” that Google figures things out, implying that it’s best to make sure they do work out.

Featured Image by Shutterstock/MyronovDesign

YouTube Introduces ‘Ask Studio’ AI For Channel Analytics via @sejournal, @MattGSouthern

YouTube launched Ask Studio, an AI assistant built into YouTube Studio that analyzes channel data to provide insights and content suggestions.

The tool appears as a chat interface accessed through a sparkle icon in YouTube Studio. You can ask for comment summaries, video performance analysis, and content ideas based on your channel’s data.

What’s New

Ask Studio analyzes three primary types of channel data: comments, analytics, and past content performance.

For comments, Ask Studio can summarize key themes and sentiment across videos. You can ask for summaries on a specific video or get an overall view of what viewers are talking about.

For analytics, Ask Studio pulls from the same performance metrics already in YouTube Studio. It identifies patterns and suggests areas for improvement based on the channel’s data.

For content planning, Ask Studio can generate ideas tailored to what viewers already respond to. You can prompt it for new angles on an ongoing series, ask what topics are resonating with your audience, or get title and outline suggestions.

See a full walkthrough in the video below:

How It Differs From Inspiration Tab

Ask Studio and the Inspiration Tab are both designed to help with content ideas, but they work differently.

Inspiration Tab is a visual surface. It shows idea cards, images, and thumbnail suggestions for creators who like to browse concepts.

Ask Studio is conversational. You type a prompt and get an answer in plain language. It’s meant for creators who already have a direction and want help sharpening the angle, planning the next video, or understanding what viewers are saying.

Both use your channel data, but Ask Studio responds in real time. Inspiration Tab curates pre-generated suggestions.

Availability

Ask Studio is currently available in English to a limited group of creators in the United States.

YouTube says it’s continuing to expand access to more U.S. creators, experimenting with additional languages, and working on international rollout.

Some prompts may return a generic response or “I can’t help with that.” YouTube says that happens when Ask Studio doesn’t have enough context or doesn’t support that request yet.

Why This Matters

Ask Studio can surface patterns in your comments and analytics without manually digging through dashboards or scrolling hundreds of viewer messages. That reduces the time spent on reporting and lets you focus on packaging the next video.

The current limitation is reach. Right now it’s U.S.-only, English-only, and only some channels are in the test group, which restricts access for international creators and teams that work across multiple languages.

Looking Ahead

YouTube says it plans to roll out Ask Studio to more creators in the United States before expanding internationally. The company is also testing additional language support but hasn’t announced specific languages or dates.

The launch continues YouTube’s push toward AI-assisted creator tools inside YouTube Studio, alongside features like the Inspiration Tab for idea generation.


Featured Image: vrlibsstudio/Shutterstock

OpenAI Flags Emotional Reliance On ChatGPT As A Safety Risk via @sejournal, @MattGSouthern

OpenAI is telling companies that “relationship building” with AI has limits. Emotional dependence on ChatGPT is considered a safety risk, with new guardrails in place.

  • OpenAI says it has added “emotional reliance on AI” as a safety risk.
  • The new system is trained to discourage exclusive attachment to ChatGPT.
  • Clinicians helped define what “unhealthy attachment” looks like and how ChatGPT should respond.
Google Uses AI To Group Queries In Search Console Data via @sejournal, @MattGSouthern

Google announced Query groups in Search Console Insights. The AI feature clusters similar search queries, surfaces trends, and shows which topics drive clicks.

  • Query groups uses AI to cluster similar search queries.
  • The new card shows total clicks per group and highlights groups trending up or down.
  • Query groups will roll out over the coming weeks to high volume accounts.
Automattic’s Legal Claims About SEO… Is This Real? via @sejournal, @martinibuster

SEO plays a role in Automattic’s counterclaim against WP Engine. The legal document mentions search engine optimization six times and SEO once as part of counterclaims asserting that WP Engine excessively used words like “WordPress” to rank in search engines as part of an “infringement” campaign that uses WordPress trademarks in commerce. A close look at those claims shows that some of the evidence may be biased and that claims about SEO rely on outdated information.

Automattic’s Claims About SEO

Automattic’s counterclaim asserts that WP Engine used SEO to rank for WordPress-related keywords and that this is causing confusion.

The counterclaim explains:

“WP Engine also has sown confusion in recent years by dramatically increasing the number of times Counterclaimants’ Marks appear on its websites. Starting in or around 2021, WP Engine began to sharply increase its use of the WordPress Marks, and starting in or around 2022, began to sharply increase its use of the WooCommerce Marks.”

Automattic next argues that the repetition of keywords on a web page is WP Engine’s SEO strategy. Here’s where their claims become controversial to those who know how search engines rank websites.

The counterclaim asserts:

“The increased number of appearances of the WordPress Marks on WP Engine’s website is particularly likely to cause confusion in the internet context.

On information and belief, internet search engines factor in the number of times a term appears in a website’s text in assessing the “relevance” of a website to the terms a user enters into the search engine when looking for websites.

WP Engine’s decision to increase the number of times the WordPress Marks appear on WP Engine’s website appears to be a conscious “search engine optimization” strategy to ensure that when internet users look for companies that offer services related to WordPress, they will be exposed to confusingly written and formatted links that take them to WP Engine’s sites rather than WordPress.org or WordPress.com.”

They call WP Engine’s strategy aggressive:

“WP Engine’s strategy included aggressive utilization of search engine optimization to use the WordPress and WooCommerce Marks extremely frequently and confuse consumers searching for authorized providers of WordPress and WooCommerce software;”

Is The Number Of Keywords Used A Ranking Factor?

I have twenty-five years of experience in search engine optimization and have a concomitantly deep understanding of how search engines rank content. The fact is that Automattic’s claim that search engines “factor in the number of times” a keyword is used in a website’s content is incorrect. Modern search engines don’t factor in the number of times a keyword appears on a web page as a ranking factor. Google’s algorithms use models like BERT to gain a semantic understanding of the meaning and intent of the keyword phrases used in search queries and content, resulting in the ability to rank content that doesn’t even contain the user’s keywords.

Those aren’t just my opinions; Google’s web page about how search works explicitly says that content is ranked according to the user’s intent, regardless of keywords, which directly contradicts Automattic’s claim about WPE’s SEO:

“To return relevant results, we first need to establish what you’re looking for – the intent behind your query. To do this, we build language models to try to decipher how the relatively few words you enter into the search box match up to the most useful content available.

This involves steps as seemingly simple as recognizing and correcting spelling mistakes, and extends to our sophisticated synonym system that allows us to find relevant documents even if they don’t contain the exact words you used.”

If Google’s documentation is not convincing enough, take a look at the search results for the phrase “Managed WordPress Hosting.” WordPress.com ranks #2, despite the phrase being completely absent from its web page.

Screenshot Of WordPress.com In Search Results

What Is The Proof?

Automattic provides a graph comparing WP Engine’s average monthly mentions of the word “WordPress” with mentions published by 18 other web hosts. The comparison of 19 total web hosts dramatically illustrates that WP Engine mentions WordPress more often than any of the other hosting providers, by a large margin.

Screenshot Of Graph

Here’s a close-up of the graph (with the values inserted) showing that WP Engine’s monthly mentions of “WordPress” far exceed the number of times words containing WordPress are used on the web pages of the other hosts.

Screenshot Of Graph Closeup

People say that numbers don’t lie, and the graph presents compelling evidence that WP Engine is deploying an aggressive use of keywords with the word WordPress in them. Leaving aside the debunked idea that keyword-term spamming actually works, a closer look at the graph comparison shows that the evidence is not so strong because it is biased.

Automattic’s Comparison Is Arguably Biased

Automattic’s counterclaim compares eighteen web hosts against WP Engine. Of those eighteen hosts, only five (including WPE) are managed WordPress hosting platforms. The remaining fourteen are generalist hosting platforms that offer cloud hosting, VPS (virtual private servers), dedicated hosting, and domain name registrations.

The significance of this fact is that the comparison can be considered biased against WP Engine because the average mention of WordPress will naturally be lower across the entire website of a company that offers multiple services (like VPS, dedicated hosting, and domain name registrations) versus a site like WP Engine that offers only one service, managed WordPress hosting.

Two of the hosts listed in the comparison, Namecheap and GoDaddy, are primarily known as domain name registrars. Namecheap is the second biggest domain name registrar in the world. There’s no need to belabor the point that these two companies in Automattic’s comparison may be biased choices to compare against WP Engine.

Of the five hosts that offer WordPress hosting, two are plugin platforms: Elementor and WPMU Dev. Both are platforms built around their respective plugins, which means that the average number of mentions of WordPress is going to be lower because the average may be diluted by documentation and blog posts about the plugins. Those two companies are also arguably biased choices for this kind of comparison.

Of the eighteen hosts that Automattic chose to compare with WP Engine, only two of them are comparable in service to WP Engine: Kinsta and Rocket.net.

Comparison Of Managed WordPress Hosts

Automattic compares the monthly mentions of phrases with “WordPress” in them, and it’s clear that the choice of hosts in the comparison biases the results against WP Engine. A fairer comparison is to compare the top-ranked web page for the phrase “managed WordPress hosting.”

The following is a comparison of the top-ranked web page for each of the three managed WordPress hosts in Automattic’s comparison list, a straightforward one-to-one comparison. I used the phrase “managed WordPress hosting” plus the domain name appended to a search query in order to surface the top-ranked page from each website and then compared how many times the word “WordPress” is used on those pages.

Here are the results:

Rocket.net

The home page of Rocket.net ranks #1 for the phrase “rocket.net managed wordpress hosting.” The home page of Rocket.net contains the word “WordPress” 21 times.

Screenshot of Google’s Search Results

Kinsta

The top ranked Kinsta page is kinsta.com/wordpress-hosting/ and that page mentions the word “WordPress” 55 times.

WP Engine

The top ranked WP Engine web page is wpengine.com/managed-wordpress-hosting/ and that page mentions the word “WordPress” 27 times.

A fair one-to-one comparison of managed WordPress host providers, selected from Automattic’s own list, shows that WP Engine is not using the word “WordPress” more often than its competitors. Its use falls directly in the middle of a fair one-to-one comparison.

Number Of Times Page Mentions WordPress

  • Rocket.net: 21 times
  • WP Engine: 27 times
  • Kinsta: 55 times

What About Other Managed WordPress Hosts?

For the sake of comparison, I compared an additional five managed WordPress hosts that Automattic omitted from its comparison to see how often the word “WordPress” was mentioned on the top-ranked web pages of WP Engine’s direct competitors.

Here are the results:

  • WPX Hosting: 9
  • Flywheel: 16
  • InstaWP: 22
  • Pressable: 23
  • Pagely: 28

It’s apparent that WP Engine’s 27 mentions put it near the upper level in that comparison, but nowhere near the level at which Kinsta mentions “WordPress.” So far, we only see part of the story. As you’ll see, other web hosts use the word “WordPress” far more than Kinsta does, and it won’t look like such an outlier when compared to generalist web hosts.

A Comparison With Generalist Web Hosts

Next, we’ll compare the generalist web hosts listed in Automattic’s comparison.

I did the same kind of search for the generalist web hosts to surface their top-ranked pages for the query “managed WordPress hosting” plus the name of the website, which is a one-to-one comparison to WP Engine.

Other Web Hosts Compared To WP Engine:

  1. InMotion Hosting: 101 times
  2. Greengeeks: 97 times
  3. Jethost: 71 times
  4. Verpex: 52 times
  5. GoDaddy: 49 times
  6. Cloudways: 47 times
  7. Namecheap: 41 times
  8. Liquidweb: 40 times
  9. Pair: 40 times
  10. Hostwinds: 37 times
  11. KnownHost: 33 times
  12. Mochahost: 33 times
  13. Panthen: 31 times
  14. Siteground: 30 times
  15. WP Engine: 27 times

Crazy, right? WP Engine uses the word “WordPress” less often than any of the other generalist web hosts. This one-to-one comparison contradicts Automattic’s graph.

And just for the record, WordPress.com’s top-ranked page wordpress.com/hosting/ uses the word “WordPress” 62 times, over twice as often as WP Engine’s web page.

Will Automattic’s SEO Claims Be Debunked?

Automattic’s claims about WP Engine’s use of SEO may be based on shaky foundations. The claims about how keywords work for SEO contradict Google’s own documentation, and the fact that WordPress.com’s own website ranks for the phrase “Managed WordPress Hosting” despite not using that exact phrase appears to debunk their assertion that search engines factor the number of times a user’s keywords are used on a web page.

The graph that Automattic presents in their counterclaim does not represent a comparison of direct competitors, which may contribute to a biased impression that WP Engine is aggressively using the “WordPress” keywords more often than competitors. However, a one-to-one comparison of the actual web pages that compete against each other for the phrase “Managed WordPress Hosting” shows that many of the web hosts in Automattic’s own list use the word “WordPress” far more often than WP Engine, which directly contradicts Automattic’s narrative.

I ran WP Engine’s Managed WordPress Hosting URL in a Keyword Density Tool, and it shows that WP Engine’s web page uses the word “WordPress” a mere 1.92% of the time, which, from an SEO point of view, could be considered a modest amount and far from excessive. It will be interesting to see how the judge decides the merits of Automattic’s SEO-related claims.

Featured Image by Shutterstock/file404

Perplexity Responds To Reddit Lawsuit Over Data Access via @sejournal, @MattGSouthern

Reddit sued Perplexity and three data-scraping firms in New York federal court, alleging the companies bypassed access controls to obtain Reddit content at scale, including by scraping Google search results.

Perplexity posted a public response, saying it summarizes Reddit discussions with citations and doesn’t train AI models on Reddit content.

The position is consistent with the company’s past statements. Whether it addresses the specific allegations in Reddit’s filing remains an open question.

The complaint names Oxylabs UAB, AWMProxy, and SerpApi as intermediaries. It alleges Perplexity is a SerpApi customer and purchased and/or utilized SerpApi services to circumvent controls and copy Reddit data.

Evidence In The Complaint

Perplexity’s argument is built around a technical distinction. The company says it summarizes and cites discussions rather than training models on Reddit posts.

Perplexity wrote in its Reddit response:

“We summarize Reddit discussions, and we cite Reddit threads in answers, just like people share links to posts here all the time.”

The complaint, however, presents technical claims that call that framework into question.

According to the filing, Reddit created a test post that was only crawlable by Google’s search engine and not accessible anywhere else on the internet. Within hours, that hidden content appeared in Perplexity’s results.

The filing also says that after Reddit sent a cease-and-desist letter, Perplexity’s citations to Reddit increased roughly forty-fold.

Similar Accusations From Publishers

Forbes previously accused Perplexity of republishing an exclusive and threatened legal action.

Wired reported that Perplexity used undisclosed IPs and spoofed user-agent strings to bypass robots.txt. Wired’s

Cloudflare later said Perplexity used “stealth, undeclared crawlers” that ignored no-crawl directives, based on tests it ran in August.

How Perplexity Has Responded

In previous disputes, Perplexity said issues stemmed from rough edges on new products and promised clearer attribution.

The company has also argued that some media organizations are trying to control “publicly reported facts.”

In this latest response, Perplexity frames Reddit’s lawsuit as leverage in broader training-data negotiations and writes:

“We summarize Reddit discussions… We won’t be extorted, and we won’t help Reddit extort Google.”

Why This Matters

This issue matters because it concerns how AI assistants use forum content that your audiences read and that publishers frequently cite.

The legal questions go beyond just training.

Courts may examine if technical controls have been bypassed, whether summarization infringes on protected expressions, and if using third-party scrapers could lead to legal liability for downstream products.

If courts accept Reddit’s anti-circumvention argument, it could lead to changes in how assistants cite or link Reddit threads.

On the other hand, if courts agree with Perplexity’s viewpoint, assistants might start relying more on forum discussions that are less restricted by licensing.

What We Don’t Know Yet

The filing alleges Perplexity obtained data via at least one scraping firm, but the public complaint doesn’t specify which vendor supplied which data or include transaction details.