Google Opens Gemini Deep Research To Free Users (With Limits) via @sejournal, @MattGSouthern

Google announced it will make its Deep Research feature available to all users for free on a limited basis, while introducing several updates to Gemini.

With this rollout, Gemini is now equipped with enhanced reasoning capabilities, personalization features, and expanded app connectivity.

Free Access with Limitations

Google’s Deep Research tool, which processes information from multiple websites and documents, will now be accessible to non-paying users “a few times a month.”

Gemini Advanced subscribers will continue to have more extensive access to the feature.

The company describes Deep Research as an AI research assistant that searches and synthesizes web information.

Google reports the feature has been updated with its Flash Thinking 2.0 model, which displays its reasoning process while browsing.

Google stated in its announcement:

“Gemini users can try Deep Research a few times a month at no cost, and Gemini Advanced users get expanded access to Deep Research.”

The feature is rolling out in more than 45 languages.

Model Updates

The Flash Thinking 2.0 model has been updated to include file upload capabilities and faster processing speeds.

For paid subscribers, the system now processes up to 1 million tokens in a context window.

Dave Citron, Senior Director of Product Management for the Gemini app, stated in the announcement that the updated model is “trained to break down prompts into a series of steps to strengthen its reasoning capabilities.”

Testing has shown the system can still make errors in both analysis and conclusions, the company acknowledged.

Additional Features

Google also announced a new experimental personalization feature that connects with users’ Google apps and services. The feature uses data from search history to provide tailored responses to queries such as restaurant recommendations.

Additional app integrations now include Calendar, Notes, Tasks, and Photos, allowing users to make requests involving multiple applications. Google Photos integration is planned for the coming weeks.

Lastly, announced that its Gems feature, which lets users create customized AI assistants for specific topics, is now available to all users at no cost.

These updates are available now at gemini.google.com.


Featured Image: Screenshot from blog.google.com, March 2025. 

Google Search History Can Now Power Gemini AI Answers via @sejournal, @martinibuster

Google announced an update to their Gemini personal AI assistant that increases personalization of responses so that it anticipates user’s needs and feels more like a natural personal assistant instead of a tool. Examples of how the new Gemini will help users is for brainstorming travel ideas and making personalized recommendations.

The new feature rolls out first to desktop and then to mobile apps.

Gemini With Personalization

Google announced a new version of Gemini that adapts responses to a user’s unique interests. It does this based on their search history which enables Gemini to deliver responses with a higher level of contextual relevance and personalization. Google intends to expand personalization by integrating other Google apps and services, naming Photos and Images as examples.

Google explained:

“In the coming months, Gemini will expand its ability to understand you by connecting with other Google apps and services, including Photos and YouTube. This will enable Gemini to provide more personalized insights, drawing from a broader understanding of your activities and preferences to deliver responses that truly resonate with you.”

How Personalization Works

Users can share their personal preferences and details like dietary requirements or their partner’s names in order to obtain a greater degree of personalization in responses that feel specific to the individual. Advanced users can allow Gemini to access past chats to further improve the relevance of responses.

Google’s access to search history and data from other apps may give it an advantage that competing apps like ChatGPT may not be able to match.

Personalization Is Opt-In

There are four key points to understand about personalization in Gemini:

  1. Personalization is currently an opt-in feature that’s labeled “experimental.”
  2. Users need to choose to use Personalization from the model drop-down menu in order to activate it.
  3. Gemini asks for permission to connect to search history and other Google services and apps before it uses them for personalization.
  4. Users can also disconnect from the feature.

That means that millions of Gemini users won’t suddenly begin accessing an increasing amount of information from a contextual AI assistant instead of search. But it does mean the door to that happening exists and the next step is for Google users to open it.

What Publishers Need To Know

This update increasingly blurs the distance between traditional Search and Google’s Assistant while simultaneously making information increasingly accessible in a way that publishers and SEOs should be concerned enough to research to identify how to respond.

Considerations about privacy issues may keep Google from turning personalization into an opt-out feature. And while personalization is currently an opt-in from a drop-down menu because it’s still an experimental feature. But once it’s mature it’s not unreasonable to assume that Google may begin nudging users to adopt it.

Even though this is an experimental feature, publishers and SEOs may want to understand how this impacts them, such as if it’s possible to track personalized Gemini referral traffic or will it be masked because of privacy considerations? Will answers from Gemini reduce the need for clicks to publisher sites?

Read Google’s announcement:

Gemini gets personal, with tailored help from your Google app

Featured Image by Shutterstock/Tada Images

Google Begins Rolling Out March Core Algorithm Update via @sejournal, @MattGSouthern

Google has officially begun rolling out its March 2025 core algorithm update, according to an announcement posted to the Google Search Status Dashboard today.

The update commenced at 9:23 AM PDT and is expected to fully deploy across all of Google’s search systems in up to two weeks.

The company provided minimal details beyond the timing and expected duration of the rollout.

The official announcement reads:

“Released the March 2025 core update. The rollout may take up to 2 weeks to complete.”

What This Means For SEO Professionals

Core updates are comprehensive changes to Google’s main search algorithms and systems.

Unlike more minor updates that might focus on specific issues, core updates typically produce noticeable changes to search rankings across the web.

Website owners and SEO professionals should expect fluctuations in search visibility and rankings over the coming weeks as the update gradually rolls out.

These changes often stabilize once the update is fully implemented, though permanent shifts in positioning can occur based on how the new algorithm evaluates content quality and relevance.

Preparing For Algorithm Changes

As with previous core updates, Google hasn’t provided specific details about changes made to its ranking systems. The company typically advises creating high-quality content rather than trying to fix particular issues when rankings drop after an update.

Monitor your analytics during this period to identify any significant changes in traffic or rankings. Documenting these changes can help determine whether adjustments are needed once the update has been fully implemented.

Search Engine Journal will continue to monitor the impact of this update and provide additional information as it becomes available.

WordPress Backup Plugin Vulnerability Affects 5+ Million Websites via @sejournal, @martinibuster

A high-severity vulnerability was discovered and patched in the All-in-One WP Migration and Backup plugin, which has over five million installations. The vulnerability requires no user authentication, making it easier for an attacker to compromise a website, but this is mitigated by a restricted attack method.

The vulnerability was assigned a severity rating of 7.5 (High), which is below the highest severity level, labeled Critical.

Unauthenticated PHP Object Injection

The vulnerability is called an unauthenticated PHP object injection. But it’s less severe than a typical Unauthenticated PHP Object Injection where an attacker could directly exploit the vulnerability. This specific vulnerability requires that a user with administrator level credentials export and restore a backup with the plugin in order to trigger the exploit.

The way this kind of vulnerability works is that the WordPress plugin processes potentially malicious data during backup restoration without properly verifying it. But because there’s a narrow attack opportunity, it makes exploiting it less straightforward.

Nevertheless, if the right conditions are met, an attacker can delete files, access sensitive information, and run malicious code.

According to a report by Wordfence:

“The All-in-One WP Migration and Backup plugin for WordPress is vulnerable to PHP Object Injection in all versions up to, and including, 7.89 via deserialization of untrusted input in the ‘replace_serialized_values’ function.

This makes it possible for unauthenticated attackers to inject a PHP Object. No known POP chain is present in the vulnerable software. If a POP chain is present via an additional plugin or theme installed on the target system, it could allow the attacker to delete arbitrary files, retrieve sensitive data, or execute code. An administrator must export and restore a backup in order to trigger the exploit.”

The vulnerability affects versions up to and including 7.89. Users of the plugin are recommended to update it to the latest version which at the time of writing is 7.90.

Read the Wordfence vulnerability advisory:

All in One WP Migration <= 7.89 – Unauthenticated PHP Object Injection

Google Publishes New Robots.txt Explainer via @sejournal, @martinibuster

Google published a new Robots.txt refresher explaining how Robots.txt enables publishers and SEOs to control search engine crawlers and other bots (that obey Robots.txt). The documentation includes examples of blocking specific pages (like shopping carts), restricting certain bots, and managing crawling behavior with simple rules.

From Basics To Advanced

The new documentation offers a quick introduction to what Robots.txt is and gradually progresses to increasingly advanced coverage of what publishers and SEOs can do with Robots.txt and how it benefits them.

The main point of the first part of the document is to introduce robots.txt as a stable web protocol with a 30 year history that’s widely supported by search engines and other crawlers.

Google Search Console will report a 404 error message if the Robots.txt is missing. It’s okay for that to happen but if it bugs you to see that in the GSC you can wait 30 days and the warning will drop off. An alterative is to create a blank Robots.txt file which is also acceptable by Google.

Google’s new documentation explains:

“You can leave your robots.txt file empty (or not have one at all) if your whole site may be crawled, or you can add rules to manage crawling.”

From there it covers the basics like custom rules for restricting specific pages or sections.

The advanced uses of Robots.txt covers these capabilities:

  • Can target specific crawlers with different rules.
  • Enables blocking URL patterns like PDFs or search pages.
  • Enables granular control over specific bots.
  • Supports comments for internal documentation.

The new documentation finishes by describing how simple it is to edit the Robots.txt file (it’s a text file with simple rules), so all you need is a simple text editor. Many content management systems have a way to edit it and there are tools available for testing if the Robots.txt file is using the correct syntax.

Read the new documentation here:

Robots Refresher: robots.txt — a flexible way to control how machines explore your website

Featured Image by Shutterstock/bluestork

AI Model Showdown: Top Choices For Text, Image, & Video Generation via @sejournal, @MattGSouthern

With so many AI models available today, it’s tough to decide where to begin. A recent study from Quora’s Poe provides guidance for those unsure about which models to choose.

The study analyzes millions of interactions to highlight the most popular tools for generating text, images, and videos.

With nearly every tech company offering an AI solution, it’s easy to get overwhelmed by choices. Poe’s data clarifies which models are trusted and widely used.

Whether you’re new to AI or experienced, this report shows trends that can help you find the best models. Remember that this data represents Poe subscribers and may not reflect the broader AI community.

Text Generation Trends

A Two-Way Race

The study shows that among Poe subscribers, Anthropic models are quickly becoming as popular as OpenAI, especially after the release of Claude 3.5 Sonnet. The usage of text models from both providers is now almost evenly split.

Rapid Adoption of New Releases

Poe users often switch to the latest models, even if loyal to a specific brand. For example, people rapidly move from OpenAI’s GPT-4 to GPT-4o or from Claude 3 to Claude 3.5.

Emerging Players

DeepSeek’s R1 and V3 have captured about 7% of the messages on Poe. Google’s Gemini family has seen a slight decline in use among Poe subscribers but remains a key player.

Image Generation Trends

Market Share of Early Movers

DALL-E-3 and StableDiffusion were once leaders in image generation, but their shares have dropped by about 80%. This decline occurred as the number of image generation models increased from three to around 25.

Leading Models

The FLUX family from BlackForestLabs is now the leading image model, holding a nearly 40% share, while Google’s Imagen3 family has about a 30% share.

Smaller Models

Smaller image providers like Playground and Ideogram update their services frequently, which helps them maintain a loyal user base. However, they only account for about 10% of Poe’s image generation usage.

Video Generation Trends

An Emerging Industry

Video generation was almost nonexistent on Poe until late 2024, but it has quickly grown in popularity. Now, at least eight providers offer this ability.

Runway: Most Used Model

Runway’s single video model handles 30–50% of video generation requests. Although its usage is lower than it used to be, many people still choose this brand.

New Player: Veo-2

Since launching on Poe, Google’s Veo-2 has gained about 40% of the market, showing how quickly customer preferences can change. Other new models, such as Kling-Pro v1.5, Hailuo-AI, HunyuanVideo, and Wan-2.1, have captured around 15% of the market.

Key Takeaway & Looking Ahead

The data shows a clear pattern of newer models replacing older ones in user preference. If you want the best performance, use the latest version rather than stick with familiar but outdated models.

Whether these usage patterns will hold steady or continue to shift remains to be seen. At some point, cost will be a barrier to adoption, as new models tend to get more expensive with every release.

In future reports, Poe plans to share insights on how different models fit various tasks and price points.


Featured Image: stokkete/Shutterstock

New Wix Automations Makes It Easy To Grow Revenue And Engagement via @sejournal, @martinibuster

Wix announced Automations, a new automation builder that enables businesses to create and manage custom actions, like sending emails based on customer activity. Users are able to create these automations with an easy-to-use visual interface and track their performance from a dashboard.

Wix Automations Is Powerful But Easy To Use

There are four key features:

  • Intuitive Automation Design
    Simplifies the process of creating advanced automations.
  • Advanced Customization
    Supports conditions and formulas for creating highly customizable automations.
  • Centralized Automation Management
    Users can track key metrics, adjust settings in real time, and manage all automations, no matter which apps they’re connected to.
  • Email Automation Insights
    Provides detailed reporting on email success rates and engagement which enables businesses to fine-tune their email messaging.

The new Automations feature integrates with Wix Services, so businesses can use customer data to set up personalized automations like custom discounts based on what customers buy.

A user-friendly interface makes it easy to click and build advanced automations based on site visitor actions. Wix Automations supports conditions and formulas for creating customizable automations. What makes Wix Automations powerful is that these features enables users to easily set up complex, multi-step actions.

For example, a customer purchase can be the trigger to check a condition, such as whether the total is over $50. If the condition is met, a formula calculates a 10% discount, and the automation sends the customer an email with a discount code for a future purchase.

According to the press release:

“The builder’s clear and intuitive design makes it easier than ever to build and manage automations, significantly improving efficiency by streamlining and automating tasks and, ultimately, enhancing overall user experience.

With the addition of conditions and formulas, the automations builder now allows users to create more accurate, highly tailored workflows that adapt to their business needs allowing businesses to operate more smoothly and effectively.”

Mullenweg Considers Delaying WordPress Releases Through 2027 via @sejournal, @martinibuster

A leaked WordPress Slack chat shows that Matt Mullenweg is considering limiting future WordPress releases to just one per year from now through 2027 and insists that the only way to get Automattic to contribute more is to pressure WP Engine to drop their lawsuit. One WordPress developer who read that message characterized it as blackmail.

WordPress Core Development

Mullenweg’s Automattic already reduced their contributions to core, prompting a WordPress developer attending WordCamp Asia 2025 to plead with Matt Mullenweg to increase Automattic’s contributions to WordPress because his and so many other businesses depend on WordPress. Mullenweg smiled and said no without actually saying the word no.

Automattic’s January 2025 statement about reducing contributions:

“…Automattic will reduce its sponsored contributions to the WordPress project. This is not a step we take lightly. It is a moment to regroup, rethink, and strategically plan how Automatticians can continue contributing in ways that secure the future of WordPress for generations to come. Automatticians who contributed to core will instead focus on for-profit projects within Automattic, such as WordPress.com, Pressable, WPVIP, Jetpack, and WooCommerce. Members of the “community” have said that working on these sorts of things should count as a contribution to WordPress.

As part of this reset, Automattic will match its volunteering pledge with those made by WP Engine and other players in the ecosystem, or about 45 hours a week that qualify under the Five For the Future program as benefitting the entire community and not just a single company. These hours will likely go towards security and critical updates.

We’ve made the decision to reallocate resources due to the lawsuits from WP Engine. This legal action diverts significant time and energy that could otherwise be directed toward supporting WordPress’s growth and health. We remain hopeful that WP Engine will reconsider this legal attack, allowing us to refocus our efforts on contributions that benefit the broader WordPress ecosystem.

WP Engine’s historically slim contributions underscore the imbalance that must be addressed for the health of WordPress. We believe in fairness and shared responsibility, and we hope this move encourages greater participation across all organizations that benefit from WordPress.”

Leaked Slack Post

The post on Slack blamed WP Engine for the slowdown and encourages others to put pressure on WP Engine to drop the suit.

The following is a leaked quote of Mullenweg’s post on the WordPress Slack channel, as posted in the Dynamic WordPress Facebook Group (must join the Facebook group to read the post) by a reliable source:

“Would like to put together a Zoom for core committers to discuss future release schedule, hopefully bringing together some of the conversations happening the past 6 weeks:
6.8 includes a lot of “overhang” contributions from Automatticians, including 890+ enhancements and bug fixes in Gutenberg.

I’d like to make sure we get extra testing on 6.8 from web hosts, especially if they can upgrade perhaps their company blogs or something, employee sites, etc to make sure upgrades and everything work well in all environments and with the most popular plugins without regressions.
The Chromecast update issues today (https://x.com/james_dunthorne/status/1898871402049999126 )remind us how easily this can happen.

I’m willing to commit people to early roll-out to WP .com to provide widespread testing with hundreds of thousands of users. This is very resource-intensive, but has contributed a lot to making sure releases are stable before they deploy to the wider array of non-engaged web hosts in the past.

We should consider modifying the release schedule:Other corporate sponsors are protesting WPE’s actions by pulling back contributions, which I think will effect some of the other largest contributors after Automattic.

The court schedule in the WP Engine lawsuit against Automattic, me, and WordPress .org ( https://cloudup.com/c33IWQHdNMj ) goes to jury trial in 2027. WPE appears to be unresponsive to public pressure to resolve things earlier. (As I said at WC Asia, I’m ready to end it yesterday.)

We are approaching 7.0 in two releases, which has an emotional valence and I’d rather not be purely maintenance. (Nor do I want to break our naming structure and do 6.10.)
One approach would be delaying 6.8 and making it the only release this year, 6.9 in 2026, and then aim for a 7.0 in late 2027 assuming a positive outcome of the jury trial.

FWIW I would estimate WPE is spending the equivalent of 60 engineers full-time salary at 250k/yr as plaintiffs / attackers, and Automattic a similar amount on defense. Imagine the project could do for democratizing publishing and competing against proprietary alternatives if that were going into core and community development.
Drop any other thoughts or agenda items you may have in this thread.”

Response to Mullenweg’s leaked post:

One Facebook user accused Mullenweg of trying to “blackmail” the WordPress community to pressure WP Engine (WPE). They wrote that the community is largely sympathetic to WPE than to Mullenweg. But in general Mullenweg’s statement was met with a shrug because they feel that this will give core contributors the chance to catch up on maintaining the core which to them is a greater priority than adding more features to Gutenberg which many of the developers in this group apparently don’t use.

One lone commenter in the Facebook discussion asked if anyone in the discussion had made a positive contribution to WordPress. At the time of writing, nobody had cared to respond.

Is Google’s Use Of Compressibility An SEO Myth? via @sejournal, @martinibuster

I recently came across an SEO test that attempted to verify whether compression ratio affects rankings. It seems there may be some who believe that higher compression ratios correlate with lower rankings. Understanding compressibility in the context of SEO requires reading both the original source on compression ratios and the research paper itself before drawing conclusions about whether or not it’s an SEO myth.

Search Engines Compress Web Pages

Compressibility, in the context of search engines, refers to how much web pages can be compressed. Shrinking a document into a zip file is an example of compression. Search engines compress indexed web pages because it saves space and results in faster processing. It’s something that all search engines do.

Websites & Host Providers Compress Web Pages

Web page compression is a good thing because it helps search crawlers quickly access web pages which in turn sends the signal to Googlebot that it won’t strain the server and it’s okay to grab even more pages for indexing.

Compression speeds up websites, providing site visitors a high quality user experience. Most web hosts automatically enable compression because it’s good for websites, site visitors and also good for web hosts because it saves on bandwidth loads. Everybody wins with website compression.

High Levels Of Compression Correlate With Spam

Researchers at a search engine discovered that highly compressible web pages correlated with low-quality content. The study called Spam, Damn Spam, and Statistics: Using Statistical Analysis to Locate Spam Web Pages  (PDF) was conducted in 2006 by two of the world’s leading researchers, Marc Najork and Dennis Fetterly.

Najork currently works at DeepMind as Distinguished Research Scientist. Fetterly, a software engineer at Google, is an author of many important research papers related to search, content analysis and other related topics. This research paper isn’t just any research paper, it’s an important one.

What the research paper shows is that 70% of web pages that compress at a level of 4.0 or higher tended to be low quality pages with a high level of redundant word usage. The average compression level of sites was around 2.0.

Here are the averages of normal web pages listed by the research paper:

  • Compression ratio of 2.0:
    The most frequently occurring compression ratio in the dataset is 2.0.
  • Compression ratio of 2.1:
    Half of the pages have a compression ratio below 2.1, and half have a compression ratio above it.
  • Compression ratio of 2.11:
    On average, the compression ratio of the pages analyzed is 2.11.

It would be an easy first-pass way to filter out the obvious content spam so it makes sense that they would do that to weed out heavy-handed content spam. But weeding out spam is more complicated than simple solutions. Search engines use multiple signals because it results in a higher level of accuracy.

The researchers reported that 70% of sites with a compression level of 4.0 or higher were spam. That means that the other 30% were not spam sites. There are always outliers in statistics and that 30% of non-spam sites is why search engines tend to use more than one signal.

Do Search Engines Use Compressibility?

It’s reasonable to assume that search engines use compressibility to identify heavy handed obvious spam. But it’s also reasonable to assume that if search engines employ it they are using it together with other signals in order to increase the accuracy of the metrics. Nobody knows for certain if Google uses compressibility.

Is There Proof That Compression Is An SEO Myth?

Some SEOs have published research analyzing the rankings of thousands of sites for hundreds of keywords. They found that both the top-ranking and bottom-ranked sites had a compression ratio of about 2.4. The difference between their compression ratios was just 2%, meaning the scores were essentially equal. Those results are close to the normal average range of 2.11 reported in the 2006 scientific study.

The SEOs claimed that the mere 2% higher compression levels of the top-ranked sites over the bottom-ranked sites prove that compressibility is an SEO myth. Of course, that claim is incorrect. The average compression ratio of normal sites in 2006 was 2.11, which means the average 2.4 ratio in 2025 falls well within the range of normal, non-spam websites.

The ratio for spam sites is 4.0, so the fact that both sets of top and bottom ranked sites are about 2.4 ratio is meaningless since both scores fall within the range of normal.

If we assume that Google is using compressibility, a site would have produce a compression ratio of 4.0, plus send other low quality signals, to trigger an algorithmic action. If that happened those sites wouldn’t be in the search results at all because they wouldn’t be in the index and therefore there is no way to test that with the SERPs, right?

It would be reasonable to assume that the sites with high 4.0 compression ratios were removed. But we don’t know that, it’s not a certainty.

Is Compressibility An SEO Myth?

Compressibility may not be an SEO myth. But it’s probably not anything publishers or SEOs should be worry about as long as they’re avoiding heavy-handed tactics like keyword stuffing or repetitive cookie cutter pages.

Google uses de-duplication which removes duplicate pages from their index and consolidates the PageRank signals to whichever page they choose to be the canonical page (if they choose one). Publishing duplicate pages will likely not trigger any kind of penalty, including anything related to compression ratios, because, as was already mentioned, search engines don’t use signals in isolation.

U.S. DOJ Antitrust Filing Proposes 4 Ways To Break Google’s Monopoly via @sejournal, @martinibuster

The plaintiffs in an antitrust lawsuit against Google filed a revised proposed final judgment for the judge in the case to consider. The proposal comes after a previous ruling where the court determined that Google broke antitrust laws by illegally maintaining its monopoly.

The legal filing by the plaintiffs, the United States Department Of Justice and State Attorneys General, argue that Google has maintained monopolies in search services and text advertising through anticompetitive practices.

The filing proposes four ways to loosen Google’s monopolistic hold on search and advertising.

  1. Requiring Google to separate Chrome from its business—this could mean selling it or spinning it off into an independent company.
  2. Limiting Google’s payments to companies like Apple for making Google the default search engine, reducing its ability to secure exclusive deals.
  3. Stopping Google from favoring its own products over competitors in search results and other services, ensuring a more level playing field.
  4. Increasing transparency in Google’s advertising and data practices so competitors have fairer access to key information.

The proposal asks that Google be subjected to continuous oversight through mandatory reporting to ensure transparency in Google’s advertising and data practices:

“Google must provide to the Technical Committee and Plaintiffs a monthly report outlining any changes to its search text ads auction and its public disclosure of those changes.”

It also suggests ongoing enforcement to guarantee that Google doesn’t impose new restrictions that undermine transparency requirements:

“Google must not limit the ability of advertisers to export in real time (by downloading through an interface or API access) data or information relating to their entire portfolio of ads or advertising campaigns bid on, placed through, or purchased through Google.”

The goal of the above section is to increase transparency in Google’s advertising system and make it easier for advertisers to analyze their ad performance, greater transparency.

Real-time access ensures advertisers can make immediate adjustments to their campaigns instead of waiting for delayed reports and it assures that advertisers aren’t locked into the Google advertising system by holding them hostage to their historical data.

The legal filing requires government-imposed restrictions and changes to Google’s advertising business practices. It proposes remedies for how Google should be regulated or restructured following the court’s earlier ruling that Google engaged in monopolistic practices. However, this is not the final judgment and the court must still decide whether to adopt, modify, or reject these proposed remedies.