New WordPress Vibe Coding Simplifies Building Websites via @sejournal, @martinibuster

10Web, an AI website-building platform, launched Vibe for WordPress, an AI-based site builder that works natively with WordPress. Vibe for WordPress aims to simplify and scale the process of creating websites.

Conversational AI WordPress Development

Vibe for WordPress enables users to build websites by explaining what they need in conversational language. It generates a working WordPress site that can be refined in chat, in the drag-and-drop visual editor, or in code mode. This process links AI-generated prototypes with WordPress’s live environment, minimizing manual setup or reliance on outside CMS tools.

Features and Integration

According to 10Web, Vibe connects to the WordPress backend, offering access to plugins, WooCommerce for e-commerce, user management, and built-in SEO tools. The hosted stack includes CDN, SSL, and backups, making each project ready for production. It is open source, so developers can modify or migrate code freely.

By combining AI-based frontend building with the WordPress backend, 10Web positions Vibe as a bridge between flexible AI creation and open-source infrastructure.

10Web describes the benefits:

  • “Unlimited Frontend Freedom — Build any layout, interaction, or animation—no drag-and-drop limits.
  • Real WordPress Backend — Plugins, auth, content models, and WooCommerce (soon) baked in.
  • Prompt → Website — Generate full sites from a prompt, then refine via chat or direct code.
  • All-in-One Hosted Stack — Managed hosting, security, performance tools, backups—plus open-source flexibility.
  • Flexible Delivery — Use the platform today; API, self-hosted, and white-label are on the roadmap.”

Future Roadmap and Availability

Planned updates include WooCommerce support for ecommerce, Custom Post Type support, Figma or screenshot-based prompts, API, self-hosted, and the ability to white label it.

Read more at 10Web:

10Web Unveils First AI-Powered Vibe Coding Frontend Builder with Complete WordPress Backend

Featured Image by Shutterstock/Reyburn

Bing Supports data-nosnippet For Search Snippets & AI Answers via @sejournal, @MattGSouthern

Bing now supports the data-nosnippet HTML attribute, giving websites more precise control over what appears in search snippets and AI-generated answers.

The attribute lets you exclude specific page sections from Bing Search results and Copilot while keeping the page indexed.

Content marked with data-nosnippet remains eligible to rank but will not surface in previews.

What’s New

data-nosnippet can be applied to any HTML element you want to keep out of previews.

When Bing crawls your site, marked sections are discoverable but are omitted from snippet text and AI summaries.

Bing highlights common use cases:

  • Keep paywalled or premium content out of previews
  • Reduce exposure of user comments or reviews in AI answers
  • Hide legal boilerplate, disclaimers, and cookie notices
  • Suppress outdated notices and expired promotions
  • Exclude sponsored blurbs and affiliate disclaimers from neutral previews
  • Avoid A/B test noise by hiding variant copy during experiments
  • Emphasize high-value content while keeping sensitive parts behind the click

Implementation is straightforward. Add the attribute to any element:

Subscriber Content

This section will not appear in Bing Search or Copilot answers.

After adding it, you can verify changes in Bing Webmaster Tools with URL inspection. Depending on crawl timing, updates may appear within seconds or take up to a week.

How It Compares To Other Directives

data-nosnippet complements page-level directives.

  • noindex removes a page from the index
  • nosnippet blocks all text and preview thumbnails
  • max-snippet, max-image-preview, and max-video-preview cap preview length or size

Unlike those page-wide controls, data-nosnippet targets specific sections for finer control.

Why This Matters

If you run a subscription site, you can keep subscriber-only passages out of previews without sacrificing indexation.

For pages with user-generated content, you can prevent comments or reviews from appearing in AI summaries while leaving your editorial copy visible.

In short, it lets websites exclude specific sections from search snippets and maintain ranking potential.

Looking Ahead

The attribute is available now. Consider adding it to pages where preview control matters most, then confirm behavior in Bing Webmaster Tools.


Featured Image: T. Schneider/Shutterstock

Google Says It Surfaces More Video, Forums, And UGC via @sejournal, @MattGSouthern

Google says it has adjusted rankings to surface more short-form video, forums, and user-generated content in response to how people search.

Liz Reid, VP and head of Google Search, discussed the changes in a Wall Street Journal Bold Names podcast interview.

What Reid Said

Reid described a shift in where people go for certain questions, especially among younger users:

“There’s a behavioral shift that is happening in conjunction with the move to AI, and that is a shift of who people are going to for a set of questions. And they are going to short-form video, they are going to forums, they are going to user-generated content a lot more than traditional sites.”

She added:

“We do have to respond to who users want to hear from. We are in the business of both giving them high quality information but information that they seek out. And so we have over time adjusted our ranking to surface more of this content in response to what we’ve heard from users.”

To illustrate the behavior change, she gave a lifestyle example:

“Where are you getting your cooking? Are you getting your cooking recipes from a newspaper? Are you getting your cooking recipes from YouTube?”

Reid also highlighted a pattern with search updates:

“One of the things that’s always true about Google Search is that you make changes and there are winners and losers. That’s true on any ranking update.”

Ads And Query Mix

Reid said the impact of AI Overviews on ads is offset by people running more searches overall:

“The revenue with AI Overviews has been relatively stable… some queries may get less clicks on ads, but also it grows overall queries so people do more searches. And so those two things end up balancing out.”

She noted most queries have no ads:

“Most queries don’t have any ads at all… that query is sort of unaffected by ads.”

Reid also described how lowering friction (e.g., Lens, multi-page answers via AI Overviews) increases total searches.

Attribution & Personalization

Reid highlighted work on link prominence and loyal-reader connections:

“We’ve started doing more with inline links that allows you to say according to so-and-so with a big link for whoever the so-and-so is… building both the brand, as well as the click through.”

Quality Signals & Low-Value Content

On quality and spam posture:

“We’ve… expanded beyond this concept of spam to sort of low-value content.”

She said richer, deeper material tends to drive the clicks from AI experiences.

How Google Tests Changes

Asked whether there is a “push” as well as a “pull,” Reid described the evaluate-and-learn loop:

“You take feedback from what you hear from research about what users want, you then test it out, and then you see how users actually act. And then based on how users act, the system then starts to learn and adjust as well.”

Why This Matters

In certain cases, your pages may face increased competition from forum threads and short videos.

That means improvements in quality and technical SEO alone might not fully account for traffic fluctuations if the distribution of formats has changed.

If hit by a Google update, teams should examine where visibility decreases and identify which query types are impacted. From there, determine if competing results have shifted to forum threads or short videos.

Open Questions

Reid didn’t provide timing for when the adjustments began or metrics indicating how much weighting changed.

It’s unclear which categories are most affected or whether the impact will expand further.

Looking Ahead

Reid’s comments confirm that Google has adjusted ranking to reflect evolving user behavior.

Given this, it makes sense to consider creating complementary formats like short videos while continuing to invest in in-depth expertise where traditional pages still win.


Featured Image: Michael Vi/Shutterstock

Google’s John Mueller Flags SEO Issues In Vibe Coded Website via @sejournal, @MattGSouthern

Google Search Advocate John Mueller provided detailed technical SEO feedback to a developer on Reddit who vibe coded a website in two days and launched it on Product Hunt.

The developer posted in r/vibecoding that they built a Bento Grid Generator for personal use, published it on Product Hunt, and received over 90 upvotes within two hours.

Mueller responded with specific technical issues affecting the site’s search visibility.

Mueller wrote:

“I love seeing vibe-coded sites, it’s cool to see new folks make useful & self-contained things for the web, I hope it works for you.

This is just a handful of the things I noticed here. I’ve seen similar things across many vibe-coded sites, so perhaps this is useful for others too.”

Mueller’s Technical Feedback

Mueller identified multiple issues with the site.

The homepage stores key content in a llms.txt JavaScript file. Mueller noted that Google doesn’t use this file, and he’s not aware of other search engines using it either.

Mueller wrote:

“Generally speaking, your homepage should have everything that people and bots need to understand what your site is about, what the value of your service / app / site is.”

He recommended adding a popup-welcome-div in HTML that includes the information to make it immediately available to bots.

For meta tags, Mueller said the site only needs title and description tags. The keywords, author, and robots meta tags provide no SEO benefit.

The site includes hreflang tags despite having just one language version. Mueller said these aren’t necessary for single-language sites.

Mueller flagged the JSON-LD structured data as ineffective, noting:

“Check out Google’s ‘Structured data markup that Google Search supports’ for the types supported by Google. I don’t think anyone else supports your structured data.”

He called the hidden h1 and h2 tags “cheap & useless.” Mueller suggested using a visible, dismissable banner in the HTML instead.

The robots.txt file contains unnecessary directives. Mueller recommended skipping the sitemap if it’s just one page.

Mueller suggested adding the domain to Search Console and making it easier for visitors to understand what the app or site does.

Setting Expectations

Mueller closed his feedback with realistic expectations about the impact of technical SEO fixes.

He said:

“Will you automatically get tons of traffic from just doing these things? No, definitely not. However, it makes it easier for search engines to understand your site, so that they could be sending you traffic from search.”

He noted that implementing these changes now sets you up for success later.

Mueller added:

“Doing these things sets you up well, so that you can focus more on the content & functionality, without needing to rework everything later on.”

The Vibe Coding Trade-Off

This exchange highlights a tension with vibe coding and search visibility.

The developer built a functional product that generated immediate user engagement. The site works, looks polished, and achieved success on Product Hunt within hours.

None of the flagged issues affects user experience. But every implementation choice Mueller criticized shares the same characteristic. It works for visitors while providing nothing to search engines.

Sites built for rapid launch can achieve product success without search visibility. But the technical debt adds up.

The fixes aren’t too challenging, but they require addressing issues that seemed fine when the goal was to ship fast rather than rank well.


Featured Image: Panchenko Vladimir/Shutterstock

Google Answers What To Do For AEO/GEO via @sejournal, @martinibuster

Google’s VP of Product, Robby Stein, recently answered the question of what people should think about in terms of AEO/GEO. He provided a multi-part answer that began with how Google’s AI creates answers and ended with guidance on what creators should consider.

Foundations Of Google AI Search

The question asked was about AEO/GEO, which was characterized by the podcast host as the evolution of SEO. Google’s Robby Stein’s answer suggested thinking about the context of AI answers.

This is the question that was asked:

“What’s your take on this whole rise of AEO, GEO, which is kind of this evolution of SEO?

I’m guessing your answer is going to be just create awesome stuff and don’t worry about it, but you know, there’s a whole skill of getting to show up in these answers. Thoughts on what people should be thinking about here?”

Stein began his answer describing the foundations of how Google’s AI search works:

“Sure. I mean, I can give you a little bit of under the hood, like how this stuff works, because I do think that helps people understand what to do.

When our AI constructs a response, it’s actually trying to, it does something called query fan-out, where the model uses Google search as a tool to do other querying.

So maybe you’re asking about specific shoes. It’ll add and append all of these other queries, like maybe dozens of queries, and start searching basically in the background. And it’ll make requests to our data kind of backend. So if it needs real-time information, it’ll go do that.

And so at the end of the day, actually something’s searching. It’s not a person, but there’s searches happening.”

Robby Stein shows that Google’s AI still relies on conventional search engine retrieval, it’s just scaled and automated. The system performs dozens of background searches and evaluates the same quality signals that guide ordinary search rankings.

That means that “answer engine optimization” is basically the same as SEO because the underlying indexing, ranking and quality factors inherent to traditional SEO principles still apply to queries that the AI itself issues as part of the query fan-out process.

For SEOs, the insight is that visibility in AI answers depends less on gaming a new algorithm and more on producing content that satisfies intent so thoroughly that Google’s automated searches treat it as the best possible answer. As you’ll see later in this article, originality also plays a role.

Role Of Traditional Search Signals

An interesting part of this discussion is centered on the kinds of quality signals that Google describes in its Quality Raters Guidelines. Stein talks about originality of the content, for example.

Here’s what he said:

“And then each search is paired with content. So if for a given search, your webpage is designed to be extremely helpful.

And then you can look up Google’s human rater guidelines and read… what makes great information? This is something Google has studied more than anyone.

And it’s like:

  • Do you satisfy the user intent of what they’re trying to get?
  • Do you have sources?
  • Do you cite your information?
  • Is it original or is it repeating things that have been repeated 500 times?

And there’s these best practices that I think still do largely apply because it’s going to ultimately come down to an AI is doing research and finding information.

And a lot of the core signals, is this a good piece of information for the question, they’re still valid. They’re still extremely valid and extremely useful. And that will produce a response where you’re more likely to show up in those experiences now.”

Although Stein is describing AI Search results, his answer shows that Google’s AI Search still values the same underlying quality factors found in traditional search. Originality, source citations, and satisfying intent remain the foundation of what makes information “good” in Google’s view. AI has changed the interface of search and encouraged more complex queries, but the ranking factors continue to be the same recognizable signals related to expertise and authoritativeness.

More On How Google’s AI Search Works

The podcast host, Lenny, followed up with another question about how Google’s AI Search might follow a different approach from a strictly chatbot approach.

He asked:

“It’s interesting your point about how it goes in searches. When you use it, it’s like searching a thousand pages or something like that. Is that a just a different core mechanic to how other popular chatbots work because the others don’t go search a bunch of websites as you’re asking.”

Stein answered with more details about how AI search works, going beyond query fan-out, identifying factors it uses to surface what they feel to be the best answers. For example, he mentions parametric memory. Parametric memory is the knowledge that an AI has as part of its training. It’s essentially the knowledge stored within the model and not fetched from external sources.

Stein explained:

“Yeah, this is something that we’ve done uniquely for our AI. It obviously has the ability to use parametric memory and thinking and reasoning and all the things a model does.

But one of the things that makes it unique for designing it specifically for informational tasks, like we want it to be the best at informational needs. That’s what Google’s all about.

  • And so how does it find information?
  • How does it know if information is right?
  • How does it check its work?

These are all things that we built into the model. And so there is a unique access to Google. Obviously, it’s part of Google search.

So it’s Google search signals, everything from spam, like what’s content that could be spam and we don’t want to probably use in a response, all the way to, this is the most authoritative, helpful piece of information.

We’re going link to it and we’re going to explain, hey, according to this website, check out that information and you’re going to probably go see that yourself.

So that’s how we’ve thought about designing this.”

Stein’s explanation makes it clear that Google’s AI Search is not designed to mimic the conversational style of general chatbots but to reinforce the company’s core goal of delivering trustworthy information that’s authoritative and helpful.

Google’s AI Search does this by relying on signals from Google Search, such as spam detection and helpfulness, the system grounds its AI-generated answers in the same evaluation and ranking framework inherent in regular search ranking.

This approach positions AI Search as less a standalone version of search and more like an extension of Google’s information-retrieval infrastructure, where reasoning and ranking work together to surface factually accurate answers.

Advice For Creators

Stein at one point acknowledges that creators want to know what to do for AI Search. He essentially gives the advice to think about the questions people are asking. In the old days that meant thinking about what keywords searchers are using. He explains that’s no longer the case because people are using long conversational queries now.

He explained:

“I think the only thing I would give advice to would be, think about what people are using AI for.

I mentioned this as an expansionary moment, …that people are asking a lot more questions now, particularly around things like advice or how to, or more complex needs versus maybe more simple things.

And so if I were a creator, I would be thinking, what kind of content is someone using AI for? And then how could my content be the best for that given set of needs now?
And I think that’s a really tangible way of thinking about it.”

Stein’s advice doesn’t add anything new but it does reframe the basics of SEO for the AI Search era. Instead of optimizing for isolated keywords, creators should consider anticipating the fuller intent and informational journey inherent in conversational questions. That means structuring content to directly satisfy complex informational needs, especially “how to” or advice-driven queries that users increasingly pose to AI systems rather than traditional keyword search.

Takeaways

  • AI Is Search Still Built on Traditional SEO Signals
    Google’s AI Search relies on the same core ranking principles as traditional search—intent satisfaction, originality, and citation of sources.
  • How Query Fan-Out Works
    AI Search issues dozens of background searches per query, using Google Search as a tool to fetch real-time data and evaluate quality signals.
  • Integration of Parametric Memory and Search Signals
    The model blends stored knowledge (parametric memory) with live Google Search data, combining reasoning with ranking systems to ensure factual accuracy.
  • Google’s AI Search Is Like An Extension of Traditional Search
    AI Search isn’t a chatbot; it’s a search-based reasoning system that reinforces Google’s informational trust model rather than replacing it.
  • Guidance for Creators in the AI Search Era
    Optimizing for AI means understanding user intent behind long, conversational queries—focusing on advice- and how-to-style content that directly satisfies complex informational needs.

Google’s AI Search builds on the same foundations that have long defined traditional search, using retrieval, ranking, and quality signals to surface information that demonstrates originality and trustworthiness. By combining live search signals with the model’s own stored knowledge, Google has created a system that explains information and cites the websites that provided it. For creators, this means that success now depends on producing content that fully addresses the complex, conversational questions people bring to AI systems.

Watch the podcast segment starting at about the 15:30 minute mark:

Featured Image by Shutterstock/PST Vector

WPBakery WordPress Vulnerability Lets Attackers Inject Malicious Code via @sejournal, @martinibuster

An advisory was issued for the popular WPBakery plugin that’s bundled in thousands of WordPress themes. The vulnerability enables authenticated attackers to inject malicious scripts that execute when someone visits an affected page.

WPBakery Plugin

WPBakery is a drag-and-drop page builder plugin for WordPress that enables users to easily create custom layouts and websites without writing code. WPBakery is frequently bundled with premium themes. Theme developers license it so that they can bring the power of a drag and drop page builder functionality to their WordPress themes.

WPBakery Vulnerability

The WPBakery Page Builder WordPress plugin was discovered to have insufficient input sanitization and output escaping in it’s Custom JS module.

Insufficient input sanitization and output escaping are flaws that enable attackers to upload malicious code into a website and cause the affected site to output malicious code. In general, this can lead to vulnerabilities such as Cross-Site Scripting (XSS) and SQL Injection.

  • Input Sanitization filters uploaded user data before it is stored or processed by the plugin.
  • Output Escaping converts characters that have HTML meanings into safe output before it is displayed on a web page. This prevents executable code from outputting onto a live web page and affecting users.

This flaw enables attackers with contributor-level access or higher to inject arbitrary scripts to affected websites. The vulnerability affects WPBakery plugin versions up to and including version 8.6.1.

Users of the plugin are encouraged to update to the latest version of WPBakery, which is currently version 8.7.

Featured Image by Shutterstock/3d artwork wallpaper

Google Redesigns How Search Ads Are Labeled via @sejournal, @brookeosmundson

Google is rolling out a change to how ads appear in Search, and this time it’s focused on clarity and user control.

Text ads will now be grouped under a single “Sponsored results” label that stays visible as you scroll. In addition, a new “Hide sponsored results” option lets users collapse the entire ad block with one click.

This update doesn’t change how ads are served or ranked, but it does change how they’re presented to users. Even small interface updates can influence how people interact with search results, so advertisers should pay attention to how this evolves over time.

A Look at the New Sponsored Label on Google Search

Previously, each text ad showed a small “Sponsored” label at the top of each ad.

Now, Google is grouping all text ads together with a single header that clearly signals where the sponsored section begins and ends. That label remains visible even if the user scrolls down the page.

While doing a Search in the wild, the new format appeared, even with just one ad:

New 'Sponsored Result' layout on Google Ads search result.Screenshot taken by author

Google is also extending this approach to other formats. For example, Shopping placements will use a “Sponsored products” label.

On results that include AI Overviews, the sponsored section can appear above or below the AI-generated content, but it will still follow the same grouping and labeling format.

The most noticeable addition is the ability to collapse all sponsored results. Not every user will hide the section, but the option itself introduces a new behavior that didn’t exist before.

Google noted that these updates are rolling out globally to users on both desktop and mobile

Why This Matters to Advertisers

From a performance perspective, the underlying mechanics are unchanged. Bidding, Quality Score, ranking, and the maximum number of ads (up to four in a block) all remain the same.

That said, grouping ads together can influence how users perceive them.

When ads are visually separated from organic listings, the difference between the two becomes more intentional.

Users who skim results may pause and decide whether to interact with the sponsored block at all. For lower-intent searches, this could result in fewer casual clicks. For higher-intent queries, the impact may be minimal.

This puts more pressure on the quality of the ad itself. Clear value propositions, relevant messaging, and strong alignment with search intent will matter even more.

Ranking at the top will still be valuable, but visibility alone won’t guarantee engagement if users are more aware of what they’re clicking.

While the update is primarily visual, advertisers should keep an eye on performance once it fully rolls out across mobile and desktop. A few areas to watch include:

  • Changes in CTR or Impression-to-Click patterns
  • Differences in engagement based on query intent
  • Any vertical-specific impact where users are more likely to hide ads

Early shifts may be small, but trends could emerge over time as users adjust to the new layout.

Why Did Google Make This Change?

Google notes that these changes were driven by user testing and feedback. The goal is to create a more consistent and transparent experience across all ad formats. It also reflects increasing expectations around clarity in search results as AI-generated content becomes more common.

By making it easier to recognize sponsored content, Google is signaling that paid placements can be both visible and trustworthy, as long as they’re clearly labeled.

This approach may help maintain long-term confidence in search results as the interface continues to evolve.

Moving Towards a More Transparent SERP

Google’s update reinforces a larger shift: how ads appear on the page is becoming just as important as where they appear.

The auction logic and placement limits remain the same, but the experience around ads is becoming more clearly defined for users.

As presentation evolves, it’s reasonable to expect user behavior to follow. Some people will ignore the change. Others may start to be more selective about when they engage with ads.

This puts more weight on relevance, clarity, and value in the message itself.

Advertisers don’t need to overhaul their campaign structure or bidding strategy because of this change. Instead, the focus should be on tightening creative quality, aligning closely with intent, and paying attention to early performance shifts.

Even if the impact is subtle at first, updates like this often lead to gradual behavior changes over time.

Search has always been a balance between visibility and trust. Advertisers who adapt early and continue to prioritize useful, high-quality messaging will be in the best position to maintain performance as the SERP continues to evolve.

Google Explains Next Generation Of AI Search via @sejournal, @martinibuster

Google’s Robby Stein, VP of Product at Google, explained that Google Search is converging with AI in a new manner that builds on three pillars of AI. The implications for online publishers, SEOs, and eCommerce stores are profound.

Three Pillars Of AI Search

Google’s Stein said that there are three essential components to the “next generation” of Google Search:

  1. AI Overviews
  2. Multimodal search
  3. AI Mode

AI Overviews is natural language search. Multimodal are new ways of searching with images, enabled by Google Lens. AI Mode is the harnessing of web content and structured knowledge to provide a conversational turn-based way of discovering information and learning. Stein indicates that all three of these components will converge as the next step in the evolution of search. This is coming.

Stein explained:

“I can tell you there’s kind of three big components to how we can think about AI search and kind of the next generation of search experiences. One is obviously AI overviews, which are the quick and fast AI you get at the top of the page many people have seen. And that’s obviously been something growing very, very quickly. This is when you ask a natural question, you put it into Google, you get this AI now. It’s really helpful for people.

The second is around multimodal. This is visual search and lens. That’s the other big piece. You go to the camera in the Google app, and that’s seeing a bunch of growth.

And then with AI mode, it brings it all together. It creates an end-to-end frontier search experience on state-of-the-art models to really truly let you ask anything of Google search.”

AI Mode Triggered By Complex Queries

Screenshot showing how a complex two sentence query automatically triggers an AI Mode preview.

The above screenshot shows a complex two sentence search query entered into Google’s search box. The complex query automatically triggers an AI Mode preview with a “Show more” link that leads to an immersive AI Mode conversational search experience. Publishers who wish to be cited need to think about how their content will fit into this kind of context.

Next Generation Of Google: AI Mode Is Like A Brain

Stein described the next frontier of search as something that is radically different from what we know as Google Search. Many SEOs still think of search as this ranking paradigm with ten blue links. That’s something that’s not quite existed since Google debuted Featured Snippets back in 2014. That’s eleven years that the concept of ten blue links has been out of step with the reality in Google’s search results.

What Stein goes on to describe completely does away with the concept of ten blue links, replacing it with the concept of a brain that users can ask questions and interact with. SEOs, merchants and other publishers really need to begin doing away with the mental concept of ten blue links and focus on surfacing content within an interactive natural language environment that’s completely outside of search.

Stein explained this new concept of a brain in the context of AI Mode:

“You can go back and forth. You can have a conversation. And it taps into and is specially designed for search. So what does that mean? One of the cool things that I think it does is it’s able to understand all of this incredibly rich information that’s within Google.

  • So there’s 50 billion products in the Google Shopping Graph, for instance. They’re updated 2 billion times an hour by merchants with live prices.
  • You have 250 million places and maps.
  • You have all of the finance information.
  • And not to mention, you have the entire context of the web and how to connect to it so that you can get context, but then go deeper.

And you put all of that into this brain that is effectively this way to talk to Google and get at this knowledge.

That’s really what you can do now. So you can ask anything on your mind and it’ll use all of this information to hopefully give you super high quality and informed information as best as we can.”

Stein’s description shows that Google’s long-term direction is to move beyond retrieval toward an interactive turn-based mode of information discovery. The “brain” metaphor signals that search will increasingly be less about locating web pages but about generating informed responses built from Google’s own structured data, knowledge graphs, and web content. This represents a fundamental change and as you’ll see in the following paragraphs, this change is happening right now.

AI Mode Integrates Everything

Stein describes how Google is increasingly triggering AI Mode as the next evolution of how users find answers to questions and discover information about the world immediately around them. This goes beyond asking “what’s the best kayak” and becomes more of a natural language conversation, an information journey that can encompass images, videos, and text, just like in real life. It’s an integrated experience that goes way beyond a simple search box and ten links.

Stein provided more information of what this will look like:

“And you can use it directly at this google.com/ai, but it’s also been integrated into our core experiences, too. So we announced you can get to it really easily. You can ask follow-up questions of AI overviews right into AI mode now.

Same for the lens stuff, take a picture, takes it to AI mode. So you can ask follow-up questions and go there, too. So it’s increasingly an integrated experience into the core part of the product.”

How AI Will Converge Into One Interface

At this point the host of the podcast asked for a clearer explanation of how all of these things will be integrated.

He asked:

“I imagine much of this is… wait and see how people use it. But what’s the vision of how all these things connect?

Is the idea to continue having this AI mode on the side, AI overviews at the top, and then this multimodal experience? Or is there a vision of somehow pushing these together even more over time?”

Stein answered that all of these modes of information discovery will converge together. Google will be able to detect by the query whether to trigger AI Mode or just a simple search. There won’t be different interfaces, just the one.

Stein explained:

“I think there’s an opportunity for these to come closer together. I think that’s what AI Mode represents, at least for the core AI experiences. But I think of them as very complementary to the core search product.

And so you should be able to not have to think about where you’re asking a question. Ultimately, you just go to Google.

And today, if you put in whatever you want, we’re actually starting to use much of the power behind AI mode, right in AI Overviews. So you can just ask really hard, you could put a five-sentence question right into Google search.

You can try it. And then it should trigger AI at the top, it’s a preview. And then you can go deeper into AI mode and have this back and forth. So that’s how these things connect.

Same for your camera. So if you take a picture of something, like, what’s this plant? Or how do I buy these shoes? It should take you to an AI little preview. And then if you go deeper, again, it’s powered by AI mode. You can have that back and forth.

So you shouldn’t have to think about that. It should feel like a consistent, simple product experience, ultimately. But obviously, this is a new thing for us. And so we wanted to start it in a way that people could use and give us feedback with something like a direct entry point, like google.com/AI.”

Stein’s answer shows that Google is moving from separate AI features toward one unified search system that interprets intent and context automatically.

  • For users, that means typing, speaking, or taking a picture will all connect to the same underlying process that decides how to respond.
  • For publishers and SEOs, it means visibility will depend less on optimizing for keywords and more on aligning content with how Google understands and responds to different kinds of questions.

How Content Can Fit Into AI Triggered Search Experiences

Google is transitioning users out of the traditional ten blue links paradigm into a blended AI experience. Users can already enter questions consisting of multiple sentences and Google will automatically transition into an AI Mode deep question and answer. The answer is a preview with an option to trigger a deeper back and forth conversation.

Robbie Stein indicated that the AI Search experience will converge even more, depending on user feedback and how people interact with it.

These are profound changes that demand publishers ask deep questions about how content:

  • Should you consider how curating unique images, useful video content, and step-by-step tutorials may fit into your content strategies?
  • Information discovery is increasingly conversational, does your content fit into that context?
  • Information discovery may increasingly include camera snapshots, will your content fit into that kind of search?

These are examples of the kinds of questions publishers, SEOs and store owners should be thinking about.

Watch the podcast interview with Robby Stein

Inside Google’s AI turnaround: AI Mode, AI Overviews, and vision for AI-powered search | Robby Stein

Featured image/Screenshot of Lenny’s Podcast video

Google Adds AI Previews To Discover, Sports Feed Coming via @sejournal, @MattGSouthern

Google rolled out AI trending previews in Discover and will add a “What’s new” sports feed to U.S. mobile search in coming weeks.

  • AI trending previews in Discover are live in the U.S., South Korea, and India.
  • A sports “What’s new” button will begin rolling out in the U.S. in the coming weeks.
  • Both experiences show brief previews with links to publisher/creator content.
YouTube Upgrades Shorts Editor With Timeline View via @sejournal, @MattGSouthern

YouTube is launching an all-new timeline editor for Shorts that puts video clips, overlays, and audio in one unified view.

The update responds to creator requests for more precise controls inside the YouTube app.

What’s New

The editor adds timeline-level controls you can use without switching modes. You can trim and reorder clips via drag-and-drop and use zoom for precise timing and transitions.

YouTube states in a video announcement:

“When you’re creating a short, you’ve told us how critical it is to be able to make all of your edits to a video in one place to be able to really make something you’re proud of. We appreciate the feedback and we’ve been listening. So, we’re launching an all new Shorts timeline editor. Now, everything is visible in one place. all of your video clips, overlays, and audio. You can trim, reorder the clips, simple drag and drop. You can zoom in to make precise edits.”

See it in action in the video below:

Planned Additions

YouTube says it plans to add slip editing, clip splitting, and the ability to add media directly from the timeline. No release window was provided in the video.

YouTube adds:

“And this is just the beginning. We’re making a lot more key improvements, for example, like being able to do slip editing, being able to do splitting, and adding media directly from timeline.”

YouTube also said it will continue expanding Edit with AI, the Gemini-assisted tool that can assemble a first-draft edit with music, transitions, and voiceover.

While integration with the new timeline editor is planned, YouTube didn’t share timing or regional availability.

YouTube said:

“We also plan to continue to expand Edit with AI… to make it even easier for you to craft your vision with Gemini Assistance.”

Why This Matters

A unified timeline brings more of the desktop editing experience into the YouTube app.

The single view reduces mode switching, and the upcoming slip and split controls should improve pacing and transitions without relying on third-party editors.

As YouTube put it:

“The goal is really simple. More creative freedom with less friction. We’re really committed to building easy to use creation tools that help you be your most creative self and make something you’re really proud of.”

Looking Ahead

YouTube didn’t specify when the timeline editor will reach all creators or which regions will get it first.

These updates fit into YouTube’s broader push to enhance Shorts creation tools, following earlier improvements like beat syncing, templates, and AI-generated stickers.


Featured Image: FotoField/Shutterstock