5 SEO Tactics to Be Seen & Trusted on AI Search [Webinar] via @sejournal, @duchessjenm

Is your brand ready for AI-driven SERPs?

Search is evolving faster than ever. AI-driven engines like ChatGPT, Google SGE, and Bing Copilot are changing how users discover and trust brands. Traditional SEO tactics alone may no longer guarantee visibility or authority in Answer Engines.

Discover five proven tactics to protect your SERP presence and maintain trust in AI search.

What You’ll Learn

Craig Smith, Chief Strategy Officer at Outerbox, will show exactly how to adapt your SEO strategy for generative search and answer engines. 

You’ll walk away with actionable steps to:

Register now to get the SEO playbook your competitors wish they had.

Why You Can’t Miss This Webinar

AI Overviews are already impacting traffic. Brands that adapt now will dominate visibility and authority while others fall behind.

🛑 Can’t attend live? Register anyway and we’ll send you the recording so you can watch at your convenience.

WPBakery WordPress Vulnerability Lets Attackers Inject Malicious Code via @sejournal, @martinibuster

An advisory was issued for the popular WPBakery plugin that’s bundled in thousands of WordPress themes. The vulnerability enables authenticated attackers to inject malicious scripts that execute when someone visits an affected page.

WPBakery Plugin

WPBakery is a drag-and-drop page builder plugin for WordPress that enables users to easily create custom layouts and websites without writing code. WPBakery is frequently bundled with premium themes. Theme developers license it so that they can bring the power of a drag and drop page builder functionality to their WordPress themes.

WPBakery Vulnerability

The WPBakery Page Builder WordPress plugin was discovered to have insufficient input sanitization and output escaping in it’s Custom JS module.

Insufficient input sanitization and output escaping are flaws that enable attackers to upload malicious code into a website and cause the affected site to output malicious code. In general, this can lead to vulnerabilities such as Cross-Site Scripting (XSS) and SQL Injection.

  • Input Sanitization filters uploaded user data before it is stored or processed by the plugin.
  • Output Escaping converts characters that have HTML meanings into safe output before it is displayed on a web page. This prevents executable code from outputting onto a live web page and affecting users.

This flaw enables attackers with contributor-level access or higher to inject arbitrary scripts to affected websites. The vulnerability affects WPBakery plugin versions up to and including version 8.6.1.

Users of the plugin are encouraged to update to the latest version of WPBakery, which is currently version 8.7.

Featured Image by Shutterstock/3d artwork wallpaper

Google Redesigns How Search Ads Are Labeled via @sejournal, @brookeosmundson

Google is rolling out a change to how ads appear in Search, and this time it’s focused on clarity and user control.

Text ads will now be grouped under a single “Sponsored results” label that stays visible as you scroll. In addition, a new “Hide sponsored results” option lets users collapse the entire ad block with one click.

This update doesn’t change how ads are served or ranked, but it does change how they’re presented to users. Even small interface updates can influence how people interact with search results, so advertisers should pay attention to how this evolves over time.

A Look at the New Sponsored Label on Google Search

Previously, each text ad showed a small “Sponsored” label at the top of each ad.

Now, Google is grouping all text ads together with a single header that clearly signals where the sponsored section begins and ends. That label remains visible even if the user scrolls down the page.

While doing a Search in the wild, the new format appeared, even with just one ad:

New 'Sponsored Result' layout on Google Ads search result.Screenshot taken by author

Google is also extending this approach to other formats. For example, Shopping placements will use a “Sponsored products” label.

On results that include AI Overviews, the sponsored section can appear above or below the AI-generated content, but it will still follow the same grouping and labeling format.

The most noticeable addition is the ability to collapse all sponsored results. Not every user will hide the section, but the option itself introduces a new behavior that didn’t exist before.

Google noted that these updates are rolling out globally to users on both desktop and mobile

Why This Matters to Advertisers

From a performance perspective, the underlying mechanics are unchanged. Bidding, Quality Score, ranking, and the maximum number of ads (up to four in a block) all remain the same.

That said, grouping ads together can influence how users perceive them.

When ads are visually separated from organic listings, the difference between the two becomes more intentional.

Users who skim results may pause and decide whether to interact with the sponsored block at all. For lower-intent searches, this could result in fewer casual clicks. For higher-intent queries, the impact may be minimal.

This puts more pressure on the quality of the ad itself. Clear value propositions, relevant messaging, and strong alignment with search intent will matter even more.

Ranking at the top will still be valuable, but visibility alone won’t guarantee engagement if users are more aware of what they’re clicking.

While the update is primarily visual, advertisers should keep an eye on performance once it fully rolls out across mobile and desktop. A few areas to watch include:

  • Changes in CTR or Impression-to-Click patterns
  • Differences in engagement based on query intent
  • Any vertical-specific impact where users are more likely to hide ads

Early shifts may be small, but trends could emerge over time as users adjust to the new layout.

Why Did Google Make This Change?

Google notes that these changes were driven by user testing and feedback. The goal is to create a more consistent and transparent experience across all ad formats. It also reflects increasing expectations around clarity in search results as AI-generated content becomes more common.

By making it easier to recognize sponsored content, Google is signaling that paid placements can be both visible and trustworthy, as long as they’re clearly labeled.

This approach may help maintain long-term confidence in search results as the interface continues to evolve.

Moving Towards a More Transparent SERP

Google’s update reinforces a larger shift: how ads appear on the page is becoming just as important as where they appear.

The auction logic and placement limits remain the same, but the experience around ads is becoming more clearly defined for users.

As presentation evolves, it’s reasonable to expect user behavior to follow. Some people will ignore the change. Others may start to be more selective about when they engage with ads.

This puts more weight on relevance, clarity, and value in the message itself.

Advertisers don’t need to overhaul their campaign structure or bidding strategy because of this change. Instead, the focus should be on tightening creative quality, aligning closely with intent, and paying attention to early performance shifts.

Even if the impact is subtle at first, updates like this often lead to gradual behavior changes over time.

Search has always been a balance between visibility and trust. Advertisers who adapt early and continue to prioritize useful, high-quality messaging will be in the best position to maintain performance as the SERP continues to evolve.

Get Your Ad Campaigns Ready Before Black Friday via @sejournal, @brookeosmundson

For most PPC marketers, the weeks leading up to Black Friday aren’t just doing busy work. They’re loaded with decisions, deadlines, and last-minute requests.

If you’re managing Google Ads, Microsoft Ads, or any ad platform in between, this time of year can either be a strong finish or a missed opportunity.

The difference usually comes down to planning.

If you’re looking to approach Black Friday with a more structured and thoughtful strategy, keep reading. This article focuses on what you can control (like budgets, campaign builds, and feed readiness) and includes specific examples across platforms to help you avoid common pitfalls.

Let’s start with what to revisit from last year.

Take The Time To Audit Last Year’s Wins And Pitfalls

Before building anything new, it’s worth taking a closer look at last year’s performance.

The strategy here isn’t about copying old campaigns; it’s about understanding where they overdelivered, where they stalled out, and how the landscape might have changed since then.

In Google Ads, start with the attribution reports. Look beyond just last-click conversions and examine how various campaign types contributed throughout the funnel.

If Performance Max campaigns played more of an assist role, that should inform how you structure them this year.

If Standard Shopping capped out early or certain product categories were underrepresented, those are fixable issues.

You can also use auction insights to see when competitors ramped up spend, or whether you lost impression share due to budget or rank. These reports offer useful context if you’re planning to scale this year but didn’t last year.

If you’re using Microsoft Ads, review audience and device performance to see where volume shifted.

Holiday behavior isn’t always the same across platforms. What worked well on Google may not have translated to Bing or Meta, and vice versa.

The goal is to identify specific opportunities, not just assume last year’s playbook will hold up.

Build Early, Even If You’re Not Launching Yet

There’s value in building out your campaigns well in advance of Black Friday, even if you don’t plan to activate them until closer to the sale.

Whether you’re launching new campaigns or just updating ads in existing ones, getting ahead on structure gives you time to QA creative, troubleshoot disapprovals, and coordinate across teams.

If you’re planning to reuse existing campaigns, you can still stay organized using labels. For example:

  • Apply labels to new Responsive Search Ads (RSAs) that include holiday-specific copy or promotions.
  • Label sitelinks, callouts, or promo assets that reference Black Friday offers.
  • Tag ad groups or asset groups that are tied to limited-time sale messaging.

Using a clear naming convention makes it easier to filter, review, and schedule changes across campaigns without confusion.

If you want to automate this even further, you can create automated rules based on labels.

For example, you can set a rule to enable all ads with your Black Friday label at 12:01 a.m. on November 28. You can also set up rules to pause those same ads at the end of the promotion, reducing the chance that outdated messaging stays live.

You’d also want to create an automated rule to run to pause all non-Black Friday ads at the same time. This ensures that only your promo ads are running during Black Friday season.

If you end up creating Black Friday-specific campaigns, you can easily set start and end dates on them to ensure they only run during the allotted time.

While you don’t have complete scheduling control at the ad or asset level across platforms, you can use a combination of labels, automated rules, or campaign/ad group start and end dates. These give you enough flexibility to manage most scenarios without scrambling the morning of your launch.

If you’re running Meta Ads, be sure to upload your Black Friday creative and audience setups well in advance. Platforms are slower to review and approve ads during peak periods, and early delivery data will help the algorithm optimize once you start increasing budgets.

Give Smart Bidding Better Direction

Most advertisers are using some sort of Smart Bidding for their campaigns, especially around Black Friday. That doesn’t mean you should take a hands-off approach, though.

If you’re using Google Ads, consider seasonality adjustments if you’re planning for a short-term sale or expect a sudden fluctuation in conversion rates. These adjustments tell Google to expect better-than-usual performance during a specific window, and can help avoid underspending during flash sales.

Seasonality adjustments are currently available for these campaign types that use either a Target ROAS or Target CPA bid strategy:

  • Search.
  • Shopping.
  • Display.

If you’re using seasonality adjustments for conversion rates, then you can choose between these campaign types:

  • Search.
  • Display.
  • Shopping.
  • Performance Max.
  • App (in beta).

That said, they’re not suited for every situation. If you’re running a longer sale or have limited historical volume, the adjustment could cause more volatility than good.

For broader holiday performance, make sure your campaigns have enough data to support Smart Bidding decisions. Review the “Bid Strategy Report” and watch for signs of limited learning or constrained budgets.

Pushing into a critical promo window without stabilized bidding can lead to inefficient spend, especially with newer campaigns.

Check Your Product Feed Before It Becomes A Problem

It’s easy to focus on campaign settings and forget that your product feed is powering everything from standard Shopping campaigns to Performance Max. If it’s not accurate or timely, your best offers might not show up correctly.

In Google Merchant Center, navigate to the Diagnostics tab and resolve any disapprovals or mismatched pricing issues. These often spike around holidays when sale prices don’t sync correctly or out-of-stock products remain active.

Make sure your feed includes items like:

  • Up-to-date GTINs and product identifiers.
  • Attributes like ‘sale_price’ and ‘sale_price_effective_date’ for promotions.
  • High-quality images that meet platform guidelines.
  • Clear shipping and availability details.

If you’re running Performance Max campaigns, review the Listing Groups report to ensure your most valuable products are getting served. Many advertisers find that certain SKUs get minimal impressions due to budget spread or structural issues.

This is also a good time to upload holiday-themed creative assets, including lifestyle images and product videos. These can improve performance in placements like YouTube and Discover, which tend to ramp during PMax campaigns in Q4.

The more you control the feed and asset side, the less you have to worry about automation making subpar choices when competition is highest.

Expect Things To Break, And Plan Around That

Black Friday campaigns don’t always go according to plan.

Promo pages fail to update. Budgets cap out early. Tracking drops off mid-day. It’s worth thinking through what could go wrong now, while you still have time to build a backup plan.

Start with some of the basics in campaign planning:

  • Double-check conversion actions in Google Ads and Google Analytics 4. Make sure no duplicate events are being counted, and key actions like purchases, add-to-cart, and email sign-ups are being tracked.
  • Test final URLs on mobile and desktop. If you’re using promo pages, confirm they’re live and loading quickly. A slow checkout experience during Black Friday Cyber Monday (BFCM) will almost always tank performance.
  • Pre-schedule creative updates where possible. You don’t want to be manually swapping sitelinks or headlines in the middle of a surge.
  • Double-check your automated rules. If you’re using rules to enable sale ads and pausing evergreen ads, make sure to have the platform(s) email you with any changes so you can confirm with confidence the right ads are being shown at the right time.
  • Set up alerts for unusual activities. If campaigns showcase a sudden ROAS drop, zero conversions, or unusual spend, you’ll want to be alerted in real-time. Even something as simple as a budget cap hitting before 10 a.m. can throw off the day if it goes unnoticed.

The more you can troubleshoot before launch week, the fewer fires you’ll need to put out when things are moving fast.

Don’t Shut Down Campaigns The Minute Cyber Monday Ends

It’s common for brands to ramp hard through Cyber Monday, then pause everything until January. But, many shoppers are still active well into December, especially those looking for last-minute gifts or deals that weren’t available earlier.

Based on previous personal experience, Google Ads auction data may show that competition could dip after Cyber Monday and shopping intent doesn’t disappear. Conversion rates often stay steady through the first two weeks of December, particularly for brands with fast shipping or digital products.

Rather than winding down completely, consider updating your messaging to reflect the urgency. Swap out “Black Friday” language for “Still Time to Save” or “Guaranteed Delivery Before Christmas.” Countdown ads and shipping deadline assets work well here.

If you’re running remarketing campaigns, exclude recent purchasers and focus on users who visited key pages but didn’t convert. These audiences tend to convert at lower cost-per-acquisition (CPA) during post-Cyber sales, especially if you’ve got gift cards or bundled offers to promote.

December also gives you a chance to build audience pools for Q1. Visitors from BFCM campaigns can be remarketed to in January for loyalty or cross-sell efforts. Just make sure your campaign structure allows for clean audience segmentation.

Planning Ahead Is Still Your Best Defense

Black Friday doesn’t reward last-minute execution. It rewards structure, preparation, and proactive troubleshooting.

The platforms are going to do what they do. Smart Bidding will make the best decisions based on your inputs. Asset groups will mix and match in ways you can’t fully control.

But, what you can control, like budgets, tracking, and product feed health, still has a major impact on your campaign performance.

Getting your campaigns in order early gives you the breathing room to monitor performance, scale what’s working, and catch issues before they snowball.

And when something inevitably breaks or shifts unexpectedly, you’ll already have a plan in place.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

Google Explains Next Generation Of AI Search via @sejournal, @martinibuster

Google’s Robby Stein, VP of Product at Google, explained that Google Search is converging with AI in a new manner that builds on three pillars of AI. The implications for online publishers, SEOs, and eCommerce stores are profound.

Three Pillars Of AI Search

Google’s Stein said that there are three essential components to the “next generation” of Google Search:

  1. AI Overviews
  2. Multimodal search
  3. AI Mode

AI Overviews is natural language search. Multimodal are new ways of searching with images, enabled by Google Lens. AI Mode is the harnessing of web content and structured knowledge to provide a conversational turn-based way of discovering information and learning. Stein indicates that all three of these components will converge as the next step in the evolution of search. This is coming.

Stein explained:

“I can tell you there’s kind of three big components to how we can think about AI search and kind of the next generation of search experiences. One is obviously AI overviews, which are the quick and fast AI you get at the top of the page many people have seen. And that’s obviously been something growing very, very quickly. This is when you ask a natural question, you put it into Google, you get this AI now. It’s really helpful for people.

The second is around multimodal. This is visual search and lens. That’s the other big piece. You go to the camera in the Google app, and that’s seeing a bunch of growth.

And then with AI mode, it brings it all together. It creates an end-to-end frontier search experience on state-of-the-art models to really truly let you ask anything of Google search.”

AI Mode Triggered By Complex Queries

Screenshot showing how a complex two sentence query automatically triggers an AI Mode preview.

The above screenshot shows a complex two sentence search query entered into Google’s search box. The complex query automatically triggers an AI Mode preview with a “Show more” link that leads to an immersive AI Mode conversational search experience. Publishers who wish to be cited need to think about how their content will fit into this kind of context.

Next Generation Of Google: AI Mode Is Like A Brain

Stein described the next frontier of search as something that is radically different from what we know as Google Search. Many SEOs still think of search as this ranking paradigm with ten blue links. That’s something that’s not quite existed since Google debuted Featured Snippets back in 2014. That’s eleven years that the concept of ten blue links has been out of step with the reality in Google’s search results.

What Stein goes on to describe completely does away with the concept of ten blue links, replacing it with the concept of a brain that users can ask questions and interact with. SEOs, merchants and other publishers really need to begin doing away with the mental concept of ten blue links and focus on surfacing content within an interactive natural language environment that’s completely outside of search.

Stein explained this new concept of a brain in the context of AI Mode:

“You can go back and forth. You can have a conversation. And it taps into and is specially designed for search. So what does that mean? One of the cool things that I think it does is it’s able to understand all of this incredibly rich information that’s within Google.

  • So there’s 50 billion products in the Google Shopping Graph, for instance. They’re updated 2 billion times an hour by merchants with live prices.
  • You have 250 million places and maps.
  • You have all of the finance information.
  • And not to mention, you have the entire context of the web and how to connect to it so that you can get context, but then go deeper.

And you put all of that into this brain that is effectively this way to talk to Google and get at this knowledge.

That’s really what you can do now. So you can ask anything on your mind and it’ll use all of this information to hopefully give you super high quality and informed information as best as we can.”

Stein’s description shows that Google’s long-term direction is to move beyond retrieval toward an interactive turn-based mode of information discovery. The “brain” metaphor signals that search will increasingly be less about locating web pages but about generating informed responses built from Google’s own structured data, knowledge graphs, and web content. This represents a fundamental change and as you’ll see in the following paragraphs, this change is happening right now.

AI Mode Integrates Everything

Stein describes how Google is increasingly triggering AI Mode as the next evolution of how users find answers to questions and discover information about the world immediately around them. This goes beyond asking “what’s the best kayak” and becomes more of a natural language conversation, an information journey that can encompass images, videos, and text, just like in real life. It’s an integrated experience that goes way beyond a simple search box and ten links.

Stein provided more information of what this will look like:

“And you can use it directly at this google.com/ai, but it’s also been integrated into our core experiences, too. So we announced you can get to it really easily. You can ask follow-up questions of AI overviews right into AI mode now.

Same for the lens stuff, take a picture, takes it to AI mode. So you can ask follow-up questions and go there, too. So it’s increasingly an integrated experience into the core part of the product.”

How AI Will Converge Into One Interface

At this point the host of the podcast asked for a clearer explanation of how all of these things will be integrated.

He asked:

“I imagine much of this is… wait and see how people use it. But what’s the vision of how all these things connect?

Is the idea to continue having this AI mode on the side, AI overviews at the top, and then this multimodal experience? Or is there a vision of somehow pushing these together even more over time?”

Stein answered that all of these modes of information discovery will converge together. Google will be able to detect by the query whether to trigger AI Mode or just a simple search. There won’t be different interfaces, just the one.

Stein explained:

“I think there’s an opportunity for these to come closer together. I think that’s what AI Mode represents, at least for the core AI experiences. But I think of them as very complementary to the core search product.

And so you should be able to not have to think about where you’re asking a question. Ultimately, you just go to Google.

And today, if you put in whatever you want, we’re actually starting to use much of the power behind AI mode, right in AI Overviews. So you can just ask really hard, you could put a five-sentence question right into Google search.

You can try it. And then it should trigger AI at the top, it’s a preview. And then you can go deeper into AI mode and have this back and forth. So that’s how these things connect.

Same for your camera. So if you take a picture of something, like, what’s this plant? Or how do I buy these shoes? It should take you to an AI little preview. And then if you go deeper, again, it’s powered by AI mode. You can have that back and forth.

So you shouldn’t have to think about that. It should feel like a consistent, simple product experience, ultimately. But obviously, this is a new thing for us. And so we wanted to start it in a way that people could use and give us feedback with something like a direct entry point, like google.com/AI.”

Stein’s answer shows that Google is moving from separate AI features toward one unified search system that interprets intent and context automatically.

  • For users, that means typing, speaking, or taking a picture will all connect to the same underlying process that decides how to respond.
  • For publishers and SEOs, it means visibility will depend less on optimizing for keywords and more on aligning content with how Google understands and responds to different kinds of questions.

How Content Can Fit Into AI Triggered Search Experiences

Google is transitioning users out of the traditional ten blue links paradigm into a blended AI experience. Users can already enter questions consisting of multiple sentences and Google will automatically transition into an AI Mode deep question and answer. The answer is a preview with an option to trigger a deeper back and forth conversation.

Robbie Stein indicated that the AI Search experience will converge even more, depending on user feedback and how people interact with it.

These are profound changes that demand publishers ask deep questions about how content:

  • Should you consider how curating unique images, useful video content, and step-by-step tutorials may fit into your content strategies?
  • Information discovery is increasingly conversational, does your content fit into that context?
  • Information discovery may increasingly include camera snapshots, will your content fit into that kind of search?

These are examples of the kinds of questions publishers, SEOs and store owners should be thinking about.

Watch the podcast interview with Robby Stein

Inside Google’s AI turnaround: AI Mode, AI Overviews, and vision for AI-powered search | Robby Stein

Featured image/Screenshot of Lenny’s Podcast video

An Earthling’s guide to planet hunting

The pendant on Rebecca Jensen-Clem’s necklace is only about an inch wide, composed of 36 silver hexagons entwined in a honeycomb mosaic. At the Keck Observatory, in Hawaii, just as many segments make up a mirror that spans 33 feet, reflecting images of uncharted worlds for her to study. 

Jensen-Clem, an astronomer at the University of California, Santa Cruz, works with the Keck Observatory to figure out how to detect new planets without leaving our own. Typically, this pursuit faces an array of obstacles: Wind, fluctuations in atmospheric density and temperature, or even a misaligned telescope mirror can create a glare from a star’s light that obscures the view of what’s around it, rendering any planets orbiting the star effectively invisible. And what light Earth’s atmosphere doesn’t obscure, it absorbs. That’s why researchers who study these distant worlds often work with space telescopes that circumvent Earth’s pesky atmosphere entirely, such as the $10 billion James Webb Space Telescope. 

But there’s another way over these hurdles. At her lab among the redwoods, Jensen-Clem and her students experiment with new technologies and software to help Keck’s primary honeycomb mirror and its smaller, “deformable” mirror see more clearly. Using measurements from atmospheric sensors, deformable mirrors are designed to adjust shape rapidly, so they can correct for distortions caused by Earth’s atmosphere on the fly. 

This general imaging technique, called adaptive optics, has been common practice since the 1990s. But Jensen-Clem is looking to level up the game with extreme adaptive optics technologies, which are aimed to create the highest image quality over a small field of view. Her group, in particular, does so by tackling issues involving wind or the primary mirror itself. The goal is to focus starlight so precisely that a planet can be visible even if its host star is a million to a billion times brighter.

In April, she and her former collaborator Maaike van Kooten were named co-recipients of the Breakthrough Prize Foundation’s New Horizons in Physics Prize. The prize announcement says they earned this early-career research award for their potential “to enable the direct detection of the smallest exo­planets” through a repertoire of methods the two women have spent their careers developing. 

In July, Jensen-Clem was also announced as a member of a new committee for the Habitable Worlds Observatory, a concept for a NASA space telescope that would spend its career on the prowl for signs of life in the universe. She’s tasked with defining the mission’s scientific goals by the end of the decade.

The Keck Observatory’s 10-meter primary mirror features a honeycomb structure with 36 individual mirror segments.
The Keck Observatory’s 10-meter primary mirror features a honeycomb structure with 36 individual mirror segments.
ETHAN TWEEDIE

“In adaptive optics, we spend a lot of time on simulations, or in the lab,” Jensen-Clem says. “It’s been a long road to see that I’ve actually made things better at the observatory in the past few years.”

Jensen-Clem has long appreciated astronomy for its more mind-bending qualities. In seventh grade, she became fascinated by how time slows down near a black hole when her dad, an aerospace engineer, explained that concept to her. After starting her bachelor’s degree at MIT in 2008, she became taken with how a distant star can seem to disappear—either suddenly winking out or gently fading away, depending on the kind of object that passes in front of it. “It wasn’t quite exoplanet science, but there was a lot of overlap,” she says.

“If you just look up at the night sky and see stars twinkling, it’s happening fast. So we have to go fast too.”

During this time, Jensen-Clem began sowing the seeds for one of her prize-winning methods after her teaching assistant recommended that she apply for an internship at NASA’s Jet Propulsion Laboratory. There, she worked on a setup that could perfect the orientation of a large mirror. Such mirrors are more difficult to realign than the smaller, deformable ones, whose shape-changing segments cater to Earth’s fluctuating atmosphere.

“At the time, we were saying, ‘Oh, wouldn’t it be really cool to install one of these at Keck Observatory?’” Jensen-Clem says. The idea stuck around. She even wrote about it in a fellowship application when she was gearing up to start her graduate work at Caltech. And after years of touch-and-go development, Jensen-Clem succeeded in installing the system—which uses a technology called a Zernike wavefront sensor—on Keck’s primary mirror about a year ago. “My work as a college intern is finally done,” she says. 

The system, which is currently used for occasional recalibrations rather than continuous adjustments, includes a special kind of glass plate that bends the light rays from the mirror to reveal a specific pattern. The detector can pick up a hairbreadth misalignment in that picture: If one hexagon is pushed too far back or forward, its brightness changes. Even the tiniest misalignment is important to correct, because “when you’re studying a faint object, suddenly you’re much more susceptible to little mistakes,” Jensen-Clem says.

She has also been working to perfect the craft of molding Keck’s deformable mirror. This instrument, which reflects light that’s been rerouted from the primary mirror, is much smaller—only six inches wide—and is designed to reposition as often as 2,000 times a second to combat atmospheric turbulence and create the clearest picture possible. “If you just look up at the night sky and see stars twinkling, it’s happening fast. So we have to go fast too,” Jensen-Clem says. 

Even at this rapid rate of readjustment, there’s still a lag. The deformable mirror is usually about one millisecond behind the actual outdoor conditions at any given time. “When the [adaptive optics] system can’t keep up, then you aren’t going to get the best resolution,” says van Kooten, Jensen-Clem’s former collaborator, who is now at the National Research Council Canada. This lag has proved especially troublesome on windy nights. 

Jensen-Clem thought it was an unsolvable problem. “The reason we have that delay is because we need to run computations and then move the deformable mirror,” she says. “You’re never going to do those things instantaneously.”

But while she was still a postdoc at UC Berkeley, she came across a paper that posited a solution. Its authors proposed that using previous measurements and simple algebra to predict how the atmosphere will change, rather than trying to keep up with it in real time, would yield better results. She wasn’t able to test the idea at the time, but coming to UCSC and working with Keck presented the perfect opportunity. 

Around this time, Jensen-Clem invited van Kooten to join her team at UCSC as a postdoc because of their shared interest in the predictive software. “I didn’t have a place to live at first, so she put me up in her guest room,” van Kooten says. “She’s just so supportive at every level.”

After creating experimental software to try out at Keck, the team compared the predictive version with the more standard adaptive optics, examining how well each imaged an exoplanet without its drowning in starlight. They found that the predictive software could image even faint exoplanets two to three times more clearly. The results, which Jensen-Clem published in 2022, were part of what earned her the New Horizons in Physics Prize. 

Thayne Currie, an astronomer at the University of Texas, San Antonio, says that these new techniques will become especially vital as researchers build bigger and bigger ground-based facilities to capture images of exoplanets—including upcoming projects such as the Extremely Large Telescope at the European Southern Observatory and the Giant Magellan Telescope in Chile. “There’s an incredible amount that we’re learning about the universe, and it is really driven by technology advances that are very, very new,” Currie says. “Dr. Jensen-Clem’s work is an example of that kind of innovation.”

In May, one of Jensen-Clem’s graduate students went back to Hawaii to reinstall the predictive software at Keck. This time, the program isn’t just a trial run; it’s there to stay. The new software has shown it can refocus artificial starlight. Next, it will have to prove it can handle the real thing. 

And in about a year, Jensen-Clem and her students and colleagues will brace themselves for a flood of observations from the European Space Agency’s Gaia mission, which recently finished measuring the motion, temperature, and composition of billions of stars over more than a decade. 

When the project releases its next set of data—slated for December 2026—Jensen-Clem’s team aims to hunt for new exoplanetary systems using clues like the wobbles in a star’s motion caused by the gravitational tugs of planets orbiting around it. Once a system has been identified, exoplanet photographers will then be able to shoot the hidden planets using a new instrument at Keck that can reveal more about their atmospheres and temperatures. 

There will be a mountain of data to sort through, and an even steeper supply of starlight to refocus. Thankfully, Jensen-Clem has spent more than a decade refining just the techniques she’ll need: “This time next year,” she says, “we’ll be racing to throw all our adaptive optics tricks at these systems and detect as many of these objects as possible.”

Jenna Ahart is a science journalist specializing in the physical sciences. 

The Download: planet hunting, and India’s e-scooters

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

An Earthling’s guide to planet hunting

The pendant on Rebecca Jensen-Clem’s necklace is composed of 36 silver hexagons entwined in a honeycomb mosaic. At the Keck Observatory, in Hawaii, just as many segments make up a mirror that spans 33 feet, reflecting images of uncharted worlds for her to study.

Jensen-Clem, an astronomer at the University of California, Santa Cruz, works with the Keck Observatory to try to detect new planets without leaving our own. It’s a pursuit that faces a vast array of obstacles, for example wind, and fluctuations in atmospheric density and temperature. 

At her lab among the redwoods, Jensen-Clem and her students experiment with new technologies and software to help overcome the challenges, and see into space more clearly. Read more about her and her work

—Jenna Ahart

This story is from the forthcoming print issue of MIT Technology Review, which is all about the body. If you’re not already a subscriber, sign up now to receive issues as soon as they land.

2025 climate tech companies to watch: Ather Energy and its premium e-scooters


While sales of Tesla or BYD cars drove electric vehicle adoption elsewhere in the world, two-wheelers have led the green energy transition in India.

As one of the earliest “pure play” e-scooter makers, Ather Energy has helped drive micromobility EV penetration throughout India and boosted the shift away from carbon-emitting vehicles. Read the full story.


—Nilesh Christopher

Ather Energy is one of our 10 climate tech companies to watch—our annual list of some of the most promising climate tech firms on the planet. Check out the rest of the list here.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 The Trump administration has laid off thousands of federal health workers
It’s blaming the government shutdown—but labor unions are suing. (Reuters $)
The firings will decimate parts of the CDC that work on disease surveillance. (STAT)

2 AI videos of dead celebrities are upsetting their families
OpenAI is coming under mounting pressure to restrict what its video generator Sora 2 can create. (WP $)
Sora was downloaded over a million times in less than five days—a rate even faster than ChatGPT. (BBC)

3 The Dutch government has taken control of a Chinese-owned chipmaker
The move comes after Beijing tightened restrictions on the export of rare earth elements, which could hurt Europe’s car industry. (CNBC)

4 Why some developers reject AI coding tools
Even at their best, they introduce bugs into the code base that can be tricky to spot. (The Information $)
The second wave of AI coding is here. (MIT Technology Review)

5 Police are begging teens to stop pulling the AI homeless man prank
Kids are using AI to create images of a disheveled person in their home, then sending that to their parents. (The Verge)

6 How Elon Musk’s embrace of Trump continues to hurt Tesla
The result is that his cars are now more expensive and less desirable. (The Atlantic $)
China might force Tesla to redesign its door handles. (Wired $)
How did China come to dominate the world of electric cars? (MIT Technology Review

7 What happened after schools in Australia banned phones?
Both students and staff say the impact has been overwhelmingly positive. (The Guardian)

8 AI is fantastic at detecting small earthquakes 
But the really big prize is seeing if it can help with predicting them, too. (Ars Technica)
What we can learn from Japan’s “megaquake” preparations. (MIT Technology Review)

9 Climate change is creating new hybrid species
The “grue jay” is half-blue jay, half-green jay. It’s also a sign of the times we live in. (Nautilus)
How a breakthrough gene-editing tool will help the world cope with climate change. (MIT Technology Review)

10 How people gamify Hinge to get the dates they want 
It puts people’s most promising matches behind a paywall—but they’re finding workarounds. (The Cut $)
There’s now a term for daters who use AI to boost their appeal : chatfishers. (The Guardian)

Quote of the day

“It’s as emotionally devastating as it is dangerous to the American public.”

—An employee at the Centers for Disease Control and Prevention tells STAT about the impact of the Trump administration’s decision to carry out mass layoffs at the agency. 

One more thing

gloved hands insert a test strip into a tube

GETTY IMAGES

How the federal government is tracking changes in the supply of street drugs

In 2021, the Maryland Department of Health and the state police were confronting a crisis: Fatal drug overdoses in the state were at an all-time high, and authorities didn’t know why.

Seeking answers, Maryland officials turned to scientists at the National Institute of Standards and Technology, the national metrology institute for the United States, which defines and maintains standards of measurement essential to a wide range of industrial sectors and health and security applications.

There, a research chemist named Ed Sisco and his team had developed methods for detecting trace amounts of drugs, explosives, and other dangerous materials—techniques that could protect law enforcement officials and others who had to collect these samples. And a pilot uncovered new, critical information almost immediately. Read the full story.

—Adam Bluestein

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ This spicy tea is exactly what you need when you’re under the weather. 
+ Just a man, jamming with his cat. 
+ Empathy is not a fixed trait—you can grow it. Here’s how.
+ There’s something very soothing about João Bernardino’s photos of Portugal. 

Transforming commercial pharma with agentic AI 

Amid the turbulence of the wider global economy in recent years, the pharmaceuticals industry is weathering its own storms. The rising cost of raw materials and supply chain disruptions are squeezing margins as pharma companies face intense pressure—including from countries like the US—to control drug costs. At the same time, a wave of expiring patents threatens around $300 billion in potential lost sales by 2030. As companies lose the exclusive right to sell the drugs they have developed, competitors can enter the market with generic and biosimilar lower-cost alternatives, leading to a sharp decline in branded drug sales—a “patent cliff.” Simultaneously, the cost of bringing new drugs to market is climbing. McKinsey estimates cost per launch is growing 8% each year, reaching $4 billion in 2022. 

In clinics and health-care facilities, norms and expectations are evolving, too. Patients and health-care providers are seeking more personalized services, leading to greater demand for precision drugs and targeted therapies. While proving effective for patients, the complexity of formulating and producing these drugs makes them expensive and restricts their sale to a smaller customer base.

The need for personalization extends to sales and marketing operations too as pharma companies are increasingly needing to compete for the attention of health-care professionals (HCPs). Estimates suggest that biopharmas were able to reach 45% of HCPs in 2024, down from 60% in 2022. Personalization, real-time communication channels, and relevant content offer a way of building trust and reaching HCPs in an increasingly competitive market. But with ever-growing volumes of content requiring medical, legal, and regulatory (MLR) review, companies are struggling to keep up, leading to potential delays and missed opportunities. 

Download the report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. It was researched, designed, and written by human writers, editors, analysts, and illustrators. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

Google’s AI Reshapes Organic Listings

Google is quickly changing organic search results as it integrates AI. Thus far, the new features have caused traffic losses to most external sites, necessitating new search engine tactics and priorities.

Here’s how AI is impacting traditional organic search visibility to date.

AI Overviews

AI Overviews are answers to search queries. They summarize and cite top-ranking pages, typically, correlating traditional SEO with visibility in Overviews.

AI Overviews:

  • Eliminates searchers’ need to click. If your target query triggers an Overview, the result is likely fewer clicks, even if the Overview cites your page.
  • Cites pages that then often appear in average position 1 in Google Search Console with abnormally low click-throughs. Thus the average position in the Performance tab will increase, but click-throughs will decrease.

‘People also ask’

Traditional SEO typically recommends “People also ask” questions in content, to generate clicks. However, Google now serves occasional AI-generated answers to “People also ask” queries, which decreases clicks in that section and in organic listings.

Screenshot of the oil-stain example

Google often serves AI-generated answers to “People also ask” queries, such as this example for “oil stains.”

Suggested topics

Google now provides a search-result section that I call “suggested topics.” It functions similarly to a fan-out result, wherein Google suggests related topics to queries having multiple intents. For example, a search for “roof repair” could trigger suggestions exploring the symptoms and causes of roof damage.

Clicking on any of these suggestions produces an AI-generated answer, which is unlikely to generate traffic to an external source.

A search for “roof repair” could trigger suggestions exploring the symptoms and causes of roof damage.

AI-generated search snippets

Google is apparently testing AI-generated search snippets, foregoing the practice of using publishers’ meta descriptions or body text.

Google reportedly enhances a snippet sometimes with additional info, which can increase clicks.

Google’s testing of AI-generated search snippets replaces or enhances publishers’ meta descriptions or body text.

Local search

Google is integrating AI in blended results, especially local packs. Reportedly, Google’s AI now invites users to learn more about a local business and will even suggest related fan-out-style questions.

Screenshot of the search results for roofing.

Google’s AI suggests fan-out-style questions, such as “Do they offer roof cleaning?”

The feature mimics what Google’s URL bar does now: encourage users to learn more about any page.

Hence local businesses should focus on providing on-site details of products or services, encouraging customer reviews, answering questions, and more.

Google is also integrating AI actions into local packs, following the practice in AI Mode. For example, for a “car tires near me” search, Google might suggest having AI check prices.

Screenshot of the search result and the question

A “car tires near me” search might include a suggestion, such as “Have AI check prices.”

I once feared generative AI platforms would replace organic search. Instead, search engines are adopting AI themselves, making organic results less predictable, less trackable, and less traffic-generating.

We know what is happening. The key is adjusting traffic, tactics, and expectations accordingly.

Google Adds AI Previews To Discover, Sports Feed Coming via @sejournal, @MattGSouthern

Google rolled out AI trending previews in Discover and will add a “What’s new” sports feed to U.S. mobile search in coming weeks.

  • AI trending previews in Discover are live in the U.S., South Korea, and India.
  • A sports “What’s new” button will begin rolling out in the U.S. in the coming weeks.
  • Both experiences show brief previews with links to publisher/creator content.