Google Files Patent On Personal History-Based Search via @sejournal, @martinibuster

Google recently filed a patent for a way to provide search results based on a user’s browsing and email history. The patent outlines a new way to search within the context of a search engine, within an email interface, and through a voice-based assistant (referred to in the patent as a voice-based dialog system).

A problem that many people have is that they can remember what they saw but they can’t remember where they saw it or how they found it. The new patent, titled Generating Query Answers From A User’s History, solves that problem by helping people find information they’ve previously seen within a webpage or an email by enabling them to ask for what they’re looking for using everyday language such as “What was that article I read last week about chess?”

The problem the invention solves is that traditional search engines don’t enable users to easily search their own browsing or email history using natural language. The invention works by taking a user’s spoken or typed question, recognizing that the question is asking for previously viewed content, and then retrieving search results from the user’s personal history (such as their browser history or emails). In order to accomplish this it uses filters like date, topic, or device used.

What’s novel about the invention is the system’s ability to understand vague or fuzzy natural language queries and match them to a user’s specific past interactions, including showing the version of a page as it looked when the user originally saw it (a cached version of the web page).

Query Classification (Intent) And Filtering

Query Classification

The system first determines whether the intent of the user’s spoken or typed query is to retrieve previously accessed information. This process is called query classification and involves analyzing the phrasing of the query to detect the intent. The system compares parts of the query to known patterns associated with history-seeking questions and uses techniques like semantic analysis and similarity thresholds to identify if the user’s intent is to seek something they’d seen before, even when the wording is vague or conversational.

The similarity threshold is an interesting part of the invention because it compares what the user is saying or typing to known history-seeking phrases to see if they are similar. It’s not looking for an exact match but rather a close match.

Filtering

The next part is filtering, and it happens after the system has identified the history-seeking intent. It then applies filters such as the topic, time, or device to limit the search to content from the user’s personal history that matches those criteria.

The time filter is a way to constrain the search to within a specific time frame that’s mentioned or implied in the search query. This helps the system narrow down the search results to what the user is trying to find. So if a user speaks phrases like “last week” or “a few days ago” then it knows to restrict the query to those respective time frames.

An interesting quality of the time filter is that it’s applied with a level of fuzziness, which means it’s not exact. So when a person asks the voice assistant to find something from the past week it won’t do a literal search of the past seven days but will expand it to a longer period of time.

The patent describes the fuzzy quality of the time filter:

“For example, the browser history collection… may include a list of web pages that were accessed by the user. The search engine… may obtain documents from the index… based on the filters from the formatted query.

For example, if the formatted query… includes a date filter (e.g., “last week”) and a topic filter (e.g., “chess story”), the search engine… may retrieve only documents from the collection… that satisfy these filters, i.e., documents that the user accessed in the previous week that relate to a “chess story.”

In this example, the search engine… may apply fuzzy time ranges to the “last week” filter to account for inaccuracies in human memory. In particular, while “last week” literally refers to the seven calendar days of the previous week, the search engine… may search for documents over a wider range, e.g., anytime in the past two weeks.”

Once a query is classified as asking for something that was previously seen, the system identifies details in the user’s phrasing that are indicative of topic, date or time, source, device, sender, or location and uses them as filters to search the user’s personal history.

Each filter helps narrow the scope of the search to match what the user is trying to recall: for example, a topic filter (“turkey recipe”) targets the subject of the content; a time filter (“last week”) restricts results to when it was accessed; a source filter (“WhiteHouse.gov”) limits the search to specific websites; a device filter (e.g., “on my phone”) further restricts the search results from a certain device; a sender filter (“from grandma”) helps locate emails or shared content; and a location filter (e.g., “at work”) restricts results to those accessed in a particular physical place.

By combining these context-sensitive filters, the system mimics the way people naturally remember content in order to help users retrieve exactly what they’re looking for, even when their query is vague or incomplete.

Scope of Search: What Is Searched

The next part of the patent is about figuring out the scope of what is going to be searched, which is limited to predefined sources such as browser history, cached versions of web pages, or emails. So, rather than searching the entire web, the system focuses only on the user’s personal history, making the results more relevant to what the user is trying to recall.

Cached Versions of Previously Viewed Content

Another interesting feature described in the patent is web page caching. Caching refers to saving a copy of a web page as it appeared when the user originally viewed it. This enables the system to show the user that specific version of the page in search results, rather than the current version, which may have changed or been removed.

The cached version acts like a snapshot in time, making it easier for the user to recognize or remember the content they are looking for. This is especially useful when the user doesn’t remember precise details like the name of the page or where they found it, but would recognize it if they saw it again. By showing the version that the user actually saw, the system makes the search experience more aligned with how people remember things.

Potential Applications Of The Patent Invention

The system described in the patent can be applied in several real-world contexts where users may want to retrieve content they’ve previously seen:

Search Engines

The patent refers multiple times to the use of this technique in the context of a search engine that retrieves results not from the public web, but from the user’s personal history, such as previously visited web pages and emails. While the system is designed to search only content the user has previously accessed, the patent notes that some implementations may also include additional documents relevant to the query, even if the user hasn’t viewed them before.

Email Clients

The system treats previously accessed emails as part of the searchable history. For example, it can return an old email like “Grandma’s turkey meatballs” based on vague, natural language queries.

Voice Assistants

The patent includes examples of “a voice-based search” where users speak conversational queries like “I’m looking for a turkey recipe I read on my phone.” The system handles speech recognition and interprets intent to retrieve relevant results from personal history.

Read the entire patent here:

Generating query answers from a user’s history

To Navigate AI Turbulence, CMOs Can Apply The Flywheel Model via @sejournal, @gregjarboe

Right now, as technology changes daily, chief marketing officers face exceptional levels of change and uncertainty. But it’s not for the first time (or the last).

During the COVID-19 pandemic, nearly two-thirds of CMOs in Fortune 500 companies overcame the extraordinary challenge of navigating change and uncertainty.

This resulted in 65% of CMOs who exited their roles after an average tenure of 4.3 years  “being promoted to more senior roles” or “making lateral moves to other attractive CMO positions.”

However, what was a pandemic obstacle course has been followed by an AI Olympic steeplechase.

To navigate these turbulent times, CMOs should consider analyzing marketing research and applying digital trends to:

  • Discover consumer insights for effective marketing in a dynamic market.
  • Unlock exceptional marketing results and increase return on investment (ROI) with the power of AI.
  • Reach customers across search, video, social, and shopping platforms.
  • Drive progress in marketing by championing the latest innovations and ideas.
  • Transform their data into a tool for building a lasting business advantage.

To lead their teams, CMOs could also apply the flywheel model, a customer-centric approach to business growth.

Adding AI To The Traditional Flywheel

Recently, based on a survey of 2,000 global marketers, Think With Google wrote:

“The traditional flywheel has always existed in marketing. Now, leaders are adding AI to multiply its momentum.”

Screenshot from Think With Google, April 2025

The article provides CMOs with a framework, built on four interconnected pillars:

  1. Measurement and insights.
  2. Media and personalization.
  3. Creativity and content.
  4. People and process.

This framework outlines how AI is amplifying the traditional marketing flywheel.

Measurement And Insights

The first pillar, stresses the importance of aligning key performance indicators (KPIs) with business performance metrics like profit and ROI.

Implementing modern, AI-powered measurement tools is crucial for accurate data and insights while respecting privacy.

A foundation of well-defined KPIs, historical data, and first-party data enables outcome-based planning, where AI predicts and improves campaign performance, optimizing budget allocation.

The future involves an AI-powered Marketing Engine for continuous, real-time optimization.

Media And Personalization

The second pillar, focuses on AI’s role in delivering the right ad to the right person at the right time.

Leading marketers scale successful AI-powered campaigns, shifting budgets for maximum ROI and flexibility.

AI identifies engaged, high-value audiences across channels, revealing valuable consumer behavior insights.

The ultimate stage is AI-powered media transformation, where an AI engine autonomously creates and refines media plans in real time based on continuous measurement.

Creativity And Content

The third pillar, explores how generative AI aids in brainstorming impactful ideas to help develop innovative content.

AI identifies and amplifies top-performing assets, and AI-powered “creative studios” accelerate time-to-market.

AI also enables pre-launch testing and optimization, bringing the goal of real-time, personalized creative delivery closer to reality.

People And Process

The fourth pillar, emphasizes collaboration, extending to the C-suite.

Sharing prioritized AI opportunities early is vital. Transformative leaders restructure organizations to fully leverage the AI engine.

Scaling AI success requires investing in AI talent to develop new operational methods, which are then formalized and disseminated.

Leading marketers design improved workflows and assess AI impact, recognizing that holistic organizational transformation is needed.

The article concludes that these four interdependent pathways merge to create the AI-powered Marketing Engine, amplifying the traditional marketing flywheel.

Analyzing Market Research And Applying Audience Research

CMOs will quickly notice that “The AI-powered Marketing Engine” framework can help to overcome four of the five obstacles that I mentioned above:

  • Measurement and insights can help transform their data into a tool for building a lasting business advantage.
  • Media and personalization can help to reach customers across search, video, social, and shopping platforms.
  • Creativity and content can help unlock exceptional marketing results and increase ROI.
  • People and process can help to drive progress in marketing by championing the latest innovations and ideas.

And CMOs will immediately wonder: Why can’t the AI-powered Marketing Engine help our analysts discover consumer insights for effective marketing in a dynamic market?

That’s the right question to ask, and there are two probable answers.

The first was provided by Avinash Kaushik in 2014, when he asked, “Is your company creating reporting squirrels or analysis ninjas?”

In any organization, investments in data generate two distinct types of work: Reporting Squirrel work and Analysis Ninja work. While both are important, only one directly contributes to improving the company’s financial performance.

Reporting Squirrels primarily focus on data production, spending most of their time creating reports for various stakeholders.

Their responsibilities include data extraction, query writing, fulfilling ad-hoc requests, scheduling data outputs, and coordinating with IT teams for data acquisition.

Conversely, Analysis Ninjas dedicate their time to analyzing data and generating actionable insights, which are typically communicated in clear, plain language.

Their work involves tasks such as data retrieval, segmentation, in-depth exploration, modeling, creating unique datasets, answering business questions, and defining data requirements for Reporting Squirrels and IT teams.

It’s important to note that Fortune 500 companies don’t typically hire individuals with the titles “Reporting Squirrel” and “Analysis Ninja.” Instead, they employ analysts or data scientists.

However, CMOs need to ask if these professionals are primarily focused on data output rather than providing actionable recommendations.

The second probable answer was in my recent article, where I mentioned, “GA4 gives us less than a third of the data we need to know about user acquisition: The initial stage of building business awareness and acquiring user interest.”

I added, “Somehow, we’ve missed what GA4 can’t – or doesn’t – tell us about the Zero Moment of Truth (ZMOT): the moment in the purchase process when the consumer or business buyer researches a product or service prior to visiting your website.”

So, if CMOs realize that they don’t have a clue about where the lion’s share of their customers discovered their brands or products before visiting their website, then what should they do?

They have two options: Get audience research and conduct market research.

Audience research and market research are distinct but complementary approaches to understanding a business environment.

Audience Research

Audience research focuses on the individual, delving into the needs, preferences, behaviors, and language of the target audience.

This micro-level perspective is achieved through direct engagement with the audience via interviews, surveys, focus groups, social media analysis, and by leveraging existing customer data like customer relationship management (CRM) and support logs.

Market Research

In contrast, market research takes a broader, macroeconomic view, examining the overall landscape.

It involves analyzing industry trends, competitor activities, economic data, and trade publications to assess the viability of products or services.

Think of market research as providing the map, indicating where to go, and audience research as the compass, guiding you on the best path to get there. Therefore, both types of research play crucial roles.

AI Won’t Take Your Job. Somebody Using AI Will

CMOs remember what economist Richard Baldwin said at the 2023 World Economic Forum’s Growth Summit: “AI won’t take your job. It’s somebody using AI that will.”

They understand that their Fortune 500 company expects them to successfully navigate the complexities of the AI era and achieve sustainable growth.

To do that, they must embrace AI-powered tools and frameworks while prioritizing a deep understanding of their audience through dedicated research efforts.

By integrating these approaches, CMOs can transform data into actionable insights, optimize marketing strategies, and ultimately, build a lasting competitive advantage in an increasingly dynamic market.

More Resources:


Featured Image: R.bussarin/Shutterstock

A new biosensor can detect bird flu in five minutes

Over the winter, eggs suddenly became all but impossible to buy. As a bird flu outbreak rippled through dairy and poultry farms, grocery stores struggled to keep them on shelves. The shortages and record-high prices in February raised costs dramatically for restaurants and bakeries and led some shoppers to skip the breakfast staple entirely. But a team based at Washington University in St. Louis has developed a device that could help slow future outbreaks by detecting bird flu in air samples in just five minutes. 

Bird flu is an airborne virus that spreads between birds and other animals. Outbreaks on poultry and dairy farms are devastating; mass culling of exposed animals can be the only way to stem outbreaks. Some bird flu strains have also infected humans, though this is rare. As of early March, there had been 70 human cases and one confirmed death in the US, according to the Centers for Disease Control and Prevention.

The most common way to detect bird flu involves swabbing potentially contaminated sites and sequencing the DNA that’s been collected, a process that can take up to 48 hours.

The new device samples the air in real time, running the samples past a specialized biosensor every five minutes. The sensor has strands of genetic material called aptamers that were used to bind specifically to the virus. When that happens, it creates a detectable electrical change. The research, published in ACS Sensors in February, may help farmers contain future outbreaks.

Part of the group’s work was devising a way to deliver airborne virus particles to the sensor. 

With bird flu, says Rajan Chakrabarty, a professor of energy, environmental, and chemical engineering at Washington University and lead author of the paper, “the bad apple is surrounded by a million or a billion good apples.” He adds, “The challenge was to take an airborne pathogen and get it into a liquid form to sample.”

The team accomplished this by designing a microwave-­size box that sucks in large volumes of air and spins it in a cyclone-like motion so that particles stick to liquid-coated walls. The process seamlessly produces a liquid drip that is pumped to the highly sensitive biosensor. 

Though the system is promising, its effectiveness in real-world conditions remains uncertain, says Sungjun Park, an associate professor of electrical and computer engineering at Ajou University in South Korea, who was not involved in the study. Dirt and other particles in farm air could hinder its performance. “The study does not extensively discuss the device’s performance in complex real-world air samples,” Park says. 

But Chakrabarty is optimistic that it will be commercially viable after further testing and is already working with a biotech company to scale it up. He hopes to develop a biosensor chip that detects multiple pathogens at once. 

Carly Kay is a science writer based in Santa Cruz, California.

This Texas chemical plant could get its own nuclear reactors

Nuclear reactors could someday power a chemical plant in Texas, making it the first with such a facility onsite. The factory, which makes plastics and other materials, could become a model for power-hungry data centers and other industrial operations going forward.

The plans are the work of Dow Chemical and X-energy, which last week applied for a construction permit with the Nuclear Regulatory Commission, the agency in the US that governs nuclear energy.

It’ll be years before nuclear reactors will actually turn on, but this application marks a major milestone for the project, and for the potential of advanced nuclear technology to power industrial processes.

“This has been a long time coming,” says Harlan Bowers, senior vice president at X-energy. The company has been working with the NRC since 2016 and submitted its first regulatory engagement plan in 2018, he says.

In 2020, the US Department of Energy chose X-energy as one of the awardees of the Advanced Reactor Demonstration Program, which provides funding for next-generation nuclear technologies. And it’s been two years since X-energy and Dow first announced plans for a joint development agreement at Dow’s plant in Seadrift, Texas.  

The Seadrift plant produces 4 billion pounds of materials each year, including plastic used for food and pharmaceutical packaging and chemicals used in products like antifreeze, soaps, and paint. A natural-gas plant onsite currently provides both steam and electricity. That equipment is getting older, so the company was looking for alternatives.  

“Dow saw the opportunity to replace end-of-life assets with safe, reliable, lower-carbon-emissions technology,” said Edward Stones, an executive at Dow, in a written statement in response to questions from MIT Technology Review.

Advanced nuclear reactors designed by X-energy emerged as a fit for the Seadrift site in part because of their ability to deliver high-temperature steam, Stones said in the statement.

X-energy’s reactor is not only smaller than most nuclear plants coming online today but also employs different fuel and different cooling methods. The design is a high-temperature gas-cooled reactor, which flows helium over self-contained pebbles of nuclear fuel. The fuel can reach temperatures of around 1,000 °C (1,800 °F). As it flows through the reactor and around the pebbles, the helium reaches up to 750 °C (about 1,400 °F). Then that hot helium flows through a generator, making steam at a high temperature and pressure that can be piped directly to industrial equipment or converted into electricity.

The Seadrift facility will include four of X-energy’s Xe-100 reactors, each of which can produce about 200 megawatts’ worth of steam or about 80 megawatts of electricity.

A facility like Dow’s requires an extremely consistent supply of steam, Bowers says. So during normal operation, two of the modules will deliver steam, one will deliver electricity, and the final unit will sell electricity to the local grid. If any single reactor needs to shut down for some reason, there will still be enough onsite power to keep running, he explains.

The progress with the NRC is positive news for the companies involved, but it also represents an achievement for advanced reactor technology more broadly, says Erik Cothron, a senior analyst at the Nuclear Innovation Alliance, a nonprofit think tank. “It demonstrates real-world momentum toward deploying new nuclear reactors for industrial decarbonization,” Cothron says.

While there are other companies looking to bring advanced nuclear reactor technology online, this project could be the first to incorporate nuclear power onsite at a factory. It thus sets a precedent for how new nuclear energy technologies can integrate directly with industry, Cothron says—for example, showing a pathway for tech giants looking to power data centers.

It could take up to two and a half years for the NRC to review the construction permit application for this site. The site will also need to receive an operating license before it can start up. Operations are expected to begin “early next decade,” according to Dow.

Correction: A previous version of this story misspelled Erik Cothron’s name.

The Download: detecting bird flu, and powering industrial processes with nuclear energy

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

A new biosensor can detect bird flu in five minutes

Over the winter, eggs suddenly became all but impossible to buy. As a bird flu outbreak rippled through dairy and poultry farms, grocery stores struggled to keep them on shelves.

The shortages and record-high prices in February raised costs dramatically for restaurants and bakeries and led some shoppers to skip the breakfast staple entirely. But a team based at Washington University in St. Louis has developed a device that could help slow future outbreaks by detecting bird flu in air samples in just five minutes. Read the full story.

—Carly Kay

This story is from the next edition of our print magazine, which is all about the body. Subscribe now to read it and get a copy of the magazine when it lands!

This Texas chemical plant could get its own nuclear reactors

Nuclear reactors could someday power a chemical plant in Texas, making it the first with such a facility onsite. The factory, which makes plastics and other materials, could become a model for power-hungry data centers and other industrial operations going forward.

The plans are the work of Dow Chemical and X-energy, which last week applied for a construction permit with the Nuclear Regulatory Commission, the agency in the US that governs nuclear energy.

While it’ll be years before nuclear reactors will actually turn on, this application marks a major milestone for the project, and for the potential of advanced nuclear technology to power industrial processes. Read the full story.

—Casey Crownhart

MIT Technology Review Narrated: Exosomes are touted as a trendy cure-all. We don’t know if they work.

People are spending thousands of dollars on unproven exosome therapies for hair loss, skin aging, and acne, as well as more serious conditions like long covid and Alzheimer’s.

This is our latest story to be turned into a MIT Technology Review Narrated podcast, which  we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Donald Trump is confident Apple can make iPhones in the US 
Tim Cook is probably less sure about that. (9to5Mac)
+ Politicians are obsessed with the fantasy of an America-made iPhone. (404 Media)
+ If you need a new phone, you’re better off buying one now. (Wired $)

2 Trade groups are weighing up suing Trump to fight his tariffs
The Chamber of Commerce and other groups feel they may not have another option. (WSJ $)
+ Trump has hit China with a 104% tariff. (CNBC)
+ What does he really hope to achieve? (Vox)
+ Even the conservative podcasters that helped him win aren’t happy. (FT $)
+ Trump’s tariffs will deliver a big blow to climate tech. (MIT Technology Review)

3 The UK government is building a “murder prediction” tool
But research shows that algorithmic crime prediction systems don’t work. (The Guardian)
+ Predictive policing algorithms are racist. They need to be dismantled. (MIT Technology Review)

4 DOGE has converted magnetic tapes to digital records
The problem is, magnetic tapes are stable and safe. Digital records are both hackable and vulnerable to bit rot. (404 Media)
+ Government technologists aren’t happy about the switch. (Economist $)
+ Can AI help DOGE slash government budgets? It’s complex. (MIT Technology Review)

5 The crypto industry isn’t benefiting from Trump quite yet
In fact, VC investment has fallen. (Bloomberg $)
+ However, prosecutors are being told to stop pursuing certain crypto crimes. (WP $)

6 Tech bros are building a Christian utopia in Appalachia
These groups have traditionally existed only online. Can building a town bring them together? (Mother Jones $)

7 California’s only nuclear power plant is using AI
It’s the first time generative AI has been used onsite at a power plant.(The Markup)
+ Interest in nuclear power is surging. Is it enough to build new reactors? (MIT Technology Review)

8 Custom 3D-printed railway shelters are being trialed in Japan
In a bid to help rural stations replace ageing infrastructure. (Ars Technica)

9 We’re learning more about how the Titanic sank
Thanks to a new scan of its wreckage. (BBC)

10 Would you ride this headless horse robot?
Kawasaki’s outlandish concept model looks decidedly unsafe. (Vice)
+ A skeptic’s guide to humanoid-robot videos. (MIT Technology Review)

Quote of the day

“iPhone manufacturing isn’t coming back to America.”

—An anonymous source familiar with Apple’s plans has some bad news for the Trump administration, the Washington Post reports.

The big story

Inside effective altruism, where the far future counts a lot more than the present

Since its birth in the late 2000s, effective altruism has aimed to answer the question “How can those with means have the most impact on the world in a quantifiable way?”—and supplied methods for calculating the answer.

It’s no surprise that effective altruisms’ ideas have long faced criticism for reflecting white Western saviorism, alongside an avoidance of structural problems in favor of abstract math. And as believers pour even greater amounts of money into the movement’s increasingly sci-fi ideals, such charges are only intensifying. Read the full story.

—Rebecca Ackermann

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Why is everybody suddenly obsessed with Dubai chocolate? 🍫
+ Inside one academic’s quest to locate the famous photograph hanging on the wall of The Shining’s Overlook Hotel.
+ Adorable: a Japanese town has created its own trading card game featuring older men in the community.
+ I think it’s safe to say Val Kilmer really didn’t enjoy being in the largely forgotten film Spartan.

Tariffs are bad news for batteries

Update: Since this story was first published in The Spark, our weekly climate newsletter, the White House announced that most reciprocal tariffs would be paused for 90 days. That pause does not apply to China, which will see an increased tariff rate of 125%.

Today, new tariffs go into effect for goods imported into the US from basically every country on the planet.

Since Donald Trump announced his plans for sweeping tariffs last week, the vibes have been, in a word, chaotic. Markets have seen one of the quickest drops in the last century, and it’s widely anticipated that the global economic order may be forever changed.  

While many try not to look at the effects on their savings and retirement accounts, experts are scrambling to understand what these tariffs might mean for various industries. As my colleague James Temple wrote in a new story last week, anxieties are especially high in climate technology.

These tariffs could be particularly rough on the battery industry. China dominates the entire supply chain and is subject to monster tariff rates, and even US battery makers won’t escape the effects.   

First, in case you need it, a super-quick refresher: Tariffs are taxes charged on goods that are imported (in this case, into the US). If I’m a US company selling bracelets, and I typically buy my beads and string from another country, I’ll now be paying the US government an additional percentage of what those goods cost to import. Under Trump’s plan, that might be 10%, 20%, or upwards of 50%, depending on the country sending them to me. 

In theory, tariffs should help domestic producers, since products from competitors outside the country become more expensive. But since so many of the products we use have supply chains that stretch all over the world, even products made in the USA often have some components that would be tariffed.

In the case of batteries, we could be talking about really high tariff rates, because most batteries and their components currently come from China. As of 2023, the country made more than 75% of the world’s lithium-ion battery cells, according to data from the International Energy Agency.

Trump’s new plan adds a 34% tariff on all Chinese goods, and that stacks on top of a 20% tariff that was already in place, making the total 54%. (Then, as of Wednesday, the White House further raised the tariff on China, making the total 104%.)

But when it comes to batteries, that’s not even the whole story. There was already a 3.5% tariff on all lithium-ion batteries, for example, as well as a 7.5% tariff on batteries from China that’s set to increase to 25% next year.

If we add all those up, lithium-ion batteries from China could have a tariff of 82% in 2026. (Or 132%, with this additional retaliatory tariff.) In any case, that’ll make EVs and grid storage installations a whole lot more expensive, along with phones, laptops, and other rechargeable devices.

The economic effects could be huge. The US still imports the majority of its lithium-ion batteries, and nearly 70% of those imports are from China. The US imported $4 billion worth of lithium-ion batteries from China just during the first four months of 2024.

Although US battery makers could theoretically stand to benefit, there are a limited number of US-based factories. And most of those factories are still purchasing components from China that will be subject to the tariffs, because it’s hard to overstate just how dominant China is in battery supply chains.

While China makes roughly three-quarters of lithium-ion cells, it’s even more dominant in components: 80% of the world’s cathode materials are made in China, along with over 90% of anode materials. (For those who haven’t been subject to my battery ramblings before, the cathode and anode are two of the main components of a battery—basically, the plus and minus ends.)

Even battery makers that work in alternative chemistries don’t seem to be jumping for joy over tariffs. Lyten is a California-based company working to build lithium-sulfur batteries, and most of its components can be sourced in the US. (For more on the company’s approach, check out this story from 2024.) But tariffs could still spell trouble. Lyten has plans for a new factory, scheduled for 2027, that rely on sourcing affordable construction materials. Will that be possible? “We’re not drawing any conclusions quite yet,” Lyten’s chief sustainability officer, Keith Norman, told Heatmap News.

The battery industry in the US was already in a pretty tough spot. Billions of dollars’ worth of factories have been canceled since Trump took office.  Companies making investments that can total hundreds of millions or billions of dollars don’t love uncertainty, and tariffs are certainly adding to an already uncertain environment.

We’ll be digging deeper into what the tariffs mean for climate technology broadly, and specifically some of the industries we cover. If you have questions, or if you have thoughts to share about what this will mean for your area of research or business, I’d love to hear them at casey.crownhart@technologyreview.com. I’m also on Bluesky @caseycrownhart.bsky.social.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

Charts: European Views on U.S. Tariffs

YouGov is a U.K.-based research and analytics firm operating in Europe, North America, the Middle East, and Asia-Pacific. A recent YouGov survey (PDF) explores public opinion on tariffs across Western Europe.

YouGov surveyed 9,455 adults in March 2025 in the U.K. (2,155), France (1,002), Germany (2,196), Denmark (999), Sweden (1,011), Spain (1,061), and Italy (1,031).

The survey consisted of four questions, starting with, “If the U.S. were to place tariffs on E.U. goods imported to the U.S., would you support or oppose the E.U. responding by placing tariffs on American goods imported to the E.U.?”

Most respondents back retaliatory tariffs, with support highest in Denmark at 79%.

Next, the survey asked, “How much impact, if any, do you think that the U.S. placing tariffs on E.U. goods imported to the U.S. would have on the E.U. economy?” and “the [country’s] economy?

Seventy-five percent of German respondents believe tariffs will have “a lot” or “significant” impact on their national economy.

The final survey question addressed fairness: “Do you think the E.U. has been fair or unfair in its trade dealings with the U.S. in recent years?

AI Costs Drop 280x In 18 Months: What This Means For Marketers via @sejournal, @MattGSouthern

The cost of using advanced AI has fallen sharply.

Since late 2022, the price of using GPT-3.5-level AI models has dropped from $20.00 to just $0.07 per million tokens.

According to Stanford HAI’s AI Index Report, that’s a 280-fold reduction in less than two years.

This massive cost drop is changing the pricing of AI marketing tools. Tools that only big companies could afford are now within reach for businesses of all sizes.

AI Cost Reduction

The report shows that large language model (LLM) prices have fallen between 9 and 900 times yearly, depending on the task.

These cost reductions change the ROI for AI in marketing. Tools that were too expensive before could now pay off even for medium-sized companies.

Source: McKinsey & Company Survey, 2024 | Chart: 2025 AI Index report

The gap between the best AI models is closing. The difference between the first and tenth-ranked models has shrunk from 11.9% to just 5.4% over the past year.

The report also shows that AI models are getting smaller while staying powerful. In 2022, to get 60% accuracy on the MMLU benchmark (a test of AI reasoning), you needed models with 540 billion parameters.

By 2024, models 142 times smaller could do the same job. This means businesses can now use advanced AI tools with less computing power and lower costs.

Chart: 2025 AI Index Report
Chart: 2025 AI Index Report

What This Means For Marketers

For marketers, these changes bring several potential benefits:

1. Advanced Content Creation at Scale
The price drop makes it affordable to create and optimize content in bulk. Tasks can now be automated cheaply without losing quality.

2. Better Analysis
Newer AI models can process up to 1-2 million tokens (pieces of text) at once. This is enough to analyze entire websites for competitive insights.

3. Smarter Knowledge Management
Retrieval-augmented generation (RAG), where AI pulls information from your company’s data, is improving. This helps marketers build systems that ensure AI outputs match their brand voice and expertise.

The End of AI Moats?

The report shows that AI models are becoming more similar in performance, with little difference between leading systems.

This suggests that the edge in marketing technology may shift from the raw AI power to how well you use it, your strategy, and your integration skills.

As AI capabilities become more common, the real difference-maker for marketing teams will be how effectively they use these tools to create unique value for their companies.

For more on the state of AI, see the full report.

Google Confirms Discover Coming To Desktop Search via @sejournal, @MattGSouthern

Google has announced plans to bring Discover to desktop search. This move could change how publishers get traffic from Google.

The news came from the Search Central Live event in Madrid and was first shared by SEO expert Gianluca Fiorelli on X.

Google has tested Discover on desktop before, but this is the first time it has confirmed it’s happening. The company hasn’t said when it will launch.

What Is Google Discover?

Google Discover is a feed that shows content based on what you might like. It appears in the Google app, Chrome’s new tab page, and google.com on phones.

Unlike regular searches, you don’t need to type anything. Discover suggests content based on your interests and search history.

As Google defines it:

“Discover is a part of Google Search that shows people content related to their interests, based on their Web and App Activity.”

Why This Matters: Discover’s Growing Impact on Publisher Traffic

This desktop launch is important as Discover has become a bigger traffic source for many sites.

A January survey from NewzDash found that 52% of news publishers consider Discover a top priority. The survey also showed that 56% of publishers saw recent traffic increases from Discover.

Martin Little from Reach plc (publisher of UK news sites like Daily Mirror) recently said that Google Discover has become their “single largest traffic referral source.”

Little told Press Gazette:

“Discover is making up for [search traffic losses] and then some. Almost 50% of our titles are growing year-on-year now, partly because of the shifts in Google.”

Optimizing Content for Google Discover

You don’t need special markup or tags to appear in Discover. However, Google suggests these best practices:

  • Create quality content that matches user interests
  • Use good, large images (at least 1200px wide)
  • Write honest titles that accurately describe your content
  • Don’t use misleading previews to trick people into clicking
  • Focus on timely, unique content that tells stories well

Little noted that Discover prefers “soft-lens” content – personal stories, lifestyle articles, and niche topics. Breaking news and hard news often don’t do as well.

“You don’t get court content in there, no crime, our council content doesn’t get in there,” Little explained what Discover tends to avoid.

Desktop Expansion: Potential Traffic Implications

The desktop rollout could significantly change traffic patterns for publishers already using mobile Discover.

Google’s presentation slide at the Madrid event highlighted “expanding surfaces,” which suggests Google wants a more consistent experience across all devices.

For SEO pros, this is both an opportunity and a challenge. Desktop users browse differently from mobile users, which might affect how content performs in Discover.

Building a Discover Strategy

Publishers wanting to get more Discover traffic should consider these approaches:

  1. Monitor performance: Use Search Console’s Discover report to track how your content is doing.
  2. Diversify content: Don’t ignore traditional search traffic while optimizing for Discover.
  3. Focus on keeping readers: Consider using newsletters to turn Discover visitors into regular readers.
  4. Use effective headlines: Publishers note that Discover often picks headlines with a “curiosity gap” – titles that tell enough of the story but hold back key details to encourage clicks.

What’s Next?

As Google expands Discover to desktop, publishers should prepare for traffic changes. This move shows Google’s shift from just answering searches to actively suggesting content.

While we don’t know the exact launch date, publishers who understand and optimize for Discover will have an advantage.


Featured Image: DJSully/Shutterstock

Ecommerce PPC Challenges & Strategies For Second-Hand Retailers

The second-hand ecommerce sector is significant.

The market for global resale apparel alone reached $227 billion in 2024 and is projected to hit $367 billion by 2029.

This once traditional way of shopping in thrift stores and auction houses has changed drastically. U.S. online resale is expected to nearly double by 2029, reaching $40 billion.

What’s referred to as the “second-hand economy” represents a shift in how people shop, their adaptability to economic changes, and a way of acting on growing sustainability concerns by buying pre-loved items.

As this market expands at pace, brands are ramping up their investment in paid search, with major players like eBay spending over $150 million per year on Google Ads alone.

With this growth in PPC spending, brands are looking to scale and scale fast.

However, running PPC for second-hand or resale ecommerce is a very different ballgame from a traditional ecommerce model, where brands are either manufacturing the items they sell or reselling new items.

In this post, I’ve shared five ecommerce PPC strategies for second-hand retailers that will help find success.

Before we jump into them, let’s dig into a few key challenges that are unique to managing paid search in this market.

Key Challenges Unique To PPC For Second-Hand Retailers

Inventory Turnover And One-Of-A-Kind Products

The flow of products will vary by retailer.

Take eBay, for example. It likely has hundreds (even thousands) of certain items, but for smaller retailers or specialised brands (such as antique or vintage resellers), it is most likely dealing with one-of-a-kind products.

In this scenario, once a product is gone, it’s gone.

Bidding algorithms get little time to learn which products convert the best, as many items may only be in the feed briefly, whereas others may remain in the product feed for a long time and be deprioritized by newer items.

Frequent Product Updates & Data Quality

For some second-hand retailers, inventory can change daily (or hourly) as new products are acquired and are listed on the site to sell through as soon as possible.

This movement, whether fast or slow, impacts both PPC campaigns that use product feeds (such as Google Shopping or Performance Max) as new data is fed into the campaigns on a frequent basis.

It can also impact search campaigns as products move in and out of stock.

Let’s say a brand has a search campaign bidding on keywords themed around “second-hand Herman Miller chairs.” It sells through 80% of the stock and is waiting for new SKUs to be added.

The efficiency of the campaign will decline, and spend could be wasted. This isn’t just for second-hand retailers; it also applies to all PPC ecommerce strategies.

In addition, data quality has to be bulletproof to ensure that products are entered into the most relevant auctions and searchers are provided with the best possible data prior to clicking through.

For example, say one product is uploaded with the title: Nike – Air Force 1 ’07 – White – Size 10. And another: Carhartt Hoodie.

In this scenario, retailers will be forever going back and forth across various teams to fix data issues with the feed (something I’ve seen firsthand).

Then, throw in brands such as Depop and Vinted, which have user-generated listings, and the task of creating a refined, rich data feed becomes even more complex.

Dynamic Budget Allocation

With an ever-changing flow of products and search queries, accurately forecasting and allocating budgets can be a difficult task.

A category may perform great one month, where SKUs that are in high demand are in stock, then drop off the following month as the conversion rate declines due to a less desirable product selection.

Dynamic budget allocation is essential, as there are so many moving parts.

Advertisers must monitor stock levels across many touchpoints (e.g., brand, category, material) and trends in search queries, and undertake systematic performance reviews to feed into how much budget to cut out for PPC and where to allocate this.

Complex Measurement And Reporting

With SKUs coming and going, traditional product reporting is limited.

Advertisers can’t rely on item-level metrics alone, as many items have zero sales (or a single sale) before being removed from the feed and out of product/listing groups.

This essentially takes away the traditional strategy of catering to your “best sellers” first – a strategy that relies on accrued product-level data to feed into various characteristics set by advertisers (e.g., X number of sales over X days at a ROAS of X = best seller).

Second-hand retailers must aggregate their product data to uncover trends in brands, styles, materials, product types, and more.

This comes with a level of expertise in creating these reports and the time to maintain, update, and actually use them to inform the PPC strategy.

So, How Can Second-Hand Retailers Succeed In Paid Search Given The Limitations?

Despite these challenges, second-hand retailers can thrive with PPC.

Here are five strategies that are tried and tested and will lay the groundwork for creating a second-hand PPC powerhouse.

1. Optimize And Enrich Your Shopping Feed

Product feeds are the heart of PPC for ecommerce.

Campaign types that use product listings, such as Google Shopping and Performance Max, allow advertisers to get their products in front of searchers prior to clicking through.

Google search for the query Screenshot from search for [second hand supreme jackets], Google, March 2025

As with a couple of points raised so far, this isn’t a strategy exclusive to second-hand retailers, but the importance of making sure data is rich and processes are in place is critical with many different SKUs flowing in and out of the inventory.

So that you can sleep at night knowing you’re matching the most relevant queries and ensure you have the best possible data in your feed, I’d recommend this approach:

  • The Basics: Create a structure and put a process in place that accounts for every stakeholder who will be involved in feeding data at any point. If you want to ensure you spot any anomalies immediately (definitely recommended), you could use a third-party tool, export your feed to a sheet, and build a script to check that all SKUs follow the same pattern.
  • The Next Step: Custom labels, keyword research, supplemental feeds, and more. This could be:
    • Adding detailed information on the condition of an item in the description, with a summary in the title (e.g., new with tags, used once, X number of owners, etc.).
    • Qualifying that the items are not brand new. This will help with both entering into ad auctions for pre-loved/second-hand queries. It will also help qualify traffic as your listing will clearly show up front that it is not new.
    • Categorizing groupings such as era, designer, or material for antique and vintage stores. This is useful for structuring both the feed and the way campaigns are grouped in the ad platform.

2. Think Categories (Or Bespoke Groupings), Not Individual Product Sales

Ecommerce PPC strategies are often built on best-selling product data.

This segment naturally demands the highest budget allocation as conversion rate, return on ad spend (ROAS), etc., is often the highest.

However, many second-hand retailers may only ever have one (or a handful) of every item, which almost breaks apart the traditional approach of managing paid search for ecommerce.

All is not lost, though. Brands can find success by segmenting (and reporting) by category and using this to steer budgeting, forecasting, day-to-day optimisation, and more.

Aggregating this data helps to:

  • Uncover meaningful trends to both share with the wider business and feed into bidding algorithms.
  • Set the foundations for adapting to change. For example, say a luxury handbag reseller receives a high intake of products from a new brand/designer. A category-level split will help facilitate driving visibility for these items through PPC, whereas if a “best-seller” structure were used, it would not contain the new items and wouldn’t prioritize them.
  • Assist with flexing media budgets, as depending on size, some retailers may be dealing with hundreds of thousands of items and being able to pull back and scale spend on what works is crucial.

3. Don’t Be Afraid To Broaden Your Reach, With Care

I have seen many brands in this space doubling down on Search and Shopping, with strict query funneling to only serve ads for queries that contain “second-hand”/”pre-loved”/”used.”

This is logical and may work well. However, for this theoretical example where we don’t have data, this strategy neglects multiple audiences who are not only in the market for the items, but may convert higher for the short term and help drive up Customer Lifetime Value (CLV) in the long run.

This strategy makes the assumption that if the query has been pre-qualified (second-hand/pre-loved/used, etc.), the audience searching will be the most profitable, which, in my experience, is not always the case.

Take a second-hand camera retailer, for example. If it only bids on pre-qualified queries such as “used Canon cameras” or “second-hand point-and-shoot cameras,” it would miss all users who are looking for the brands they sell, general camera queries, longer-tail searches, and more.

This is where campaign types such as Performance Max and especially Dynamic Search Ads (DSA) are certainly worth testing to expand your reach and serve ads for intent-driven searches across a wide range of audiences.

4. Align PPC Efforts With Inventory And Operations

This isn’t exclusive to second-hand retailers, but it is especially important.

Cross-team collaboration is a must when products are flowing in and out of stock, and retailers have an ever-changing number of products on site.

Data should flow both ways:

PPC → Wider Team (Merchandising, Buying, Operations, etc.)

  • Which categories/brands/designers have indexed up or down vs. average over a certain time period?
  • Are there any new queries that can help with product acquisition?
  • How has category X trended over time since stock volume increased considerably?

Wider Team → PPC

  • We’ve got X units of brand A and more to come over the next three months. How do we prioritize this?
  • The stock of category X has begun drying up. There’s not much on the market, so a restock is unlikely soon.
  • Returns for brand X are 50% above average. How much are we spending on these items each month?

Creating a virtuous cycle will only improve PPC performance and build relationships.

Finding the best way to pull this data may take time, as teams will need to share various datasets (stock reports, CRM, order books, etc.) to then feed into a centralized report, but the payoff is definitely worth it.

5. Think Outside Of The PPC Box

In the world of second-hand retail, the importance of PPC teams having a clear understanding of profitability outside of account-level KPIs such as ROAS or cost per acquisition (CPA) is crucial.

Unlike a traditional ecommerce model where brands manufacture the products themselves, the second-hand market, whatever the product may be, will likely make less margin comparatively due to lower prices, costs of acquiring the product, operational expenses, etc.

Here are a few metrics I would highly recommend keeping close to when making strategic PPC decisions:

  • Return Rate: The average return rate for ecommerce was 16.9% in 2024, with products that require specific fits (clothing, shoes, etc.) rising as high as 30%, and even further during peak. With margin front of mind, weaving these rates into PPC budgeting, forecasting, and setting KPI is essential.
  • New Customer Acquisition Cost (nCAC): This measures the average expense incurred to acquire a new customer and is calculated by total new customer marketing expenses/number of new customers acquired. While it may not be the primary goal, nor are all accounts built to accommodate clear, new, and returning budget splits, this is a metric that must be observed in line with CLV, ROAS, etc.
  • Customer Lifetime Value (CLV): PPC teams operating within this business model have to look past the first sale. CLV helps quantify the long-term value of a customer, which unlocks more informed decisions for budgeting, forecasting, and optimization, especially when acquiring new customers.

In second-hand retail, where margins are tighter, understanding the full customer journey and setting KPIs using a clear view of profitability will empower PPC teams to make smarter, more commercially aligned decisions.

Summary: A Different Approach, A World Of Potential

With changing inventory and tighter margins, advertisers need to adopt a different approach to PPC.

Whether a billion-dollar resale store with self-serving listings or a small clothing store, the same principles apply. As with most things PPC, it all comes back to having clear, accurate data.

Advertisers have a wealth of tactics to consider, from ensuring the feed is the best it can be to setting targets using bespoke groupings that change over time.

One-size-fits-all approaches may bring short-term stability, but for long-term growth and scalability, the teams that think and adapt quickly will lead the pack.

More Resources:


Featured Image: Wayhome Studio/Shutterstock