Google Reveals How It Prefetches Search Results For Faster Loading via @sejournal, @MattGSouthern

Google has shared new details on how it uses the Speculation Rules API to speed up clicks on search results.

When searching in Chrome, Google preloads parts of a webpage before you click, leading to faster load times.

Here’s an overview of how it works and the benefits Google has observed.

How Prefetching Works

Google loads the top two search results before you click them. When you see the results on the screen, your browser automatically fetches these links.

If you click on one, it will already be partially loaded, reducing your wait time.

Google explains:

“Google Search has been making use of the Speculation Rules API to improve navigation speed from the search results page to the result links and they’ve been using a few features of the API that may be of interest to other site owners.”

Early on, one of Google’s primary tactics was:

“One of the first uses of speculation rules was to prefetch the first two search results.”

In other words, your browser quietly fetches the HTML from the top search results as soon as you land on the results page, giving you a head start if you decide to click.

Performance Gains

Tests show a noticeable speed boost.

On Chrome for Android, Google recorded a 67-millisecond drop in Largest Contentful Paint (LCP), while Desktop Chrome users saw a 58.6-millisecond improvement.

Beyond prefetching the top two results, Google selectively prefetches other results when a user’s cursor hovers over them on desktop:

“[The Speculation Rules API] was enhanced with an eagerness property that allows speculations to only happen when the user hovers on, or starts to click a link. Google Search decided to move beyond the first two search results and also prefetch remaining search results—but only when the user hovers over the link…”

The hover action triggers a moderate prefetch, saving bandwidth for links that might not be clicked.

Mobile devices, however, lack hover functionality, so Google didn’t see the same benefits there.

Future Experiments & Browser Support

Google is exploring prerendering entire search results pages (SERPs) in certain scenarios, such as when you start typing a search in Chrome’s address bar.

Other search engines can adopt this technology, too, but Google remains the main implementer for now.

The Speculation Rules API currently works in Chromium-based browsers like Chrome.

Why This Matters

Prefetching can shave critical milliseconds off your page load time.

Google notes that even slight speed boosts matter, especially with billions of daily searches.

If your audience primarily uses Chrome, you could see performance gains by implementing the Speculation Rules API on your site.


Featured Image: Thaspol Sangsee/Shutterstock

Google CrUX Report Update Targets LCP & Network Delays via @sejournal, @MattGSouthern

Google Chrome has released its latest Chrome User Experience Report (CrUX).

This update zeroes in on individual components of page speed, offering fresh data on largest contentful paint (LCP) image subparts and real-world network conditions (RTT).

For SEO professionals, this means you’ll have a better understanding of what needs improvement.

Barry Pollard, Web Performance Developer Advocate at Google Chrome, made the announcement on Bluesky.

Key Updates to CrUX Report

Speed and user experience are known to impact search visibility, and Google’s latest CrUX update breaks down site performance barriers with greater detail:

Granular LCP Details
New “image subparts” let you pinpoint what’s slowing down your largest image element.

With Time to First Byte, Resource Load Delay, Resource Load Duration, and Element Render Delay all in one mix, you can see if your bottleneck is server lag, render delays, or even how late the browser finds your image.

RTT Over ECT
Rather than lumping user connections into outdated “3G/4G” bins, Google’s new round trip time (RTT) tri-bins highlight the true speed of your audience’s networks.

Identifying high-latency segments can guide you toward optimizing for users in specific regions or network conditions.

BigQuery & CrUX Vis Updates
SEO professionals can access updated BigQuery datasets, which include more granular RTT information and broader coverage for metrics like Interaction to Next Paint (INP).

Additionally, the CrUX Vis tool (cruxvis.withgoogle.com) now shows everything from streamlined LCP subparts to country-level RTT stats, ideal for spotting speed issues at a glance.

Better Data Coverage, More Competitive Edge
By retiring the Effective Connection Type (ECT) dimension, Google can report richer data for a broader range of sites.

How To Leverage The New Metrics

  • Zero In on Server Delays: Pinpoint high Time to First Byte if your hosting setup or backend scripts are slowing that crucial first load.
  • Check Image Discovery Times: A Resource Load Delay might indicate a late-discovered hero image or JavaScript injection issues, which are common trouble spots in modern frameworks.
  • Optimize Media Files: Long Resource Load Duration can highlight oversized or unoptimized images that hamper page speed, a frequent cause of poor LCP.
  • Tailor Solutions for Different Regions: With RTT tri-bins now informing you if certain countries or networks face big delays, you can serve region-specific content faster (e.g., via CDNs or local servers).

Why This Matters

The latest CrUX update provides new data and identifies where your page may fail visitors, potentially harming your rankings. This can help you fix issues quickly, potentially improving site performance and visibility.

To analyze your site, visit cruxvis.withgoogle.com or check the CrUX BigQuery dataset.


Featured Image: julslst/Shutterstock

Surprising Review Stats To Feed Your Local Strategy [Study]

If you’ve already walked a long mile in your local SEO shoes, chances are you have plenty of lived experiences that allow you to predict some of the responses to large-scale local business review surveys.

It’s affirming to see, for example, that 85% of our respondents place a degree of trust in local business reviews, and 92% now believe that brands responding to reviews have become part of offering good customer service.

Consumers are also seeking review content across a wide variety of platforms, including traditional local business listings like Google Business Profiles, social media sites like Instagram, and other localized online spaces.

You could have made an educated guess about stats like these, but pay attention to the data in this survey that contains genuine surprises.

Statistics that challenge your biases provide critical learning moments that can feed into your local search marketing strategy.

Unexpected data points can also help you earn buy-in from decision-makers for local SEO initiatives you want to explore.

I’d like to share six local business review findings from our survey of 1,200+ North American consumers that taught me something new, plus one stat I accurately predicted and that I want to be sure is accessible to anyone involved in marketing local brands.

1. Young Consumers Are Surprisingly Patient When It Comes To Owner Responses To Reviews

Screenshot from GatherUp, January 2025

I was genuinely surprised to discover that consumers aged 45 to 60 have the highest expectations when it comes to review response time frames, with 46% of them expecting to hear back from businesses within one day.

I wrongly supposed that our youngest demographic would have the least patience because they have grown up in an era of such intense automation.

Brands that primarily serve youthful consumers are constantly told that all processes must be made as frictionless as possible to avoid abandonment and loss.

Still, this survey question reveals that – at least when it comes to owner responses to reviews – 18- to 29-year-olds are the dominant group that will tolerate reviews taking one or more weeks to receive a reply.

The best practice remains to respond to all incoming reviews as quickly as you possibly can.

If your consumer base is young, there could be other elements of your local search marketing and brand-consumer communications that require more urgent action.

2. Word-Of-Mouth Recommendations Are Far More Trusted Than Reviews

Screenshot from GatherUp, January 2025

The survey found that just 31% of consumers trust online local business reviews as much as they do personal recommendations from family and friends.

This stat will come as a genuine shocker to anyone who has concluded from other surveys over the past couple of decades that most people trust reviews as much as they do word-of-mouth (WOM) referrals.

It’s vital to know that 45% of your consumer base is likely to rely more trustingly on whether the people they know in real life think your business is worth trying than they do on the sentiment of online strangers.

This finding emphasizes the critical need for customer service standards that inspire consumers to recommend your brand to their circle.

Formal loyalty programs should be strongly considered in your local search marketing strategy.

3. Review Reading Is On The Rise

Screenshot from GatherUp, January 2025

Due to our tech-driven society’s fascination with the latest new thing, I might have thought this survey would yield signs that the review honeymoon could be over.

After all, local business reviews are now more than 20 years old, and the internet is increasingly full of distractions that could supplant the quiet habit of perusing review content.

As it turns out, I couldn’t be more wrong.

A significant 59% of consumers report spending more time reading local business reviews than they did five years ago.

We can theorize about whether this uptrend might be the result of the COVID-19 pandemic causing more dependence on the web, the outrageous cost of remote shipping prompting consumers to search for local alternatives, or other contributing factors.

Whatever the cause, the narrative you need to take to your next local search marketing strategy session is that the value of reviews is on the rise, meaning reputation management deserves priority resources.

It’s important to note that Google continues to invest in highlighting review content, both on Google Business Profiles and in other formats like the bonus text snippets called local justifications that can appear in local packs and Maps. Google clearly thinks that reviews matter.

4. Are Star Ratings Less Important Than You Think?

Image from GatherUp, January 2025

My gut would tell me that the overall star rating of local businesses on listings like Google Business Profiles would be the ultimate factor determining whether a particular business gets chosen by a consumer for a transaction. The data says otherwise.

Just 23% of respondents stated that they looked at the overall star rating of brands the last time they consulted reviews.

This pales in comparison to the 67% who focused on the most recent reviews, and the 50% who prioritized looking at the lowest-star reviews first.

This is a takeaway I find so surprising that it is challenging to construct any other narrative surrounding it than this: Modern consumers have realized that average ratings include all of the reviews a business has ever received, and that this may not be reflective of current quality.

The public is smart if they are trying to find out how fellow consumers feel about a business today, this week, or this month, instead of how a brand has performed historically.

The learning here is obvious: A successful reputation management program is one that delivers a steady stream of fresh, incoming review content.

If your review river is stagnating, you need to find whatever is damming it and remove those obstacles to ensure that your community can quickly access recent sentiment about your brand.

5. Only A Minority Of Review Readers Are Interested In Responses That Detail Brand Improvements

Screenshot from GatherUp, January 2025

In the past, I’ve recommended local business clients be certain that their owner responses to negative reviews include a detailed explanation of the improvements they’ve made to ensure that other customers don’t experience the same problem the reviewer encountered.

For example, if an unhappy reviewer mentioned that their pizza was delivered cold, I would have typically advised the brand to analyze whether this sentiment about cold food was emergent and uptrending, and then to make an operational fix.

I would have counseled them to respond to all such negative reviews with the information that the business had invested in new insulated carrier bags, or what have you.

Now, seeing that just 34% of review readers highly value this type of explanation, I may alter my best practice advice in a particular use case.

I am frequently asked by large multi-location enterprises about how to prioritize review responses when dealing with hundreds or thousands of incoming reviews.

I have seen some marketers suggest that the business should only respond to negative reviews to make scaling more manageable, but I remain leery of this advice because surveys like this one confirm for me that 73% of consumers appreciate being thanked by the business for their positive feedback.

Instead, if scaling review management is necessitating a shortcut at the moment, you might experiment with limiting the text of your owner responses to negative reviews to a sincere apology and contact information for in-person resolution, rather than taking extra time to describe operational improvements.

6. Instagram Is Definitely In The Local Business Reputation Game

Screenshot from GatherUp, January 2025

I hear a lot of grief from small business owners about Instagram’s algorithm, and though I use the platform fairly frequently, I find its formatting a bit of a mess.

These are biases on my part that led to my surprise that 52% of modern consumers are relying on this social media space for local business recommendations.

I think YouTube is a more natural fit for local business marketing for most brands, but if there’s one mantra to put at the heart of your company, it’s to be wherever your customers are.

Of course, your vertical comes into play here. Business models that relate to pleasure (think restaurants, bakeries, travel) have an advantage in the Instagram community.

If you are marketing a legal firm or a plumbing franchise, this particular social sphere could be a hard one to make headway in.

My overall takeaway from responses to this question is that a growing number of platforms are influencing local purchasing decisions. It’s not enough to manage your reputation on Google, Yelp, or TripAdvisor.

You need a presence and a fandom on whichever platforms are favored by the towns and cities you serve to maximize the referrals your brand receives around the web.

7. The One Stat I Don’t Want You To Miss!

Image from GatherUp, January 2025

92% of consumers now consider owner responses to reviews as part and parcel of providing good customer service.

This is the statistic that did not surprise me, but which I had never seen codified by any other local business review survey.

It confirms for me the advice I’ve been giving small-to-enterprise brands for many years now that creating the best possible online consumer experiences is as crucial to building a strong reputation as what happens within premise walls.

Your customers’ online and offline experiences with your company work in concert to form their opinions and determine whether they will come to you for repeat transactions, recommend you to others, and speak well of you socially.

Given this, timely, professional, accountable owner responses to reviews must be seen as a top-tier activity in your local search marketing strategy.

Few brands are large enough to safely be able to ignore a customer who is trying to communicate with them via a review.

Monopolies and near-monopolies who are getting away with review neglect are also likely leaving profits on the table because, even if a town has only one hardware franchise, fabric store branch, or supermarket, remote fulfillment is now at the fingertips of most consumers, thanks to the internet.

It’s my hope that this statistic will cut through so many of the tantalizing shortcuts to real customer service that are on offer today.

There is no more vital or lucrative focus for local brands of any size than ensuring that they are in a trustworthy, responsive, and reliable relationship with their customer base.

Smart brands will put this at the heart of their marketing strategy.

Summing Up

Surveys matter to the local SEO industry because they both confirm hypotheses and challenge biases, offering the opportunity to base strategy on data instead of guesses.

This useful survey taught me not to undervalue the patience of the youngest consumers and to encourage my clients to earn more WOM recommendations because they are more trusted than online equivalents.

Also, it taught me that online distractions aren’t getting in the way of review reading, fresh review content is more important than ever, shorter responses to negative reviews may be acceptable in some cases, and Instagram needs to be thought of as a dominant player in the local business reputation milieu.

It also confirmed my long-suspected but up-to-now unproven theory that owner responses must be seen as integral to providing good customer service.

If you’re marketing a brand that is not yet bringing its A-game to reputation management, you can share the following tips to help it rapidly improve, based on additional findings of this survey:

  • Begin collecting email and SMS contact info at the time of service so that you can request reviews. 83% of your customers will be at least somewhat responsive to your requests for their reviews.
  • Train staff to ask for reviews in person at the time of service. 47% of customers prefer this form of request.
  • Respond to all incoming reviews in a timely fashion. 73% of consumers appreciate being thanked for positive feedback, and 79% expect your response to their complaints.
  • Respond to negative reviews with an apology and an offer to make things right. 73% of unhappy customers will be willing to give your business a second chance if your owner response solves their problems.
  • Avoid engaging in any form of review fraud. Only 14% of people will give your business a try if your local business profiles get stamped with a review spam warning.

My final tip: A good large-scale review survey should inspire you to conduct a smaller one of your own within your unique consumer base.

Polling customers on a regular basis is the best way to spot new trends, behaviors, and opportunities. The better you know the preferences and habits of your community, the better prepared you’ll be to serve.

You can read the full survey results here from GatherUp.

More Resources:


Featured Image: Song_about_summer/Shutterstock

The dream of offshore rocket launches is finally blasting off

Want to send something to space? Get in line. The demand for rides off Earth is skyrocketing, pushing even the busiest spaceports, like Florida’s Kennedy Space Center, to their operational limits. Orbital launches worldwide have more than doubled over the past four years, from about 100 to 250 annually. That number is projected to spiral further up this decade, fueled by an epic growth spurt in the commercial space sector.

To relieve the congestion, some mission planners are looking to the ocean as the next big gateway to space. China has sent more than a dozen space missions from ocean platforms since 2019, most recently in January 2025. Italy’s space program has announced it will reopen its ocean launchpad off the coast of Kenya, while German space insiders envision an offshore spaceport in the North Sea. In the US, the idea of sea launches has attracted attention from heavyweights like SpaceX and inspired a new startup called the Spaceport Company

Launching rockets from offshore platforms like barges or oil rigs has a number of advantages. For one thing, it dramatically expands potential locations to lift off from, especially along the equator (this provides rockets with a natural speed boost because, thanks to geometry, the equator moves faster than the poles). At the same time, it is potentially safer and more environmentally friendly, placing launches further from population centers and delicate ecosystems. 

Ocean launches have taken place on and off for decades. But the renewed interest in offshore spaceports raises a host of questions about the unique regulatory, geopolitical, and environmental trade-offs of sea-based launches. It also offers a glimpse of new technologies and industries, enabled by a potentially limitless launch capacity, that could profoundly reshape our lives.

“The best way to build a future where we have dozens, hundreds, or maybe thousands of spaceports is to build them at sea,” says Tom Marotta, CEO and founder of the Spaceport Company, which is working to establish offshore launch hubs. “It’s very hard to find a thousand acres on the coast over and over again to build spaceports. It’s very easy to build the same ship over and over again.”

The saga of sea launches

The vision of oceanic spaceports is almost as old as rocketry itself. The first large rocket to take off from sea was a V2, the notorious missile developed by Germany in World War II and subsequently adopted by the United States, which the US Navy launched from the aircraft carrier USS Midway south of Bermuda on September 6, 1947. 

As it turned out, the inaugural flight was a bit of a mixed bag. Neal Casey, an 18-year-old technician stationed on the Midway, later recalled how the missile tilted dangerously starboard and headed toward  the vessel’s own command center, known as the island.

“I had no problem tracking the rocket,” said Casey, according to the USS Midway Museum. “It almost hit the island.”

Despite this brush with disaster, the test was considered a success because it proved that launching rockets from sea platforms was technically feasible. That revelation enabled the proliferation of missile-armed vessels, like warships or submarines, that have prowled the sea ever since.

Of course, missiles are designed to hit targets on Earth, not venture into space. But in the early 1960s Robert Truax, an American rocketry engineer, began pursuing a spectacular vision: the Sea Dragon. 

Standing nearly 500 feet tall, it would have been by far the biggest rocket in history, towering over the Apollo Program’s Saturn V or SpaceX’s Starship. No launchpad on land could withstand the force of its liftoff. A rocket this gargantuan could only be launched from a submerged position beneath the sea, rising out of the water like a breaching whale and leaving whirlpools swirling in its wake.

Truax proposed this incredible idea in 1963 while he was working at the rocket and missile manufacturer Aerojet General. He was even able to test a few small prototypes, including the Sea Bee, which was fired from under the waters of San Francisco Bay. Though the Sea Dragon never became a reality, the concept captured the imaginations of space dreamers for decades; most recently, it was depicted bursting from the ocean in the Apple+ series For All Mankind.  

Truax was eerily prescient about many future trends in spaceflight, and indeed, various governments and private entities have developed offshore launch platforms to take advantage of the flexibility offered by the seas.

“The most wanted launching sites are close to the equator,” says Gerasimos Rodotheatos, an assistant professor of international law and security at the American University in the Emirates who has researched sea-based launches. “Many countries there are hard to deal with because of political instability or because they don’t have the infrastructure. But if you’re using a platform or a vessel, it’s easier to select your location.”

Another major advantage is safety. “You’re far away from cities,” Rodotheatos adds. “You’re far away from land. You’re minimizing the risk of any accidents or any failures.”

For these reasons, rockets have intermittently lifted off from sea for nearly 60 years, beginning with Italy’s Luigi Broglio Malindi Space Center, a retrofitted oil rig off the coast of Kenya that launched orbital missions from the 1960s to the 1980s and may soon reopen after a nearly 40-year hiatus. 

Sea Launch, a multinational company founded in 1995, launched dozens of missions into orbit from the LP Odyssey, another repurposed drilling rig. The company might still be in business if Russia had not annexed Crimea in 2014, a move that prompted the venture—a partnership between Russia, Ukraine, the United States, and Norway—to shutter later the same year. 

The saga of Sea Launch proved that offshore launches could be commercially profitable, but it also exposed gray areas in international marine and space law. For instance, while Sea Launch was a venture between four spacefaring nations, it registered its rig and vessels to Liberia, which has been interpreted as a flag of convenience. Such strategies could present the opportunity for companies or other entities to evade certain labor laws, tax obligations, and environmental regulations.  

“Some states are very strict on the nationality and transparency of ownership, and other states less strict,” says Alla Pozdnakova, a professor of law at the University of Oslo’s Scandinavian Institute for Maritime Law, who has researched sea-based launches. “For now, it seems that it hasn’t been really that problematic because the United States, for example, would require that if you’re a US citizen or a US company, then you have to apply for a license from the US space authorities, regardless of where you want to launch.”

But if the US imposes strict oversight on launches, other nations might apply different standards to licensing agreements with launch providers. “I can imagine that some unauthorized projects may become possible simply because they are on the seas and there is no real authority—by contrast to land-based space launches—to supervise those kinds of launches,” Pozdnakova says.

Boeing, which managed Sea Launch, was fined $10 million in 1998 by the US Department of State for allegedly sharing information about American defense technology with its foreign partners in violation of the Arms Export Control Act. In addition to the legal and national security risks posed by Sea Launch, Pacific Island nations raised concerns to the United Nations in 1999 that the company’s offshore rockets could damage the environment by, for instance, creating oil slicks from unused fuel in discarded boosters. 

The complex issues that offshore spaceports raise for international law, environmental protection, and launch access have never been more relevant. SpaceX, which is famous for pioneering offshore rocket landings, has also flirted with sea-based launches. The company went so far as to purchase two oil rigs for $3.5 million apiece in 2020. They were renamed Deimos and Phobos after the two moons of Mars.

“SpaceX is building floating, superheavy-class spaceports for Mars, moon & hypersonic travel around Earth,” SpaceX CEO Elon Musk posted on Twitter (when it was still Twitter) in 2020. 

SpaceX eventually abandoned this project and sold the rigs, though Gwynne Shotwell, its president and COO, said in 2023 that sea-based launches were likely to be part of the company’s future. SpaceX did not respond to a request for comment. 

The company might need to move launch operations offshore if it wants to carry through on its aspirations for Starship, which is the most powerful rocket ever developed and the keystone of SpaceX’s future plans to send humans to the moon and Mars. “We have designed Starship to be as much like aircraft operations as we possibly can get it,” she said at a conference in 2023, according to SpaceNews. “We want to talk about dozens of launches a day, if not hundreds of launches a day.” 

The environmental impact of launching hundreds of rockets a day, either from sea or land, is not known. While offshore launches pose fewer direct risks to local environments than land launches, very little is understood about the risks that rocket emissions and chemical pollution pose to the climate and human health at current levels, much less exponentially higher ones. 

“It’s hard to deny that launching or emitting anything further from people is usually better,” says Sebastian Eastham, the senior lecturer in sustainable aviation at Imperial College London, who studies aerospace emissions and their environmental impacts. “But when we say that we’re concerned about the emissions, it is incomplete to say that we’re not launching near people, so people aren’t going to be affected.”

“I really hope that we find out that the impacts are small,” he continues. “But because you have this very rapid growth in launch emissions, you can’t sample now and say that this is representative of what it’s going to be like in five years. We’re nowhere near a steady state.”

In other words, rocket launches have been largely overlooked as a source of greenhouse-gas emissions and air pollution, simply because they have been too rare to be considered a major contributor. As space missions ramp up around the world, experts must aim to constrain the impact on climate change, the ozone layer, and pollution from spent parts that burn up in the atmosphere

The McDonald’s of spaceports

Offshore launches are almost routine in China, where companies like Galactic Energy, Orienspace, and the China Aerospace Science and Technology Corporation have expanded orbital liftoffs from barges. (None of these companies responded to a request for comment.) 

But at the moment, sea-based launches are limited to small rockets that can deploy payloads of a few thousand pounds to orbit. No ocean spaceport is currently equipped to handle the world’s most powerful rockets, like SpaceX’s Falcon Heavy, which can deliver more than 140,000 pounds to orbit. There are also currently no public plans to invest in sea-based infrastructure for heavy-lift rockets, but that may change if smaller offshore spaceports prove to be reliable and affordable options.

“All the activities now are based on off-the-shelf technologies,” Rodotheatos says, meaning facilities like oil rigs or barges. “If one company makes an investment to design and implement a floating platform from zero, specifically fitted for that purpose, I expect to see a big change.” 

Tom Marotta founded the Spaceport Company in 2022 with a similar long-term vision in mind. After working both for the space company Astra and on the regulatory side at the Federal Aviation Administration’s Office of Commercial Space Transportation, Marotta observed what he calls a “spaceport bottleneck” that had to be addressed to keep pace with the demands of the commercial space sector.  

To that end, the Spaceport Company procured a former US Navy training vessel, named the Once in a Lifetime after the Talking Heads song, as its first launchpad. The company is currently serving customers for suborbital space missions and missile tests, but its broader vision is to establish a network of scalable orbital spaceports across the ocean.

“We want to be the McDonald’s of spaceports, and build a model that can be repeated and copied-and-pasted all around the world,” Marotta says.

Marotta sees boundless applications for such a network. It could expand launch capacity without threatening coastal ecosystems or provoking pushback from local communities. It could serve as a reliable backup option for busy spaceports on land. It could give nations that normally don’t have access to spaceflight an affordable option for their own launch services. 

“Many nations want their own sovereign orbital launch capability, but they don’t want to spend a billion dollars to build a launchpad that might only be used once or twice,” Marotta says. “We see an opportunity there to basically give them a launchpad on demand.”

Marotta also has another dream in mind: ocean platforms could help to enable point-to-point rocket travel, capable of transporting cargo and passengers anywhere on Earth in under 90 minutes.

“You’re going to need dedicated and exclusive use of rockets off the coasts of major cities to serve that point-to-point rocket travel concept,” Marotta says. “This is science fiction right now, but I would not be surprised if in the next five years we see [organizations], particularly the military, experimenting with point-to-point rocket cargo.” 

Offshore launches currently represent a small tile in the global space mosaic, but they could dramatically change our lives in the coming decades. What that future might look like, with all of its risks and benefits, depends on the choices that companies, governments, and the public make right now.

Becky Ferreira is a science reporter based in Ithaca, NY. She writes the weekly Abstract column for 404 Media and is the author of the upcoming book First Contact, about the search for alien life.

Can AI help DOGE slash government budgets? It’s complex.

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

No tech leader before has played the role in a new presidential administration that Elon Musk is playing now. Under his leadership, DOGE has entered offices in a half-dozen agencies and counting, begun building AI models for government data, accessed various payment systems, had its access to the Treasury halted by a federal judge, and sparked lawsuits questioning the legality of the group’s activities.  

The stated goal of DOGE’s actions, per a statement from a White House spokesperson to the New York Times on Thursday, is “slashing waste, fraud, and abuse.”

As I point out in my story published Friday, these three terms mean very different things in the world of federal budgets, from errors the government makes when spending money to nebulous spending that’s legal and approved but disliked by someone in power. 

Many of the new administration’s loudest and most sweeping actions—like Musk’s promise to end the entirety of USAID’s varied activities or Trump’s severe cuts to scientific funding from the National Institutes of Health—might be said to target the latter category. If DOGE feeds government data to large language models, it might easily find spending associated with DEI or other initiatives the administration considers wasteful as it pushes for $2 trillion in cuts, nearly a third of the federal budget. 

But the fact that DOGE aides are reportedly working in the offices of Medicaid and even Medicare—where budget cuts have been politically untenable for decades—suggests the task force is also driven by evidence published by the Government Accountability Office. The GAO’s reports also give a clue into what DOGE might be hoping AI can accomplish.

Here’s what the reports reveal: Six federal programs account for 85% of what the GAO calls improper payments by the government, or about $200 billion per year, and Medicare and Medicaid top the list. These make up small fractions of overall spending but nearly 14% of the federal deficit. Estimates of fraud, in which courts found that someone willfully misrepresented something for financial benefit, run between $233 billion and $521 billion annually. 

So where is fraud happening, and could AI models fix it, as DOGE staffers hope? To answer that, I spoke with Jetson Leder-Luis, an economist at Boston University who researches fraudulent federal payments in health care and how algorithms might help stop them.

“By dollar value [of enforcement], most health-care fraud is committed by pharmaceutical companies,” he says. 

Often those companies promote drugs for uses that are not approved, called “off-label promotion,” which is deemed fraud when Medicare or Medicaid pay the bill. Other types of fraud include “upcoding,” where a provider sends a bill for a more expensive service than was given, and medical-necessity fraud, where patients receive services that they’re not qualified for or didn’t need. There’s also substandard care, where companies take money but don’t provide adequate services.

The way the government currently handles fraud is referred to as “pay and chase.” Questionable payments occur, and then people try to track it down after the fact. The more effective way, as advocated by Leder-Luis and others, is to look for patterns and stop fraudulent payments before they occur. 

This is where AI comes in. The idea is to use predictive models to find providers that show the marks of questionable payment. “You want to look for providers who make a lot more money than everyone else, or providers who bill a specialty code that nobody else bills,” Leder-Luis says, naming just two of many anomalies the models might look for. In a 2024 study by Leder-Luis and colleagues, machine-learning models achieved an eightfold improvement over random selection in identifying suspicious hospitals.

The government does use some algorithms to do this already, but they’re vastly underutilized and miss clear-cut fraud cases, Leder-Luis says. Switching to a preventive model requires more than just a technological shift. Health-care fraud, like other fraud, is investigated by law enforcement under the current “pay and chase” paradigm. “A lot of the types of things that I’m suggesting require you to think more like a data scientist than like a cop,” Leder-Luis says.

One caveat is procedural. Building AI models, testing them, and deploying them safely in different government agencies is a massive feat, made even more complex by the sensitive nature of health data. 

Critics of Musk, like the tech and democracy group Tech Policy Press, argue that his zeal for government AI discards established procedures and is based on a false idea “that the goal of bureaucracy is merely what it produces (services, information, governance) and can be isolated from the process through which democracy achieves those ends: debate, deliberation, and consensus.”

Jennifer Pahlka, who served as US deputy chief technology officer under President Barack Obama, argued in a recent op-ed in the New York Times that ineffective procedures have held the US government back from adopting useful tech. Still, she warns, abandoning nearly all procedure would be an overcorrection.

Democrats’ goal “must be a muscular, lean, effective administrative state that works for Americans,” she wrote. “Mr. Musk’s recklessness will not get us there, but neither will the excessive caution and addiction to procedure that Democrats exhibited under President Joe Biden’s leadership.”

The other caveat is this: Unless DOGE articulates where and how it’s focusing its efforts, our insight into its intentions is limited. How much is Musk identifying evidence-based opportunities to reduce fraud, versus just slashing what he considers “woke” spending in an effort to drastically reduce the size of the government? It’s not clear DOGE makes a distinction.


Now read the rest of The Algorithm

Deeper Learning

Meta has an AI for brain typing, but it’s stuck in the lab

Researchers working for Meta have managed to analyze people’s brains as they type and determine what keys they are pressing, just from their thoughts. The system can determine what letter a typist has pressed as much as 80% of the time. The catch is that it can only be done in a lab.

Why it matters: Though brain scanning with implants like Neuralink has come a long way, this approach from Meta is different. The company says it is oriented toward basic research into the nature of intelligence, part of a broader effort to uncover how the brain structures language.  Read more from Antonio Regalado.

Bites and Bytes

An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it

While Nomi’s chatbot is not the first to suggest suicide, researchers and critics say that its explicit instructions—and the company’s response—are striking. Taken together with a separate case—in which the parents of a teen who died by suicide filed a lawsuit against Character.AI, the maker of a chatbot they say played a key role in their son’s death—it’s clear we are just beginning to see whether an AI company is held legally responsible when its models output something unsafe. (MIT Technology Review)

I let OpenAI’s new “agent” manage my life. It spent $31 on a dozen eggs.

Operator, the new AI that can reach into the real world, wants to act like your personal assistant. This fun review shows what it’s good and bad at—and how it can go rogue. (The Washington Post)

Four Chinese AI startups to watch beyond DeepSeek

DeepSeek is far from the only game in town. These companies are all in a position to compete both within China and beyond. (MIT Technology Review)

Meta’s alleged torrenting and seeding of pirated books complicates copyright case

Newly unsealed emails allegedly provide the “most damning evidence” yet against Meta in a copyright case raised by authors alleging that it illegally trained its AI models on pirated books. In one particularly telling email, an engineer told a colleague, “Torrenting from a corporate laptop doesn’t feel right.” (Ars Technica)

What’s next for smart glassesSmart glasses are on the verge of becoming—whisper it—cool. That’s because, thanks to various technological advancements, they’re becoming useful, and they’re only set to become more so. Here’s what’s coming in 2025 and beyond. (MIT Technology Review)

The Download: offshore rocket launches, and how DOGE plans to use AI

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The dream of offshore rocket launches is finally blasting off

Want to send something to space? Get in line. The demand for rides off Earth is skyrocketing, with launches more than doubling over the past four years, from about 100 to 250 annually. That number is projected to spiral further up, fueled by an epic growth spurt in the commercial space sector.

To relieve the congestion, some mission planners are looking to the ocean as the next big gateway to space. But sea-based launches come with some unique regulatory, geopolitical, and environmental trade-offs. They also offer a glimpse of new technologies and industries, enabled by a potentially limitless launch capacity, that could profoundly reshape our lives. Read the full story. 

—Becky Ferreira

Can AI help DOGE slash government budgets? It’s complex.

No tech leader before has played the role in a new presidential administration that Elon Musk is playing now. Under his leadership, DOGE has entered offices in a half-dozen agencies and counting, accessed various payment systems, had its access to the Treasury halted by a federal judge, and sparked lawsuits questioning the legality of the group’s activities.  

The stated goal of DOGE’s actions is “slashing waste, fraud, and abuse.” So where is fraud happening, and could AI models fix it, as DOGE staffers hope? Read our story to find out

—James O’Donnell

This story is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Elon Musk is leading an unsolicited bid to buy OpenAI for $97.4 billion
This is an escalation in his long-running feud with CEO Sam Altman, but it may well come to nothing. (WSJ $)
The timing is annoying for Altman, as he’s in the middle of complex restructuring negotiations. (FT $)
Still, he says he’s confident OpenAI’s board is going to reject Musk’s offer. (The Information $)

2 What we’re learning from the AI Action Summit in Paris
As tech companies ship AI products relentlessly, policymakers still haven’t got a clue how to respond. (NYT $)

3 A federal judge blocked NIH cuts to research grants
A hearing has been set for February 21. (STAT $)
Why the cuts would be so devastating, according to the scientists who’d be affected. (Scientific American $)

4 AI chatbots cannot accurately summarize news
A study of leading models found 51% of their answers to questions about the news had ‘significant issues’. (BBC)
The tendency to make things up is holding chatbots back. But that’s just what they do. (MIT Technology Review)

5 BYD is bringing advanced self-driving to its cars
Including even the cheapest models. (FT $)
Analysts expect this to solidify the company’s position as China’s top EV maker. (South China Morning Post)
Why the world’s biggest EV maker is getting into shipping. (MIT Technology Review)
+ Meanwhile in the US, the next big robotaxi push is underway. (Quartz)

6 Trump is imposing 25% tariffs on foreign steel
You may recall he did this during his last term, and ended up having to roll it back. (NYT $)

7 A Silicon Valley job isn’t as desirable as it used to be 
Multiple rounds of layoffs have really broken employees’ trust in their superiors. (WP $)

8 Google Maps now shows the ‘Gulf of America’
Unless you live in Mexico! (The Verge)

9 Can the human body endure a voyage to Mars? 🧑‍🚀
Space travel exacts an extremely high physical toll on even the fittest astronauts. (New Yorker $)
Space travel is dangerous. Could genetic testing and gene editing make it safer? (MIT Technology Review)

10 Thinking of re-playing the Sims? Maybe don’t.
25 years on, it feels a bit like a psyop to prepare millennials for the capitalist grind. (The Guardian)

Quote of the day

“No thank you but we will buy twitter for $9.74 billion if you want.”

—Sam Altman responds on X to news that Elon Musk is leading an unsolicited bid to buy OpenAI for $97.4 billion.

The big story

This sci-fi blockchain game could help create a metaverse that no one owns

screenshot from Dark Forest game

DARK FOREST VIA DFWIKI

November 2022

Dark Forest is a vast universe, and most of it is shrouded in darkness. Your mission, should you choose to accept it, is to venture into the unknown, avoid being destroyed by opposing players who may be lurking in the dark, and build an empire of the planets you discover and can make your own.

But while the video game seemingly looks and plays much like other online strategy games, it doesn’t rely on the servers running other popular online strategy games. And it may point to something even more profound: the possibility of a metaverse that isn’t owned by a big tech company. Read the full story.

—Mike Orcutt

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Tiles can be such a beautiful artform. Just ask the Portuguese!
+ Here are some quick ways to jumpstart your energy levels.
+ I’m obsessed with spicy smacked cucumbers. Turns out, they’re easy to make at home.
+ Aww… This little boy and his Dad managed to visit every city in England by train last year.

AI crawler wars threaten to make the web more closed for everyone

We often take the internet for granted. It’s an ocean of information at our fingertips—and it simply works. But this system relies on swarms of “crawlers”—bots that roam the web, visit millions of websites every day, and report what they see. This is how Google powers its search engines, how Amazon sets competitive prices, and how Kayak aggregates travel listings. Beyond the world of commerce, crawlers are essential for monitoring web security, enabling accessibility tools, and preserving historical archives. Academics, journalists, and civil societies also rely on them to conduct crucial investigative research.  

Crawlers are endemic. Now representing half of all internet traffic, they will soon outpace human traffic. This unseen subway of the web ferries information from site to site, day and night. And as of late, they serve one more purpose: Companies such as OpenAI use web-crawled data to train their artificial intelligence systems, like ChatGPT. 

Understandably, websites are now fighting back for fear that this invasive species—AI crawlers—will help displace them. But there’s a problem: This pushback is also threatening the transparency and open borders of the web, that allow non-AI applications to flourish. Unless we are thoughtful about how we fix this, the web will increasingly be fortified with logins, paywalls, and access tolls that inhibit not just AI but the biodiversity of real users and useful crawlers.

A system in turmoil 

To grasp the problem, it’s important to understand how the web worked until recently, when crawlers and websites operated together in relative symbiosis. Crawlers were largely undisruptive and could even be beneficial, bringing people to websites from search engines like Google or Bing in exchange for their data. In turn, websites imposed few restrictions on crawlers, even helping them navigate their sites. Websites then and now use machine-readable files, called robots.txt files, to specify what content they wanted crawlers to leave alone. But there were few efforts to enforce these rules or identify crawlers that ignored them. The stakes seemed low, so sites didn’t invest in obstructing those crawlers.

But now the popularity of AI has thrown the crawler ecosystem into disarray.

As with an invasive species, crawlers for AI have an insatiable and undiscerning appetite for data, hoovering up Wikipedia articles, academic papers, and posts on Reddit, review websites, and blogs. All forms of data are on the menu—text, tables, images, audio, and video. And the AI systems that result can (but not always will) be used in ways that compete directly with their sources of data. News sites fear AI chatbots will lure away their readers; artists and designers fear that AI image generators will seduce their clients; and coding forums fear that AI code generators will supplant their contributors. 

In response, websites are starting to turn crawlers away at the door. The motivator is largely the same: AI systems, and the crawlers that power them, may undercut the economic interests of anyone who publishes content to the web—by using the websites’ own data. This realization has ignited a series of crawler wars rippling beneath the surface.

The fightback

Web publishers have responded to AI with a trifecta of lawsuits, legislation, and computer science. What began with a litany of copyright infringement suits, including one from the New York Times, has turned into a wave of restrictions on use of websites’ data, as well as legislation such as the EU AI Act to protect copyright holders’ ability to opt out of AI training. 

However, legal and legislative verdicts could take years, while the consequences of AI adoption are immediate. So in the meantime, data creators have focused on tightening the data faucet at the source: web crawlers. Since mid-2023, websites have erected crawler restrictions to over 25% of the highest-quality data. Yet many of these restrictions can be simply ignored, and while major AI developers like OpenAI and Anthropic do claim to respect websites’ restrictions, they’ve been accused of ignoring them or aggressively overwhelming websites (the major technical support forum iFixit is among those making such allegations).

Now websites are turning to their last alternative: anti-crawling technologies. A plethora of new startups (TollBit, ScalePost, etc), and web infrastructure companies like Cloudflare (estimated to support 20% of global web traffic), have begun to offer tools to detect, block, and charge nonhuman traffic. These tools erect obstacles that make sites harder to navigate or require crawlers to register.

These measures still offer immediate protection. After all, AI companies can’t use what they can’t obtain, regardless of how courts rule on copyright and fair use. But the effect is that large web publishers, forums, and sites are often raising the drawbridge to all crawlers—even those that pose no threat. This is even the case once they ink lucrative deals with AI companies that want to preserve exclusivity over that data. Ultimately, the web is being subdivided into territories where fewer crawlers are welcome.

How we stand to lose out

As this cat-and-mouse game accelerates, big players tend to outlast little ones.  Large websites and publishers will defend their content in court or negotiate contracts. And massive tech companies can afford to license large data sets or create powerful crawlers to circumvent restrictions. But small creators, such as visual artists, YouTube educators, or bloggers, may feel they have only two options: hide their content behind logins and paywalls, or take it offline entirely. For real users, this is making it harder to access news articles, see content from their favorite creators, and navigate the web without hitting logins, subscription demands, and captchas each step of the way.

Perhaps more concerning is the way large, exclusive contracts with AI companies are subdividing the web. Each deal raises the website’s incentive to remain exclusive and block anyone else from accessing the data—competitor or not. This will likely lead to further concentration of power in the hands of fewer AI developers and data publishers. A future where only large companies can license or crawl critical web data would suppress competition and fail to serve real users or many of the copyright holders.

Put simply, following this path will shrink the biodiversity of the web. Crawlers from academic researchers, journalists, and non-AI applications may increasingly be denied open access. Unless we can nurture an ecosystem with different rules for different data uses, we may end up with strict borders across the web, exacting a price on openness and transparency. 

While this path is not easily avoided, defenders of the open internet can insist on laws, policies, and technical infrastructure that explicitly protect noncompeting uses of web data from exclusive contracts while still protecting data creators and publishers. These rights are not at odds. We have so much to lose or gain from the fight to get data access right across the internet. As websites look for ways to adapt, we mustn’t sacrifice the open web on the altar of commercial AI.

Shayne Longpre is a PhD Candidate at MIT, where his research focuses on the intersection of AI and policy. He leads the Data Provenance Initiative.

AI Is Changing Buying Behavior, Study Finds

Artificial intelligence is driving global shopping experiences according to Capgemini Research Institute’s annual trends report.

What matters to today’s consumer 2025,” published Jan. 9, recaps the firm’s survey in October and November 2024 of 12,000 consumers in Australia, Canada, France, Germany, India, Italy, Japan, the Netherlands, Spain, Sweden, the United Kingdom, and the United States.

The 100-page report focuses on how consumers discover products, how they shop, and why they switch brands.

Product Discovery

ChatGPT and other generative AI platforms have largely replaced traditional search engines for product recommendations, according to the survey. Nearly two-thirds of Gen Zs (ages 18-25), Millennials (26-41), and Gen Xs (42-47) prefer genAI for that purpose. Only Boomers (58 and over) still favor Google and other search engines for recommendations.

Moreover, genAI is transforming seemingly every touchpoint of the shopping journey. Consumers now ask genAI to curate images and aggregate product searches from multiple platforms. Some had even found virtual assistants more adept at making fashion, home décor, and travel recommendations than sales associates.

We addressed in December the power of influencers and social media for gift recommendations. An Adobe survey found that 20% of all U.S. Cyber Monday sales came from influencer endorsements. The Capgemini survey confirms those findings and more, reporting:

  • 32% of consumers purchased products through social media,
  • 68% of Gen Zs have discovered a product or brand through social.

Shopping

Consumers are responding to retail media, according to the survey.

  • 67% of respondents notice ads on retailer sites.
  • 35% found the ads helpful.
  • 22% discovered products from those ads.

Despite the recent cost-of-living improvements, consumers still seek in-store and online discounts.

  • 64% visit multiple physical stores seeking deals.
  • 65% buy private-label or low-cost brands.

Consumers also value “quick commerce,” the hyper-fast delivery of online goods. Approximately two-thirds of respondents stated a 2-hour or a 10-minute delivery was important to their purchase decisions. Forty-two percent valued the order-online pick-up in-store option.

“Demand for quick commerce is on the rise, with consumers from some geographies increasingly willing to pay for speed and efficiency,” researchers wrote, adding that merchants continue to invest in AI and logistics to improve infrastructures.

Switching Brands

Brand loyalty is increasingly rare among consumers, according to the survey. Researchers advised brands to (i) augment genAI tools to become more consumer-centric, (ii) use technology to lower prices, and (iii) leverage social and retail media networks.