WordPress Scraper Plugin Compromised By Security Vulnerability via @sejournal, @martinibuster

A WordPress plugin that automatically posts content scraped from other websites has been discovered to contain a critical vulnerability that allows anyone to upload malicious files to affected websites. The severity of the vulnerability is rated at 9.8 on a scale of 1-10.

Crawlomatic Multisite Scraper Post Generator Plugin for WordPress

The Crawlomatic WordPress plugin is sold via the Envato CodeCanyon store for $59 per license. It enables users to crawl forums, weather statistics, articles from RSS feeds, and directly scrape the content from other websites and then automatically publish the content on the user’s website.

The plugin’s Envato CodeCanyon web page features a banner that notes that the author of the plugin has been recognized for having met “WordPress quality standards” and displays a badge indicating that it is “Envato WP Requirements Compliant,” an indication that it meets Envato’s “security, quality, performance and coding standards in WordPress plugins and themes.”

The plugin’s directory page explains that it it can crawl and scrape virtually any website, including JavaScript-based sites, promising that it can turn a user’s website into a “money making machine.”

Unauthenticated Arbitrary File Upload

The Crawlomatic WordPress plugin is missing a filetype validation check in all version prior to and including version 2.6.8.1.

According to a warning posted on Wordfence:

“The Crawlomatic Multipage Scraper Post Generator plugin for WordPress is vulnerable to arbitrary file uploads due to missing file type validation in the crawlomatic_generate_featured_image() function in all versions up to, and including, 2.6.8.1. This makes it possible for unauthenticated attackers to upload arbitrary files on the affected site’s server which may make remote code execution possible.”

Users of the plugin are recommended by Wordfence to update to at least version 2.6.8.2.

Read more at Wordfence:

Crawlomatic Multipage Scraper Post Generator <= 2.6.8.1 – Unauthenticated Arbitrary File Upload

Featured Image by Shutterstock/nakaridore

The Intersection Of Video SEO And Social Media: Tactics To Win via @sejournal, @rio_seo

Video content is now a driving force behind consumers’ attention in today’s digital era.

Gone are the days of simple blog posts and stock images.

Social media has transformed what entices clicks, follows, and likes, displaying a consistent stream of highly engaging visual content for users to sift through.

Visual content, particularly in the form of video, is a winning content format.

In fact, during the third quarter of 2024, a whopping 92% of internet users worldwide watched online videos on a monthly basis. The opportunity is massive, and now is the time for social media marketers to embrace the evolution of effective content marketing.

Videos, however, don’t just show up in social media algorithms and search engine results pages (SERPs) because we create them.

Success is contingent on numerous factors, perhaps the most notable of which is SEO.

Success in today’s digital landscape means knowing how to optimize your videos to appear more favorably in both search and social media. The intersection of video SEO and social media is now more closely tied than ever.

In this post, we will uncover the tried and tested tactics necessary to capture attention, prompt clicks, encourage engagement, and ensure your video shows up when people search for your products or services.

Why Video SEO Matters More Than Ever

More than half (56.4%) of all social media ad budgets in the U.S. went to video in 2024. That share is projected to hit 60% by 2026.

If your strategy doesn’t prioritize video, you’re missing where the money (and the momentum) is going.

Most people are inherently visual learners and, therefore, prefer to consume content in the form of a video rather than reading lengthy chunks of text.

Google, TikTok, Instagram, and YouTube are all strong advocates for this form of content creation.

Google, which acquired YouTube nearly two decades ago, often displays YouTube videos at the top of the SERPs, especially if that video can satisfy search intent quickly.

TikTok has skyrocketed in popularity over recent years, praised for its visually appealing, short-form video format, allowing users to connect with people and businesses in more meaningful ways.

On the other hand, YouTube continues to dominate the social media market, with many younger audiences turning to social media to answer their questions rather than traditional search engines.

YouTube remains the second most popular social network worldwide in terms of monthly active users, right behind Facebook.

Simply put, brands that aren’t creating videos will fall behind as video continues to remain front and center across the web.

From a technical SEO perspective, video content:

  • Typically sees viewers sticking around longer and earns attention better than written content.
  • Can increase engagement and help surface the video higher in results with optimized thumbnails and metadata.
  • Be shared with others or embedded on websites if users deem the video valuable, which helps increase a brand’s domain authority.
  • Encourage users to take action by clicking through to a landing page or consuming more video content from the brand.
  • Improve brand awareness as watchers find a product or service for the first time.

Bottom line? Video can be a high-impact asset across the funnel if you follow SEO best practices to become more discoverable.

How Social Media Influences Brand Discoverability

Social platforms have become key drivers of content discovery, supported by a recent study that found social media is the preferred product discovery channel for Gen Z and Millennials, with Gen X and Boomers quickly jumping aboard the growing trend.

The same study found 33% respondents reported having discovered a product on social media in the past three months.

Whether it’s TikTok serving as a search engine for Gen Z or Instagram Reels driving top-of-funnel awareness, social media isn’t just a distribution channel. It is now playing an integral role in SEO strategy and helping brands get discovered by highly motivated audiences..

Social engagement and content can both act as indirect ranking signals that can amplify your video’s reach:

  • Shares and saves can improve organic search visibility.
  • Comments can highlight to search algorithms that consumers find your content engaging and relevant.
  • Higher watch times can prompt search engines to see your video as authoritative and credible on the topic.

When your video content performs well socially, it can prompt a ripple effect across the entire digital ecosystem, including coveted positioning in the search results.

Video SEO Strategies That Also Boost Social Performance

Video optimization is crucial to discoverability.

Many foundational SEO tactics remain relevant when it comes to video optimization.

Each platform will require slight refinements. However, overall SEO efforts will look largely similar across each platform.

Here are a few areas that deserve your time and attention.

Conduct Keyword Research

Keyword optimization is still a must to be found in social media and search platforms.

However, the keyword you’d wish to target may vary by platform. Conduct keyword research for each platform you plan to distribute your video to.

To find the keywords that are getting the most traction, type your desired keyword into the search bar to find trending topics and phrases related to that keyword.

For example, TikTok’s Creator Search Insights tool shows what users are searching for most on the platform, along with suggested keywords and even scripts to help you create high-performing promotional content for your business.

Additionally, keyword research tools like TubeBuddy, VidIQ, and Google Trends can help you identify high-volume phrases that resonate with your intended audience.

Don’t stuff your title with keywords and expect it to perform well. Instead, make it compelling and searchable. Example: “Want Clear Skin in 7 Days? This Acne Fix Changed Everything”

Optimize Thumbnails

Your video thumbnail will be the first thing customers might see on both YouTube and social feeds.

It should match the context of the video, but should also be engaging enough to get people to stop scrolling and click on your content instead.

Close-ups tend to perform better as the image appears less abstract than overly crowded images.

Additionally, brand your visuals for consistency so that potential customers start to recognize your video thumbnails whenever they come across them.

Use bold text, expressive faces, and high contrast colors in your video thumbnail to grab attention and communicate the video’s value at a glance.

Leverage Transcripts And Captions

Closed captions aren’t just for making your content more accessible. They can actually help search engines understand your video.

Captions and transcripts provide text-based content for crawlers, which can help boost your video’s search visibility.

YouTube indexes this text, increasing the likelihood of keyword terms and phrases spoken in a video appearing in relevant searches.

Transcripts can be turned into content such as social and blog posts, amplifying your message further.

Implement Structured Data And Schema Markup

If you plan to host videos on your own website, add VideoObject schema markup to help Google better understand your content, such as your video title, description, duration, upload date, and more.

Schema markup can also help improve the chances of your videos appearing in rich results like featured snippets or video carousels, increasing the likelihood of being found in search and customers clicking through to learn more.

Create Custom URLs And Metadata

Your videos should be treated like any other piece of high-value content you produce.

Use a clear and relevant URL structure to improve SEO clarity. Be sure to include keywords in your URLs as well.

In addition to your URLs, you should optimize your video metadata, such as your title tags, descriptions, and thumbnail alt text.

This helps search engines index your videos appropriately and ensures consistency when videos are embedded, shared on social media, or linked across channels.

Craft Engaging Intros

Your introduction is your first opportunity to capture a viewer’s attention. It sets the tone and stage for what’s to come. It makes or breaks if a viewer drops off or if they continue to consume your content.

Keep the first few seconds of your video punchy and unique. You only have a few seconds to make an impression, so make them count. It’s worth getting to the meat of your content quickly.

If you’re using TikTok or Instagram, use trending audio to capture attention and potentially appear to more consumers. Use relevant hashtags to surface keywords and terms related to your video content.

Create Playlists

YouTube gives brands the option to create playlists to help increase engagement and dwell time.

Playlists should include videos that relate to each other to ensure an optimal user experience.

For example, a home improvement company may want to create a playlist featuring “how-to” videos to assist customers who want to tackle home projects on their own.

For YouTube videos, use relevant tags and strong descriptions that include your focus keyword or keywords to boost your SEO efforts. End screens are also an optimal opportunity to give users a next step.

What action do you want the viewer to take next? Include a strong call-to-action (CTA) to encourage viewers to click through to a product page, subscribe to your newsletter, follow your YouTube channel, and more.

Include A Call To Action

For most businesses, a video is not only an opportunity to provide informational content to an audience, but it’s also a tool to encourage viewers to perform a desired action.

A simple end promoter should not be the only place you include your CTA, as some viewers may not watch until the end of your video.

Including your CTA early in your video is recommended to capitalize on viewer attention.

Another area to include your CTA is in your caption (Instagram, Facebook, X, and TikTok) or your description (YouTube).

By putting your CTA in more places, your customers will hear your message loud and clear, and may be more likely to convert.

Common Mistakes To Avoid

Even seasoned marketers can miss the mark with video if they’re not careful.

Keep the following in mind when optimizing your social media for SEO to ensure you adhere to best practices and avoid common mistakes.

Over-Optimizing

Don’t over-optimize your content for search and lose human appeal.

This includes injecting as many keywords as possible within your video, where the dialogue sounds unnatural and unhelpful. Over-optimizing may also look like adding too many irrelevant tags.

Ignoring Platform-Specific Specs

Each social media platform has specific video guidelines to adhere to, such as dimensions, length, and caption formats.

Be sure to familiarize yourself with each to provide a positive user experience irrespective of the platform.

Failing To Engage With Your Audience

If your business has time to create a video, it must also make the time to respond to comments.

Social media responses act as a signal for improved search visibility, helping your content be highlighted.

Responses also encourage interaction and show that your business values its customers.

Not Testing

You might not get it right the first time, which is where A/B testing becomes useful.

A/B testing allows your business to see what’s resonating with users and what isn’t. Test your video titles, thumbnails, copy, and CTAs to see what works best.

Only Focusing On One Platform

Social media users are active on myriad social media platforms. You want your business to be found everywhere your customers are looking.

Don’t focus all your attention on one place. Instead, spread the love and create a solid social media presence across the platforms where your target demographic spends their time.

Not Posting Consistently

Make your presence known on social media. Just as you regularly produce written content, your brand should also produce a steady stream of video content.

Customers prefer diverse content formats. Video increases engagement and also helps break up your feed, helping you to create a more visually appealing profile for your business.

Not Linking To Your Social Media Profiles On GBP

Your Google Business Profile offers myriad opportunities to advertise your business and your products or services to searchers.

One of these features is the ability to add your business’s social media profile links to your GBP.

Take advantage of this free opportunity to help drive customers to your social media channels to learn more about your brand and engage with your community.

Why Alignment Wins

Video SEO and social media go hand-in-hand.

Proper video SEO helps your brand appear more favorably in the SERPs, improves your chances of appearing for featured snippets, and ultimately drives more eyes to your landing pages.

As social media and SEO continue to intertwine, optimizing video for each and every social media channel is a non-negotiable.

To stay ahead, social media marketers should:

  • Follow best practices for each platform.
  • Prioritize creating and distributing video content.
  • Let data guide your video strategy.
  • Test different formats and messaging to see what works best.
  • Learn where your target audience spends their time.
  • Respond to feedback and comments.
  • Optimize every single video.
  • Delve into the data to uncover key findings.

Marketers who optimize for both discovery and engagement, while also adhering to the unique dynamics of each platform, will be the ones who win attention, build authority, and drive more potential revenue for the business.

Start by creating one video. Choose a high-performing piece of content, turn it into a 60-second video, optimize it for search and social, and track how it performs across platforms.

Chances are, the return will be well worth the work.

More Resources:


Featured Image: Xavier Lorenzo/Shutterstock

How US research cuts are threatening crucial climate data

Over the last few months, and especially the last few weeks, there’s been an explosion of news about proposed budget cuts to science in the US. One trend I’ve noticed: Researchers and civil servants are sounding the alarm that those cuts mean we might lose key data that helps us understand our world and how climate change is affecting it.

My colleague James Temple has a new story out today about researchers who are attempting to measure the temperature of mountain snowpack across the western US. Snow that melts in the spring is a major water source across the region, and monitoring the temperature far below the top layer of snow could help scientists more accurately predict how fast water will flow down the mountains, allowing farmers, businesses, and residents to plan accordingly.

But long-running government programs that monitor the snowpack across the West are among those being threatened by cuts across the US federal government. Also potentially in trouble: carbon dioxide measurements in Hawaii, hurricane forecasting tools, and a database that tracks the economic impact of natural disasters. It’s all got me thinking: What do we lose when data is in danger?

Take for example the work at Mauna Loa Observatory, which sits on the northern side of the world’s largest active volcano. In this Hawaii facility, researchers have been measuring the concentration of carbon dioxide in the atmosphere since 1958.

The resulting graph, called the Keeling Curve (after Charles David Keeling, the scientist who kicked off the effort) is a pillar of climate research. It shows that carbon dioxide, the main greenhouse gas warming the planet, has increased in the atmosphere from around 313 parts per million in 1958 to over 420 parts per million today.

Proposed cuts to the National Oceanic and Atmospheric Administration (NOAA) jeopardize the Keeling Curve’s future. As Ralph Keeling (current steward of the curve and Keeling’s son) put it in a new piece for Wired, “If successful, this loss will be a nightmare scenario for climate science, not just in the United States, but the world.”

This story has echoes across the climate world right now. A lab at Princeton that produces what some consider the top-of-the-line climate models used to make hurricane forecasts could be in trouble because of NOAA budget cuts. And last week, NOAA announced it would no longer track the economic impact of the biggest natural disasters in the US.

Some of the largest-scale climate efforts will feel the effects of these cuts, and as James’s new story shows, they could also seep into all sorts of specialized fields. Even seemingly niche work can have a huge impact not just on research, but on people.

The frozen reservoir of the Sierra snowpack provides about a third of California’s groundwater, as well as the majority used by towns and cities in northwest Nevada. Researchers there are hoping to help officials better forecast the timing of potential water supplies across the region.

This story brought to mind my visit to El Paso, Texas, a few years ago. I spoke with farmers there who rely on water coming down the Rio Grande, alongside dwindling groundwater, to support their crops. There, water comes down from the mountains in Colorado and New Mexico in the spring and is held in the Elephant Butte Reservoir. One farmer I met showed me pages and pages of notes of reservoir records, which he had meticulously copied by hand. Those crinkled pages were a clear sign: Publicly available data was crucial to his work.

The endeavor of scientific research, particularly when it involves patiently gathering data, isn’t always exciting. Its importance is often overlooked. But as cuts continue, we’re keeping a lookout, because losing data could harm our ability to track, address, and adapt to our changing climate. 

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The Download: Montana’s experimental treatments, and Google DeepMind’s new AI agent

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The first US hub for experimental medical treatments is coming

The news: A bill that allows clinics to sell unproven treatments has been passed in Montana. Under the legislation, doctors can apply for a license to open an experimental treatment clinic and recommend and sell therapies not approved by the Food and Drug Administration (FDA) to their patients.

Why it matters: Once it’s signed by the governor, the law will be the most expansive in the country in allowing access to drugs that have not been fully tested. The bill allows for any drug produced in the state to be sold in it, providing it has been through phase I clinical trials—but these trials do not determine if the drug is effective.

The big picture: The bill was drafted and lobbied for by people interested in extending human lifespans. And these longevity enthusiasts are hoping Montana will serve as a test bed for opening up access to experimental drugs. Read the full story.

—Jessica Hamzelou

Google DeepMind’s new AI agent cracks real-world problems better than humans can

Google DeepMind has once again used large language models to discover new solutions to long-standing problems in math and computer science. This time the firm has shown that its approach can not only tackle unsolved theoretical puzzles, but improve a range of important real-world processes as well.

The new tool, called AlphaEvolve, uses large language models (LLMs) to produce code for a wide range of different tasks. LLMs are known to be hit and miss at coding. The twist here is that AlphaEvolve scores each of Gemini’s suggestions, throwing out the bad and tweaking the good, in an iterative process, until it has produced the best algorithm it can. In many cases, the results are more efficient or more accurate than the best existing (human-written) solutions.Read the full story.

—Will Douglas Heaven

Research cuts are threatening crucial climate data

—Casey Crownhart

Over the last few weeks, there’s been an explosion of news about proposed budget cuts to science in the US. Researchers and civil servants are sounding the alarm that those cuts mean we might lose key data that helps us understand our world and how climate change is affecting it.

Long-running US government programs that monitor the snowpack across the West are among those being threatened by cuts across the US federal government, as my colleague James Temple’s new story explores. Also potentially in trouble: carbon dioxide measurements in Hawaii, hurricane forecasting tools, and a database that tracks the economic impact of natural disasters. 

It’s all got me thinking: What do we lose when data is in danger? Read the full story.

This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 Donald Trump doesn’t want Apple building iPhones in India
The US President claims Apple will be upping their US production as a result. (Bloomberg $)
+ He also said that India was willing to “literally charge us no tariffs.” (WSJ $)

2 Elon Musk’s Grok chatbot ranted about white genocide 
In response to completely unrelated queries. (FT $)
+ It’s not the first time Grok has shared questionable responses. (Bloomberg $)
+ Grok told users it was instructed to accept white genocide as real. (The Guardian)

3 RFK Jr doesn’t think we should take his medical advice
Which begs the question: why is he US Health and Human Services secretary? (NY Mag $)
+ Kennedy said his opinions on vaccines are irrelevant. (NYT $)
+ He defended his decision to downsize the health department amid protests. (The Guardian)

4 GM’s new EV battery can power a truck for more than 400 miles 
Its lithium manganese-rich cells use cheaper minerals than lithium-ion ones. (Fast Company $)
+ Tariffs are bad news for batteries. (MIT Technology Review)

5 Anthropic has been accused of using AI-generated evidence in a legal case
A lawyer for Universal Music Group claimed an expert cited a source that didn’t exist. (Reuters)
+ A judge in another case reportedly caught fake AI citations, too. (Ars Technica)
+ AI companies are finally being forced to cough up for training data. (MIT Technology Review)

6 AI won’t put human radiologists out of a job any time soon
The technology is helpful, but is unable to do everything trained human experts can. (NYT $)
+ Why it’s so hard to use AI to diagnose cancer. (MIT Technology Review)

7 The US Defense Department wants faster aircraft and missiles
And startups are more than willing to answer the call. (WP $)
+ Phase two of military AI has arrived. (MIT Technology Review)

8 SpaceX has successfully tested its Starship rocket 🚀
Clearing a major hurdle ahead of its planned launch later this month. (Wired $)

9 YouTube will start inserting ads into videos’ crucial moments
Wow, that doesn’t sound annoying at all. (TechCrunch)

10 Apple’s Vision Pro headset is a pain in the neck
And early adopters are regretting shelling out $3,500 apiece. (WSJ $)
+ Maybe the ability to scroll using their eyes will change their minds. (Bloomberg $)

Quote of the day

“To say a professor is ‘some kind of monster’ for using AI to generate slides “is, to me, ridiculous.”

—Paul Shovlin, a professor at Ohio University, reacts to student backlash against professors using AI to create teaching materials, the New York Times reports.

One more thing

Who gets to decide who receives experimental medical treatments?

There has been a trend toward lowering the bar for new medicines, and it is becoming easier for people to access treatments that might not help them—and could even harm them. Anecdotes appear to be overpowering evidence in decisions on drug approval. As a result, we’re ending up with some drugs that don’t work.

We urgently need to question how these decisions are made. Who should have access to experimental therapies? And who should get to decide? Such questions are especially pressing considering how quickly biotechnology is advancing. We’re not just improving on existing classes of treatments—we’re creating entirely new ones. Read the full story.

—Jessica Hamzelou

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet ’em at me.)

+ Food nostalgia is the best nostalgia, and this Bluesky account of discontinued foods doesn’t disappoint.
+ Don’t even think of calling your newborn baby King if you live in New Zealand.
+ Actor Jeremy Strong just loves a bucket hat.
+ Watch out Swiss drivers—a duck has been caught speeding 🦆

This baby boy was treated with the first personalized gene-editing drug

Doctors say they constructed a bespoke gene-editing treatment in less than seven months and used it to treat a baby with a deadly metabolic condition.

The rapid-fire attempt to rewrite the child’s DNA marks the first time gene editing has been tailored to treat a single individual, according to a report published in the New England Journal of Medicine.

The baby who was treated, Kyle “KJ” Muldoon Jr., suffers from a rare metabolic condition caused by a particularly unusual gene misspelling.

Researchers say their attempt to correct the error demonstrates the high level of precision new types of gene editors offer. 

“I don’t think I’m exaggerating when I say that this is the future of medicine,” says Kiran Musunuru, an expert in gene editing at the University of Pennsylvania whose team designed the drug. “My hope is that someday no rare disease patients will die prematurely from misspellings in their genes, because we’ll be able to correct them.”

The project also highlights what some experts are calling a growing crisis in gene-editing technology. That’s because even though the technology could cure thousands of genetic conditions, most are so rare that companies could never recoup the costs of developing a treatment for them. 

In KJ’s case, the treatment was programmed to correct a single letter of DNA in his cells.

“In reality, this drug will probably never be used again,” says Rebecca Ahrens-Nicklas, a physician at the Children’s Hospital of Philadelphia who treats metabolic diseases in children and who led the overall effort to treat the child.

That effort involved more than 45 scientists and doctors as well as pro bono assistance from several biotechnology companies. Musunuru says he cannot estimate how much it had cost in time and effort.

Eventually, he says, the cost of custom gene-editing treatments might be similar to that of liver transplants, which is around $800,000, not including lifelong medical care and drugs.

The researchers used a new version of CRISPR technology, called base editing, that can replace a single letter of DNA at a specific location. 

Previous versions of CRISPR have generally been used to delete genes, not rewrite them to restore their function.

The researchers say they were looking for a patient to treat when they learned about KJ. After he was born in August, a doctor noted that the infant was lethargic. Tests found he had a metabolic disorder that leads to the buildup of ammonia, a condition that’s frequently fatal without a liver transplant.

In KJ’s case, gene sequencing showed that the cause was a misspelled letter in the gene CPS1 that stopped it from making a vital enzyme.

The researchers approached KJ’s parents, Nicole and Kyle Muldoon, with the idea of using gene editing to try to correct their baby’s DNA. After they agreed, a race ensued to design the editing drug, test it in animals, and get permission from the US Food and Drug Administration to treat KJ in a one-off experiment.

The team says the boy, who hasn’t turned one yet, received three doses of the gene-editing treatment, of gradually increasing strength. They can’t yet determine exactly how well the gene editor worked because they don’t want to take a liver biopsy, which would be needed to check if KJ’s genes have really been corrected.

But Ahrens-Nicklas says that because the child is “growing and thriving,” she thinks the editing has been at least partly successful and that he may now have “a milder form of this horrific disease.”

“He’s received three doses of the therapy without any complications, and is showing some early signs of benefit,” she says. “It’s really important to say that it’s still very early, so we will need to continue to watch KJ closely to fully understand the full effects of this therapy.”

The case suggests a future in which parents will take sick children to a clinic where their DNA will be sequenced, and then they will rapidly receive individualized treatments. Currently, this would only work for liver diseases, for which it’s easier to deliver gene-editing instructions, but eventually it might also become a possible approach for treating brain diseases and conditions like muscular dystrophy.

The experiment is drawing attention to a gap between what gene editing can do and what treatments are likely to become available to people who need them.

So far, biotechnology companies testing gene editing are working only on fairly common gene conditions, like sickle cell disease, leaving hundreds of ultra-rare conditions aside. One-off treatments, like the one helping KJ, are too expensive to create and get approved without some way to recoup the costs.

The apparent success in treating KJ, however, is making it even more urgent to figure out a way forward. Researchers acknowledge that they don’t yet know how to scale up personalized treatment, although Musunuru says initial steps to standardize the process are underway at his university and in Europe.

New Ecommerce Tools: May 15, 2025

We publish a rundown each week of new products from companies offering services to ecommerce merchants. This installment includes updates on AI-powered site-search, shopping assistants, product listings, merchant financing, point-of-sale solutions, cross-border transactions, and warehouse automation.

Got an ecommerce product release? Email releases@practicalecommerce.com.

New Tools for Merchants

Miva expands ecommerce platform with AI search infrastructure and no-code tools. Miva, an ecommerce software and service provider, has launched Miva 10.12, introducing tools to help ecommerce teams move faster and scale. The update adds platform infrastructure for Vexture, Miva’s AI-powered search engine that uses vector-based technology to interpret shopper intent. Merchants can now create and assign product page designs — a new design control panel helps teams manage fonts, buttons, and page spacing. Also, background tasks are now easier to monitor, prioritize, and troubleshoot.

Home page of Miva

Miva

eBay launches warranties for open-box goods. eBay is offering a warranty program on open-box products in the Electronics and Home categories. Listings in those categories will automatically display the Certified Open Box badge when they meet key service standards and offer free shipping and returns. Each Certified Open Box item comes with a one-year warranty serviced by Allstate Protection Plans.

New genAI tool helps Amazon sellers improve product listings. Amazon has launched Enhance My Listing, a new genAI tool wherein sellers can automatically maintain and optimize their product listings. The tool generates timely, relevant recommendations for product titles, attributes, descriptions, and missing details. Sellers can review, customize, or decline these suggestions, ensuring that listings reflect their expertise while benefiting from Amazon’s detailed knowledge.

Storfund launches merchant financing on marketplaces via Mirakl Connect. Storfund, an ecommerce cash-flow provider, has announced a new partnership enabling businesses to use its cash-flow solution on the 400 marketplaces powered by Mirakl, an ecommerce software provider. Designed to eliminate the standard delay of 25 days between sale and payout, Storfund’s new Daily Advance pays sellers as soon as they ship their goods. Sellers can apply for cash flow on hundreds of marketplaces at the same time.

Home page of Storfund

Storfund

WooCommerce names Square as preferred POS partner. WooCommerce has named Square as its preferred point-of-sale provider for in-person payments. The strategic partnership enables seamless synchronization of inventory, orders, and customer data across digital and physical retail. Square’s POS integration is available to WooCommerce merchants in eight countries: the U.S., Canada, Australia, Japan, the U.K., France, Spain, and Ireland.

Ebanx integrates UPI recurring payments for cross-border ecommerce in India. Ebanx, a global provider of payment services for emerging markets, has integrated UPI Autopay, the recurring payments feature of India’s instant payments system, Unified Payments Interface, into its cross-border payments platform. With the integration, Ebanx’s global ecommerce merchants in industries such as SaaS, streaming, and other subscription-based businesses can now offer recurring payments to their India-based customers.

Bluecore brings AI shopping assistant Alby to Shopify merchants. Bluecore, a marketing technology company, has expanded Alby, its genAI customer-service agent. Merchants can access Alby through Shopify to power conversational AI shopping unique to their brand and customers. Once activated, the AI agent will learn, make decisions, and act on a brand’s behalf, pulling from product data, brand assets, and the company’s tone of voice to ensure accurate responses.

Home page of Alby

Alby

Fluent and Rebuy partner on post-purchase advertising for Shopify merchants. Fluent, a provider of commerce media solutions, and Rebuy, an ecommerce personalization platform for Shopify brands, have announced a strategic partnership to launch Rebuy Ads powered by Fluent, designed to help merchants further engage their customers while unlocking additional revenue at no cost. The partnership pairs Fluent’s AI-powered advertiser marketplace and demand-generation expertise with Rebuy’s Shopify integration and partner network.

Shopify updates its Sidekick AI tool. Shopify has updated its AI-powered commerce assistant, Sidekick, with multi-step reasoning. Merchants can ask complex questions, and Sidekick will analyze multiple data sources to pinpoint the issue and make suggestions. Sidekick can now generate custom images based on text prompts to help merchants create visually appealing content for hero banners, marketing materials, and illustrations. Also, Sidekick now supports all 20 admin languages, automatically responding in the language detected.

Uber rolls out Courier XL in India to deliver packages up to 1,650 pounds. Uber has launched Courier XL in India, an expanded service under its logistics arm aimed at transporting packages weighing up to 750 kilograms (1,650 pounds). Initially rolled out in Delhi and Mumbai, the service is part of Uber’s broader strategy to strengthen courier offerings and compete in the large-package delivery space, with plans to expand to more cities in the coming months. Courier XL builds on Uber’s two-wheeler parcel service, Courier.

Robotic warehouse firm Pio expands product lineup. Pio, a warehouse automation provider, has announced its global expansion with a new product lineup. Pio now offers four scalable, modular systems: P100 for businesses with fewer than 10,000 monthly orders; P200 for mid-sized operations; P400 for high volume fulfillment centres; P600 for larger high-capacity warehouses. According to Pio, customers can reduce picking labour costs by up to 80%, and installations take as little as five days.

Home page of Pio

Pio

Googler’s Deposition Offers View Of Google’s Ranking Systems via @sejournal, @martinibuster

A Google engineer’s redacted testimony published online by the U.S. Justice Department offers a look inside Google’s ranking systems, offering an idea about Google’s quality scores and introduces a mysterious popularity signal that uses Chrome data.

The document offers a high level and very general view of ranking signals, providing a sense of what the algorithms do but not the specifics.

Hand-Crafted Signals

For example, it begins with a section about the “hand crafting” of signals which describes the general process of taking data from quality raters, clicks and so on and applying mathematical and statistical formulas to generate a ranking score from three kinds of signals. Hand crafted means scaled algorithms that are tuned by search engineers. It doesn’t mean that they are manually ranking websites.

Google’s ABC Signals

The DOJ document lists three kinds of signals that are referred to as ABC Signals and correspond to the following:

  • A – Anchors (pages linking to the target pages),
  • B – Body (search query terms in the document),
  • C – Clicks (user dwell time before returning to the SERP)

The statement about the ABC signals is a generalization of one part of the ranking process. Ranking search results is far more complex and involves hundreds if not thousands of additional algorithms at every step of the ranking process, from indexing, link analysis, anti-spam processes, personalization, re-ranking, and other processes. For example, Liz Reid has discussed Core Topicality Systems as part of the ranking algorithm and Martin Splitt has discussed annotations as a part of understanding web pages.

This is what the document says about the ABC signals:

“ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to the query.

T* (Topicality) effectively combines (at least) these three signals in a relatively hand-crafted way. Google uses to judge the relevance of the document based on the query terms.”

The document offers an idea of the complexity of ranking web pages:

“Ranking development (especially topicality) involves solving many complex mathematical problems. For topicality, there might be a team of engineers working continuously on these hard problems within a given project.

The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.”

The document compares their hand-crafted approach to Microsoft’s automated approach, saying that when something breaks at Bing it’s far more difficult to troubleshoot than it is with Google’s approach.

Interplay Between Page Quality And Relevance

An interesting point revealed by the search engineer is that page quality is independent of query. If a page is determined to be high quality, trustworthy, it’s regarded as trustworthy across all related queries which is what is meant by the word static, it’s not dynamically recalculated for each query. However, there are relevance-related signals in the query that can be used to calculate the final rankings, which shows how relevance plays a decisive role in determining what gets ranked.

This is what they said:

“Quality
Generally static across multiple queries and not connected to a specific query.

However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.

Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. If competitors see the logs, then they have a notion of “authority” for a given site.

Quality score is hugely important even today. Page quality is something people complain about the most…”

AI Gives Cause For Complaints Against Google

The engineer states that people complain about quality but also says that AI aggravates the situation by making it worse.

He says about page quality:

“Nowadays, people still complain about the quality and AI makes it worse.

This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.”

eDeepRank – A Way To Understand LLM Rankings

The Googler lists other ranking signals, including one called eDeepRank which is an LLM-based system that uses BERT, which is a language related model.

He explains:

“eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. “

That part about decomposing LLM signals into components seems to be a reference of making the LLM-based ranking signals more transparent so that search engineers can understand why the LLM is ranking something.

PageRank Linked To Distance Ranking Algorithms

PageRank is Google’s original ranking innovation and it has since been updated. I wrote about this kind of algorithm six years ago . Link distance algorithms calculate the distance from authoritative websites for a given topic (called seed sites) to other websites in the same topic. These algorithms start with a seed set of authoritative sites in a given topic and sites that are further away from their respective seed site are determined to be less trustworthy. Sites that are closer to the seed sets are likelier to be more authoritative and trustworthy.

This is what the Googler said about PageRank:

“PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.”

Read about this kind of link ranking algorithm: Link Distance Ranking Algorithms

Cryptic Chrome-Based Popularity Signal

There is another signal whose name is redacted that’s related to popularity.

Here’s the cryptic description:

“[redacted] (popularity) signal that uses Chrome data.”

A plausible claim can be made that this confirms that the Chrome API leak is about actual ranking factors. However, many SEOs, myself included, believe that those APIs are developer-facing tools used by Chrome to show performance metrics like Core Web Vitals within the Chrome Dev Tools interface.

I suspect that this is a reference to a popularity signal that we might not know about.

The Google engineer does refer to another leak of documents that reference actual “components of Google’s ranking system” but that they don’t have enough information for reverse engineering the algorithm.

They explain:

“There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.

For example
The documents alone do not give you enough details to figure it out, but the data likely does.”

Takeaway

The newly released document summarizes a U.S. Justice Department deposition of a Google engineer that offers a general outline of parts of Google’s search ranking systems. It discusses hand-crafted signal design, the role of static page quality scores, and a mysterious popularity signal derived from Chrome data.

It provides a rare look into how signals like topicality, trustworthiness, click behavior, and LLM-based transparency are engineered and offers a different perspective on how Google ranks websites.

Featured Image by Shutterstock/fran_kie

HTTP Status Codes Google Cares About (And Those It Ignores) via @sejournal, @MattGSouthern

Google’s Search Relations team recently shared insights about how the search engine handles HTTP status codes during a “Search Off the Record” podcast.

Gary Illyes and Martin Splitt from Google discussed several status code categories commonly misunderstood by SEO professionals.

How Google Views Certain HTTP Status Codes

While the podcast didn’t cover every HTTP status code (obviously, 200 OK remains fundamental), it focused on categories that often cause confusion among SEO practitioners.

Splitt emphasized during the discussion:

“These status codes are actually important for site owners and SEOs because they tell a story about what happened when a particular request came in.”

The podcast revealed several notable points about how Google processes specific status code categories.

The 1xx Codes: Completely Ignored

Google’s crawlers ignore all status codes in the 1xx range, including newer features like “early hints” (HTTP 103).

Illyes explained:

“We are just going to pass through [1xx status codes] anyway without even noticing that something was in the 100 range. We just notice the next non-100 status code instead.”

This means implementing early hints might help user experience, but won’t directly benefit your SEO.

Redirects: Simpler Than Many SEOs Believe

While SEO professionals often debate which redirect type to use (301, 302, 307, 308), Google’s approach focuses mainly on whether redirects are permanent or temporary.

Illyes stated:

“For Google search specifically, it’s just like ‘yeah, it was a redirection.’ We kind of care about in canonicalization whether something was temporary or permanent, but otherwise we just [see] it was a redirection.”

This doesn’t mean redirect implementation is unimportant, but it suggests the permanent vs. temporary distinction is more critical than the specific code number.

Client Error Codes: Standard Processing

The 4xx range of status codes functions largely as expected.

Google appropriately processes standard codes like 404 (not found) and 410 (gone), which remain essential for proper crawl management.

The team humorously mentioned status code 418 (“I’m a teapot”), an April Fool’s joke in the standards, which has no SEO impact.

Network Errors in Search Console: Looking Deeper

Many mysterious network errors in Search Console originate from deeper technical layers below HTTP.

Illyes explained:

“Every now and then you would get these weird messages in Search Console that like there was something with the network… and that can actually happen in these layers that we are talking about.”

When you see network-related crawl errors, you may need to investigate lower-level protocols like TCP, UDP, or DNS.

What Wasn’t Discussed But Still Matters

The podcast didn’t cover many status codes that definitely matter to Google, including:

  • 200 OK (the standard successful response)
  • 500-level server errors (which can affect crawling and indexing)
  • 429 Too Many Requests (rate limiting)
  • Various other specialized codes

Practical Takeaways

While this wasn’t a comprehensive guide to HTTP status codes, the discussion revealed several practical insights:

  • For redirects, focus primarily on the permanent vs. temporary distinction
  • Don’t invest resources in optimizing 1xx responses specifically for Google
  • When troubleshooting network errors, look beyond HTTP to deeper protocol layers
  • Continue to implement standard status codes correctly, including those not specifically discussed

As web technology evolves with HTTP/3 and QUIC, understanding how Google processes these signals can help you build more effective technical SEO strategies without overcomplicating implementation.


Featured Image: Roman Samborskyi/Shutterstock

Ask A PPC: How Will The Tariffs Impact My PPC Campaigns? via @sejournal, @navahf

Tariffs – and other international trade turbulence – tend to live in the operations or finance side of the house.

In reality, these shifts disrupt marketing, media efficiency, and PPC strategy more than many brands are prepared for.

When the ripple effects hit, they often show up first in your performance metrics.

This month’s Ask A PPC is focused on:

  • The impact of tariffs.
  • What you can do about it.
  • Adjacent implications.

National & International Implications Of Tariffs For PPC

When new tariffs are imposed or existing ones are expanded, they change the fundamental cost structure of goods.

That alone is enough to throw a wrench into your performance benchmarks, but the real chaos comes from how differently brands respond.

  • Some advertisers raise prices, hoping to preserve margins.
  • Others eat the cost, at least in the short term, to maintain market share.
  • Still others pull back spend entirely in affected markets or shift budgets into “safer” channels.

This reshuffling affects auction dynamics. If big players reduce spend in your vertical, you might see the cost-per-click drop – temporarily.

But, if a price increase tanks your conversion rate and you’re still optimizing for return on ad spend (ROAS), your cost-per-acquisition can spike even with stable CPCs.

As Mike Ryan of Smarter Ecommerce (SMEC) reported, Temu’s sharp decline in impression share indicates fear in investing in a chaotic market:

For international brands, the implications are even more tangled.

A product line that’s suddenly 20% more expensive in the U.S. might still perform normally in the EU or Canada. That means different messaging, different ROAS targets, and possibly different bid strategies across markets.

This is why it’s really important to segment markets by Google campaign so you can dynamically adjust budgets. It’s worth noting that Microsoft, Meta, and LinkedIn allow for ad group/ad set location targeting.

How Tariffs Influence CPCs (Even When They Don’t Change the Bid)

One of the biggest misconceptions is that tariffs = higher CPCs. The reality is more subtle.

Tariffs increase the cost of doing business. For physical products, that usually means higher retail prices or tighter margins. And that, in turn, changes how efficiently your ads can convert.

  • Higher prices can depress conversion rates, especially if your landing pages haven’t been updated to match the current world state.
  • Softening conversion rates make your CPAs more expensive, even if the platform’s reported CPC hasn’t changed.
  • Smart bidding reacts to this. If ROAS or CPA targets aren’t being hit due to lower conversion rates, Google and Meta will either scale back delivery or hunt for cheaper (possibly lower-intent) clicks.

So no, tariffs don’t directly change your bids, but they will change how your bidding strategy performs – and whether you’re hitting your key performance indicators (KPIs).

What Should Advertisers Do?

There is no right or wrong answer here. Developing a success plan in the tariff world requires balancing proven tactics and empathy for evolving consumer sentiment.

Here are some good places to start:

1. Your PPC Accounts

  • Check your conversion rates by market. A dip in one region but not another could indicate a local pricing or availability issue. If a market is no longer profitable enough to justify the budget, consider pausing the investment and moving that spend to other regions
  • Refine your audience targeting. Consider excluding in-market and life event audiences that are too far out of your core market, as well as layering on segments from YouTube content, custom intent, and lookalikes (Demand Gen only).
  • Adjust your creative. Emphasize non-price value props: longevity, warranty, local support, sustainability. Also, make sure your creative doesn’t pigeonhole you into one country or another. This is a great time to audit your assets (formerly known as extensions) to ensure nothing comes across in a way you don’t intend.
  • Recalibrate smart bidding. Adjust your ROAS or CPA targets to reflect new economic realities as well as any micro-conversions you may introduce. If performance is drastically different, you may need to input exclusions into Google’s algorithm.

2. On Your Landing Pages

  • Be transparent about pricing changes. Consumers are more forgiving when they understand why something costs more, especially if the messaging is human and upfront. Additionally, make sure your language speaks to all customers, not just those in the U.S.
  • Lean into trust-building elements: Shipping policies, customer reviews, and return guarantees help offset price sensitivity. This is especially important above the fold.
  • Highlight sustainable or local production practices, but with care. “Made in USA” messaging can work for domestic campaigns, but be cautious with international audiences. What resonates in one market might alienate in another.

Bonus: Don’t Forget The Environmental Angle

Tariffs aren’t just about money. They often reflect or trigger shifts in global logistics. That means longer shipping routes, more warehousing costs, and bigger carbon footprints. Consumers are paying attention.

If your brand has made changes to source materials domestically or reduce emissions, that’s worth testing in ad copy and landing pages. Sustainability isn’t just a PR point; it’s a conversion lever.

Final Thought

PPC doesn’t operate in a vacuum. Every economic policy, trade shift, or tariff war changes the playing field, often before your attribution model can catch up.

As paid media managers, we can’t control tariffs, but we can recognize their downstream effects early, respond quickly, and guide our brands through the storm with a smarter strategy.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

How Referral Traffic Undermines Long-Term Brand Growth via @sejournal, @martinibuster

Mordy Oberstein, a search marketing professional whom I hold in high esteem, recently shared the provocative idea that referral traffic is not a brand’s friend and that every brand, as it matures, should wean itself from it. Referrals from other websites are generally considered a sign of a high-performing business, but it’s not a long-term strategy because it depends on sources that cannot be controlled.

Referral Traffic Is Necessary But…

Mordy Oberstein (LinkedIn profile), formerly of Wix, asserted in a Facebook post that relying on a traffic source, whether that’s another website or a search engine, offers a degree of vulnerability to maintaining steady traffic and performance.

He broke it down as a two-fold weakness:

  • Relying on the other site to keep featuring your brand.
  • Relying on Google to keep ranking that other site which in turn sends visitors to your brand.

The flow of traffic can stop at either of those two points, which is a hidden weakness that can affect the long-term sustainability of healthy traffic and sales.

Mordy explained:

“It’s a double vulnerability…

1) Relying on being featured by the website (the traffic source)
2) Relying on Google to give that website …traffic (the channel)

There are two levels of exposure & vulnerability.

As your brand matures, you want to own your own narrative.

More referral traffic is not your friend. It’s why, as a brand matures, it should wean off of it.

Full disclosure, this is my opinion. I am sure a lot of people will disagree.”

Becoming A Destination

I’ve always favored promoting a site in a way that helps it become synonymous with a given topic because that’s how to make it a default destination and encourage the kinds of signals that Google interprets as authoritative. I’ve done things like created hats with logos to give away, annual product giveaways and other promotional activities, both online and offline. While my competition was doing SEO busy work I created fans. Promoting a site is basically just getting it in front of people, both online and offline.

Brand Authority Is An Excuse, Not A Goal

Some SEOs believe in a concept called Brand Authority, which is a misleading explanation for why a website rank.  The term Brand Authority is not about Branding and it’s not about Authoritativeness, either. It’s just an excuse for why a site is top-ranked.

The phrase Brand Authority has its roots in PageRank. Big brand websites used to have a PageRank of 9 out of 10 and even a 10/10, which enabled them to rank for virtually any keywords they wanted. A link from one of those sites practically guarantee a top ten ranking. But Google ended the outsized influence of PageRank because it resulted in less relevant results, which was around 2004-ish, about the time that Google started using Navboost, a ranking signal that essentially measures how people feel about a site, which is what PageRank does, too.

This insight, that Google uses signals about how people feel about a site, is important because the feelings people have for a business are what being a brand is all about.

Marty Neumeier, a thought leader on how to promote companies (author of The Brand Gap) explained what being a brand is all about:

“Instead of creating the brand first, the company creates customers (through products and social media), the customers build the brand (through purchases and advocacy), and the customer-built brand sustains the company (through “tribal” loyalty). This model takes into account a profound and counterintuitive truth: a brand is not owned by the company, but by the customers who draw meaning from it. Your brand isn’t what you say it is. It’s what they say it is.”

Neumeier also explains how brand is about customer feelings:

“The best brands are vivid. They create clear mental pictures and powerful feelings in the minds and hearts of customers. They’re brought to life through their touchpoints, the places where customers experience them, from the first exposure to a brand’s name, to buying the product, to eventually making it part of who they are.”

That “tribal loyalty” is the kind of thing Google tries to measure. So when Danny Sullivan talks about differentiating your site to make it like a brand, he is not referring to so-called “brand authority.” He is talking about doing the kinds of things that influence people to feel positive about a site.

Getting Back To Mordy Oberstein

It seems to me that what he’s saying is that referral traffic is a stepping stone towards becoming a destination, it’s a means to an end. It’s not the goal, it’s a step toward the goal of becoming a destination.

On the other side of that process, I think it’s important to maintain relevance with potential site visitors and customers, especially today with the rapid pace of innovation, generational change, new inventions, and new product models. Relevance to people has been a Google ranking signal for a long time, beginning with PageRank, then with additional signals like Navboost.

The SEO factor that the SEO industry has largely missed is the part about about getting people to think positive thoughts about your site and your business, enough to share with other people.

Mordy’s insight about traffic is beautiful and elegant.

Read Mordy’s entire post on Facebook.

Featured Image by Shutterstock/Yunus Praditya