Why the term “women of childbearing age” is problematic

This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.

Every journalist has favorite topics. Regular Checkup readers might already know some of mine, which include the quest to delay or reverse human aging, and new technologies for reproductive health and fertility. So when I saw trailers for The Substance, a film centered on one middle-aged woman’s attempt to reexperience youth, I had to watch it.

I won’t spoil the movie for anyone who hasn’t seen it yet (although I should warn that it is not for the squeamish, or anyone with an aversion to gratuitous close-ups of bums and nipples). But a key premise of the film involves harmful attitudes toward female aging.

“Hey, did you know that a woman’s fertility starts to decrease by the age of 25?” a powerful male character asks early in the film. “At 50, it just stops,” he later adds. He never explains what stops, exactly, but to the viewer the message is pretty clear: If you’re a woman, your worth is tied to your fertility. Once your fertile window is over, so are you.

The insidious idea that women’s bodies are, above all else, vessels for growing children has plenty of negative consequences for us all. But it has also set back scientific research and health policy.

Earlier this week, I chatted about this with Alana Cattapan, a political scientist at the University of Waterloo in Ontario, Canada. Cattapan has been exploring the concept of “women of reproductive age”—a descriptor that is ubiquitous in health research and policy.

The idea for the research project came to her when the Zika virus was making headlines around eight years ago. “I was planning on going to the Caribbean for a trip related to my partner’s research, and I kept getting advice that women of reproductive age shouldn’t go,” she told me. At the time, Zika was being linked to microcephaly—unusually small heads—in newborn babies. It was thought that the virus was affecting key stages of fetal development.

Cattapan wasn’t pregnant. And she wasn’t planning on becoming pregnant at the time. So why was she being advised to stay away from areas with the virus?

The experience got her thinking about the ways in which attitudes toward our bodies are governed by the idea of potential pregnancy. Take, for example, biomedical research on the causes and treatment of disease. Women’s health has lagged behind men’s as a focus of such work, for multiple reasons. Male bodies have long been considered the “default” human form, for example. And clinical trials have historically been designed in ways that make them less accessible for women.

Fears about the potential effects of drugs on fetuses have also played a significant role in keeping people who have the potential to become pregnant out of studies. “Scientific research has excluded women of ‘reproductive age,’ or women who might potentially conceive, in a blanket way,” says Cattapan. “The research that we have on many, many drugs does not include women and certainly doesn’t include women in pregnancy.”  

This lack of research goes some way to explaining why women are much more likely to experience side effects from drugs—some of them fatal. Over the last couple of decades, greater effort has been made to include people with ovaries and uteruses in clinical research. But we still have a long way to go.

Women are also often subjected to medical advice designed to protect a potential fetus, whether they are pregnant or not. Official guidelines on how much mercury-containing fish it is safe to eat can be different for “women of childbearing age,” according to the US Environmental Protection Agency, for example.  And in 2021, the World Health Organization used the same language to describe people who should be a focus of policies to reduce alcohol consumption

The takeaway message is that it’s women who should be thinking about fetal health, says Cattapan. Not the industries producing these chemicals or the agencies that regulate them. Not even the men who contribute to a pregnancy. Just women who stand a chance of getting pregnant, whether they intend to or not. “It puts the onus of the health of future generations squarely on the shoulders of women,” she says.

Another problem is the language itself. The term “women of reproductive age” typically includes women between 15 and 44. Women at one end of that spectrum will have very different bodies and a very different set of health risks from those at the other. And the term doesn’t account for people who might be able to get pregnant but don’t necessarily identify as female.

In other cases it is overly broad. In the context of the Zika virus, for example, it was not all women between the ages of 15 and 44 who should have considered taking precautions. The travel advice didn’t apply to people who’d had hysterectomies or did not have sex with men, for example, says Cattapan. “Precision here matters,” she says. 

More nuanced health advice would be helpful in cases like these. Guidelines often read as though they’re written for people assumed to be stupid, she adds. “I don’t think that needs to be the case.”

Another thing

On Thursday, president-elect Donald Trump said that he will nominate Robert F. Kennedy Jr. to lead the US Department of Health and Human Services. The news was not entirely a surprise, given that Trump had told an audience at a campaign rally that he would let Kennedy “go wild” on health, “the foods,” and “the medicines.”

The role would give Kennedy some control over multiple agencies, including the Food and Drug Administration, which regulates medicines in the US, and the Centers for Disease Control and Prevention, which coordinates public health advice and programs.

That’s extremely concerning to scientists, doctors, and health researchers, given Kennedy’s positions on evidence-based medicine, including his antivaccine stance. A few weeks ago, in a post on X, he referred to the FDA’s “aggressive suppression of psychedelics, peptides, stem cells, raw milk, hyperbaric therapies, chelating compounds, ivermectin, hydroxychloroquine, vitamins, clean foods, sunshine, exercise, nutraceuticals and anything else that advances human health and can’t be patented by Pharma.”  

“If you work for the FDA and are part of this corrupt system, I have two messages for you,” continued the post. “1. Preserve your records, and 2. Pack your bags.”

There’s a lot to unpack here. But briefly, we don’t yet have good evidence that mind-altering psychedelic drugs are the mental-health cure-alls some claim they are. There’s not enough evidence to support the many unapproved stem-cell treatments sold by clinics throughout the US and beyond, either. These “treatments” can be dangerous.

Health agencies are currently warning against the consumption of raw unpasteurized milk, because it might carry the bird flu virus that has been circulating in US dairy farms. And it’s far too simplistic to lump all vitamins together—some might be of benefit to some people, but not everyone needs supplements, and high doses can be harmful.

Kennedy’s 2021 book The Real Anthony Fauci has already helped spread misinformation about AIDS. Here at MIT Technology Review, we’ll continue our work reporting on whatever comes next. Watch this space.


Now read the rest of The Checkup

Read more from MIT Technology Review’s archive

The tech industry has a gender problem, as the Gamergate and various #MeToo scandals made clear. A new generation of activists is hoping to remedy it

Male and female immune systems work differently. Which is another reason why it’s vital to study both women and female animals as well as males

Both of the above articles were published in the Gender issue of MIT Technology Review magazine. You can read more from that issue online here.

Women are more likely to receive abuse online. My colleague Charlotte Jee spoke to the technologists working on an alternative way to interact online: a feminist internet.

From around the web 

The scientific community and biopharma investors are reacting to the news of Robert F. Kennedy Jr.’s nomination to lead the Department of Health and Human Services. “It’s hard to see HHS functioning,” said one biotech analyst. (STAT)

Virologist Beata Halassy successfully treated her own breast cancer with viruses she grew in the lab. She has no regrets. (Nature)

Could diet influence the growth of endometriosis lesions? Potentially, according to research in mice fed high-fat, low-fiber “Western” diets. (BMC Medicine)

Last week, 43 female rhesus macaque monkeys escaped from a lab in South Carolina. The animals may have a legal claim to freedom. (Vox)

Health Crisis Drives $50 Million Supplement CEO

Dean Brennan says a diet of beer, pizza, and fast food led to his ulcerative colitis. His doctors diagnosed it years ago in his twenties and told him he’d need medications for life. But Brennan decided otherwise.

“I didn’t want to take lifelong medication,” he told me. “It sparked my passion for health and led me to want to help others.”

Fast forward to 2024, and Brennan is the CEO of Heart & Soil, a nutritional supplement company doing $50 million in annual revenue.

In our recent conversation, he addressed his journey to Heart & Soil, key supplement ingredients, supply chain challenges, and more. The entire audio is embedded below. The transcript is edited for clarity and length.

Eric Bandholz: Give us a rundown of what you do.

Dean Brennan: I’m the CEO of Heart & Soil, a nutritional supplements company. I entered ecommerce in 2020 with no experience, coming from a background in filmmaking.

I got involved with the company from my personal health journey. In my twenties, I was diagnosed with ulcerative colitis, and doctors told me I’d need medication for life. I grew up eating home-cooked, natural foods, although in college I consumed a lot of beer, pizza, and fast food.

I didn’t want to take lifelong medication. It sparked my passion for health and led me to want to help others who suffer from conditions like psoriasis, Crohn’s disease, and eczema.

Heart & Soil offers supplements containing nature-based multivitamins made from bovine organs sourced from regenerative farms, initially in New Zealand and now also from the U.S.

Bandholz: How did you get connected with Heart & Soil?

Brennan: I was aware of Paul Saladino, our founder, but not the company. He’s a board-certified physician and a nutrition specialist. I followed him on social media while experimenting with a carnivore diet. I admired his ability to simplify complex health concepts and share them in an engaging way.

In 2020, I met Paul by chance, along with two employees who are now our chief research officer and head of operations. At the time, the company hadn’t launched yet, and I offered feedback on its prototype product. Initially, I wasn’t looking for a position in the company, but I was passionate about their mission.

Later that year, after my persistence, Paul brought me on board the day the company launched. I printed shipping labels and prepared the orders. Within three months, I had worked my way into a bigger role.

The team was small then — Paul, me, and three others. We worked out of a rental house in West Austin, packing and shipping supplements ourselves. We grew quickly. Paul realized his expertise was podcasting and researching, not operations. He assigned those responsibilities to me by January 2021.

Bandholz: How did you earn Paul’s trust so quickly?

Brennan: It was a gradual transition. Paul left for a trip to Africa. Then there was a massive ice storm in Austin, and he couldn’t return. Eventually, he went to Costa Rica and decided to stay there, leaving me to run the business. I think he trusted me because I showed up every day, worked hard, and didn’t ask for anything.

The transition was easy. I was nervous about how the team would react, but they were all on board. We’ve worked well together ever since.

Bandholz: How do you spread awareness beyond Paul’s podcast audience?

Brennan: Only about 30% of our customers come from Paul’s audience, with the same percentage coming through word of mouth. Our product works, and we’ve received hundreds of customer success stories. One of our strengths is personalized customer service. Our team of health guides offers one-on-one support, which has led to word-of-mouth referrals. People often tell others about us, even if they haven’t purchased our products themselves.

We also started another podcast called Radical Health Radio, and we’re producing films for YouTube. Our documentary on seed oils will be released next month.

Bandholz: What’s your supply chain like?

Brennan: Our long-term goal is to build a U.S.-based supply chain to produce all the organs needed for our supplements. In 2020, nothing like this existed in the U.S., so we sourced from New Zealand, where regenerative farming is common. But we’ve worked hard over the last four years to develop U.S. suppliers, supporting American farmers.

There’s a huge education gap in the U.S. regarding organ consumption. Around the world, most cultures consume organs regularly. We hope that educating consumers can drive demand for better products and ingredients.

When consumers ask for healthier alternatives, large companies will have to respond. This movement isn’t just about our products but about supporting sustainable farming practices and improving public health.

Bandholz: Where can people buy your supplements and follow you?

Brennan: Our ecommerce site is Heartandsoil.co. You can follow me on X and LinkedIn.

WP Engine Escalates Legal Battle With Automattic and Mullenweg via @sejournal, @martinibuster

WP Engine escalated its Federal complaint by citing Automattic’s publication of the WP Engine Tracker website as evidence of intent to harm WP Engine and exposing customers to potential cybercrimes. The updated complaint incorporates recent actions by Mullenweg to further strengthen their case.

A spokesperson for WP Engine issued a statement to Search Engine Journal about the WP Engine Tracker website:

“Automattic’s wrongful and reckless publication of customer’s information without their consent underscores why we have moved for a preliminary injunction. WP Engine has requested the immediate takedown of this information and looks forward to the November 26th hearing on the injunction.”

Legal Complaint Amended With More Evidence

WP Engine (WPE) filed a complaint in Federal court seeking a preliminary injunction to prevent Matt Mullenweg and Automattic from continuing actions that harm WPE’s business and their relationships with their customers. That complaint was amended with further details to support their allegations against Mullenweg and Automattic.

The legal complaint begins by stating in general terms what gives rise to their claim:

“This is a case about abuse of power, extortion, and greed.”

It then grows progressively specific by introducing evidence of how Automattic and Mullenweg continue their “bad acts unabated” for the purpose of harming WP Engine (WPE).

The amended claim adds the following, quoting Mullenweg himself:

“Since then, Defendants have continued to escalate their war, unleashing a campaign to steal WPE’s software, customers and employees. Indeed, just days ago, Defendants were unambiguous about their future plans:”

This is the statement Mullenweg made that is quoted in the amended complaint:

“[S]ince this started [with WPE] they’ve had uh, we estimate tens of thousands of customers leave. . . . So, um you know, I think over the next few weeks, they’re actually gonna lose far more than 8% of their business . . . we’re at war with them. We’re . . . going to go brick by brick and take . . . take every single one of their customers . . . if they weren’t around guess what? . . . We’d happily have those customers, and in fact we’re getting a lot of them.”

WP Engine Tracker Site Used As Evidence

Automattic recently created a website on the WordPressEngineTracker.com domain called WP Engine Tracker that encourages WordPress Engine customers to leave, offering links to promotions that offer discounts and promise a smooth transition to other web hosts.

WPE states that the WP Engine Tracker website is part of a campaign to encourage WPE customers to abandon it, writing:

“Defendants also created a webpage at wordpress.org offering “Promotions and Coupons” to convince WPE customers to stop doing business with WPE and switch over to Automattic’s competitor hosting companies like wordpress.com and Pressable; they later added links to other competitors as well.”

The WordPress Engine Tracker website calls attention to the number of sites that have abandoned WP Engine (WPE) since Matt Mullenweg’s September 21st public denunciation of WP Engine and the start of his “nuclear” war against the web host. The amended Federal lawsuit points to the September 21st date listed on that site as additional evidence tying Automattic to a campaign to harm WP Engine’s business.

The legal document explains:

“Just last week, in an apparent effort to brag about how successful they have been in harming WPE, Defendants created a website—www.wordpressenginetracker.com—that “list[s] . . . every domain hosted by @wpengine, which you can see decline every day. 15,080 sites have left already since September 21st.

September 21 was not selected randomly. It is the day after Defendants’ self-proclaimed nuclear war began – an admission that these customer losses were caused by Defendants’ wrongful actions. In this extraordinary attack on WPE and its customers, Defendants included on their disparaging website a downloadable file of ‘all [WPE] sites ready for a new home’—that is, WPE’s customer list, literally inviting others to target and poach WPE’s clients while Defendants’ attacks on WPE continued..”

The purpose of the above allegations are to build as much evidence that lend credence to WP Engine’s claim that Automattic is actively trying to cause harm WP Engine’s business.

WPE Accuses Automattic Of Additional Harms

Another new allegation against Automattic is that the spreadsheet offered for download on the WP Engine Tracker website includes sensitive information that is not publicly available and could cause direct harm to WPE customers.

The amended Federal lawsuit explains:

“Worse, this downloadable file contains private information regarding WPE’s customers’ domain names, including development, test, and pre-production servers—many of which are not intended to be accessed publicly and contain sensitive or private information. Many of these servers are intentionally not indexed or otherwise included in public search results because the servers are not safe, secure or production-ready and not intended to be accessed by the general public.

By disclosing this information to the general public, Defendants put these development, test, and pre-production domains at risk for hacking and unauthorized access.”

WP Engine Tracker Site Part Of A Larger Strategy

WPE’s amended complaint alleges that the WP Engine Tracker site is one part of a larger strategy to cause harm to WP Engine’s business that includes encouraging WPE employees to resign. The legal document adds new information of how the WP Engine Tracker website is just one part of a larger strategy to harm WPE’s business.

The updated document adds the following new allegations as evidence of WPE’s claims:

“Not content with interfering with WPE’s customer relations, Automattic has recently escalated its tactics by actively recruiting hundreds of WPE employees, in an apparent effort to weaken WPE by sowing doubts about the company’s future and enticing WPE’s employees to join Automattic:”

The document includes a screenshot of an email solicitation apparently sent to an employee that encourages them to join Automattic.

Screenshot Of Evidence Presented In Amended Complaint

Escalation Of Federal Complaint

WP Engine’s amended complaint against Mullenweg and Automattic invokes the Sherman Act (prohibiting monopolization to maintain a competitive marketplace), the Lanham Act (governing trademarks, false advertising, and unfair competition), and the Computer Fraud and Abuse Act (addressing unauthorized computer access and cybercrimes). The amendments tie recent actions by Mullenweg and Automattic—such as the creation of the WP Engine Tracker website—directly to their claims, turning Mullenweg’s attacks on WP Engine into evidence.

Read the amended Federal complaint here: (PDF).

Featured Image by Shutterstock/chaiyapruek youprasert

Bad & Toxic Backlinks You Should Avoid via @sejournal, @BennyJamminS

Link building is a complicated art form with many different tactics and approaches.

Despite being one of the most mature processes in SEO, there’s still much disagreement about what makes a “bad” or “good” link building strategy, including effectiveness vs. risk, and what tactics Google can detect or punish a website for.

This post will help you determine what to avoid when link building or vetting the tactics of a new service provider.

I’m not going to claim to put any disagreements to rest, and if you’re a particularly experiment-minded SEO you might find this post a little on the conservative side.

As with all things in the industry, there’s inconsistency between what Google says and what works, and everyone benefits from those who experiment and push boundaries.

But I’m taking a conservative approach that follows Google’s guidelines closely for two core reasons:

  • This post is for readers looking for reliable and sustainable strategies. I don’t advise that you use experimental or high-risk tactics when it comes to link building if you don’t already know what you’re doing and what the risks are.
  • You should take the guidelines as a statement of intent, not absolute or current truth. Even if a link building tactic that goes against Google’s guidelines works now, there is reason to believe that Google intends to address it.

Types Of Unnatural Links

A an unnatural link is any link that is created for the purposes of manipulating search engines or that violates Google’s spam policies.

The following are some of the most common types of unnatural links.

Buying Or Selling Links

There is nothing fundamentally wrong with paying for a link or exchanging some kind of product or service for a link as long as the nature of the relationship is disclosed and the links are not for SEO purposes.

Buying, exchanging, or trading for links for SEO is the problem. Links for SEO are supposed to be a choice influenced only by the content on the page.

If your content is highly valued and people choose to link to it for that reason, then you deserve SEO benefits.

When you enter money or value exchanges into that dynamic, it breaks the ideal purpose of SEO links and introduces a high potential for manipulation. In such cases, Google requires marking the link as rel=nofollow or rel=sponsored so that the links do not pass SEO value. As long as you or the parties linking to you do this, for the most part, there’s no problem.

Here is an example of implementing nofollow and sponsored attributes:

Here are some ways that buying or selling links can fall afoul of Google’s spam policies:

  • Text advertisements with links that pass SEO signals because they haven’t been identified with “nofollow” or “sponsored.”
  • Paying for articles that include links that pass SEO signals.

Another way to buy links is to pay someone to create them for you. In this case, a service provider does that work of creating assets, reaching out to acquire links, or both. As long as this service provider doesn’t engage in shady tactics of their own and doesn’t give you links on domains that they own, this is totally fine.

Keep in mind that the “buying” and “selling” definitions are not limited to an exchange of currency.

It describes any kind of relationship where something is exchanged for a link, like a product.

As Matt Cutts explained in 2014, Google aligns pretty closely with the FTC on what it understands to be a “material connection” between a link provider and link recipient:

  • If a party receives enough value to reasonably change their behavior, a material connection must be disclosed.
    • A pen or a t-shirt likely won’t change behavior (unless received for the explicit purpose of reviewing / linking to it).
    • A direct payment for a link, a gift card, or a product with a high dollar value likely changes behavior and incentivizes a link.
    • An item loaned has different implications than an item given.
  • Consider the intended audience: if you’re giving things away for reasons other than to acquire links (for example as part of a conference attendance gift package), then disclosure might be necessary, but it might not be strictly necessary to ask all those people to mark links as sponsored if they choose to talk about it.
  • Consider whether a link relationship would be surprising: it makes sense that a movie reviewer might see a movie for free. It makes less sense that a tech reported would get to keep a laptop they’re reporting about without disclosure.

Link Exchange Agreements

Link exchanges are similar to buying links because they involve an exchange of value.

Mutual linking happens often, and when it occurs organically, it’s no problem. It makes perfect sense for some websites to link back and forth.

But you need to watch out for any kind of agreement. “Link for link” is a no-go, and if you do it often enough, it can become easy to spot.

The thing about links is that any time you give or get a link for a reason other than the value and relevance of the link itself, it’s easy to spot – likely easier than you think.

The occasional bit of back rubbing isn’t a big deal. When given a few different choices of websites to reference, it makes sense that people would choose those they already know or have existing relationships with.

That’s generally fine. The problem comes when you enter into specific agreements: You link to me, and I’ll link to you.

The video below explains the difference between a link that’s an editorial choice and a link that’s based on an agreement.

Private Blog Networks

Private blog networks (PBNs) are networks of sites created to artificially inflate the rankings of one specific central website.

Basically, one entity controls an entire network of websites and can use a few different specific linking methods to manipulate to pass authority and SEO value around.

This network can then be used to artificially inflate the rankings of other websites by linking out to them.

In order for this tactic to work, all the websites need to have relationships or be owned by the same entity.

This is a pretty clear violation of Google’s guidelines, and it’s also pretty easy to spot.

Sites that are part of these networks can be penalized, and if you’re a little too lax with user-generated content on your site, you could find yourself accidentally becoming one.

If you accept any kind of content from external parties, scrutinize it carefully, especially links. Skip down to “How To Spot Shady Links” to find out more.

Unnatural Links From Forums, Blog Comments, And Other User-Generated Content

User-generated content is tricky when it comes to links. Ideally, a random person loves your content so much that they use you as a reference. Not so ideal is faking it.

Comments, forums, blogs, guestbooks, and even sites like Reddit might be tempting sources for links, and in the right context, they can absolutely be part of a healthy backlink profile. You can even link to yourself if you’re genuinely engaging in a relevant discussion. Google doesn’t consider all comment links and UGC links to be spam.

However, it’s a bad idea to try and engineer these links as part of a mass strategy.

The first thing to keep in mind is that many user-generated content (UGC) websites have blanket nofollow attributes on outgoing links. It’s an old tactic, so many high-quality communities moderate UGC heavily. This means that doing this effectively requires effort. The big question to ask yourself is: does the comment add genuine value to the community?

Most commonly, people execute these links unnaturally using bots to post automatically. Generally, automated posting using bots isn’t exactly valuable, and you’ll be flagged and moderated out of those communities.

Automated Link Syndication

There are tons of ways to automate links, but Google considers automating links at scale to be spam.

There are plenty of ways to safely automate your content processes, but we aren’t talking about that. We’re talking about using automation to post content externally from your website purely to acquire SEO links.

From automated article spinners to bots that will post comments and social media posts, if you’re intentionally building links “at scale,” then the chances are high that you’re building toxic links.

This could look like an automated press release or directory posting. It could look like low-quality article directories, which are often filled with spammy content that is widely distributed.

Generative AI has enabled new forms of automation for links and content, so it’s important to consider the overall principles in Google’s and the FTC guidelines when you evaluate novel functions and strategies.

Links In Distributed Widgets

People sometimes engage in automated link building by adding links to widgets distributed to multiple websites. Google clarified its stance on this and provided examples of manipulative widgets.

This kind of link building is pretty easy to spot, and it’s pretty clear that these types of links don’t add value.

Using Expired Domains To Build Links

Expired domain abuse is another tactic Google is wise to, but that doesn’t stop people from trying it.

One way that expired domains can be used to build unnatural links is by purchasing it and then redirecting it to another website. The idea is that all of the authority and backlinks belonging to the expired domain will be forwarded through the redirect. Don’t do this.

Any Link Can Be Bad If It’s Lazy Enough

Does the automated press release spam mean you shouldn’t send press releases? No!

Does the prevalence of poor-quality directors mean you can’t use directories in a high-quality way? Also no!

This goes for many link building strategies. There’s usually a high-effort, valuable version and a low-effort, spammy version.

Take guest posting as an example.

If you’re an expert in your field and take the time to write useful content aligned with E-E-A-T best practices, that’s valuable.

If you want to reach new audiences, you could send that post to a website with a large reach. It makes sense for that website to then link back to you as a reference for readers if they like your writing and want to learn more.

This is an ideal linking relationship. A website has chosen your content because it provides value to its readers and links to you as the source of the expertise.

But when one party turns lazy, this becomes toxic.

A website might decide that, for whatever reason, it makes sense to start allowing poor-quality content with links.

Maybe it starts charging or uses a big catalog of content to build an affiliate strategy.

On the other side, link builders might generate poor-quality content with links and post it on websites that either don’t mind or don’t know better. Or they might try and sneak them by following stricter editorial guidelines.

When one side of the equation gets lazy, guest posting becomes a manipulative linking strategy.

The Risk Of Manual Actions

The most likely risk of an unnatural link is that it will be a waste of time and/or money.

If you build a link for SEO that goes against Google’s guidelines, algorithms will simply ignore it either immediately or at an unspecified time in the future when they discover it.

If you have many toxic links and you’re using a strategy that the algorithms don’t immediately catch, this can open you up to a sudden reduction in SEO effectiveness.

At some point, Google will likely release an update that improves how the algorithms detect the links.

When that happens, if you have many of them, the adjustment can significantly impact your rankings and traffic. This can look like a targeted penalty, but generally, it isn’t.

Google uses automated systems and manual actions to punish toxic and spammy link building, but generally, you’re safe from this action unless you’re intentionally using these tactics on a large scale.

On the other hand, you can receive specific penalties for unnatural links, both coming to your site or going out from your site.

Unnatural links manual action notification in search console.Unnatural links manual action notification in search console.

Links To Your Site Vs. Links From Your Site

If you host unnatural links from your site to other sites, you may be hit with a manual action. This indicates to Google that you’re on the supply side of the ecosystem it’s trying to stop.

A large number of unnatural links coming from your website could cause Google to decide it doesn’t trust you and issue a penalty. This will be communicated to you in Google Search Console. These penalties can be reversed, but generally this requires you to fix the problems and submit a request for reevaluation.

This video from Google about unnatural links from your site explains more. It’s your responsibility to ensure that your site does not host unnatural links. This video from Google provides a great overview. Remember: “A natural link is an editorial choice.”

For example, if you use your domains to host bad link tactics and sell links to others, you’re at a high risk of receiving a manual penalty from Google that suppresses or removes your website from the Search index.

You can also receive a manual penalty for unnatural links to your website. This seems less likely, because there are many cases where it wouldn’t be fair to punish a website for incoming links. However, you might still receive a manual penalty if Google is confident that you are trying to manipulate your ranking.

This video from Google about unnatural links to your site has more information.

How To Spot Shady Links

A good link is a genuine interaction of trust between two parties.

Spotting shady links is actually pretty easy, especially when there’s a pattern.

If you’re auditing your backlink profile or putting a potential service provider through their paces, here are some signs to look for.

1. New or young sites on blogging domains.

If you notice links from blogging subdomains ( e.g. blogger.com ) to your website, especially if they aren’t directly relevant, appear in high numbers (without nofollow attribute), or even in some cases where the blog has your website or brand name, this is a sign that someone was building shady links to your website.

This is a good indication of a PBN.

You should ask a link building service provider whether they create new websites to build links. This is a red flag.

2. Many unnatural links from unrelated forums.

Links like this can indicate automated link building with bots. Generally, using UGC sites to build links is against the terms of service of those websites.

Usually, the strategy involves pretending to be a genuine user. If you have to pretend you’re someone you’re not, it’s a shady link.

3. Links from irrelevant websites and directories.

Relevance really does matter with links, and if you’re looking through a link profile and see domains that just don’t make sense, they bear investigation. For example if you are a recipe publisher a link from plumber’s article is highly irrelevant. That means it was likely the result of an unnatural link building technique.

However, if you add your website to relevant directories that have value from the users’ perspective, this can be totally fine. For example, you should add your restaurant website to Yelp, which is used by 32M active users who look for reviews before booking a reservation. Check our list of directories that still matter.

If you want to learn more about link building and its many pitfalls, check out SEJ’s ebook The Dark Side Of Link Building.

More resources: 


Featured Image: Jakub Krechowicz/Shutterstock

6 Web Hosts Ranked By Core Web Vitals: One Outperforms All via @sejournal, @martinibuster

HTTPArchive is offering a new technology comparison dashboard, currently in beta testing. Users can now view real-world web hosting performance scores for Core Web Vitals. We compare six web hosts and find one that consistently performs better across nearly all metrics.

About HTTPArchive

HTTPArchive tracks websites through crawling and with data collected in the Chrome User Experience Report (CrUX). It publishes reports about the technologies that power websites, including Core Web Vitals performance of content management systems like WordPress and Wix.

New Technology Comparison Dashboard – Beta

HTTPArchive has new reports under development, one of which is a comparison of Core Web Vitals and Lighthouse performance scores by web hosts. HTTPArchive also tracks the median page weight by web hosts but it’s still under development and is in Beta testing.

The new reports allow comparison by web hosts. There isn’t data yet for many web hosts but there is for the following six. Comparing web hosts by core web vitals is not a totally fair comparison. A web host like Ionos might host many thousands of small and local sites which might not be resource intensive.

So with those caveats, here are the six web hosts under comparison:

  1. Bluehost
  2. GoDaddy
  3. HostGator
  4. IONOS
  5. SiteGround
  6. WP Engine

Core Web Vitals By Web Host

The following are the list of web hosts by percentage of sites hosted at each one that pass Core Web Vitals. The HTTPArchive says that thise report is still under development and, as previously mentioned, the percentages don’t necessarily reflect the quality of the web hosts themselves, but rather the quality of the sites hosted there.

This is the description of the CWV metric scores:

Passes Core Web Vitals
The percentage of origins passing all three Core Web Vitals (LCP, INP, CLS) with a good experience. Note that if an origin is missing INP data, it’s assessed based on the performance of the remaining metrics.”

However, it’s interesting to see that the number one web host is a managed WordPress web host because that may indicate that the platform itself may be optimized better than a general web host. The following scores are based on a snapshot taken at the beginning of September.

Core Web Vitals Scores In Descending Order

  • WP Engine 70%
  • GoDaddy 67%
  • SiteGround 65%
  • HostGator 58%
  • Ionos 58%
  • Bluehost 45%

Largest Contentful Paint (LCP)

LCP measures the perceived page loading speed, how fast the page appears to load for a site visitor.

HTTPArchive defines this metric:

“Largest Contentful Paint (LCP) is an important, stable Core Web Vital metric for measuring perceived load speed because it marks the point in the page load timeline when the page’s main content has likely loaded—a fast LCP helps reassure the user that the page is useful. Good experiences are less than or equal to 2.5 seconds.”

WP Engine again comes out on top, perhaps indicating the quality of the sites hosted on that platform as well as the performance optimizations that are a key element of that web host.

LCP Scores In Descending Order

  • WP Engine 79%
  • GoDaddy 78%
  • SiteGround 75%
  • HostGator 69%
  • IONOS 69%
  • Bluehost 52%

Cumulative Layout Shift (CLS)

HTTPArchive also provides a comparison of the six web hosts by the CLS score. CLS measures how much a web page shifts around as it’s rendered in a web browser. A score of 0.1 or less for 75% of visitors is recommended. The percentages for each of the web hosts were all higher than the 75% minimum. This time WP Engine is tied for first place with HostGator.

CLS Scores In Descending Order

  • WP Engine 88%
  • HostGator 88%
  • Bluehost 87%
  • SiteGround 86%
  • IONOS 85%
  • GoDaddy 84%

First Contentful Paint (FCP)

FCP measures how long it takes for the content to become visible. A low FCP means that the content is rendered quickly. The number one ranked web host for FCP turns out to be GoDaddy, ahead by a significant margin of 7 points. WP Engine comes in second, followed by SiteGround.

FCP Scores In Descending Order

  • GoDaddy 73%
  • WP Engine 67%
  • SiteGround 62%
  • IONOS 60%
  • HostGator 57%
  • Bluehost 39%

Time To First Byte (TTFB)

TTFB measures how long it takes from to download the first byte of a resource after it’s requested by a browser. GoDaddy scores top of the list again.

TTFB In Descending Order

  • GoDaddy 59%
  • IONOS 45%
  • WP Engine 39%
  • HostGator 38%
  • SiteGround 37%
  • Bluehost 25%

Interaction to Next Paint (INP)

This metric represents the overall responsiveness of the entire web page.

HTTPArchive explains what this score means:

“INP is a metric that assesses a page’s overall responsiveness to user interactions by observing the latency of all click, tap, and keyboard interactions that occur throughout the lifespan of a user’s visit to a page. The final INP value is the longest interaction observed, ignoring outliers. A good experience is less than or equal to 200ms.”

The scores are the percentage of pages that provide a good INP experience. WP Engine is back on top for INP but the other five web hosts are not far behind.

INP Scores In Descending Order

  • WP Engine 95%
  • SiteGround 94%
  • Bluehost 92%
  • GoDaddy 90%
  • HostGator 89%
  • IONOS 88%

Lighthouse Performance Score

Lighthouse is an open source auditing tool that scores web pages for performance, SEO, and other metrics. The performance scores for the six web hosts are fairly close to each, clustering on either side of a performance score of 40.

This is HTTPArchive’s description of this score:

“In general, only metrics contribute to your Lighthouse Performance score, not the results of Opportunities or Diagnostics.”

Interestingly, HostGator ranks the highest for the Lighthouse Performance score, with GoDaddy and Ionos tied for second place. The other three were tied for third place, by one point less than the second place. Nevertheless, HostGator was the clear winner for the Lighthouse Performance score metric.

Lighthouse Performance Scores

  • HostGator 43
  • GoDaddy 40
  • IONOS 40
  • Bluehost 39
  • SiteGround 39
  • WP Engine 39

HostGator came out near the top for Core Web Vitals and scores at the top of the list for the Lighthouse Performance metric. WP Engine is clustered with two other web hosts scoring 39 points.

Lighthouse Accessibility Scores

The accessibility scores are clustered similarly to the performance scores, on either side of a score of 85.

This is how HTTPArchive describes this metric:

“The Lighthouse Accessibility score is a weighted average of all accessibility audits. Weighting is based on axe user impact assessments. Each accessibility audit is pass or fail. Unlike the Performance audits, a page doesn’t get points for partially passing an accessibility audit.”

Accessibility Scores In Descending Order

  • GoDaddy 87
  • Bluehost 86
  • WP Engine 86
  • SiteGround 86
  • HostGator 85
  • Ionos 85

Lighthouse SEO Scores

The SEO scores were even more tightly clustered, with GoDaddy scoring the highest of the six web hosts under comparison.

HTTPArchive describes what the SEO Score is measuring:

“These checks ensure that your page is following basic search engine optimization advice. There are many additional factors Lighthouse does not score here that may affect your search ranking, including performance on Core Web Vitals.”

SEO Scores In Descending Order:

  • GoDaddy 91
  • Bluehost 88
  • WP Engine 88
  • HostGator 88
  • IONOS 88
  • SiteGround 88

Lighthouse Best Practices Score

The last score is interesting because it measures if the hosted sites are created with web development best practices. HTTPArchive doesn’t explain at this time what those best practices are.

Here’s the description of this score:

“This ensures that your page is built using modern web development best practices.”

Best Practices Scores In Descending Order

  • Bluehost 79
  • HostGator 79
  • SiteGround 79
  • WP Engine 77
  • GoDaddy 77
  • IONOS 77
  • Takeaway

HTTPArchive is expanding on what it is measuring. The performance dashboard is still in Beta and under development, meaning that it may have bugs but that it’s ready for a public preview. It’s interesting to see a managed WordPress host come on top. The scores will be more meaningful once there are more managed web hosts that can be compared against each other, which may provide a more meaningful comparison. Nevertheless, this is a good start.

Visit the new dashboard here and provide your feedback to make it better.

Featured Image by Shutterstock/TierneyMJ

New Ecommerce Tools: November 14, 2024

This week, our rundown of new tools from companies offering services to ecommerce merchants includes updates on holiday marketing campaigns, drone deliveries, analytics and insights, search, video generators, and several AI-based platforms.

Got an ecommerce product release? Email releases@practicalecommerce.com.

New Tools for Merchants

Amazon opens Virtual Holiday Shop, a 3D shopping experience. Amazon has launched Virtual Holiday Shop, a virtual shopping experience that uses three-dimensional technology powered by the Amazon Beyond virtual store. Inside the shop, visitors experience music, animations, and a guided search for gifts. Visitors can add products directly to a cart and then check out as usual. Per Amazon, the Virtual Holiday Shop spotlights selections of the top 100-plus gifts, stocking stuffers, holiday decor, and premium products, including customer favorites.

Web page showing Amazon's Virtual Holiday Shop

Amazon’s Virtual Holiday Shop

WPForms launches AI-powered form builder for WordPress. WPForms, a WordPress plugin, has released an AI-powered form builder to automatically generate customizable forms for contact-us, surveys, registrations, and feedback. Users describe what they want through an AI chatbot, and WPForms AI generates a form. The builder can translate entire forms into multiple languages, automatically set up conditional logic, and tweak or adjust forms afterward.

eBay mobile app adds traffic and performance data. Performance Insights is now live on the eBay mobile app to help merchants understand and improve their businesses. With Performance Insights, sellers can view real-time traffic graphs, track listing views, and monitor click rates and traffic sources — all on the go.

CapCut launches a video content platform for ecommerce merchants. Short-form editing app CapCut by ByteDance has launched Commerce Pro, a platform for ecommerce sellers and creators to produce and scale ads and branded content. The AI video generator instantly converts the URL of a product into ad videos with links to the products. AI-generated presenters assist with product demonstrations, explainer videos, and more. AI models can virtually try on products and generate photos for showcasing.

Home page of CapCut Commerce

CapCut Commerce

Buy-now, pay-later provider Affirm expands to the U.K. Affirm, a U.S.-based fintech firm, has launched its BNPL loans in the U.K. According to Affirm, the U.K. offering will include both interest-free and interest-bearing payment options. Interest on its plans will be fixed and calculated on the original principal amount, so it won’t increase or compound. The U.K. expansion is Affirm’s first outside of the U.S. and Canada.

Coveo partners with Shopify on scalable AI search and commerce experiences. Coveo, a commerce experience platform that leverages search and generative AI, has partnered with Shopify to bring its AI capabilities to enterprise customers. Coveo says its platform enables Shopify enterprise merchants to manage AI models and strategies for search relevance, personalization, real-time recommendations, unified indexing, and generative shopper experiences for product discovery and session optimization.

Contentsquare expands its analytics platform with advanced AI features. Contentsquare, an analytics provider, has upgraded its AI-driven Experience Intelligence platform to help marketing, product, and tech teams work more efficiently — with flexible purchasing options for businesses of all sizes. According to Contentsquare, the genAI CoPilot offers immediate insights and recommended next steps, summaries of customer sentiment, automatic session replays, and more. Contentsquare has also added heatmaps, enhanced customer feedback, and expanded analysis.

Home page of Contentsquare

Contentsquare

Zenapse launches AI-powered marketing platform on Google Cloud Marketplace. Zenapse, an AI-powered marketing platform utilizing emotional intelligence, has launched the ZenImpact Optimization Studio on Google Cloud Marketplace. Zenapse states Google Cloud Marketplace users can access its AI-driven psychographic signals, which analyze consumer thoughts, feelings, and beliefs, combined with demographic and behavioral data, to predict in real-time which content, products, and offers will resonate most with an audience. This launch will help marketers enhance business outcomes and gain deeper customer insights.

DreamHost partners with ecommerce solution provider Ecwid by Lightspeed. DreamHost, a provider of web hosting and managed WordPress services, has partnered with Ecwid by Lightspeed, enabling individuals and businesses to set up online stores and scale their businesses through multiple channels. According to DreamHost, the partnership means customers can quickly set up an online store without technical expertise. The solution includes built-in real-time reporting, marketing tools, and integrations for scaling. DreamHost customers have immediate access to Ecwid’s free tier.

Brizy launches a page builder for Shopify. Brizy, a London-based developer of website-building tools, has launched a landing page builder for Shopify store owners. With its drag-and-drop interface, users can design custom pages, product showcases, and marketing materials without needing any coding skills. For a limited time, Shopify users can try Brizy’s free plan. Brizy’s library includes over 90 templates and advanced elements such as countdowns, pop-ups, and alert bars.

Amazon expands drone delivery in Arizona. Amazon is expanding Prime Air drone delivery in the West Valley of metro Phoenix. Customers who live near Amazon’s Same-Day site in Tolleson, Arizona, and purchase an eligible item weighing five pounds or less can have it delivered by drone in under an hour. Tolleson’s Same-Day Delivery site is a hybrid — part fulfillment center and part delivery station. Amazon’s new MK30 drones will deploy from the facility.

Photo of an Amazon Prime Air drone

Amazon Prime Air: Phoenix Metro Area Drone Deliveries

Microsoft’s AI SEO Tips: New Guidance For AI Search Optimization via @sejournal, @MattGSouthern

Microsoft has provided guidance on how to optimize content for AI-powered search engines.

This advice is timely now that OpenAI has launched ChatGPT Search, which uses Bing’s search index.

Understanding user intent is everything in this new era of search, Microsoft says:

“In the past, digital marketing strategies often relied heavily on demographic data and broad customer segments. But in this era of generative AI, the focus now shifts from who the customer is to what they are looking for—in real-time.”

Microsoft explains several ways websites can optimize content for AI-powered search.

AI SEO Recommendations

Intent-Based Content

Content should address the underlying purpose of user queries, Microsoft says:

“Focus on the intent behind the search query rather than just the keywords themselves. For example, if based on your keyword research, you find that users are searching for “how to choose eco-friendly coffee makers,” provide detailed, step-by-step guides rather than just general information.”

Natural Language Processing (NLP)

Websites should leverage NLP techniques to align content with how AI systems process and understand language.

Microsoft states:

“Generative engines, such as Bing Generative Search, deliver content to searchers by understanding and generating human language through Natural Language Processing (NLP). By analyzing vast amounts of text data to learn language patterns, context, and semantics, they’re able to provide relevant and accurate responses to user queries.”

Additionally, Microsoft emphasized the following sentence in italics:

“Leveraging these same NLP strategies in creating your content can optimize it to rank higher, increase its relevance, and enhance its authority, ultimately boosting its visibility and effectiveness.”

Strategic Keyword Implementation

To improve your website and landing pages for AI search engines, Microsoft recommends these keyword strategies:

  • Long-tail keywords for specific user interests
  • Conversational phrases matching natural speech patterns
  • Semantic keywords providing contextual relevance
  • Question-based keywords addressing common user queries

Freshness

Microsoft encourages keeping content updated and suggests using the IndexNow protocol to quickly notify search engines about website changes.

This helps maintain search rankings and ensures AI systems have the latest information.

Microsoft states:

“While it can be tempting to set it and forget it, AI systems depend on the latest, freshest information to determine the most relevant content to display to searchers. Regularly updating your content not only helps maintain your rankings but also keeps your audience engaged with current and valuable information. This practice can significantly influence how AI systems perceive and rank your website.”

Why This Matters

ChatGPT Search now uses Bing’s index, making these optimization strategies vital for websites seeking better visibility in AI-powered searches.

While this can help you create more optimized content, Microsoft acknowledges there’s no “secret sauce” for AI search systems.

How To Get Indexed In ChatGPT Search

Refer to our article on ChatGPT search indexing to ensure your content is indexed in ChatGPT’s real-time search engine.

You can also watch the short video I recorded on this topic below:


Featured Image: jomel alos/Shutterstock

WordPress Security Plugin Vulnerability Endangers 4 Million+ Sites via @sejournal, @martinibuster

A critical vulnerability was discovered in a popular WordPress security plugin with over 4 million installations. The flaw allows attackers to log in as any user, including administrators, and gain full access to their site-level permissions. Assigned a threat score of 9.8 out of 10, it underscores the ease of exploitation and the potential for full site compromise, including malware injection, unauthorized content changes, and attacks on site visitors.

Really Simple Security

Really Simple Security is a WordPress plugin that was developed to improve resistance of WordPress sites against exploits (called security hardening), enable two-factor authentication, detect vulnerabilities and it also generates an SSL certificate. One of the reasons it promotes itself as lightweight is because it’s designed as a modular software that allows users to choose what security enhancements to enable so that (in theory) the processes for disabled capabilities don’t load and slow down the website. It’s a popular trend in WordPress plugins that allows a software to do many things but only do the tasks that a user requires.

The plugin is promoted through affiliate reviews and according to Google AI Overview enjoys highly positive reviews. Over 97% of reviews on the official WordPress repository are rated with five stars, the highest possible rating, with less than 1% rating the plugin as 1 star.

What Went Wrong?

A security flaw in the plugin makes it vulnerable to authentication bypass, which is a flaw that allows an attacker to access areas of a website that require a username and a password without having to provide credentials. The vulnerability specific to Really Simple Security allows an attacker to acquire access of any registered user of the website, including the administrator, simply by knowing the user name.

This is called an Unauthenticated Access Vulnerability, one of most severe kinds of flaws because it is generally easier to exploit than an “authenticated” flaw which requires an attacker to first attain the user name and password of a registered user.

Wordfence explains the exact reason for the vulnerability:

“The Really Simple Security (Free, Pro, and Pro Multisite) plugins for WordPress are vulnerable to authentication bypass in versions 9.0.0 to 9.1.1.1. This is due to improper user check error handling in the two-factor REST API actions with the ‘check_login_and_get_user’ function. This makes it possible for unauthenticated attackers to log in as any existing user on the site, such as an administrator, when the “Two-Factor Authentication” setting is enabled (disabled by default).

Wordfence blocked 310 attacks targeting this vulnerability in the past 24 hours.”

Recommended Course Of Action:

Wordfence encourages users of the plugin to update to Really Simple Security version 9.1.2 (or higher version).

The Really Simple Security plugin’s changelog responsibly announces the reason for the updated software:

“Changelog
9.1.2
security: authentication bypass”

Read the Wordfence security advisory:

Really Simple Security (Free, Pro, and Pro Multisite) 9.0.0 – 9.1.1.1 – Authentication Bypass

Featured Image by Shutterstock/Tithi Luadthong

Google Maps Launches Product Search: Steps For Retailers via @sejournal, @MattGSouthern

Google has updated Maps to include product-specific search capabilities, just in time for the holiday shopping season.

This feature allows consumers to search for specific items and find nearby stores with available inventory.

Key Features

The update lets you search for products directly within Google Maps and find real-time inventory availability at nearby stores.

Shoppers can find a wide range of items and have immediate access to local pickup options.

In an announcement, Google states:

“Need a last-minute gift? Now, you can search in Google Maps for items like pickleball rackets, board games, ice cream makers and more to find nearby stores that have them in stock so you can pick up what you need, stat. No empty stockings here!”

Implementation Guide For Retailers

Retailers should focus on three key components to succeed with Google Maps’ product search feature.

1. Structured Data Markup

Local inventory visibility depends on using Schema.org markup correctly.

The example below shows what’s for products to be visible on Google Maps:


2. Inventory Management

To ensure this feature works well, accurate inventory tracking is important.

Follow these key steps to keep your data correct:

  • Regularly sync with Google Merchant Center.
  • Automate inventory updates.
  • Update inventory at least once every 24 hours.

3. Store Location Integration

Accurate store location information helps customers find you after discovering your products.

Here are the key elements to focus on:

  • Ensure your Google Business Profile is correct.
  • Keep your geocoding information accurate.
  • Update your operating hours regularly.
  • Adjust pickup availability settings as needed.

Looking Ahead

For retailers aiming to make the most of this feature, here are the recommended steps to follow:

  1. Verify the setup of your Google Merchant Center.
  2. Implement the necessary structured data.
  3. Test the transmission of your data.
  4. Monitor performance using Google Search Console.
  5. Regularly update your inventory feeds.

This update can help drive more customers to physical stores during the holiday shopping season and beyond.

For more information, Google provides detailed guides in the Merchant Center help section.


Featured Image: Rokas Tenys/Shutterstock

Insights From IAB’s Cross-Channel Measurement For Marketers Release via @sejournal, @gregjarboe

In the digital era, consumers interact with brands through a range of platforms and devices — such as social media, display ads, and video on mobile, desktop, tablet, and connected TV (CTV).

This diversity in touchpoints creates both opportunities and complexities for marketers. To navigate this landscape effectively, a robust measurement strategy is essential.

This morning, IAB released two new guides: “Implementing Cross-Channel Measurement for Marketers Playbook” and “Cross-Channel Measurement Best Practices for Marketers.”

The resources from IAB offer marketers detailed strategies for enhancing cross-channel measurement to achieve better business results.

From these guides, let me share some of the strategic insights and tactical advice that they offer marketers.

Implementing Cross-Channel Measurement For Marketers Playbook

The IAB’s “Implementing Cross-Channel Measurement for Marketers Playbook” provides a step-by-step approach to implementing cross-channel measurement for successful outcomes.

This guide lays out a comprehensive approach to building a unified measurement strategy, from setting goals and key performance indicators (KPIs) to using advanced attribution and ensuring privacy compliance.

It stresses the importance of regular audits and team collaboration, enabling marketers to keep up with industry trends and improve measurement tactics continually.

By following these steps, marketers can address common challenges, gain a comprehensive understanding of their marketing activities, and improve business results.

An effective cross-channel measurement strategy unifies campaign insights, allowing marketers to see how different channels contribute to success.

Integrating data from various sources offers a holistic view of consumer behavior, enables media budget optimization, and enhances customer experiences.

However, setting up a successful cross-channel measurement approach requires careful planning, diverse tool integration, and a commitment to continuous improvement.

By sidestepping common mistakes and following best practices, marketers can maximize media efficiency, elevate customer experiences, and improve business outcomes.

Focusing on continuous improvement through regular audits, active stakeholder involvement, and adapting to industry trends is essential for sustained success in an ever-changing digital environment.

Key points from the playbook include:

  • Setting Clear Objectives and KPIs: Defining SMART (specific, measurable, attainable, relevant, and time-bound) goals and KPIs is fundamental for alignment and clarity in measurement strategies. It’s also critical that these metrics align with strategic decisions – whether targeting audiences, gathering consumer insights, crafting messaging, or designing campaign approaches.
  • Developing a Unified Data Strategy: Achieving a holistic view of customer interactions across channels depends on making data interoperable and ensuring its quality. ETL (extract, transform, and load) processes allow integration from multiple sources. Beyond bringing data together, accessibility and seamless connectivity are essential for deep insights and well-informed decision-making.
  • Implementing Advanced Analytics and Attribution Models: Selecting suitable attribution models, employing predictive analytics, and using machine learning help marketers accurately assess the impact of each channel and refine their strategies.
  • Ensuring Privacy Compliance and Data Integrity: Adherence to privacy laws like GDPR and CCPA, implementing consent management systems, and maintaining data accuracy are key to protecting user privacy and ensuring data reliability.
  • Fostering Continuous Improvement and Collaboration: Regular data audits, setting performance benchmarks, engaging stakeholders, promoting team collaboration, and adapting to industry trends are vital for the ongoing refinement and effectiveness of cross-channel measurement strategies.

Cross-Channel Measurement Best Practices for Marketers

This resource explores the challenges of today’s complex digital advertising environment, offering practical advice on data integration, attribution, advanced analytics, privacy compliance, and ongoing optimization.

Digital advertising is increasingly complex as consumers engage with brands across a variety of channels and devices.

These touchpoints include direct media buys, social media, and programmatic campaigns across desktop, mobile web, mobile apps, and connected TVs, among others. This calls for a comprehensive measurement approach to achieve a unified view of marketing performance across all channels.

Cross-channel measurement is not only a technical necessity but also a strategic priority.

Marketers who excel at it harvest insights into their campaigns’ overall impact, optimize media spending, and enhance customer experiences.

A unified view of consumer interactions allows for data-driven decision-making, leading to higher ROI, better conversion rates, and stronger brand loyalty.

However, achieving this unified perspective presents challenges. Data silos, complex attribution, technological limitations, and privacy regulations can all impede effective measurement.

This guide offers marketers practical insights and best practices to overcome these obstacles, laying the groundwork for a successful cross-channel measurement strategy in today’s dynamic digital environment.

Why Cross-Channel Measurement Matters For Brand Advertisers And Agencies

Cross-channel measurement is essential for brand advertisers and agencies for several reasons:

  • Comprehensive Understanding of the Consumer Journey: It reveals how different touchpoints drive advertising success, enabling marketers to see the entire customer journey and understand how each interaction affects conversions.
  • Effective Budget Allocation: By identifying the most impactful channels, advertisers can allocate budgets more effectively, ensuring investments are directed toward channels with the highest ROI.
  • Refined Creative Strategies: Cross-channel insights allow marketers to improve creative strategies and messaging. By assessing content performance across platforms, they can tailor messaging to resonate better with target audiences.
  • Quantifying Marketing Impact: Overcoming data silos and using advanced attribution models helps advertisers quantify the true impact of their efforts, providing clear ROI evidence and justifying media spending.
  • Privacy Compliance and Trust: In a privacy-focused world, adhering to data regulations through rigorous practices builds consumer trust and protects brand reputation.
  • Informed Decision-Making: Effective cross-channel measurement empowers advertisers to make data-driven decisions, leading to more successful strategies, stronger brand loyalty, and better business results.

Why Cross-Channel Measurement Matters For Publishers And Platforms

For publishers and platforms, cross-channel measurement is crucial for these reasons:

  • Showcasing Value to Advertisers: It provides detailed performance data to show how your channels contribute to campaign success, making inventory more appealing and building trust-based relationships with buyers.
  • Boosting Revenue Growth: By highlighting the effectiveness of your media inventory through cross-channel insights, you can drive revenue growth, as advertisers are likely to invest in high-performing channels.
  • Optimizing Content and Ad Delivery: Insights into cross-channel interactions allow for refined content and ad strategies, enhancing user engagement and maximizing ad effectiveness to better meet market demands.
  • Identifying High-Performing Content: With accurate attribution and analytics, publishers can identify top-performing content formats, enabling them to adjust and optimize offerings in line with audience preferences and market trends.
  • Ensuring Privacy Compliance: Strong privacy practices protect user data – maintaining platform reputation – and ensure adherence to regulatory requirements.
  • Staying Competitive: In a data-driven market, effective cross-channel measurement enables publishers to remain competitive, fostering innovation and continuous improvement to deliver exceptional value to advertisers and audiences.

These are some of the strategic insights and tactical advice that I gleaned during my first dive into the guides.

Marketers will want to dive even deeper themselves. To do that, both of IAB’s new resources can be found here.

More resources:


Featured Image: Visual Generation/Shutterstock