The Future Of Content In An AI World: Provenance & Trust In Information

When Emily Epstein shared her perspective on LinkedIn about how “people didn’t stop reading books when encyclopedias came out,” it sparked a conversation about the future of primary sources in an AI-driven world.

In this episode, Katie Morton, Editor-in-Chief of Search Engine Journal, and Emily Anne Epstein, Director of Content at Sigma, dig into her post and unpack what AI really means for publishers, content creators, and marketers now that AI tools present shortcuts to knowledge.

Their discussion highlights the importance of provenance, the layers involved in online knowledge acquisition, and the need for more transparent editorial standards.

If you’re a content creator, this episode can help you gain insight into how to provide value as the competition for attention becomes a competition for trust.

Watch the video or read the full transcript below:

Katie Morton: Hello, everybody. I’m Katie Morton, Editor-in-Chief of Search Engine Journal, and today I’m sitting down with Emily Anne Epstein, Director of Content at Sigma. Welcome, Emily.

Emily Ann Epstein: Thanks so much. I’m so excited to be here.

Katie: Me too. Thanks for chatting with me. So Emily wrote a really excellent post on LinkedIn that caught my attention. Emily, for our audience, would you mind summarizing that post for us?

Emily: So this should feel both shocking and non-shocking to everybody. But the idea is, people didn’t stop reading books when encyclopedias came out. And this is a response to the hysteria that’s going on with the way AI tools are functioning as summarizing devices for complicated and complex situations. And so the idea is, just because there’s a shortcut now to acquiring knowledge, it doesn’t mean we’re getting rid of the need for primary sources and original sources.

These two different types of knowledge acquisition exist together, and they layer on top of one another. You may start your book report with an encyclopedia or ChatGPT search, but what you find there doesn’t matter if you can’t back it up. You can’t just say in a book report, “I heard it in Encarta.” Where did the information come from? I think about the way this is going to transform search: There’s simply going to be layers now.

Maybe start your search with an AI tool, but you’ll need to finish somewhere else that organizes primary sources, provides deeper analysis, and even shows contradictions that go into creating knowledge.

Because a lot of what these synthesized summaries do is present a calm, “impartial” view of reality. But we all know that’s not true. All knowledge is biased in some way because it cannot be “all-containing.”

The Importance Of Provenance

Katie: I want to talk about something you mentioned in your LinkedIn post: provenance. What needs to happen, whether culturally, editorially, or socially, for “show me the source material” to become standard in AI-assisted search?

With Wikipedia or encyclopedias, ideally, people should still cite the original source, go deeper into the analysis, and be able to say, “Here’s where this information came from.” How do we get there so people aren’t just skimming surface-level summaries and taking them as gospel?

Emily: First, people need to use these tools, and there needs to be a reckoning with how reliable they are. Thinking about provenance means thinking about knowledge acquisition as triangulation. So, when I was a journalist, you have to balance hearsay, direct quotes, press releases, and social media.

You create your story from a variety of sources, so that way, you get something that’s in the middle and can explain multiple truths and realities. That comes from understanding that truth has never been linear, and reality is fracturing.

What AI does, even more advanced than that, is deliver personalized responses. People are prompting their models differently, so we’re all working from different sets of information and getting different answers. Once reality is fractured to that degree, knowing where something comes from – the provenance – becomes essential for context.

And triangulation won’t just be important for journalists; it’s going to be important for everyone because people make decisions based on the information that they receive.

If you get bad inputs, you’ll get bad outputs, make bad decisions, and that affects everything from your work to your housing. People will need to triangulate a better version of reality that is more accurate than what they’re getting from the first person or the first tool they asked.

Creators: Competing For Attention To Competing For Trust

Katie: So if AI becomes the top layer in how people access information – designed to hold attention within its own ecosystem – what does that mean for content creators and publishers? It feels like they’re creating a commodity that AI then repackages as its own.

How do you see that playing out for creators in terms of revenue and visibility?

Emily: Instead of competing for attention, creators and publishers will compete for trust. That means making editorial standards more transparent. They’re going to have to show the work that they’re doing. Because with most AI tools, you don’t see how they work, it’s a bit of a black box.

But if creators can serve as a “blockchain,” (a verifiable ledger of information sources) and they’re showing their sources and methods, that will be their value.

Think about photography. When it first came out, it was considered a science. People thought photos were pure fact. Then, darkroom techniques like dodging and burning or combining multiple exposures showed that photos could lie.

And when photography became an art form, people realized that the photographer’s role was to provide a filter. That’s where we are with AI. There are filters on every piece of information that we receive.

And those organizations that make their filter transparent are going to be more successful, and people will return to them because again, they’re getting better information. They know where it’s coming from, so they can make better decisions and live better lives.

AI Hallucinations & Deepfakes

Emily: It was a shocking moment in the history of photography. that people could lie with photographs. And that’s sort of where we are right now. Everybody is using AI, and we know there are hallucinations, but we have to understand that we cannot trust this tool, generally speaking, unless it shows its work.

Katie: And the risks are real. We’re already seeing AI voiceovers and video deepfakes mimicking creators often without their consent.

Inspiring People To Go Deeper

Katie: In your post, you ended with “people still doing the work of deciding what’s enough.” In an attention economy of speed and convenience, how do we help people go deeper?

Emily: The idea that people don’t want to go deeper flies in the face of Wikipedia holes. People start with summarized information, but then click a citation, keep going further, watch another show, keep digging.

People want more of what they want. If you give them a breadcrumb of fascinating information, they’ll want more or that. Knowledge acquisition has an emotional side. It gives you dopamine hits: “I found that, that’s for me.”

And as content marketers, we have to provide that value for people where they say, ‘Wow, I am smarter because of this information. I like this brand because this brand has invested in my intelligence and my betterment.’

And for content creators, that needs to be the gold star.

Wrapping Up

Katie: Right on. For those who want to follow your work, where can they find you?

Emily: I’m dialoging and writing my thoughts on AI out loud and in public on LinkedIn. Come join me, and let’s think out loud together.

Katie: Sounds great. And I’m always at searchenginejournal.com. Thank you so much, Emily, for taking the time today.

Emily: Thank you!

More Resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Google’s Robby Stein Names 5 SEO Factors For AI Mode via @sejournal, @martinibuster

Robby Stein, Vice President of Product for Google Search, recently sat down for an interview where he answered questions about how Google’s AI Mode handles quality, how Google evaluates helpfulness, and how it leverages its experience with search to identify which content is helpful, including metrics like clicks. He also outlined five quality SEO-related factors used for AI Mode.

How Google Controls Hallucinations

Stein answered a question about hallucinations, where an AI lies in its answers. He said that the quality systems within AI Mode are based on everything Google has learned about quality from 25 years of experience with classic search. The systems that determine what links to show and whether content is good are encoded within the model and are based on Google’s experience with classic search.

The interviewer asked:

“These models are non-deterministic and they hallucinate occasionally… how do you protect against that? How do you make sure the core experience of searching on Google remains consistent and high quality?”

Robby Stein answered:

“Yeah, I mean, the good news is this is not new. While AI and generative AI in this way is frontier, thinking about quality systems for information is something that’s been happening for 20, 25 years.

And so all of these AI systems are built on top of those. There’s an incredibly rigorous approach to understanding, for a given question, is this good information? Are these the right links? Are these the right things that a user would value?

What’s all the signals and information that are available to know what the best things are to show someone. That’s all encoded in the model and how the model’s reasoning and using Google search as a tool to find you information.

So it’s building on that history. It’s not starting from scratch because it’s able to say, oh, okay, Robbie wants to go on this trip and is looking up cool restaurants in some neighborhood.

What are the things that people who are doing that have been relying on on Google for all these years? We kind of know what those resources are we can show you right there. And so I think that helps a lot.

And then obviously the models, now that you release the constraint on layout, obviously the models over time have also become just better at instruction following as well. And so you can actually just define, hey, here are my primitives, here are my design guidelines. Don’t do this, do this.

And of course it makes mistakes at times, but I think just the quality of the model has gotten so strong that those are much less likely to happen now.”

Stein’s explanation makes clear that AI Mode is encoded with everything learned from Google’s classic search systems rather than a rebuild from scratch or a break from them. The risk of hallucinations is managed by grounding AI answers in the same relevance, trust, and usefulness signals that have underpin classic search for decades. Those signals continue to determine which sources are considered reliable and which information users have historically found valuable. Accuracy in AI search follows from that continuity, with model reasoning guided by longstanding search quality signals rather than operating independently of them.

How Google Evaluates Helpfulness In AI Mode

The next question is about the quality signals that Google uses within AI Mode. Robby Stein’s answer explains that the way AI Mode determines quality is very much the same as with classic search.

The interviewer asked:

“And Robbie, as search is evolving, as the results are changing and really, again, becoming dynamic, what signals are you looking at to know that the user is not only getting what they want, but that is the best experience possible for their search?”

Stein answered:

“Yeah, there’s a whole battery of things. I mean, we look at, like we really study helpfulness and if people find information helpful.

And you do that through evaluating the content kind of offline with real people. You do that online by looking at the actual responses themselves.

And are people giving us thumbs up and thumbs downs?

Are they appreciating the information that’s coming?

And then you kind of like, you know, are they using it more? Are they coming back? Are they voting with their feet because it’s valuable to you.

And so I think you kind of triangulate, any one of those things can lead you astray.

There’s lots of ways that, interestingly, in many products, if the product’s not working, you may also cause you to use it more.

In search, it’s an interesting thing.

We have a very specific metric that manages people trying to use it again and again for the same thing.

We know that’s a bad thing because it means that they can’t find it.

You got to be really careful.

I think that’s how we’re building on what we’ve learned in search, that we really feel good that the things that we’re shipping are being found useful by people.”

Stein’s answer shows that AI Mode evaluates success using the same core signals used for search quality, even as the interface becomes more dynamic. Usefulness is not inferred from a single engagement signal but from a combination of human evaluation, explicit feedback, and behavioral patterns over time.

Importantly, Stein notes that just because people use it a lot, presumably in a single session, that the increased usage alone is not treated as success, since repeated attempts to answer the same query indicate failure rather than satisfaction. The takeaway is that AI Mode’s success is judged by whether users are satisfied, and that it uses quality signals designed to detect friction and confusion as much as positive engagement. This carries over continuity from classic search rather than redefining what usefulness means.

Five Quality Signals For AI Search

Lastly, Stein answers a question about the ranking of AI generated content and if SEO best practices still help for ranking in AI. Stein’s answer includes five factors that are used for determining if a website meets their quality and helpfulness standards.

Stein answered:

“The core mechanic is the model takes your question and reasons about it, tries to understand what you’re trying to get out of this.

It then generates a fan-out of potentially dozens of queries that are being Googled under the hood. That’s approximating what information people have found helpful for those questions.

There’s a very strong association to the quality work we’ve done over 25 years.

Is this piece of content about this topic?

Has someone found it helpful for the given question?

That allows us to surface a broader diversity of content than traditional Search, because it’s doing research for you under the hood.

The short of it is the same things apply.

  1. Is your content directly answering the user’s question?
  2. Is it high quality?
  3. Does it load quickly?
  4. Is it original?
  5. Does it cite sources?

If people click on it, value it, and come back to it, that content will rank for a given question and it will rank in the AI world as well.”

Watch the interview starting about the one hour and twenty three minute mark:

Let’s Be Honest About The Ranking Power Of Links via @sejournal, @martinibuster

What link building should be trying to accomplish, in my opinion, is proving that a site is trustworthy and making sure the machine understands what topic your web pages fit into. The way to communicate trustworthiness is to be careful about what sites you obtain links from and to be super careful about what sites your site links out to.

Context Of Links Matter

Maybe it doesn’t have to be said but I’ll say it: It’s important now more than ever that the page your link is on has relevant content on it and that the context for your link is an exact match for the page that’s being linked to.

Outgoing Links Can Signal A Site Is Poisoned

Also make sure that the outgoing links are to legitimate sites, not to sites that are low quality or in problematic neighborhoods. If those kinds of links are anywhere on the site it’s best to consider the entire site poisoned and ignore it.

The reason I say to consider the site poisoned is the link distance ranking algorithm concept where inbound links tell a story about how trustworthy a site is. Low quality outbound links are a signal that something’s wrong with the site. It’s possible that a site like that will have its ability to pass PageRank removed.

Reduced Link Graph

This is how the Reduced Link Graph works, where the spammy sites are kicked out of the link graph and only the legit sites are kept for ranking purposes and link propagation. The link graph can be thought of as a map of the internet with websites connected to each other by links. When you kick out the spammy sites that’s called the reduced link graph.

Search engines are at a point where they can rank websites based on the content alone. Links still matter but the content itself is now the highest level ranking factor. I suspect that in general the link signal isn’t very healthy right now. Less people are blogging across all topics. Some topics have a healthy blogging ecosystem but in general there aren’t professors blogging about technology in the classroom and there aren’t HR executives sharing workplace insights and so on like there used to be ten or fifteen years ago.

Links for Inclusion

I’m of the opinion that links increasingly are useful for determining if a site is legit, high quality, and trustworthy, deeming it worthy for consideration in the search results. In order to stay in the SERPs it’s important to think about the outbound links on your site and the sites you obtain links from. Think in terms of reduced link graphs, with spammy sites stuck on the outside within their own spammy cliques and the non-spam on the inside within the trusted Reduced Link Graph.

In my opinion, you must be in the trusted Reduced Link Graph in order to stay in play.

Is Link Building Over?

Link building is definitely not over. There’s still important. What needs to change is how links are acquired. The age of blasting out emails at scale are over. There aren’t enough legitimate websites to make that worthwhile. It’s better to be selective and targeted about which sites you get a (free) link from.

Something else that’s becoming increasingly important is citations, other sites talking about your site. An interesting thing right now is that sponsored articles, sometimes known as native advertising, will get cited in AI search engines, including Google AI Overviews and AI Mode. This is a great way to get a citation in a way that will not hurt your rankings as long as the sponsored article is clearly labeled as sponsored and the outbound links are nofollowed.

Takeaways

  • Links As Trust And Context Signals, Not Drivers Of Ranking
    Links increasingly function to confirm that a site is legitimate and topically aligned, rather than to directly push rankings through volume or anchor text manipulation as in the old days.
  • The Reduced Link Graph Matters
    Search engines filter out spammy or low-quality sites, leaving a smaller trusted network where links and associations still count. Being outside this trusted graph puts sites at risk of exclusion.
  • Content Matters, Links Qualify
    Search engines can rank many pages based on content alone, but links can still act as a gatekeeper for credibility and inclusion, especially for competitive topics.
  • Outbound Links Are A Risk Signal
    Linking out to low-quality or problematic sites can damage a site’s perceived trustworthiness and its ability to pass value.
  • Traditional Link Building Is Obsolete
    Scaled outreach, anchor text strategies, and chasing volume are ineffective in an AI-driven search environment.
  • Citations Are Rising In Importance
    Mentions and discussions of a website can cause a site to rank better in AI search engines
  • Sponsored Articles
    Sponsored articles that are properly labeled as sponsored content and containing nofollowed links are increasingly surfaced in AI search features and contribute to visibility.

Link building is still relevant, but not in the way it used to be. Its function now is likely more about establishing whether a site is legitimate and clearly associated with a real topic area, not to push rankings through volume, anchors, or scale. Focusing on clean outbound links, selective relationships with trusted sites, and credible citations keeps a site inside the trusted reduced link graph, which is the condition that allows strong content to compete and appear in both traditional search results and AI-driven search surfaces.

Featured Image by Shutterstock/AYO Production

Google Says What To Tell Clients Who Want SEO For AI via @sejournal, @martinibuster

Google’s Danny Sullivan offered advice to SEOs who have clients asking for updates on what they’re going to do for AI SEO. He acknowledged it’s easier to give the advice than it is to have to actually tell clients, but he also said that advancements in content management systems drive technical SEO into the background, enabling SEOs and publishers to focus on the content.

What To Tell Clients

Danny Sullivan acknowledged that SEOs are in a tough spot with clients. He didn’t suggest specifics for how to rank better in AI search (although later in the podcast he did offer suggestions for what to do to rank better in AI search).

But he did offer suggestions for what to tell clients.

Danny explained:

“And the other thing is, and I’ve seen a number of people remark on this, is this concern that, well, I’ve been doing SEO, but now I’m getting clients or people saying to me, but I need the new stuff. I need the new stuff. And I can’t just tell them it’s the same old stuff.

So I don’t know if you feel like you need to dress it up a bit more, but I think the way you dress it up is to say, These are continuing to be the things that are going to make you successful in the long-term. I get you want the fancy new type of thing, but the history is that the fancy new type of thing doesn’t always stick around if we go off and do these particular types of things…

I’m keeping an eye on it, but right now, the best advice I can tell you when it comes to how we’re going to be successful with our AEO is that we continue on doing the stuff that we’ve been doing because that is what it’s built on.

Which is easy for me to say ’cause I don’t got someone banging on the door to say, Well, actually we do. And so we are doing that.

So that’s why, as part of the podcast, it’s just to kind of reassure that, look, just because the formats are changing didn’t mean you have to change everything that you had to do and that everything you had to shift around.”

Downside Of Prioritizing AEO/GEO For AI Search Visibility

There are many in the SEO community who are suggesting fairly spammy things to do to rank better in AI chatbots like ChatGPT, like creating listicles that recommend themselves as best whatever. Others are doing things like tweaking on keyword phrases, the kind of thing SEOs stopped doing by 2005 or 2006.

The problem with making dramatic changes to content in order to rank better in chatbots is that ChatGPT, Perplexity, and Anthropic Claude’s search traffic share is a fraction of a percent for each of them, with Claude close zero and ChatGPT estimated to be 0.2% – 0.5%.

So it absolutely makes zero sense to prioritize AEO/GEO over Google and Bing search at this point because the return on the investment is close to zero. It’s a different story when it comes to Google AI Overviews and AI Mode, but the underlying ranking systems for both AI interfaces remain Google’s classic search.

Danny shared that focusing on things that are specific to AI risks complicating what should be simple.

Google’s Danny Sullivan shared:

“And in fact, that the more that you dramatically shift things around, and start doing something completely different, or the more that you start thinking I need to do two different things, the more that you may be making things far more complicated, not necessarily successful in the long term as you think they are.”

Technical SEO Is Needed Less?

John Mueller followed up by mentioning that the advanced state of content management systems today means that SEOs and publishers no longer have to spend as much time on technical SEO issues because most CMS’s have the basics of SEO handled virtually out of the box. Danny Sullivan said that this frees up SEOs and creators to focus on their content, which he insisted will be helpful for ranking in AI search surfaces.

John Mueller commented:

“I think that makes a lot of sense. I think one of the things that perhaps throws SEOs off a little bit is that in the early days, there was a lot of almost like a technical transition where people initially had to do a lot of technical specific things to make their site even kind of accessible in search. And at some point nowadays, I think if you’re using a popular CMS like WordPress or Wix or any of them, basically you don’t have to worry about any of those technical details.

So it’s almost like that technical side of things is a lot less in the foreground now, and you can really focus on the content, and that’s really what users are looking for. So it’s like that, almost like a transition from technical to content side with regards to SEO.”

This echoed a previous statement from earlier in the podcast where Danny remarked on how some people have begun worrying less about SEO and focusing on content.

Danny said:

“But we really just want you to focus on your content and not really worry about this. If your content is on the web and generally accessible as most people’s content is, that’s it.

I’ve actually been heartened that I’ve seen a number of people saying things like: I don’t even want to think about this SEO stuff anymore. I’m just getting back into the joy of writing blogs.

I’m like, yes, great. That’s what we want you to do. That’s where we think you’re going to find your most success.”

Listen to Danny Sullivan’s remarks at about the 8 minute mark:

Featured Image by Shutterstock/Just dance

Google Explains How To Rank In AI Search via @sejournal, @martinibuster

Google’s John Mueller and Danny Sullivan discussed their thoughts on AI search and what SEOs and creators should be doing to make sure their content is surfaced. Danny showed some concern for folks who were relying on commodity content that is widely available.

What Creators Should Focus On For AI

John Mueller asked Danny Sullivan what publishers should be focusing on right now that’s specific to AI. Danny answered by explaining what kind of content you should not focus on and what kind of content creators should be focusing on.

He explained that the kind of content that creators should not focus on is commodity content. Commodity content is web content that consists of information that’s widely available and offers no unique value, no perspective, and requires no expertise. It is the kind of content that’s virtually interchangeable with any other site’s content because they are all essentially generic.

While Danny Sullivan did not mention recipe sites, his discussion about commodity content immediately brought recipe sites to mind because those kinds of sites seemingly go out of their way to present themselves as generically as possible, from the way the sites look, the “I’m just a mom of two kids” bio, and the recipes they provide. In my opinion, what Danny Sullivan said should make creators consider what they bring to the web that makes them notable.

To explain what he meant by commodity content, Danny used the example of publishers who used to optimize a web page for the time that the Super Bowl game began. His description of the long preamble they wrote before giving the generic answer of what time the Super Bowl starts reminded me again of recipe sites.

At about the twelve minute mark John Mueller asked Danny:

“So what would you say web creators should focus on nowadays with all of the AI?”

Danny answered:

“A key thing is to really focus on is the original aspect. Not a new thing.

These are not new things beyond search, but if you’re really trying to reframe your mind about what’s important, I think that on one hand, there’s a lot of content that is just kind of commodity content, factual information, and I think that the… LLM, AI systems are doing a good job of presenting that sort of stuff.

And it’s not originating from any type of thing.

So the classic example, as you know, will make people laugh, …but every year we have this little American football thing called the Super Bowl, which is our big event.

…But no one ever can seem to remember what time it’s on.

…Multiple places would then all write their “what time does the Super Bowl start in 2011?” post. And then they would write these giant long things.

…So, you know, and then at some point, we could see enough information and we have data feeds and everything else that we just kind of said, you do a search and …the Super Bowl is going to be at 3:30.

…I think the vast majority of people say, that’s a good thing. Thank you for just telling me the time of the Super Bowl.

It wasn’t super original information.”

Commodity Content Is Not Your Strength

Next Danny considered some of the content people are publishing today, encouraging them to think  about the generic nature of their content and to give some thought to how they can share something more original and unique.

Danny continued his answer:

“I think that is a thing people need to understand, is that more of this sort of commodity stuff, it isn’t going to necessarily be your strength.

And I do worry that some people, even with traditional SEO, focus on it too much.

There are a number of sites I know from the research and things that I’ve done that get a huge amount of traffic for the answer to various popular online word-solving games.

It’s just every day I’m going to give you the answer to it. …and that is great. Until the system shifts or whatever, and it’s common enough, or we’re pulling it from a feed or whatever, and now it’s like, here’s the answer.”

Bring Your Expertise To AI

Danny next suggested that people who are concerned about showing up in AI should start exploring how to express their authentic experience or expertise. He said this advice is not just for text content but also to video and podcast content as well.

He continued:

“Your original voice is that thing that only you can provide. It’s your particular take.

And so that’s what we think was our number one thing when we’re telling people is like, this is what we think your strength is going to be.

As we go into this new world, is already what you should be doing, but this is what your strength that you should be doing is focus on that original content.

I think related to that is this idea that people are also seeking original content that’s, …authentic to them, which typically means it’s a video, it’s a podcast…

…And you’ve seen that in the search we’ve already done, where we brought in more social, more experiential content.  Not to take away from the expert takes, it’s just that people want that.

Sometimes you’re just wanting to know someone’s firsthand experience alongside some expert take on it as well.

But if you are providing those expert takes, you’re doing reviews or whatever, and you’ve done that in the written form, you still have the opportunity to be doing those in videos and podcasts and so on.  Those are other opportunities.

So those are things that, again, it’s not unique to the AI formats, but they just may be, as you’re thinking about, how do I reevaluate what I’m doing overall in this era, that these are things you may want to be considering with it from there.”

John Mueller agreed that it makes sense to bring your unique voice to content in order to make it stand out. Danny’s point treats visibility in AI driven search as a matter of differentiation rather than optimization. The emphasis is not on adapting content to a new format, but on creating a recognizable voice and perspective with which to stand out.  Given that AI Search is still classic search under the hood, it makes sense to stand out from competitors with unique content that people will recognize and recommend.

Listen to the passage at around the twelve minute mark:

Featured Image by Shutterstock/Asier Romero

The Facts About Trust Change Everything About Link Building via @sejournal, @martinibuster

Trust is commonly understood to be a standalone quality that is passed between sites regardless of link neighborhood or topical vertical. What I’m going to demonstrate is that “trust” is not a thing that trickles down from a trusted site to another site. The implication for link building is that many may have been focusing on the wrong thing.

Six years ago I was the first person to write about link distance ranking algorithms that are a way to create a map of the Internet that begins with a group of sites that are judged to be trustworthy. These sites are called the seed set. The seed set links to other sites, which in turn link to ever increasing groups of other sites. The sites closer to the original seed set tend to be trustworthy websites. The sites that are furthest away from the seed set tend to be not trustworthy.

Google still counts links as part of the ranking process so it’s likely that there continues to be a seed set that is considered trustworthy from which the further away you a site is linked from the seeds the likelier it is considered to be spam.

Circling back to the idea of trust as a ranking related factor, trust is not a thing that is passed from one site to another. Trust, in this context, is not even a part of the conversation. Sites are said to be trustworthy by the link distance between the site in question and the original seed set. So you see, there is no trust that is conveyed from one site or another.

The word Trustworthiness is even a part of the E-E-A-T standard of what constitutes a quality website. So trust should never be considered as a thing that is passed from one site to another because it does not exist.

The takeaway is that link building decisions based on the idea of trust propagated through links are built on an outdated premise. What matters is whether a site sits close to trusted seed sites within the same topical neighborhood, not whether it receives a link from a widely recognized or authoritative domain. This insight transforms link evaluation into a relevance problem rather than a reputation problem. This insight should encourage site owners to focus on earning links that reinforce topical alignment instead of chasing links that appear impressive but have little, if any, ranking value.

Why Third Party Authority Metrics Are Inaccurate

The second thing about the link distance ranking algorithms that I think is quite cool and elegant is that websites naturally coalesce around each other according to their topics. Some topics are highly linked and some, like various business association verticals, are not well linked at all. The consequence is that those poorly linked sites that are nevertheless close to the original seed set do not acquire much “link equity” because their link neighborhoods are so small.

What that means is that a low-linked vertical can be a part of the original seed set and display low third-party authority metrics scores. The implication is that the third-party link metrics that measure how many inbound links a site has fail. They fail because third-party authority metrics follow the old and outdated PageRank scoring method that counts the amount of inbound links a site has. PageRank was created around 1998 and is so old that the patent on it has expired.

The seed set paradigm does not measure inbound links. It measures the distance from sites that are judged to be trustworthy. That has nothing to do with how many links those seed set sites have and everything to do with them being trustworthy, which is a subjective judgment.

That’s why I say that third-party link authority metrics are outdated. They don’t follow the seed set paradigm, they follow the old and outdated PageRank paradigm.
The insight to take away from this is that many highly trustworthy sites are being overlooked for link building purposes because link builders are judging the quality of a site by outdated metrics that incorrectly devalue sites in verticals that aren’t well linked but are actually very close to the trustworthy seed set.

The Important Of Link Neighborhoods

Let’s circle back to the observation that websites tend to naturally link to other sites that are on the same topic. What’s interesting about this is that the seed sets can be chosen according to topic verticals. Some verticals have a lot of inbound links and some verticals are in their own little corner of the Internet and aren’t link to from outside of their clique.

A link distance ranking algorithm can thus be used to calculate the relevance according to whatever neighbhorhood a site is located in. Majestic does something like that with their Trust Flow and Topical Trust Flow metrics that actually start with trusted seed sites. Topical Trust Flow breaks that score down into specific topic categories. The Topical Trust Flow metric shows how relevant a website is for a given metric.

My point isn’t that you should use that metric, although I think it’s the best one available today. The point is that there is no context for thinking about trustworthiness as something that spreads from link to link.

Once you can think of links in the paradigm of distance within a topic category it becomes easier to understand why a link from a university website or some other so-called “high trust” site isn’t necessarily that good or useful. I know for certain because there was a time before distance ranking where the topic of the site didn’t matter but now it does matter very much and it has mattered for a long time now.

The takeaways here are:

  1. It is counterproductive to go after so-called “high trust” links from verticals that are well outside of the topic of the website you’re trying to get a link to.
  2. This means that it’s more important to get links from sites that are in the right topic or from a context that exactly matches the topic, from a website that’s in an adjacent topical category.

For example, a site like The Washington Post is not a part of the Credit Repair niche. Any “trust” that may be calculated from a New York Times link to a Credit Repair site will likely be dampened to zero. Of course it will. Remember, seed set trust distance is calculated within groups within a niche. There is no trust passed from one link to another link. It is only the distance that is counted.

Logically, it makes sense to assume that there will be no validating effect between irrelevant sites. relevant website for the purposes of the seed set trust calculations.

Takeaways

  • Trust is not something that’s passed by links
    Link distance ranking algorithms do not deal with “trust.” They only measure how close a site is to a trusted seed set within a topic.
  • Link distance matters more than link volume
    Ranking systems based on link distance assess proximity to trusted seed sites, not how many inbound links a site has.
  • Topic-based link neighborhoods shape relevance
    Websites naturally cluster by topic, and link value is likely evaluated within those topical clusters rather than across the entire web. A non-relevant link can still have some small value but irrelevant links stopped working almost twenty years ago.
  • Third-party authority metrics are misaligned with modern link ranking systems
    Some third-party metrics rely on outdated Page Rank-style link counting and fail to account for seed set distance and topical context.
  • Low-link verticals are undervalued by SEOs
    Entire niches that are lightly linked can still sit close to trusted seed sets, yet appear weak in third-party metrics, causing them to be overlooked in link builders.
  • Relevance outweighs perceived link strength
    Links from well-known but topically irrelevant sites likely contribute little or nothing compared to links from closely related or adjacent topic sites.

Modern link evaluation is about topical proximity, not “trust” or raw link counts. Search systems measure how close a site is to trusted seed sites within its own topic neighborhood, which means relevant links from smaller, niche sites can matter more than links from famous but unrelated domains.

This knowledge should enable smarter link building by focusing efforts on contextually relevant websites that may actually strengthen relevance and rankings, instead of chasing outdated link authority scores that no longer reflect how search works.

Featured Image by Shutterstock/Kues

Improve Any Link Building Strategy With One Small Change via @sejournal, @martinibuster

Link building outreach is not just blasting out emails. There’s also a conversation that happens when someone emails you back with a skeptical question. The following are tactics to use for overcoming skeptical responses.

In my opinion it’s always a positive sign when someone responds to an email, even if they’re skeptical. I consider nearly all email responses to be indicators that a link is waiting to happen. This is why a good strategy that anticipates common questions will help you convert skeptical responses into links.

Many responses tend to be questions. What they are asking, between the lines, is for you to help them overcome their suspicions. Anytime you receive a skeptical response, try to view it as them asking you, “Help me understand that you are legitimate and represent a legitimate website that we should be linking to.”

The question is asked between the lines. The answer should similarly be addressed between the lines. Ninety nine percent of the time, a between-the-lines question should not be answered directly. The perfect way to answer those questions, the perfect way to address an underlying concern, is to answer it in the same way you received it, between the lines.

Common  and weird questions that I used to get were like:

  • Who are you?
  • Who do you work for?
  • How did get my email address?

Before I discuss how I address those questions, I want to mention something important that I do not do. I do not try to actively convert the respondent in the first response. In my response to their response to my outreach, I never ask them to link to the site.

The question of linking is already hanging in the air and is the object of their email to you- there is no need to bring that up. If in your response you ask them again to link to your site it will tilt them back to being suspicious of you, raising the odds of losing the link.

In trout fishing, the successful angler crouches so that the trout does not see you. The successful angler may even wear clothing that helps them blend into the background. The best anglers imitate the crane, a fish-eating bird that stands perfectly still, imperceptibly inching closer to its prey. This is done to avoid being noticed. Your response should imitate the crane or the camouflaged angler. You should put yourself into the mindset of anything but a marketer asking for a link.

Your response must not be to immediately ask for a link because that in my opinion will just lose the link. So don’t do it just yet.

Tribal Affinity

One approach that I used to use successful is what I called the Tribal Affinity approach. For a construction/home/real estate related campaign, I used to approach it with the mindset of a homeowner. I wouldn’t say that I’m a homeowner (even though I was), I would just think in terms of what would I say as a homeowner contacting a company to suggest a real estate or home repair type a link. In the broken link or suggest a link strategy, I would say that the three links I am suggesting for their links page have been useful to me.

Be A Mirror

A tribal affinity response that was useful to me is to mirror the person I’m outreaching to, to assume the mindset of the person I am responding to. So for example, if they are a toy collector then your mindset can also be a toy collector. If the outreach target is a club member then your outreach mindset can be an enthusiast of whatever the club is about. I never claim membership in any particular organization, club or association. I limit my affinity to mirroring the same shared mindset as the person I’m outreaching to.

Assume The Mindset

Another approach is to assume the mindset of someone who happened upon the links page with a broken link or missing a good quality link. When you get into the mindset the text of your email will be more natural.

Thus, when someone responds by challenging me by asking how I found their site or who am I working for my response is to just stick to my mindset of a homeowner and respond accordingly.

And really, what’s going on is that they’re not really asking how you found their site. What they’re really asking, between the lines, is if you’re a marketer of some kind. You can go ahead and say yes, you are. Or you can respond between the lines and say that you’re just a homeowner. Up to you.

There are many variations to this approach. The important points are:

  • Responses that challenge you are not necessarily hostile but are often link conversions waiting to happen.
  • Never respond to a response by asking for a link.
  • Put yourself into the right mindset. Thinking like a marketer will usually lead to a conversion dampening response.
  • Put yourself into the mindset that mirrors the person you outreach to.

Get into the mindset that gives you a plausible reason for finding their site and the best words for asking for a link will write themselves.

Featured Image by Shutterstock/Luis Molinero

Questions The CEO Should Be Asking About Their Website (But Rarely Does) via @sejournal, @billhunt

Few CEOs ever ask hard questions about their company website. They’ll sign off on multimillion-dollar redesigns, approve ad budgets, and endorse “digital transformation” plans, but rarely ask how much enterprise value their digital infrastructure is actually creating.

That’s a problem, because the website is no longer a marketing artifact. It’s the factory floor of digital value creation. Every lead, sale, customer interaction, and data signal runs through it. When the site performs well, it compounds growth. When it underperforms, it silently leaks shareholder value.

Executives don’t need to understand HTML or crawl budgets. But they do need to ask sharper questions.  They need to ask the kind that expose hidden risk, surface inefficiencies, and align digital investments with measurable business outcomes. In the age of AI-driven search, where visibility and trust are determined algorithmically, these questions aren’t optional. They’re fiduciary.

Why CEOs Must Ask – Even If SEO’s Believe It Is “Beneath” Them

There’s a persistent misconception in digital circles: that CEOs shouldn’t concern themselves with SEO, site performance, or technical issues. “That’s marketing’s job,” people say. But the truth is, these issues directly affect the metrics that boards and investors care about most – operating margin, revenue growth, capital efficiency, and risk mitigation.

When a website is treated as an expense line rather than a capital asset, accountability disappears. Teams chase traffic over value, marketing spend rises to offset organic losses, and executives are left with fragmented data that hides the real cost of inefficiency.

A CEO’s job isn’t to approve color palettes or keyword lists. It’s to ensure the digital infrastructure is producing measurable returns on invested capital just as they would for a factory, logistics system, or data center.

The Cost Of Not Asking

Every company has a “digital balance sheet,” even if it’s never been documented. Behind every campaign and click lies a network of dependencies, from page speed and content accuracy to structured data, discoverability, and cross-market alignment. When those systems falter, the losses are invisible but compounding:

  • Organic visibility declines, forcing paid media spend to rise.
  • Technical debt accumulates, slowing innovation.
  • AI search engines misattribute content or cite competitors instead.
  • Global teams duplicate content, fragmenting authority and wasting budget.

In one multinational I audited, over $5 million per month in paid search spend was compensating for lost organic traffic caused by broken hreflang tags and indexation gaps.

A similar disconnect played out publicly when the CMO of a major retail brand was asked during an earnings call about their online holiday strategy. He confidently declared, “As the largest reseller in our category, we’ll dominate the season online.” Within seconds, a reporter searched the category term, and the brand didn’t appear on page one. The CMO was stunned. He had assumed offline dominance guaranteed online visibility. It didn’t.

That thirty-second fact-check illustrated a billion-dollar truth: market leadership offline doesn’t ensure findability online. Without the right questions and governance, digital equity erodes silently until someone outside the company exposes it.

No CEO would tolerate that level of inefficiency in their supply chain. Yet it happens online every day, unnoticed, because few know which questions to ask.

The 10 Questions Every CEO Should Be Asking

These questions aren’t tactical; they’re financial. They surface whether the digital system that represents your brand to the world is operating efficiently, effectively, and in alignment with corporate goals.

Question Why It Matters Executive Red Flag
1. Are we treating the website as a capital asset or a cost center? Capital assets require lifecycle planning, maintenance, and reinvestment. Budgets are reset annually with no cumulative accountability.
2. What’s our digital yield – the value per visit or per impression? Links traffic and investment to tangible business outcomes. Traffic grows, revenue stays flat.
3. Where are we leaking value? Surfaces inefficiencies across SEO, paid, content, and conversion funnels. Paid media dependency rises while organic visibility declines.
4. How fast can we diagnose and fix a problem? Measures organizational agility and governance maturity. Issues discovered only after quarterly reports.
5. Do we have digital “command and control”? Reveals whether teams, agencies, and regions share accountability. Multiple CMSs, duplicated content, and conflicting data.
6. How does our web performance translate to shareholder metrics? Connects digital KPIs to ROIC and margin. Dashboards report sessions, not value.
7. Who owns web effectiveness? Ownership drives accountability and resourcing. Everyone claims a piece; no one owns the outcome.
8. Are we findable, understandable, and trusted by both humans and machines? Future-proofs the brand in AI-driven search. Generative engines cite competitors, not us.
9. How resilient is our digital ecosystem? Tests readiness for migrations, rebrands, and AI shifts. Every platform change causes a traffic cliff.
10. What are we learning from our data that informs decisions? Turns analytics into strategy, not hindsight. Insights exist but never reach decision-makers.

Each question reframes a “marketing” issue as a governance issue. When CEOs ask these questions, they encourage teams to think systemically, connecting content, code, and conversion as interdependent components of a single digital value chain.

From Questions To Action: Building A Culture Of Digital Accountability

Asking the right questions isn’t micromanagement – it’s leadership through intent.

When a CEO defines the Commander’s Intent for digital, it brings clarity of purpose, alignment of teams, and shared metrics, and it changes how the organization approaches the web. Instead of chasing redesigns or vanity KPIs, teams operate with a shared understanding:

“Our website’s job is to create enterprise value – measurable, sustainable, and scalable.”

That intent cascades into structure:

  • Visibility: Reporting evolves from traffic to contribution value.
  • Speed: Teams track time-to-detect and time-to-resolve issues.
  • Alignment: Marketing, IT, and product teams operate under a unified governance framework.

This is where the Web Effectiveness Score or Digital Value Creation Framework bridges web metrics (load time, index coverage) to enterprise KPIs (ROIC, margin, growth). Once that link is visible, executives start managing digital performance as a financial asset because it is.

The CEO’s Digital Playbook

CEOs who ask these questions consistently outperform those who don’t – not because they know more about SEO, but because they lead with system awareness. When they do:

  1. Wasted Spend Decreases.
    Duplicative content, overlapping agencies, and redundant tools are identified and rationalized.

  2. Visibility and Trust Increase.
    Content becomes findable, structured, and cited by both search engines and generative AI.

  3. Risk Declines.
    Technical debt, migration shocks, and compliance failures are detected early.

  4. Innovation Accelerates.
    Modular systems and shared data layers enable faster experimentation.

  5. Enterprise Value Compounds.
    Web performance improvements flow into revenue growth and cost efficiency.

This is the same logic CFOs apply to physical assets. The only difference is that digital assets rarely appear on the balance sheet, so their underperformance remains invisible until a crisis.

Why Now: The AI Search Inflection Point

The rise of generative search makes these questions urgent. Search is no longer a static list of links; it’s a recommendation system. AI engines evaluate authority, trust, and structured data across the web to synthesize answers.

If your website isn’t structured, trusted, and machine-readable, your company risks digital disintermediation and being invisible in the ecosystems that shape decisions. For CEOs, that’s not a marketing problem; it’s an enterprise risk.

As AI systems determine which brands get cited and recommended, your digital infrastructure becomes the new supply chain for relevance and reputation.

Final Thought

The CEOs who win the next decade won’t outspend their competitors – they’ll out-align them. They’ll treat digital infrastructure with the same financial discipline as physical assets, measure contribution instead of activity, and lead teams to think in systems rather than silos.

Every boardroom already measures financial capital. It’s time to start measuring digital capital, and your website is where it compounds.

In the AI era, your website isn’t just how people find you.
It’s how machines define you.

More Resources:


Featured Image: Master1305/Shutterstock

Five Ways To Boost Traffic To Informational Sites via @sejournal, @martinibuster

Informational sites can easily decline into a crisis of search visibility. No site is immune. Here are five ways to manage content to maintain steady traffic, increase the ability to adapt to changing audiences, and make confident choices that help the site maintain growth momentum over time.

1. Create A Mix Of Content Types

Publishers are in a constant race to publish what’s latest because being first to publish can be a source of massive traffic. The main problem with these kinds of sites is that publishing content about current events can run into problems, putting into question the sustainability of the publication.

  • Current events quickly become stale and no longer relevant to an audience.
  • Unforeseen events like an industry strike, accidents, world events, and pandemics can disrupt interest in a topic.

The focus then is to identify content topics that are reliably relevant to the website’s current audience. This kind of content is called evergreen content, and it can form a safety net of reliable traffic that can sustain the business during slow cycles.

An example of the mixed approach to content that comes to mind is how the New York Times has a standalone recipes section on a subdomain of the main website. It also has a category-based section dedicated to gadget reviews called The Wirecutter.

Another example is the entertainment niche which in addition to industry news also publish interviews with stars and essays about popular movies. Music websites publish the latest news but also content based on snippets from interviews with famous musicians where the musicians make interesting statements about songs, inspirations, and cultural observations.

Rolling Stone magazine publishes content about music but also about current events like politics that align with their reader interests.

All three of those examples expand their topics to adjacent topics in order to bulk up their ability to attract steady and consistent traffic that is reliable.

2. Evergreen Content Also Needs Current Event Topics

Conversely, evergreen topics can generate new audience reach and growth by expanding to cover current events. Content sites about recipes, gardening, home repairs, DIY, crafts, parenting, personal finance, and fitness are all examples of topics that feature evergreen content and can also expand to cover current events. The flow of traffic derived from trending topics is an excellent source of devoted readers who return to read evergreen content and end up recommending the site to friends for both current events and evergreen topics.

Current events can be related to products and even to statements by famous people. If you enjoy creating content or making discoveries, then you’ll enjoy the challenge of discovering new sources of trending topics.

If you don’t already have a mix of evergreen and ephemeral content, then I would encourage you to seek opportunities to focus on those kinds of articles. They can help sustain traffic levels while feeding growth and life into the website.

3. Beware Of Old Content

Google evaluates the total content of a website in order to generate a quality score. Google is vague about these whole-site evaluations. We only know that they do it and that a good evaluation can have a positive effect on traffic.

However, what happens when the site becomes top-heavy with old, stale content that’s no longer relevant to site visitors? This can become a drag on a website. There are multiple ways of handling this situation.

Content that is absolutely out of date and of no interest to anyone and is therefore no longer useful should be removed. The criteria to judge content with is usefulness, not the age of the content. The reason to prune this content is because it’s possible that a whole-site evaluation may conclude that most of the website is comprised of unhelpful, outdated web pages. This could be a negative drag on site performance.

There’s nothing inherently wrong with old content as long as it’s useful. For example, the New York Times keeps old movie reviews in archives that are organized by year, month, day, category, and article title.

The URL slug for the movie review of E.T. looks like this:  /1982/06/11/movies/et-fantasy-from-spielberg.html

Screenshot Of Archived Article

Take Decisive Steps

  • Useful historical content can be archived.
  • Older content that is out of date can be rehabilitated.
  • Content that’s out of date and has been superseded by new content can be redirected with a 301 response code to the new content.
  • Content that is out of date and objectively useless should be removed from the website and allowed to show a 404 response code.

4. Topic Interest

Something that can cause traffic to decline on an informational site is waning interest. Technological innovation can cause the popularity of another product to decline, dragging website traffic along with it. For example, I consulted for a website that reported its traffic was declining. The site still ranked for its keywords, but a quick look at Google Trends showed that interest in the website topic was declining. This was several months after the introduction of the iPhone, which negatively impacted a broad category of products that the website was centered on.

Always keep track of how interested your audience is in your topic. Follow influencers in your niche topic on social media to gauge what they are talking about and whether there are any shifts in the conversation that indicate waning interest or growing interest in a related topic.

Always try out new subcategories of your topic that cross over with your readership to see if there is an audience there that can be cultivated.

Another nuance to consider is the difference between temporary dips in interest and long-term structural decline. Some topics experience predictable cycles driven by seasons, economic conditions, or news coverage, while others face permanent erosion as user needs change or alternatives emerge. Misreading a cyclical slowdown as a permanent decline can lead to unnecessary pivots, while ignoring structural shifts can leave a site over-invested in content that no longer aligns with how people search, buy, or learn.

Monitoring topic interest is less about reacting to short-term fluctuations and more about keeping aware of topical interest and trends. By monitoring audience behavior, tracking broader trends, and experimenting at the edges of the core topic, an informational site can adjust gradually rather than being forced into abrupt changes after traffic has already declined. This ongoing attention helps ensure that content decisions remain grounded in how interest evolves over time.

5. Differentiate

Something that happens to a lot of informational websites is that competitors in a topic tend to cover the exact same stories and even have similar styles of photos, about pages, and bios.

B2B software sites have images of people around a laptop, images of a serious professional, and people gesturing at a computer or a whiteboard.

Recipe sites feature the Flat Lay (food photographed from above), the Ingredient Still Life portrait, and action shots of ingredients grated, sprinkled, or in mid air.

Websites tend to converge into homogeneity in the images they use and the kind of content that’s shared, based on the idea that if it’s working for competitors, then it may be a good approach. But sometimes it’s best to step out of the pack and do things differently.

Evolve your images so that they stand out or catch the eye, try a different way of communicating your content, identify the common concept that everyone uses, and see if there’s an alternate approach that makes your site more authentic.

For example, a recipe site can show photographic bloopers or discuss what can go wrong and how to fix or avoid it. Being real is authentic. So why not show what underbaked looks like? Instagram and Pinterest are traffic drivers, but does that mean all images must be impossibly perfect? Maybe people might respond to the opposite of homogeneity and fake perfection.

The thing that’s almost always missing from product reviews is photos of the testers actually using the products. Is it because the reviews are fake? Hm… Show images of the products with proof that they’ve been used.

Takeaways

  • Sustainable traffic can be cultivated with a mix of evergreen and timely content. Find the balance that works for your website.
  • Evergreen content performs best when it is periodically refreshed with up-to-date details.
  • Outdated content that lacks utility or meaning in people’s lives can quietly grow to suppress site-wide performance. Old pages should be reviewed for usefulness and then archived, updated, redirected, or removed.
  • Audience interest in a topic can decline even if rankings remain stable. Monitoring search demand and cultural shifts helps publishers know when it’s time to expand into adjacent topics before traffic erosion becomes severe.
  • Differentiation matters as much as coverage. Sites that mirror competitors in visuals, formats, and voice risk blending into sameness, while original presentation and proof of authentic experience build trust and attention.

Search visibility declines are not caused by a single technical flaw or isolated content mistake but by gradual misalignment between what a site publishes and what audiences continue to value. Sites that rely too heavily on fleeting interest, allow outdated material to accumulate, or follow competitors into visual and editorial homogeneity risk signaling mediocrity rather than relevance and inspiring enthusiasm. Sustained performance depends on actively managing content, balancing evergreen coverage with current events, pruning what’s no longer useful, and making deliberate choices that distinguish the site as useful, authentic, and credible.

Featured Image by Shutterstock/Sergey Nivens

Apple Safari Update Enables Tracking Two Core Web Vitals Metrics via @sejournal, @martinibuster

Safari 26.2 adds support for measuring Largest Contentful Paint (LCP) and the Event Timing API, which is used to calculate Interaction to Next Paint (INP). This enables site owners to collect Largest Contentful Paint (LCP) and Interaction to Next Paint (INP) data from Safari users through the browser Performance API using their own analytics and real user monitoring tools.

LCP And INP In Apple Safari Browser

LCP is a Core Web Vital and a ranking signal. Interaction To Next Paint (INP), also a Core Web Vitals metric, measures how quickly your website responds to user interactions. Native Safari browser support enables accurate measurement, which closes a long-standing blind spot for performance diagnostics of site visitors using Apple devices.

INP is a particularly critical measurement because it reports on the total time between a user’s action (click, tap, or key press) and the visual update on the screen. It tracks the slowest interaction observed during a user’s visit. INP is important because it enables site owners to know if the page feels “frozen” or laggy for site visitors. Fast INP scores translate to a positive user experience for site visitors who are interacting with the website.

This change will have no effect on public tools like PageSpeed Insights and CrUX data because they are Chrome-based.

However, Safari site visitors can now be included in field performance data where site owners have configured measurement, such as in Google Analytics or other performance monitoring platforms.

The following analytics packages can now be configured to surface these metrics from Safari browser site visitors:

  • Google Analytics (GA4, via Web Vitals or custom event collection)
  • Adobe Analytics
  • Matomo
  • Amplitude (with performance instrumentation)
  • Mixpanel (with custom event pipelines)
  • Custom / In-House Monitoring

Apple Safari’s update also enables Real User Monitoring (RUM) platforms to surface this data for site owners:

  • Akamai mPulse
  • Cloudflare Web Analytics
  • Datadog RUM
  • Dynatrace
  • Elastic Observability (RUM)
  • New Relic Browser
  • Raygun
  • Sentry Performance
  • SpeedCurve
  • Splunk RUM

Apple’s official documentation explains:

“Safari 26.2 adds support for two tools that measure the performance of web applications, Event Timing API and Largest Contentful Paint.

The Event Timing API lets you measure how long it takes for your site to respond to user interactions. When someone clicks a button, types in a field, or taps on a link, the API tracks the full timeline — from the initial input through your event handlers and any DOM updates, all the way to when the browser paints the result on screen. This gives you insight into whether your site feels responsive or sluggish to users. The API reports performance entries for interactions that take longer than a certain threshold, so you can identify which specific events are causing delays. It makes measuring “Interaction to Next Paint” (INP) possible.

Largest Contentful Paint (LCP) measures how long it takes for the largest visible element to appear in the viewport during page load. This is typically your main image, a hero section, or a large block of text — whatever dominates the initial view. LCP gives you a clear signal about when your page feels loaded to users, even if other resources are still downloading in the background.”

Safari 26.2 provides new data that is critical for SEO and for monitoring the user experience, information that site owners rely on. Safari traffic represents a significant share of site visits. These improvements make it possible for site owners to have a more complete view of the real user experience across more devices and browsers.