New Ecommerce Tools: July 17, 2025

Every week we publish a handpicked list of new products and services from vendors to ecommerce merchants. This installment includes updates on AI-powered shopping assistants, email and SMS marketing tools, ad management platforms, automated tax solutions, and last-mile delivery networks.

Got an ecommerce product release? Email releases@practicalecommerce.com.

New Tools for Merchants

Genus AI launches Sage, an AI growth agent for ecommerce brands. Genus AI, a developer of AI tools for D2C brands, has launched Sage, an AI growth agent to help merchants scale across digital channels. Available for Shopify and Meta platforms, Sage is an AI agent that unifies creative generation, product catalog management, digital campaigns, customer insights, and order data. According to Genus AI, Sage can enhance product catalogs, optimize campaigns, and provide real-time visibility into the customer lifecycle.

Home page of Genus AI - Sage

Genus AI – Sage

Infobip brings voice calling to WhatsApp Business users. Infobip, a cloud communications platform for connected experiences across the customer journey, has launched WhatsApp Business Calling, enabling businesses to make and receive voice calls via WhatsApp. Users can start calls directly from WhatsApp chats, interactive messages, or deep links embedded in websites and apps. Integration with Infobip Conversations, the company’s cloud contact center platform, enables customer support agents to switch from chat to voice while maintaining a unified conversation history and context.

Xnurta expands full-funnel Amazon Ads support to five markets. Xnurta, an agentic AI-powered ad management platform for brands, has announced its continued international expansion with full-funnel support now live in Poland, Belgium, Sweden, Egypt, and South Africa. Xnurta equips advertisers in these five countries with advanced automation, performance insights, and AI-assisted campaign management across Sponsored Products, Sponsored Brands, Sponsored Display, and Amazon Demand-Side Platform (DSP). Xnurta’s latest expansion brings its presence to nearly all of Amazon’s 23 international marketplaces.

Klaviyo introduces an AI shopping assistant. Klaviyo, an email and SMS marketing platform for B2C sellers, has launched its Conversational AI Agent to help brands deliver personalized, always-on support using real-time session context, storefront knowledge, and marketing insights, all powered by the Klaviyo Data Platform. Built into Customer Hub, Klaviyo’s Conversational AI Agent trains quickly using data from a brand’s storefront, including its product catalog and FAQs. The agent guides shoppers from discovery to purchase by answering common questions and recommending products.

Klaviyo home page

Klaviyo

Avalara embeds AI-powered assistant into tax research. Avalara, a platform for tax compliance automation, has launched Avi, a generative AI assistant embedded within Avalara Tax Research. According to Avalara, Avi for Tax Research can (i) quickly obtain precise sales tax rates tailored to specific street addresses, (ii) instantly check the tax status of products and services through straightforward queries, and (iii) access real-time guidance that supports defensible tax positions and enables proactive adaptation to evolving tax regulations.

New Gen unveils AI-native storefronts for agentic commerce. New Gen, a company building infrastructure for the AI internet, now enables agentic commerce, powering secure, AI-initiated transactions through intelligent storefronts and embedded payment flows. The platform allows AI agents to quickly and securely check out from merchant sites across chat, voice, and (soon) through emerging agent-driven channels. New Gen leverages trusted payments infrastructure from Visa and is among the first collaborators in the Visa Intelligent Commerce sandbox.

UniUni simplifies ecommerce shipping for Canada-based sellers. UniUni, a last-mile delivery company, has announced the launch of Small Business, a system of local drop-off and service points to simplify shipping for Canada-based sellers. Through UniUni Small Business, merchants gain access to a full-stack domestic and cross-border shipping platform that includes broad carrier partnerships, an in-house last-mile delivery network, real-time tracking and automation tools, and an all-in-one dashboard for managing shipments. UniUni Small Business is now available in Toronto.

Home page of UniUni Small Business

UniUni Small Business

Ordoro launches branded tracking pages for ecommerce. Ordoro, a provider of ecommerce logistics and inventory management tools, has launched Branded Tracking Pages to help merchants transform generic shipping updates into branded touchpoints. The feature provides ecommerce businesses a tracking page (already visited by customers) to reinforce their brand, answer common questions, and drive repeat purchases. Branded Tracking Pages are included with Shipping Premium.

Oro Inc. launches OroPay, a unified payment solution for B2B commerce. Oro Inc., a developer of open-source B2B digital commerce tools, has launched OroPay, an integrated payment platform for OroCommerce, its ecommerce platform. OroPay unifies invoicing, payments, ERP connectivity, and commerce. Powered by Global Payments, a technology provider, OroPay supports Level 2 and Level 3 credit card processing, dramatically lowering fees for high-value B2B transactions. Customers also benefit from advanced fraud protection, SCA/PSD2 compliance, tokenization, and support for local and global payment methods.

Clearco launches Rolling Funding for ecommerce brands. Clearco, a lender to ecommerce companies, has announced the launch of Rolling Funding, a continuous loan for D2C brands. Customer merchants pay weekly and can now automatically increase their available debt limit in real-time, dollar for dollar. This eliminates the need to submit new funding applications, providing founders and finance teams with real-time visibility into available and projected capacity through their Clearco dashboard.

Privy acquires Emotive to launch a unified email-SMS platform. Privy, an ecommerce marketing automation provider, has acquired Emotive, an SMS platform. The acquisition enables Privy to offer a unified platform where merchants can manage email campaigns, SMS automation, and on-site pop-ups from one place. Privy says it will soon roll out real-time 1:1 conversation capabilities and a slate of new features, including advanced marketing automations, more zero-party data collection options, and improved third-party integrations.

Home page of Privy

Privy

USPS Rate Change Analysis July 2025

The United States Postal Service released new postage and shipping rates this month, increasing costs for popular services such as Priority Mail by up to 51%.

The USPS announced the increases in April 2025 to help achieve financial stability, meet regulatory requirements, and cover the transportation cost of packages.

Home page of the USPS

New USPS rates help achieve financial stability.

Rate Changes

The average increase for Priority Mail is 6.3%, but shipping software maker Pirate Ship noted that for some Zone 4-6 shipments, the increase was much higher, i.e., 51%.

The cost of USPS Flat Rate boxes rose by 3%, 11%, and 7% for the small, medium, and large options, respectively.

Ground Advantage rates climbed an average of 7.1%. A new $4 fee on non-standard packages, such as mailing tubes, could add up quickly for businesses selling posters, rods, or other rolled products.

The USPS also lowered some rates. According to Pirate Ship, small 2-3 pound Priority Mail shipments to Zones 1-4 are now about 6.5% less expensive.

Similarly, prices for Priority Mail Cubic shipments within the same zone dropped 10%. And Media Mail rates dropped slightly, by about 2%.

Some additional services, such as insurance, also experienced price decreases.

Thus for small and midsize ecommerce businesses, the rate changes are uneven. Some shipments increased just a few cents. Others, depending on weight and zone, jumped by 30% or more. Still others decreased.

The result is a potential reshuffling of fulfillment costs, product margins, and, perhaps, carrier selections.

Shipping Review

The USPS transports more than 7 billion packages annually — more than UPS and FedEx. The rate changes present an opportunity for sellers to audit shipping and fulfillment practices.

A good first step is to export and analyze past orders. Download the last three months of shipping data and prompt a generative AI platform to organize it by type and zone, for example. The aim is to create a profile that estimates a merchant’s shipping services and regions.

The review should be recurring, as the USPS now adjusts rates every January and July.

Trimming just 25¢ off a per-order shipping cost could have the same bottom-line impact as increasing average order value or decreasing customer acquisition costs.

Once it knows its shipping profile, an ecommerce business can apply the new USPS rates and estimate the cost. The process is as simple as duplicating the existing rate sheet and updating the numbers for a reusable shipping cost model.

Profit Impact

Armed with new costs, sellers can calculate the impact on profit.

For shops that offer free or flat-rate shipping, recalculating profits will be straightforward.

Sellers that pass shipping costs to customers should estimate how the changes could affect conversion rates. And don’t forget to include the cost of return shipping.

Ultimately, merchants can increase prices, adjust free shipping offers or thresholds, create product bundles, or change carriers or service levels for specific shipments.

Third-party tools can help with the analysis. Examples include Pitney Bowes’ PitneyShip software, Pirate Ship, ShipStation, and EasyPost.

Compare policies and packaging, too. Do the new USPS rates, for example, impact dimensional weight enough that it makes sense to adjust box sizes?

USPS Value

Despite the rate changes, the USPS is often the most cost-effective option for ecommerce shippers, especially those with limited volume.

The USPS is vital for last-mile delivery. The UPS and FedEx rely on it, for example.

The USPS is the only option for serving some rural or military customers.

In short, recurring USPS price changes can be frustrating, but they are essential for the future of U.S. ecommerce. The agency loses billions of dollars annually and could cease to exist if it cannot recoup the shortfalls.

Perplexity Looks Beyond Search With Its AI Browser, Comet via @sejournal, @MattGSouthern

Perplexity has launched a web browser, Comet, offering users a look at how the company is evolving beyond AI search.

While Comet shares familiar traits with Chrome, it introduces a different interface model. One where users can search, navigate, and run agent-like tasks from a single AI-powered environment.

A Browser Designed for AI-Native Workflows

Comet is built on Chromium and supports standard browser features like tabs, extensions, and bookmarks.

What sets it apart is the inclusion of a sidebar assistant that can summarize pages, automate tasks, schedule meetings, and fill out forms.

You can see it in action in the launch video below:

In an interview, Perplexity CEO Aravind Srinivas described Comet as a step toward combining search and automation into a single system.

Srinivas said:

“We think about it as an assistant rather than a complete autonomous agent but one omni box where you can navigate, you can ask formational queries and you can give agentic tasks and your AI with you on your new tab page, on your side car, as an assistant on any web page you are, makes the browser feel like more like a cognitive operating system rather than just yet another browser.”

Perplexity sees Comet as a foundation for agentic computing. Future use cases could involve real-time research, recurring task management, and personal data integration.

Strategy Behind the Shift

Srinivas said Comet isn’t just a product launch, it’s a long-term bet on browsers as the next major interface for AI.

He described the move as a response to growing user demand for AI tools that do more than respond to queries in chat windows.

Srinivas said:

“The browser is much harder to copy than yet another chat tool.”

He acknowledged that OpenAI and Anthropic are likely to release similar tools, but believes the technical challenges of building and maintaining a browser create a longer runway for Perplexity to differentiate.

A Different Approach From Google

Srinivas also commented on the competitive landscape, including how Perplexity’s strategy differs from Google’s.

He pointed to the tension between AI-driven answers and ad-based monetization as a limiting factor for traditional search engines.

Referring to search results where advertisers compete for placement, Srinivas said:

“If you get direct answers to these questions with booking links right there, how are you going to mint money from Booking and Expedia and Kayak… It’s not in their incentive to give you good answers at all.”

He also said Google’s rollout of AI features has been slower than expected:

“The same feature is being launched year after year after year with a different name, with a different VP, with a different group of people, but it’s the same thing except maybe it’s getting better but it’s never getting launched to everybody.”

Accuracy, Speed, and UX as Priorities

Perplexity is positioning Comet around three core principles: accuracy, low latency, and clean presentation.

Srinivas said the company continues to invest in reducing hallucinations and speeding up responses while keeping user experience at the center.

Srinivas added:

“Let there exist 100 chat bots but we are the most focused on getting as many answers right as possible.”

Internally, the team relies on AI development tools like Cursor and GitHub Copilot to accelerate iteration and testing.

Srinivas noted:

“We made it mandatory to use at least one AI coding tool and internally at Perplexity it happens to be Cursor and like a mix of Cursor and GitHub Copilot.”

Srinivas said the browser provides the structure needed to support more complex workflows than a standalone chat interface.

What Comes Next

Comet is currently available to users on Perplexity’s Max plan through early access invites. A broader release is expected, along with plans for mobile support in the future.

Srinivas said the company is exploring business models beyond advertising, including subscriptions, usage-based pricing, and affiliate transactions.

“All I know is subscriptions and usage based pricing are going to be a thing. Transactions… taking a cut out of the transactions is good.”

While he doesn’t expect to match Google’s margins, he sees room for a viable alternative.

“Google’s business model is potentially the best business model ever… Maybe it was so good that you needed AI to kill it basically.”

Looking Ahead

Comet’s release marks a shift in how AI tools are being integrated into user workflows.

Rather than adding assistant features into existing products, Perplexity is building a new interface from the ground up, designed around speed, reasoning, and task execution.

As the company continues to build around this model, Comet may serve as a test case for how users engage with AI beyond traditional search.


Featured Image: Ascannio/Shutterstock 

OpenAI ChatGPT Agent Marks A Turning Point For Businesses And SEO via @sejournal, @martinibuster

OpenAI announced a new way for users to interact with the web to get things done in their personal and professional lives. ChatGPT agent is said to be able to automate planning a wedding, booking an entire vacation, updating a calendar, and converting screenshots into editable presentations. The impact on publishers, ecommerce stores, and SEOs cannot be overstated. This is what you should know and how to prepare for what could be one of the most consequential changes to online interactions since the invention of the browser.

OpenAI ChatGPT Agent Overview

OpenAI ChatGPT agent is based on three core parts, OpenAI’s Operator and Deep Research, two autonomous AI agents, plus ChatGPT’s natural language capabilities.

  1. Operator can browse the web and interact with websites to complete tasks.
  2. Deep Research is designed for multi-step research that is able to combine information from different resources and generate a report.
  3. ChatGPT agent requests permission before taking significant actions and can be interrupted and halted at any point.

ChatGPT Agent Capabilities

ChatGPT agent has access to multiple tools to help it complete tasks:

  • A visual browser for interacting with web pages with the on-page interface.
  • Text based browser for answering reasoning-based queries.
  • A terminal for executing actions through a command-line interface.
  • Connectors, which are authorized user-friendly integrations (using APIs) that enable ChatGPT agent to interact with third-party apps.

Connectors are like bridges between ChatGPT agent and your authorized apps. When users ask ChatGPT agent to complete a task, the connectors enable it to retrieve the needed information and complete tasks. Direct API access via connectors enables it to interact with and extract information from connected apps.

ChatGPT agent can open a page with a browser (either text or visual), download a file, perform an action on it, and then view the results in the visual browser. ChatGPT connectors enable it to connect with external apps like Gmail or a calendar for answering questions and completing tasks.

ChatGPT Agent Automation of Web-Based Tasks

ChatGPT agent is able to complete entire complex tasks and summarize the results.

Here’s how OpenAI describes it:

“ChatGPT can now do work for you using its own computer, handling complex tasks from start to finish.

You can now ask ChatGPT to handle requests like “look at my calendar and brief me on upcoming client meetings based on recent news,” “plan and buy ingredients to make Japanese breakfast for four,” and “analyze three competitors and create a slide deck.”

ChatGPT will intelligently navigate websites, filter results, prompt you to log in securely when needed, run code, conduct analysis, and even deliver editable slideshows and spreadsheets that summarize its findings.

….ChatGPT agent can access your connectors, allowing it to integrate with your workflows and access relevant, actionable information. Once authenticated, these connectors allow ChatGPT to see information and do things like summarize your inbox for the day or find time slots you’re available for a meeting—to take action on these sites, however, you’ll still be prompted to log in by taking over the browser.

Additionally, you can schedule completed tasks to recur automatically, such as generating a weekly metrics report every Monday morning.”

What Does ChatGPT Agent Mean For SEO?

ChatGPT agent raises the stakes for publishers, online businesses, and SEO, in that making websites Agentic AI–friendly becomes increasingly important as more users become acquainted with it and begin sharing how it helps them in their daily lives and at work.

A recent study about AI agents found that OpenAI’s Operator responded well to structured on-page content. Structured on-page content enables AI agents to accurately retrieve specific information relevant to their tasks, perform actions (like filling in a form), and helps to disambiguate the web page (i.e., make it easily understood). I usually refrain from using jargon, but disambiguation is a word all SEOs need to understand because Agentic AI makes it more important than it has ever been.

Examples Of On-Page Structured Data

  • Headings
  • Tables
  • Forms with labeled input forms
  • Product listing with consistent fields like price, availability, name or label of the product in a title.
  • Authors, dates, and headlines
  • Menus and filters in ecommerce web pages

Takeaways

  • ChatGPT agent is a milestone in how users interact with the web, capable of completing multi-step tasks like planning trips, analyzing competitors, and generating reports or presentations.
  • OpenAI’s ChatGPT agent combines autonomous agents (Operator and Deep Research) with ChatGPT’s natural language interface to automate personal and professional workflows.
  • Connectors extend Agent’s capabilities by providing secure API-based access to third-party apps like calendars and email, enabling task execution across platforms.
  • Agent can interact directly with web pages, forms, and files, using tools like a visual browser, code execution terminal, and file handling system.
  • Agentic AI responds well to structured, disambiguated web content, making SEO and publisher alignment with structured on-page elements more important than ever.
  • Structured data improves an AI agent’s ability to retrieve and act on website information. Sites that are optimized for AI agents will gain the most, as more users depend on agent-driven automation to complete online tasks.

OpenAI’s ChatGPT agent is an automation system that can independently complete complex online tasks, such as booking trips, analyzing competitors, or summarizing emails, by using tools like browsers, terminals, and app connectors. It interacts directly with web pages and connected apps, performing actions that previously required human input.

For publishers, ecommerce sites, and SEOs, ChatGPT agent makes structured, easily interpreted on-page content critical because websites must now accommodate AI agents that interact with and act on their data in real time.

Read More About Optimizing For Agentic AI

Marketing To AI Agents Is The Future – Research Shows Why

Featured Image by Shutterstock/All kind of people

Ex-Google Engineer Launches Athena For AI Search Visibility via @sejournal, @MattGSouthern

A former Google Search engineer is betting on the end of traditional SEO, and building tools to help marketers prepare for what comes next.

Andrew Yan, who left Google’s search team earlier this year, co-founded Athena, a startup focused on helping brands stay visible in AI-generated responses from tools like ChatGPT and Perplexity.

The company launched last month with $2.2 million in funding from Y Combinator and other venture firms.

Athena is part of a new wave of companies responding to a shift in how people discover information. Instead of browsing search results, people are increasingly getting direct answers from AI chatbots.

As a result, the strategies that once helped websites rank in Google may no longer be enough to drive visibility.

Yan told The Wall Street Journal:

“Companies have been spending the last 10 or 20 years optimizing their website for the ‘10 blue links’ version of Google. That version of Google is changing very fast, and it is changing forever.”

Building Visibility In A Zero-Click Web

Athena’s platform is designed to show how different AI models interpret and describe a brand. It tracks how chatbots talk about companies across platforms and recommends ways to optimize web content for AI visibility.

According to the company, Athena already has over 100 customers, including Paperless Post.

The broader trend reflects growing concern among marketers about the rise of a “zero-click internet,” where users get answers directly from AI interfaces and never visit the underlying websites.

Yan’s shift from Google to startup founder underscores how seriously some search insiders are taking this transformation.

Rather than competing for rankings on a search results page, Athena aims to help brands influence the outputs of large language models.

Profound Raises $20 Million For AI Search Monitoring

Athena isn’t the only company working on this.

Profound, another startup highlighted by The Wall Street Journal, has raised more than $20 million from venture capital firms. Its platform monitors how chatbots gather and relay brand-related information to users.

Profound has attracted several large clients, including Chime, and is positioning itself as an essential tool for navigating the complexity of generative AI search.

Co-founder James Cadwallader says the company is preparing for a world where bots, not people, are the primary visitors to websites.

Cadwallader told The Wall Street Journal:

“We see a future of a zero-click internet where consumers only interact with interfaces like ChatGPT. And agents or bots will become the primary visitors to websites.”

Saga Ventures’ Max Altman added that demand for this kind of visibility data has surpassed expectations, noting that marketers are currently “flying completely blind” when it comes to how AI tools represent their brands.

SEO Consultants Are Shifting Focus

The shift is also reaching practitioners. Cyrus Shepard, founder of Zyppy SEO, told the Wall Street Journal that AI visibility went from being negligible at the start of 2025 to 10–15% of his current workload.

By the end of the year, he expects it could represent half of his focus.

Referring to new platforms like Athena and Profound, Shepard said:

“I would classify them all as in beta. But that doesn’t mean it’s not coming.”

While investor estimates suggest these startups have raised just a fraction of the $90 billion SEO industry, their traction indicates a need to address the challenges posed by AI search.

What This Means

These startups are early signs of a larger shift in how content is surfaced and evaluated online.

With AI tools synthesizing answers from multiple sources and often skipping over traditional links, marketers face a new kind of visibility challenge.

Companies like Athena and Profound are trying to fill that gap by giving marketers a window into how generative AI models see their brands and what can be done to improve those impressions.

It’s not clear yet which strategies will work best in this new environment, but the race to figure it out has begun.


Featured Image: Roman Samborskyi/Shutterstock

Google’s John Mueller Clarifies How To Remove Pages From Search via @sejournal, @MattGSouthern

In a recent installment of SEO Office Hours, Google’s John Mueller offered guidance on how to keep unwanted pages out of search results and addressed a common source of confusion around sitelinks.

The discussion began with a user question: how can you remove a specific subpage from appearing in Google Search, even if other websites still link to it?

Sitelinks vs. Regular Listings

Mueller noted he wasn’t “100% sure” he understood the question, but assumed it referred either to sitelinks or standard listings. He explained that sitelinks, those extra links to subpages beneath a main result, are automatically generated based on what’s indexed for your site.

Mueller said:

“There’s no way for you to manually say I want this page indexed. I just don’t want it shown as a sitelink.”

In other words, you can’t selectively prevent a page from being a sitelink while keeping it in the index. If you want to make sure a page never appears in any form in search, a more direct approach is required.

How To Deindex A Page

Mueller outlined a two-step process for removing pages from Google Search results using a noindexdirective:

  1. Allow crawling: First, make sure Google can access the page. If it’s blocked by robots.txt, the noindex tag won’t be seen and won’t work.
  2. Apply a noindex tag: Once crawlable, add a noindex meta tag to the page to instruct Google not to include it in search results.

This method works even if other websites continue linking to the page.

Removing Pages Quickly

If you need faster action, Mueller suggested using Google Search Console’s URL Removal Tool, which allows site owners to request temporary removal.

“It works very quickly” for verified site owners, Mueller confirmed.

For pages on sites you don’t control, there’s also a public version of the removal tool, though Mueller noted it “takes a little bit longer” since Google must verify that the content has actually been taken down.

Hear Mueller’s full response in the video below:

What This Means For You

If you’re trying to prevent a specific page from appearing in Google results:

  • You can’t control sitelinks manually. Google’s algorithm handles them automatically.
  • Use noindex to remove content. Just make sure the page isn’t blocked from crawling.
  • Act quickly when needed. The URL Removal Tool is your fastest option, especially if you’re a verified site owner.

Choosing the right method, whether it’s noindex or a removal request, can help you manage visibility more effectively.

Brand Bias For Visibility In Search & LLMs: A Conversation With Stephen Kenwright via @sejournal, @theshelleywalsh

I recently saw Stephen Kenwright speak at a small Sistrix event in Leeds about strategies for exploiting Google’s brand bias, and a lot of what he said still feels as fresh today as it did over a decade ago when he first started promoting this theory.

Right now, the search experience is changing more than in the last 25 years, and many SEOs are citing that brand is the critical focus for survival.

Some might say (Stephen included) that this is what SEO should always have been about.

I spoke to Stephen, the founder of Rise at Seven, about his talk and about how his theories and strategies could translate to a world of large language model (LLM) optimization alongside a fractured search journey.

You can watch the full interview with Stephen on IMHO below, or continue reading the article summary.

Google’s Brand Bias Is Foundational

Brand bias isn’t a recent development. Stephen was already writing about it in 2016 during his time at Branded3. What underlines this bias is the trust users have in brands.

“Google wants to give a good experience to its users. That means surfacing the results they expect to see. Often, that’s a brand they already know,” Stephen explained.

When users search, they’re often subconsciously looking to reconnect with a mental shortcut that brands provide. It’s not about discovery; it’s about recognition.

When brands invest in traditional marketing channels, they influence user behavior in ways that create cascading effects across digital platforms.

Television advertising, for example, makes viewers significantly more likely to click on branded results even when searching for generic terms.

Traditional Marketing Directly Influences Search Behavior

At his talk in Leeds, Stephen referenced research that demonstrates television advertising creates measurable impacts on search behavior, with viewers 33% more likely to click on advertised brands in search results.

“People are about a third more likely to click your result after seeing a TV ad, and they convert better, too,” Stephen said.

When users encounter brands through traditional marketing channels, they develop mental associations that influence their subsequent search behavior. These behavioral patterns then signal to Google that certain brands provide better user experiences.

“Having the trust from the user comes from brand building activity. It doesn’t come from having an exact match domain that happens to rank first for a keyword,” Stephen emphasized. “That’s just not how the real world works.”

Investment In Brand Building Gains More Buy-In From C-Suite

Even though this bias has been evident for so long, Stephen highlighted a disconnect from brand-building activities within the industry.

“Every other discipline from PR to the marketing manager through to the social media team, literally everyone else, including the C-suite is interested in brand in some capacity and historically SEOs have been the exception,” Stephen explained.

This separation has created missed opportunities for SEOs to access larger marketing budgets and gain executive support for their initiatives.

By shifting focus toward brand-building activities that impact search visibility, they can better align with broader marketing objectives.

“Just by switching that mindset and asking, ‘What’s the impact on brand of our SEO activity?’ we get more buy-in, bigger budgets, and better results,” he said.

Make A Conscious Decision About Which Search Engine To Optimize For

While Google’s dominance remains statistically intact, user behavior tells us that there has always existed a fractured search journey.

Stephen cited that half of UK adults use Bing monthly. A quarter is on Quora. Pinterest and Reddit are seeing massive engagement, especially with younger users. Nearly everyone uses YouTube, and they spend significantly more time on it than on Google.

Also, specialized search engines like Autotrader for used cars and Amazon for ecommerce have captured significant market share in their respective categories.

This fragmentation means that conscious decisions about platform optimization become increasingly important. Different platforms serve different demographics and purposes, requiring strategic choices about where to invest optimization efforts.

I asked Stephen if he thought Google’s dominance was under threat, or if it would remain part of a fractured search journey. But, he thought Google would be relevant for at least half a decade to come.

“I don’t see Google going anywhere. And I also don’t see the massive difference in LLM optimization. So most of the things that you would be doing for Google now … are broadly marketing things anyway and broadly impact LLM optimization.”

LLM Optimization Could Be A Return To Traditional Marketing

Looking toward AI-driven search platforms, Stephen believes the same brand-building tactics that work for Google will prove effective across LLM platforms. These new platforms don’t necessarily demand new rules; they reinforce old ones.

“What works in Google now, broadly speaking, is good marketing. That also applies to LLMs,” he said.

While we’re still learning how LLMs surface content and determine authority, early indicators suggest trust signals, brand presence, and real-world engagement all play pivotal roles.

The key insight is that LLM optimization doesn’t require entirely new approaches but rather a return to fundamental marketing principles focused on audience needs and brand trust.

Television Advertising Creates Significant Impact

I asked Stephen what he would do if he were to launch a new brand and how he would quickly gain traction.

In an interesting twist for someone who has worked in the SEO industry for so long, he cited TV as his primary focus.

“I’d build a transactional website and spend millions on TV [advertising]. If I did more [marketing], I’d add PR.” Stephen told me.

This recommendation reflects his belief that traditional marketing channels create a significant impact.

He believes, the combination of a functional ecommerce website with substantial television advertising investment, supplemented by PR activities, provides the foundation for rapid brand recognition and search visibility.

Before We Ruined The Internet

To me, it feels like we are going full circle and back to the days prior to the introduction of “new media” in the early 90s, when TV advertising was dominant and offline advertising was heavily influential.

“It’s like we’re going back to before we ruined the internet,” Stephen joked.

In reality, we’re circling back to what always worked: building real brands that people trust, remember, and seek out. The future requires classical marketing principles that prioritize audience understanding and brand building over technical optimization tactics.

This shift benefits the entire marketing industry by encouraging more integrated approaches that consider the complete customer journey rather than isolated technical optimizations.

Success in both search and LLM platforms increasingly depends on building genuine brand recognition and trust through consistent, audience-focused marketing activities across multiple channels.

Whether it’s Google, Bing, an LLM, or something we haven’t seen yet, brand is the one constant that wins.

Thank you to Stephen Kenwright for offering his insights and being my guest on IMHO.

More Resources:


Featured Image: Shelley Walsh/Search Engine Journal

Is It Time To Remove Focus From Average Position In GSC? via @sejournal, @TaylorDanRW

Average position has been a cornerstone metric in SEO reporting for years. It provides a simple, at-a-glance sense of where a site typically ranks in Google’s search results.

That sense is growing increasingly misleading as Google layers generative AI features, such as AI Overviews and AI Mode, on top of traditional blue link results.

Search Console then counts all these placements under the same metric. This merging of disparate result types means average position can be decreased by low-impact features or artificially boosted by high-visibility but low-traffic placements.

It is time to retire the average position as one of your primary organic key performance indicators (KPIs) and adopt a more nuanced set of metrics that focus on authentic engagement and conversions.

The Changing Landscape Of Search Features

Over the past decade, Google has transformed from a simple list of 10 blue links into a dynamic search results page packed with interactive elements.

Readers now see AI Overviews at the top of the page that generate concise summaries drawn from multiple sources. They also encounter AI Mode, which combines machine-generated insights with standard links.

Further down, they may find knowledge panels that present quick facts and structured data, People Also Ask widgets that prompt deeper exploration, video snippets that surface relevant clips, local packs showing nearby businesses, and image or news carousels that encourage visual browsing.

Each new format alters user behavior and fragments the attention once reserved for blue link results. The result is a declining share of clicks on traditional listings, which makes a simple ranking metric far less meaningful.

How AI Overviews And AI Mode Inflate Your Average Position

Google Search Console now assigns AI Overviews the same rank position value as the very top link in the organic listings.

If your page features in that AI Overview box at position one and simultaneously ranks at position four in the blue links, your average position will be calculated to around 2.5.

That figure suggests a page one presence, even though most traffic still comes from the standard link at position four.

In older versions of Search Console, rare placements, such as a query slot at position 12 in a People Also Ask box, would drag your average rank down.

Now, those obscure placements are balanced or outweighed by top-heavy AI features.

The overall metric becomes distorted. An average position of two may feel like a genuine page one victory, but it offers little insight into where user clicks land.

Why This Matters For Your SEO Strategy

An inflated average position can mislead stakeholders into believing content is performing better than it is. A marketing dashboard reporting an average rank of 2.3 will create confidence in page one visibility.

Resources may shift away from high-value keywords that sit at positions five to 10 but deliver strong conversion rates.

Teams might pour effort into optimizing for AI Overviews or AI Mode triggers that look impressive in reports yet generate few real visits.

Over time, this misplaced focus undermines return on investment. Budgets skew toward vanity improvements rather than actions that drive tangible engagement, leads, and sales.

If click-through rate and traffic volumes stay flat or decline despite a rising average position, you risk missing warning signs until revenues slip.

Metrics To Focus On Instead

To gain an accurate picture of SEO performance, we must unbundle average position.

Classify your rankings by feature type. Separate blue link placements from AI Overviews, People Also Ask entries, video snippets, local packs, and other rich features.

Generate click-through rates for each segment so you can see where users engage. Measure absolute organic traffic for top queries and compare that with historical baselines.

Analyze time on page to understand content resonance. Most importantly, connect behavior data to conversions or goal completions. This end-to-end view shows whether search visibility translates into business value.

Another way to reduce distortion is to use percentile-based position metrics. The median position or P50 gives the midpoint ranking across all queries. It is not swayed by a few very high or very low positions.

The 90th percentile or P90 shows the position below which 90% of your rankings fall.

Charting P50 and P90 over time highlights trend directions with less noise from outliers.

You can also calculate a trimmed mean by excluding the top and bottom 5% of positions. Any of these approaches will provide a steadier reading of where your pages stand in the SERP landscape.

Putting It Into Practice

First, export your Google Search Console data for the period or keywords you wish to analyze.

Add a feature tag to each query to mark whether it appeared as a blue link, AI Overview, People Also Ask entry, or other rich element/special content result block (SCRB).

Many SEO tools now include feature filters to automate this step.

Once tagging is complete, calculate the click-through rate for each feature type by dividing clicks by impressions for that feature. Compare click-through rates to identify which formats drive engagement and which only inflate visibility.

Total the organic clicks and analyze sessions to determine content that earns sustained visits.

You want to update your dashboards to reflect these new metrics. Replace an overall average position chart with a histogram showing the distribution of rankings by feature.

Include a bar chart of click-through rates for each result type. Display time series graphs of organic sessions and goal completions to link visibility improvements back to conversions. Keep these graphs simple and focused on actionable insights.

For optimization, focus on tactics that boost click-through rate and conversion paths.

For blue link results, refine title tags and meta descriptions to create a stronger call to action.

Use structured data markup so that when your page appears in People Also Ask or as a video snippet, the preview offers more context. Review the content that underpins AI Overviews.

Make sure your page answers core user questions in clear headings and concise paragraphs so the generative model can source accurate summaries.

Where gaps exist between AI Overview content and user needs, create or expand sections to fill them.

Continuously iterate by filtering your data for high-value keywords and checking whether the AI features you trigger align with intent and deliver clicks.

In larger organizations, you may need to educate stakeholders on the limitations of the average position.

Share before and after views of dashboards to show how the metric shifted once AI features entered the mix.

Walk through specific examples, such as a page that jumped from position five to an AI Overview at position one, yet saw no change in traffic.

Demonstrations like these will build consensus around moving to feature-based and engagement metrics that drive tangible business outcomes.

Summary

Generative AI features in Google Search represent a fundamental shift in how search results appear.

Average position once served as a valuable proxy for visibility, and one of the only first-party data sources to give us this proxy. It now obscures more than it reveals.

By breaking performance down by feature type, measuring click-through rates and conversions, and adopting percentile-based ranking metrics, you can cut through the noise.

This richer approach reveals what matters to your users and your bottom line. In the new era of search, a deeper, more actionable analysis will be your key to sustained SEO success.

More Resources:


Featured Image: Roman Samborskyi/Shutterstock

Google Answers Question About Structured Data And Logged Out Users via @sejournal, @martinibuster

Someone asked if showing different content to logged-out users than to logged-in users and to Google via structured data is okay. John’s answer was unequivocal.

This is the question that was asked:

“Will this markup work for products in a unauthenticated view in where the price is not available to users and they will need to login (authenticate) to view the pricing information on their end? Let me know your thoughts.”

John Mueller answered:

“If I understand your use-case, then no. If a price is only available to users after authentication, then showing a price to search engines (logged out) would not be appropriate. The markup should match what’s visible on the page. If there’s no price shown, there should be no price markup.”

What’s The Problem With That Structured Data?

The price is visible to logged-in users, so technically the content (in this case the product price) is visible for those users who are logged-in. It’s a good question because a good case can be made that the content shown to Google is available, kind of like behind a paywall, in this case it’s for logged-in users.

But that’s not good enough for Google and it’s not really comparable to paywalls because these are two different things. Google is judging what “on the page” means based on what logged-out users will see on the page.

Google’s guideline about the structured data matching what’s on the page is unambiguous:

“Don’t mark up content that is not visible to readers of the page.

…Your structured data must be a true representation of the page content.”

This is a question that gets asked fairly frequently on social media and in forums so it’s good to go over it for those who might not know yet.

Read More

Confirmed CWV Reporting Glitch In Google Search Console

Google’s New Graph Foundation Model Improves Precision By Up To 40X

Featured Image by Shutterstock/ViDI Studio

Brave Search API Now Available Through AWS Marketplace via @sejournal, @martinibuster

Brave Search and Amazon Web Services (AWS) announced the availability of the Brave Search API in the new AI Agents and Tools category of the AWS Marketplace.

AI Agents And Tools Category Of AWS Marketplace

AWS is entering the AI agent space with a new marketplace that enables entrepreneurs to select from hundreds of AI agents and tools from their new AWS category.

According to the AWS announcement:

“With this launch, AWS Marketplace becomes a single destination where customers can find everything needed for successful AI agent implementations— includes not just agents themselves, but also the critical components that make agents truly valuable—knowledge bases that power them with relevant data, third-party guardrails that enhance security, professional services to support implementation, and deployment options that enable agents to seamlessly interoperate with existing software.”

Customers can choose a pay-as-you-go pricing structure or through a monthly or yearly pricing.

Brave Search

Brave is an independent, privacy-focused search engine. The Brave Search API provides AI LLMs with real-time data, can power agentic search, and can be used for creating applications that need access to the Internet.

The Brave Search API already supplies many of the top AI LLMs with up to date search data.

According to Brian Brown, Chief Business Officer at Brave Software:

“By offering the Brave Search API in AWS Marketplace, we’re providing customers with a streamlined way to access the only independent search API in the market, helping them buy and deploy agent solutions faster and more efficiently. Our customers in foundation models, search engines, and publishing are already using these capabilities to power their chatbots, search grounding, and research tools, demonstrating the real-world value of the only commercially-available search engine API at the scale of the global Web.”

Featured Image by Shutterstock/Deemerwha studio