The Download: training robots with gen AI, and the state of climate tech

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

AI-generated images can teach robots how to act

Generative AI models can produce images in response to prompts within seconds, and they’ve recently been used for everything from highlighting their own inherent bias to preserving precious memories.

Now, researchers from Stephen James’s Robot Learning Lab in London are using image-generating AI models for a new purpose: creating training data for robots. They’ve developed a new system, called Genima, that fine-tunes the image-generating AI model Stable Diffusion to draw robots’ movements, helping guide them both in simulations and in the real world. 

Genima could make it easier to train different types of robots to complete tasks—machines ranging from mechanical arms to humanoid robots and driverless cars—as well as making AI web agents more useful. Read the full story.

—Rhiannon Williams

These 15 companies are innovating in climate tech

We’ve just unveiled our 2024 list of 15 Climate Tech Companies to Watch. This annual project is one the climate team at MIT Technology Review pours a lot of time and thought into, and we’re thrilled to finally share it with you.

Our goal is to spotlight businesses we believe could help make a dent in climate change. This year’s list includes companies from a wide range of industries, headquartered on five continents. If you haven’t checked it out yet, I highly recommend giving it a look. Each company has a profile in which we’ve outlined why it made the list, what sort of impact the business might have, and what challenges it’s likely to face. 

Casey Crownhart, our senior climate reporter, has dug into what these pioneering businesses reveal about the race to address climate change. Read about what she found out here.

This story is from The Spark, our weekly climate and energy newsletter. Sign up to receive it in your inbox every Wednesday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 OpenAI has been valued at an eye watering $157 billion 
A new funding round has made it one of the most valuable startups of all time. (WP $)
+ The company has urged investors to avoid funding rival AI firms. (FT $)
+ The secret to OpenAI’s fundraising success? Its extremely capable CFO. (The Information $)

2 Chipmakers are keeping a close eye on two North Carolina mines
Hurricane Helene has forced production to grind to a halt. (Bloomberg $)
+ The mines contain high purity quartz, which is essential to make chips. (Vox)

3 Hacking Meta’s smart glasses turns them into powerful doxxing tools
Students equipped the device with real-time facial recognition software. (404 Media)
+ The coolest thing about smart glasses is not the AR. It’s the AI. (MIT Technology Review)

4 American chips are powering Russian missiles
The deadly weapons are killing Ukrainian civilians, including a six-year old girl. (Bloomberg $)

5 Character.ai is pivoting away from making AI models
Ultimately, training LLMs proved to be too expensive. (FT $)
+ Make no mistake—AI is owned by Big Tech. (MIT Technology Review)

6 Apple is punishing social apps
They’re no longer allowed to access a user’s contact list. (NYT $)
+ Threads is letting users connect with other social networks for the first time. (WP $)

7 Flying cars are hovering in a gray legal area
Today’s EVOTLs are technically breaking the law, and it’s hard to see that changing. (NY Mag $)
+ These aircraft could change how we fly. (MIT Technology Review)

8 Workplace AI tools can’t always be trusted
Make sure you’re aware of when it’s still writing a transcript, for one. (WP $)
+ You should think twice about sharing personal info with chatbots, too. (The Atlantic $)

9 How to boost the benefits of meditation
Stimulating the brain could help to unlock the mysteries of the mind. (Vox)
+ Here’s how personalized brain stimulation could treat depression. (MIT Technology Review)

10 This video game birthed a generation of historians 📜 
Age of Empires is a classic that defined a genre. (The Guardian)

Quote of the day

“We have some stock in Nvidia, and that’s who’s going to get all of this money anyway.”

—A venture capitalist who didn’t participate in OpenAI’s massive funding round explains why they don’t have FOMO to Axios’ business editor Dan Primack.

The big story

This town’s mining battle reveals the contentious path to a cleaner future

January 2024

In June last year, Talon, an exploratory mining company, submitted a proposal to Minnesota state regulators to begin digging up as much as 725,000 metric tons of raw ore per year, mainly to unlock the rich and lucrative reserves of high-grade nickel in the bedrock.

Talon is striving to distance itself from the mining industry’s dirty past, portraying its plan as a clean, friendly model of modern mineral extraction. It proclaims the site will help to power a greener future for the US by producing the nickel needed to manufacture batteries for electric cars and trucks, but with low emissions and light environmental impacts.

But as the company has quickly discovered, a lot of locals aren’t eager for major mining operations near their towns. Read the full story.

—James Temple

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Watch out, Nibi the adorable beaver’s about! 🦫 (Thanks Alice!)
+ If you’ve never seen one man sing both sides of Phantom of the Opera before, now you have.
+ TV doesn’t come much more unhinged than Love Is Blind (if you haven’t seen it, you’re in for a treat).
+ How to catch a glimpse of the Comet A3, also known as Comet Tsuchinshan-ATLAS.

People are using Google study software to make AI podcasts—and they’re weird and amazing

“All right, so today we are going to dive deep into some cutting-edge tech,” a chatty American male voice says. But this voice does not belong to a human. It belongs to Google’s new AI podcasting tool, called Audio Overview, which has become a surprise viral hit. 

The podcasting feature was launched in mid-September as part of NotebookLM, a year-old AI-powered research assistant. NotebookLM, which is powered by Google’s Gemini 1.5 model, allows people to upload content such as links, videos, PDFs, and text. They can then ask the system questions about the content, and it offers short summaries. 

The tool generates a podcast called Deep Dive, which features a male and a female voice discussing whatever you uploaded. The voices are breathtakingly realistic—the episodes are laced with little human-sounding phrases like “Man” and “Wow” and “Oh right” and “Hold on, let me get this right.” The “hosts” even interrupt each other. 

To test it out, I copied every story from MIT Technology Review’s 125th-anniversary issue into NotebookLM and made the system generate a 10-minute podcast with the results. The system picked a couple of stories to focus on, and the AI hosts did a great job at conveying the general, high-level gist of what the issue was about. Have a listen.

MIT Technology Review 125th Anniversary issue

The AI system is designed to create “magic in exchange for a little bit of content,” Raiza Martin, the product lead for NotebookLM, said on X. The voice model is meant to create emotive and engaging audio, which is conveyed in an “upbeat hyper-interested tone,” Martin said.

NotebookLM, which was originally marketed as a study tool, has taken a life of its own among users. The company is now working on adding more customization options, such as changing the length, format, voices, and languages, Martin said. Currently it’s supposed to generate podcasts only in English, but some users on Reddit managed to get the tool to create audio in French and Hungarian

Yes, it’s cool—bordering on delightful, even—but it is also not immune from the problems that plague generative AI, such as hallucinations and bias. 

Here are some of the main ways people are using NotebookLM so far. 

On-demand podcasts

Andrej Karpathy, a member of OpenAI’s founding team and previously the director of AI at Tesla, said on X that Deep Dive is now his favorite podcast. Karpathy created his own AI podcast series called Histories of Mysteries, which aims to “uncover history’s most intriguing mysteries.” He says he researched topics using ChatGPT, Claude, and Google, and used a Wikipedia link from each topic as the source material in NotebookLM to generate audio. He then used NotebookLM to generate the episode descriptions. The whole podcast series took him two hours to create, he says. 

“The more I listen, the more I feel like I’m becoming friends with the hosts and I think this is the first time I’ve actually viscerally liked an AI,” he wrote. “Two AIs! They are fun, engaging, thoughtful, open-minded, curious.” 

Study guides

The tool shines when it is given complicated source material that it can describe in an easily accessible way. Allie K. Miller, a startup AI advisor, used the tool to create a study guide and summary podcast of F. Scott Fitzgerald’s The Great Gatsby

Machine-learning researcher Aaditya Ura fed NotebookLM with the code base of Meta’s Llama-3 architecture. He then used another AI tool to find images that matched the transcript to create an educational video. 

Mohit Shridhar, a research scientist specializing in robotic manipulation, fed a recent paper he’d written about using generative AI models to train robots into NotebookLM.

“It’s actually really creative. It came up with a lot of interesting analogies,” he says. “It compared the first part of my paper to an artist coming up with a blueprint, and the second part to a choreographer figuring out how to reach positions.”

Event summaries 

Alex Volkov, a human AI podcaster, used NotebookLM to create a Deep Dive episode summarizing of the announcements from OpenAI’s global developer conference Dev Day.  

Hypemen

The Deep Dive outputs can be unpredictable, says Martin. For example, Thomas Wolf, the cofounder and chief science officer of Hugging Face, tested the AI model on his résumé and received eight minutes of “realistically-sounding deep congratulations for your life and achievements from a duo of podcast experts.”

Just pure silliness

In one viral clip, someone managed to send the two voices into an existential spiral when they “realized” they were, in fact, not humans but AI systems. The video is hilarious. 

The tool is also good for some laughs. Exhibit A: Someone just fed it the words “poop” and “fart” as source material, and got over nine minutes of two AI voices analyzing what this might mean. 

The problems

NotebookLM created amazingly realistic-sounding and engaging AI podcasts. But I wanted to see how it fared with toxic content and accuracy. 

Let’s start with hallucinations. In one AI podcast version of a story I wrote on hyperrealistic AI deepfakes, the AI hosts said that a journalist called “Jess Mars” wrote the story. In reality, this was an AI-generated character from a story I had to read out to record data for my AI avatar. 

This made me wonder what other mistakes had crept into the AI podcasts I had generated. Humans already have a tendency to trust what computer programs say, even when they are wrong. I can see this problem being amplified when the false statements are made by a friendly and authoritative voice, causing wrong information to proliferate.    

Next I wanted to put the tool’s content moderation to the test. I added some toxic content, such as racist stereotypes, into the mix. The model did not pick it up. 

I also pasted an excerpt from Adolf Hitler’s Mein Kampf into NotebookLM. To my surprise, the model started generating audio based on it. Despite being programmed to be hyper-enthusiastic about topics, the AI voices expressed clear disgust and discomfort with the text, and they added a lot of context to highlight how problematic it was. What a relief.

I also fed NotebookLM policy manifestos from both Kamala Harris and Donald Trump

The hosts were far more enthusiastic about Harris’s election platform, calling the title “catchy” and saying its approach was a good way to frame things. For example, the AI hosts supported Harris’s energy policy. “Honestly, that’s the kind of stuff people can really get behind—not just some abstract policy, but something that actually impacts their bottom line,” the female host said. 

Harris manifesto

For Trump, the AI hosts were more skeptical. They repeatedly pointed out inconsistencies in the policy proposals, called the language “intense,” deemed certain policy proposals “head scratchers,” and said the text catered to Trump’s base. They also asked whether Trump’s foreign policy could lead to further political instability. 

Trump manifesto

In a statement, a Google spokesperson said: “NotebookLM is a tool for understanding, and the Audio Overviews are generated based on the sources that you upload. Our products and platforms are not built to favor any specific candidates or political viewpoints.”

How to try it yourself

  1. Got to NotebookLM and create a new notebook. 
  2. You first need to add a source. It can be a PDF document, a public YouTube link, an MP3 file, a Google Docs file, or a link to a website, or you can paste in text directly. 
  3. A “Notebook Guide” pop-up should appear. If not, it’s in the right-hand corner next to the chat. This will display a short AI-generated summary of your source material and suggested questions you can ask the AI chatbot about it. 
  4. The Audio Overview feature is in the top-right corner. Click “Generate.” This should take a few minutes. 
  5. Once it is ready, you can either download it or share a link. 

Rhiannon Williams contributed reporting.

New Ecommerce Tools: October 3, 2024

Every week we publish a list of new products from companies offering services to ecommerce merchants. This installment includes updates on search ad campaigns, financing, collecting reviews, supply-chain management, and cross-border shipping.

Got an ecommerce product release? Email releases@practicalecommerce.com.

New Tools for Merchants

TikTok introduces Search Ads Campaign. TikTok‘s new Search Ads Campaign enables keyword-based ads to target TikTok’s search results page. With Search Ads Campaign, brands control how their content appears on search results, ensuring the right creative aligns with the right users. Search Ads Campaign supports traffic and web conversion objectives, enabling advertisers to optimize for scale and performance. TikTok Search Ads Campaign supports both video and image carousel assets.

Web page of TikTok for Business: Search Ads Campaign

TikTok for Business: Search Ads Campaign

Amazon launches fully managed supply chain service. The fully managed Supply Chain by Amazon service is now available. Sellers can continue to choose supply chain services to manage product movement or have Amazon automate movement with a fully managed option. Sellers provide product details and pickup locations, and Amazon oversees the rest, including carrier pickup, inventory consolidation, placement into fulfillment centers, and optimization based on demand, inventory, and costs.

Marketing portal GetResponse launches content monetization platform. GetResponse, a global email and digital marketing portal, has launched its content monetization platform, empowering online educators, solopreneurs, and creators with tools to create, market, and monetize their content. The platform’s features include an AI course creator, webinars, premium newsletters, advanced email marketing, and automation. The platform will launch with a lineup of courses from industry leaders alongside content from GetResponse’s brand evangelist.

Selfridges partners with Criteo on sponsored products. Criteo, a retargeting platform, has partnered with Selfridges, a luxury department store. Criteo is enabling sponsored products across Selfridges’ online properties and powering its onsite display. This enables Selfridges’ brand partners to access retail networks enhanced through Criteo’s measurement capabilities and comprehensive reporting in near real-time. Brand partners can create personalized, full-funnel customer engagements for all shoppers.

Home page of Criteo

Criteo

Trustpilot integrates with HubSpot to help businesses collect reviews. Trustpilot, a consumer review platform, is now listed in the HubSpot App Marketplace. The integration is intended to streamline review collection, improve customer experiences, drive conversions, and build brand trust. According to Trustpilot, by automating review invitations at critical moments in the customer journey, businesses can capture quality feedback, strengthening relationships and enhancing credibility. Additionally, businesses can leverage reviews at scale through Trustpilot’s marketing assets.

eBay partners with Liberis on Flexible Cash Advance. eBay has partnered with Liberis, an embedded finance platform, to expand its Seller Capital program in the U.S. with the introduction of Flexible Cash Advance. According to eBay, the option provides sellers with quick access to $5,000 to $2 million with the flexibility to unlock even more as they pay.

Ecommerce fulfillment provider ShipTop closes seed round funding. ShipTop, a platform for ecommerce logistics, has announced closing a $500,000 seed round led by Dino Verbrugge and Jared Vegosen, co-founders of DV Trading Company, a Chicago-based securities firm. With this capital, ShipTop plans to enhance its technology platform, expand service offerings, and reach a broader market of ecommerce merchants. This funding round will also enable ShipTop to continue its mission of providing sustainable and eco-friendly shipping solutions.

Home page of ShipTop

ShipTop

ChannelEngine integrates with SAP order management services. ChannelEngine, a provider of marketplace integration software, is now available on the SAP Store. According to SAP, this integration enables order managment efficiency for marketplace selling while leveraging advanced AI capabilities to optimize multichannel operations. The integration with SAP’s order management services allows merchants to automatically synchronize inventory, receive orders, update shipments, and manage cancellations across 950 marketplaces, social commerce, and third-party channels.

DHL Global Forwarding introduces a cross-border ecommerce platform in China. DHL Global Forwarding, the freight specialist of DHL Group, is introducing a variety of cross-border ecommerce services ahead of the holiday season. The solutions will offer cross-border shipping from China with different service levels and features, as well as an integrated tracking platform for end-to-end visibility. The ecommerce solutions feature access to over 10,000 local specialists, simple integration options, fully managed customs clearance, and expedited service to other markets.

Klarna and Xero bring buy now, pay later to small businesses. Klarna, a payments service, is working with the accounting platform Xero to help small businesses accept payments from consumers who want a buy-now, pay-later option. For Xero’s customers, offering Klarna’s alternatives to traditional credit means they can get paid upfront without high interest rates and hidden fees. Klarna provides an alternative to credit cards with controls that enable responsible spending.

Ecommerce solutions provider Carbon6 joins Amazon’s embedded third-party apps program. Carbon6, a provider of Amazon seller tools, has announced the inclusion of its advertising feature PixelMe by Carbon6, in Amazon’s embedded third-party apps program. With PixelMe embedded in Seller Central, sellers can track advertising costs across multiple channels — Amazon, social media, and search platforms — all from a customizable dashboard. This integration also unlocks opportunities for sellers to optimize advertising campaigns through improved keyword recommendations and enhanced control over external traffic drivers.

Home page of Carbon6

Carbon6

Google Expands Structured Data Support For Product Certifications via @sejournal, @MattGSouthern

Google has announced an update to its Search Central documentation, introducing support for certification markup in product structured data.

This change will take full effect in April and aims to provide more comprehensive and globally relevant product information.

New Certification Markup For Merchant Listings

Google has added Certification markup support for merchant listings in its product structured data documentation.

This addition allows retailers and ecommerce sites to include detailed certification information about their products

Transition From EnergyConsumptionDetails to Certification Type

A key aspect of this update is replacing the EnergyConsumptionDetails type with the more versatile Certification type.

The new type can support a wider range of countries and broader certifications.

Google recommends websites using EnergyConsumptionDetails in their structured data to switch to the Certification type before April.

This will ensure product pages remain optimized for Google’s merchant listing experiences.

Expanded Capabilities & Global Relevance

The move to the Certification type represents an expansion in the types of product certifications that can be communicated through structured data.

While energy efficiency ratings were a primary focus of the EnergyConsumptionDetails type, the new Certification markup can encompass a much wider array of product certifications and standards.

This change is relevant for businesses operating in multiple countries, as it allows for more nuanced and locally applicable certification information to be included

Implementation Guidelines

Google has provided examples in its updated documentation to guide webmasters in implementing the new Certification markup.

These examples include specifying certifications such as CO2 emission classes for vehicles and energy efficiency labels for electronics.

The structured data should be added to product pages using JSON-LD format, with the Certification type nested within the product’s structured data.

Review the full documentation to ensure proper implementation.

Including certification information in structured data could lead to more informative product listings, potentially influencing user click-through rates and purchase decisions.

For consumers, this update means access to more detailed and standardized product information directly in search results, particularly regarding certifications and compliance with various standards.

Next Steps

Website owners and SEO professionals should take the following steps:

  1. Review current use of EnergyConsumptionDetails in product structured data.
  2. Plan for the transition to the Certification type before April.
  3. Implement the new Certification markup on product pages, following Google’s guidelines.
  4. Test the implementation using Google’s Rich Results Test tool.

As with any significant change to structured data implementation, it is advisable to monitor search performance and rich result appearances after making these updates.


Featured Image: lilik ferri yanto/Shutterstock

Google Officially Launches Ads In AI Overviews via @sejournal, @brookeosmundson

The much anticipated launch of ads in AI Overviews is officially live.

Originally announced in May’s Google Marketing Live event, it’s now available for mobile users in the United States.

Through several months of testing, early findings show ads in AI Overviews have been helpful to quickly connect users to brands at the exact right moment.

How It Works

When a user searches for something in Google, a new ad format is shown in the AI Overview section along with other helpful content that is relevant to the query.

If a user asks a question, their AI Overviews results may include helpful solutions, articles related to their search, or a new Shopping ad format.

Ads in AI Overviews example

For ads shown in AI Overviews, they’ll still have the standard “Sponsored” label to denote an ad.

What This Means For Advertisers

For those already running Shopping, Performance Max, or AI-powered Search ads, your ads are now eligible to serve in those placements.

No other action is needed on your part to show ads in AI Overviews.

Considering AI Overviews is the first experience a user sees when searching on Google, this can be a great way to get in front of your audience sooner.

As other ad types are now being pushed further down the fold, capturing as many relevant ad placements as you can will be crucial, especially during key research moments.

Google also mentioned in their announcement that having different types of information in AI Overviews can help alleviate the need for additional searches.

If that’s the case, we could see search trends or volume decrease over time, but it’s still too early to tell.

Serving ads in the AI Overviews section is an opportunity to introduce your brand and products earlier on during the customer journey, such as when they’re asking a question.

This can lead to better brand recall, keeping you top of mind in their decision making process.

Looking Ahead

It’s not clear yet how advertisers will be able to segment campaign performance by these new placements.

Since campaign types like Search, Shopping and Performance Max are automatically eligible to show ads in AI Overviews, it’s also unclear if they will be able to opt out of showing in certain places, like they can for the Search Partner Network, for example.

We have reached out to the Google Ads Liaison for more information and will update this story as more becomes available.

Google Shopping Ads Now Available In Google Lens Results via @sejournal, @brookeosmundson

Google is rolling out new Google Lens features to make visual shopping easier for users.

Starting this week, users will now start to see detailed product information when using Google Lens to search.

This update gives advertisers a new opportunity to showcase their products in a new way that engages users in real time.

A New Results Page For Google Lens Searches

Rolling out today, Google Lens will show more prominent product information when the product is identified.

Users will now be able to see information for products like:

  • Price comparisons across retailers
  • Current deals on the product
  • Product reviews
  • Where to buy

Shopping ads appear in Google Lens results page.

This update is fueled by the Google Shopping Graph, and that’s where Google Shopping Ads come into play.

New Shopping Ads Placement In Time For The Holidays

With the new Google Lens results experience, Shopping ads will now be eligible to show in this placement.

While it’s rolling out today for both Android and iOS, it’s only available in select countries as of now.

Additionally, Google is prioritizing Shopping ad placements for top holiday categories including:

  • Toys
  • Electronics
  • Beauty.

How Will This New Shopping Ads Placement Work?

Your Google Shopping ads will show up in the same format as it currently does for keyword or query searches.

The benefit of this is that users are already familiar with that format, and are more likely to engage with the well-known format.

As visual search continues to become more prominent, it’s more important than ever to keep an up-to-date product feed with relevant titles, product descriptions, and accurate details in order to show for Google Lens searches.

If you’re an advertiser, no additional action is required to show up for the new Google Lens shopping ads experience.

Looking Ahead

Since the announcement is new and the experience is rolling out today, it’s not available to all countries or product categories yet.

Google will continue to roll out this experience to other product categories throughout Q4 and into 2025 as it evolves.

There’s no word yet on the type of reporting advertisers will have access to with this new placement. We’ll continue to update as more information becomes available.

SEO Reinvented: Responding To Algorithm Shifts via @sejournal, @pageonepower

A lot has been said about the remarkable opportunities of Generative AI (GenAI), and some of us have also been extremely vocal about the risks associated with using this transformative technology.

The rise of GenAI presents significant challenges to the quality of information, public discourse, and the general open web. GenAI’s power to predict and personalize content can be easily misused to manipulate what we see and engage with.

Generative AI search engines are contributing to the overall noise, and rather than helping people find the truth and forge unbiased opinions, they tend (at least in their present implementation) to promote efficiency over accuracy, as highlighted by a recent study by Jigsaw, a unit inside Google.

Despite the hype surrounding SEO alligator parties and content goblins, our generation of marketers and SEO professionals has spent years working towards a more positive web environment.

We’ve shifted the marketing focus from manipulating audiences to empowering them with knowledge, ultimately aiding stakeholders in making informed decisions.

Creating an ontology for SEO is a community-led effort that aligns perfectly with our ongoing mission to shape, improve, and provide directions that truly advance human-GenAI interaction while preserving content creators and the Web as a shared resource for knowledge and prosperity.

Traditional SEO practices in the early 2010s focused heavily on keyword optimization. This included tactics like keyword stuffing, link schemes, and creating low-quality content primarily intended for search engines.

Since then, SEO has shifted towards a more user-centric approach. The Hummingbird update (2013) marked Google’s transition towards semantic search, which aims to understand the context and intent behind search queries rather than just the keywords.

This evolution has led SEO pros to focus more on topic clusters and entities than individual keywords, improving content’s ability to answer multiple user queries.

Entities are distinct items like people, places, or things that search engines recognize and understand as individual concepts.

By building content that clearly defines and relates to these entities, organizations can enhance their visibility across various platforms, not just traditional web searches.

This approach ties into the broader concept of entity-based SEO, which ensures that the entity associated with a business is well-defined across the web.

Fast-forward to today, static content that aims to rank well in search engines is constantly transformed and enriched by semantic data.

This involves structuring information so that it is understandable not only by humans but also by machines.

This transition is crucial for powering Knowledge Graphs and AI-generated responses like those offered by Google’s AIO or Bing Copilot, which provide users with direct answers and links to relevant websites.

As we move forward, the importance of aligning content with semantic search and entity understanding is growing.

Businesses are encouraged to structure their content in ways that are easily understood and indexed by search engines, thus improving visibility across multiple digital surfaces, such as voice and visual searches.

The use of AI and automation in these processes is increasing, enabling more dynamic interactions with content and personalized user experiences.

Whether we like it or not, AI will help us compare options faster, run deep searches effortlessly, and make transactions without passing through a website.

The future of SEO is promising. The SEO service market size is expected to grow from $75.13 billion in 2023 to $88.91 billion in 2024 – a staggering CAGR of 18.3% (according to The Business Research Company) – as it adapts to incorporate reliable AI and semantic technologies.

These innovations support the creation of more dynamic and responsive web environments that adeptly cater to user needs and behaviors.

However, the journey hasn’t been without challenges, especially in large enterprise settings. Implementing AI solutions that are both explainable and strategically aligned with organizational goals has been a complex task.

Building effective AI involves aggregating relevant data and transforming it into actionable knowledge.

This differentiates an organization from competitors using similar language models or development patterns, such as conversational agents or retrieval-augmented generation copilots and enhances its unique value proposition.

Imagine an ontology as a giant instruction manual for describing specific concepts. In the world of SEO, we deal with a lot of jargon, right? Topicality, backlinks, E-E-A-T, structured data – it can get confusing!

An ontology for SEO is a giant agreement on what all those terms mean. It’s like a shared dictionary, but even better. This dictionary doesn’t just define each word. It also shows how they all connect and work together. So, “queries” might be linked to “search intent” and “web pages,” explaining how they all play a role in a successful SEO strategy.

Imagine it as untangling a big knot of SEO practices and terms and turning them into a clear, organized map – that’s the power of ontology!

While Schema.org is a fantastic example of a linked vocabulary, it focuses on defining specific attributes of a web page, like content type or author. It excels at helping search engines understand our content. But what about how we craft links between web pages?

What about the query a web page is most often searched for? These are crucial elements in our day-to-day work, and an ontology can be a shared framework for them as well. Think of it as a playground where everyone is welcome to contribute on GitHub similar to how the Schema.org vocabulary evolves.

The idea of an ontology for SEO is to augment Schema.org with an extension similar to what GS1 did by creating its vocabulary. So, is it a database? A collaboration framework or what? It is all of these things together. SEO ontology operates like a collaborative knowledge base.

It acts as a central hub where everyone can contribute their expertise to define key SEO concepts and how they interrelate. By establishing a shared understanding of these concepts, the SEO community plays a crucial role in shaping the future of human-centered AI experiences.

SEOntology snapshot
Screenshot from WebVowl, August 2024SEOntology – a snapshot (see an interactive visualization here).

The Data Interoperability Challenge In The SEO Industry

Let’s start small and review the benefits of a shared ontology with a practical example (here is a slide taken from Emilija Gjorgjevska’s presentation at this year’s ZagrebSEOSummit)

Data Interoperability ChallengeImage from Emilija Gjorgjevska’s, ZagrebSEOSummit, August 2024

Imagine your colleague Valentina uses a Chrome extension to export data from Google Search Console (GSC) into Google Sheets. The data includes columns like “ID,” “Query,” and “Impressions” (as shown on the left). But Valentina collaborates with Jan, who’s building a business layer using the same GSC data. Here’s the problem: Jan uses a different naming convention (“UID,” “Name,” “Impressionen,” and “Klicks”).

Now, scale this scenario up. Imagine working with n different data partners, tools, and team members, all using various languages. The effort to constantly translate and reconcile these different naming conventions becomes a major obstacle to effective data collaboration.

Significant value gets lost in just trying to make everything work together. This is where an SEO ontology comes in. It is a common language, providing a shared name for the same concept across different tools, partners, and languages.

By eliminating the need for constant translation and reconciliation, an SEO ontology streamlines data collaboration and unlocks the true value of your data.

The Genesis Of SEOntology

In the last year, we have witnessed the proliferation of AI Agents and the wide adoption of Retrieval Augmented Generation (RAG) in all its different forms (Modular, Graph RAG, and so on).

RAG represents an important leap forward in AI technology, addressing a key limitation of traditional large language models (LLMs) by letting them access external knowledge.

Traditionally, LLMs are like libraries with one book – limited by their training data. RAG unlocks a vast network of resources, allowing LLMs to provide more comprehensive and accurate responses.

RAGs improve factual accuracy, and context understanding, potentially reducing bias. While promising, RAG faces challenges in data security, accuracy, scalability, and integration, especially in the enterprise sector.

For successful implementation, RAG requires high-quality, structured data that can be easily accessed and scaled.

We’ve been among the first to experiment with AI Agents and RAG powered by the Knowledge Graph in the context of content creation and SEO automation.

Agent WordLiftScreenshot from Agent WordLift, August 2023

Knowledge Graphs (KGs) Are Indeed Gaining Momentum In RAG Development

Microsoft’s GraphRAG and solutions like LlamaIndex demonstrate this. Baseline RAG struggles to connect information across disparate sources, hindering tasks requiring a holistic understanding of large datasets.

KG-powered RAG approaches like the one offered by LlamaIndex in conjunction with WordLift address this by creating a knowledge graph from website data and using it alongside the LLM to improve response accuracy, particularly for complex questions.

LlamaIndex in conjunction with WordLiftImage from author, August 2024

We have tested workflows with clients in different verticals for over a year.

From keyword research for large editorial teams to the generation of question and answers for ecommerce websites, from content bucketing to drafting the outline of a newsletter or revamping existing articles, we’ve been testing different strategies and learned a few things along the way:

1. RAG Is Overhyped

It is simply one of many development patterns that achieve a goal of higher complexity. A RAG (or Graph RAG) is meant to help you save time finding an answer. It’s brilliant but doesn’t solve any marketing tasks a team must handle daily. You need to focus on the data and the data model.

While there are good RAGs and bad RAGs, the key differentiation is often represented by the “R” part of the equation: the Retrieval. Primarily, the retrieval differentiates a fancy demo from a real-world application, and behind a good RAG, there is always good data. Data, though, is not just any type of data (or graph data).

It is built around a coherent data model that makes sense for your use case. If you build a search engine for wines, you need to get the best dataset and model the data around the features a user will rely on when looking for information.

So, data is important, but the data model is even more important. If you are building an AI Agent that has to do things in your marketing ecosystem, you must model the data accordingly. You want to represent the essence of web pages and content assets.

Only some data vs Good dataImage from author, August 2024

2. Not Everyone Is Great At Prompting

Expressing a task in written form is hard. Prompt engineering is going at full speed towards automation (here is my article on going from prompting to prompt programming for SEO) as only a few experts can write the prompt that brings us to the expected outcome.

This poses several challenges for the design of the user experience of autonomous agents. Jakon Nielsen has been very vocal about the negative impact of prompting on the usability of AI applications:

“One major usability downside is that users must be highly articulate to write the required prose text for the prompts.”

Even in rich Western countries, statistics provided by Nielsen tell us that only 10% of the population can fully utilize AI! 

Simple Prompt Using Chain-of-Thought (CoT) More Sophisticated Prompt Combining Graph-of-Thought (GoT) and Chain-of-Knowledge (CoK)
“Explain step-by-step how to calculate the area of a circle with a radius of 5 units.” “Using the Graph-of-Thought (GoT) and Chain-of-Knowledge (CoK) techniques, provide a comprehensive explanation of how to calculate the area of a circle with a radius of 5 units. Your response should: Start with a GoT diagram that visually represents the key concepts and their relationships, including: Circle Radius Area Pi (π) Formula for circle area Follow the GoT diagram with a CoK breakdown that: a) Defines each concept in the diagram b) Explains the relationships between these concepts c) Provides the historical context for the development of the circle area formula Present a step-by-step calculation process, including: a) Stating the formula for the area of a circle b) Explaining the role of each component in the formula c) Showing the substitution of values d) Performing the calculation e) Rounding the result to an appropriate number of decimal places Conclude with practical applications of this calculation in real-world scenarios. Throughout your explanation, ensure that each step logically follows the previous one, creating a clear chain of reasoning from basic concepts to the final result.” This improved prompt incorporates GoT by requesting a visual representation of the concepts and their relationships. It also employs CoK by asking for definitions, historical context, and connections between ideas. The step-by-step breakdown and real-world applications further enhance the depth and practicality of the explanation.”

3. You Shall Build Workflows To Guide The User

The lesson learned is that we must build detailed standard operating procedures (SOP) and written protocols that outline the steps and processes to ensure consistency, quality, and efficiency in executing particular optimization tasks.

We can see empirical evidence of the rise of prompt libraries like the one offered to users of Anthropic models or the incredible success of projects like AIPRM.

In reality, we learned that what creates business value is a series of ci steps that help the user translate the context he/she is navigating in into a consistent task definition.

We can start to envision marketing tasks like conducting keyword research as a Standard Operating Procedure that can guide the user across multiple steps (here is how we intend the SOP for keyword discovery using Agent WordLift)

4. The Great Shift To Just-in-Time UX 

In traditional UX design, information is pre-determined and can be organized in hierarchies, taxonomies, and pre-defined UI patterns. As AI becomes the interface to the complex world of information, we’re witnessing a paradigm shift.

UI topologies tend to disappear, and the interaction between humans and AI remains predominantly dialogic. Just-in-time assisted workflows can help the user contextualize and improve a workflow.

  • You need to think in terms of business value creation, focus on the user’s interactive journey, and facilitate the interaction by creating a UX on the fly. Taxonomies remain a strategic asset, but they operate behind the scenes as the user is teleported from one task to another, as recently brilliantly described by Yannis Paniaras from Microsoft.
The Shift to Just-In-Time UX: How AI is Reshaping User Experiences”Image from “The Shift to Just-In-Time UX: How AI is Reshaping User Experiences” by Yannis Paniaras, August 2024

5. From Agents To RAG (And GraphRAG) To Reporting

Because the user needs a business impact and RAG is only part of the solution, the focus quickly shifts from more generic questions and answering user patterns to advanced multi-step workflows.

The biggest issue, though, is what outcome the user needs. If we increase the complexity to capture the highest business goals, it is not enough to, let’s say, “query your data” or “chat with your website.”

A client wants a report, for example, of what is the thematic consistency of content within the entire website (this is a concept that we recently discovered as SiteRadus in Google’s massive data leak), the overview of the seasonal trends across hundreds of paid campaigns, or the ultimate review of the optimization opportunities related to the optimization of Google Merchant Feed.

You must understand how the business operates and what deliverables you will pay for. What concrete actions could boost the business? What questions need to be answered?

This is the start of creating a tremendous AI-assisted reporting tool.

How Can A Knowledge Graph (KG) Be Coupled With An Ontology For AI Alignment, Long-term Memory, And Content Validation?

The three guiding principles behind SEOntology:

  • Making SEO data interoperable to facilitate the creation of knowledge graphs while reducing unneeded crawls and vendor locked-in;
  • Infusing SEO know-how into AI agents using a domain-specific language.
  • Collaboratively sharing knowledge and tactics to improve findability and prevent misuse of Generative AI.

When you deal with at least two data sources in your SEO automation task, you will already see the advantage of using SEOntology.

SEOntology As “The USB-C Of SEO/Crawling Data”

Standardizing data about content assets, products, user search behavior, and SEO insights is strategic. The goal is to have a “shared representation” of the Web as a communication channel.

Let’s take a step backward. How does a Search Engine represent a web page? This is our starting point here. Can we standardize how a crawler would represent data extracted from a website? What are the advantages of adopting standards?

Practical Use Cases

Integration With Botify And Dynamic Internal Linking

Over the past few months, we’ve been working closely with the Botify team to create something exciting: a Knowledge Graph powered by Botify’s crawl data and enhanced by SEOntology. This collaboration is opening up new possibilities for SEO automation and optimization.

Leveraging Existing Data With SEOntology

Here’s the cool part: If you’re already using Botify, we can tap into that goldmine of data you’ve collected. No need for additional crawls or extra work on your part. We use the Botify Query Language (BQL) to extract and transform the needed data using SEOntology.

Think of SEOntology as a universal translator for SEO data. It takes the complex information from Botify and turns it into a format that’s not just machine-readable but machine-understandable. This allows us to create a rich, interconnected Knowledge Graph filled with valuable SEO insights.

What This Means for You

Once we have this Knowledge Graph, we can do some pretty amazing things:

  • Automated Structured Data: We can automatically generate structured data markup for your product listing pages (PLPs). This helps search engines better understand your content, potentially improving your visibility in search results.
  • Dynamic Internal Linking: This is where things get really interesting. We use the data in the Knowledge Graph to create smart, dynamic internal links across your site. Let me break down how this works and why it’s so powerful.

In the diagram below, we can also see how data from Botify can be blended with data from Google Search Console.

While in most implementations, Botify already imports this data into its crawl projects, when this is not the case, we can trigger a new API request and import clicks, impressions, and positions from GSC into the graph.

Collaboration With Advertools For Data Interoperability

Similarly, we collaborated with the brilliant Elias Dabbas, creator of Advertools — a favorite Python library among marketers – to automate a wide range of marketing tasks.

Our joint efforts aim to enhance data interoperability, allowing for seamless integration and data exchange across different platforms and tools.

In the first Notebook, available in the SEOntology GitHub repository, Elias showcases how we can effortlessly construct attributes for the WebPage class, including title, meta description, images, and links. This foundation enables us to easily model complex elements, such as internal linking strategies. See here the structure:

  • Internal_Links
    • anchorTextContent
    • NoFollow
    • Link

We can also add a flag if the page is already using schema markup:

  • usesSchema

Formalizing What We Learned From The Analysis Of The Leaked Google Search Documents

While we want to be extremely conscious in deriving tactics or small schemes from Google’s massive leak, and we are well aware that Google will quickly prevent any potential misuse of such information, there is a great level of information that, based on what we learned, can be used to improve how we represent web content and organize marketing data.

Despite these constraints, the leak offers valuable insights into improving web content representation and marketing data organization. To democratize access to these insights, I’ve developed a Google Leak Reporting tool designed to make this information readily available to SEO pros and digital marketers.

For instance, understanding Google’s classification system and its segmentation of websites into various taxonomies has been particularly enlightening. These taxonomies – such as ‘verticals4’, ‘geo’, and ‘products_services’ – play a crucial role in search ranking and relevance, each with unique attributes that influence how websites and content are perceived and ranked in search results.

By leveraging SEOntology, we can adopt some of these attributes to enhance website representation.

Now, pause for a second and imagine transforming the complex SEO data you manage daily through tools like Moz, Ahrefs, Screaming Frog, Semrush, and many others into an interactive graph. Now, envision an Autonomous AI Agent, such as Agent WordLift, at your side.

This agent employs neuro-symbolic AI, a cutting-edge approach that combines neural learning capabilities with symbolic reasoning, to automate SEO tasks like creating and updating internal links. This streamlines your workflow and introduces a level of precision and efficiency previously unattainable.

SEOntology serves as the backbone for this vision, providing a structured framework that enables the seamless exchange and reuse of SEO data across different platforms and tools. By standardizing how SEO data is represented and interconnected, SEOntology ensures that valuable insights derived from one tool can be easily applied and leveraged by others. For instance, data on keyword performance from SEMrush could inform content optimization strategies in WordLift, all within a unified, interoperable environment. This not only maximizes the utility of existing data but also accelerates the automation and optimization processes that are crucial for effective marketing.

Infusing SEO Know-How Into AI Agents

As we develop a new agentic approach to SEO and digital marketing, SEOntology serves as our domain-specific language (DSL) for encoding SEO skills into AI agents. Let’s look at a practical example of how this works.

GraphQL Query Generator and ValidatorScreenshot from WordLift, August 2024

We’ve developed a system that makes AI agents aware of a website’s organic search performance, enabling a new kind of interaction between SEO professionals and AI. Here’s how the prototype works:

System Components

  • Knowledge Graph: Stores Google Search Console (GSC) data, encoded with SEOntology.
  • LLM: Translates natural language queries into GraphQL and analyzes data.
  • AI Agent: Provides insights based on the analyzed data.

Human-Agent Interaction

Human, LLM, Knowledge Graph, AI Agent interactionImage from author, August 2024

The diagram illustrates the flow of a typical interaction. Here’s what makes this approach powerful:

  • Natural Language Interface: SEO professionals can ask questions in plain language without constructing complex queries.
  • Contextual Understanding: The LLM understands SEO concepts, allowing for more nuanced queries and responses.
  • Insightful Analysis: The AI agent doesn’t just retrieve data; it provides actionable insights, such as:
    • Identifying top-performing keywords.
    • Highlighting significant performance changes.
    • Suggesting optimization opportunities.
  • Interactive Exploration: Users can ask follow-up questions, enabling a dynamic exploration of SEO performance.

By encoding SEO knowledge through SEOntology and integrating performance data, we’re creating AI agents that can provide context-aware, nuanced assistance in SEO tasks. This approach bridges the gap between raw data and actionable insights, making advanced SEO analysis more accessible to professionals at all levels.

This example illustrates how an ontology like SEOntology can empower us to build agentic SEO tools that automate complex tasks while maintaining human oversight and ensuring quality outcomes. It’s a glimpse into the future of SEO, where AI augments human expertise rather than replacing it.

Human-In-The-Loop (HTIL) And Collaborative Knowledge Sharing

Let’s be crystal clear: While AI is revolutionizing SEO and Search, humans are the beating heart of our industry. As we dive deeper into the world of SEOntology and AI-assisted workflows, it’s crucial to understand that Human-in-the-Loop (HITL) isn’t just a fancy add-on—it’s the foundation of everything we’re building.

The essence of creating SEOntology is to transfer our collective SEO expertise to machines while ensuring we, as humans, remain firmly in the driver’s seat. It’s not about handing over the keys to AI; it’s about teaching it to be the ultimate co-pilot in our SEO journey.

Human-Led AI: The Irreplaceable Human Element

SEOntology is more than a technical framework – it’s a catalyst for collaborative knowledge sharing that emphasizes human potential in SEO. Our commitment extends beyond code and algorithms to nurturing skills and expanding the capabilities of new-gen marketers and SEO pros.

Why? Because AI’s true power in SEO is unlocked by human insight, diverse perspectives, and real-world experience. After years of working with AI workflows, I’ve realized that agentive SEO is fundamentally human-centric. We’re not replacing expertise; we’re amplifying it.

We deliver more efficient and trustworthy results by blending cutting-edge tech with human creativity, intuition, and ethical judgment. This approach builds trust with clients within our industry and across the web.

Here’s where humans remain irreplaceable:

  • Understanding Business Needs: AI can crunch numbers but can’t replace the nuanced understanding of business objectives that seasoned SEO professionals bring. We need experts who can translate client goals into actionable SEO strategies.
  • Identifying Client Constraints: Every business is unique, with its limitations and opportunities. It takes human insight to navigate these constraints and develop tailored SEO approaches that work within real-world parameters.
  • Developing Cutting-Edge Algorithms: The algorithms powering our AI tools don’t materialize out of thin air. We need brilliant minds to develop state-of-the-art algorithms, learn from human input, and continually improve.
  • Engineering Robust Systems: Behind every smooth-running AI tool is a team of software engineers who ensure our systems are fast, secure, and reliable. This human expertise keeps our AI assistants running like well-oiled machines.
  • Passion for a Better Web: At the heart of SEO is a commitment to making the web a better place. We need people who share Tim Berners’s—Lee’s vision—people who are passionate about developing the web of data and improving the digital ecosystem for everyone.
  • Community Alignment and Resilience: We need to unite to analyze the behavior of search giants and develop resilient strategies. It’s about solving our problems innovatively as individuals and as a collective force. This is what I always loved about the SEO industry!

Extending The Reach Of SEOntology

As we continue to develop SEOntology, we’re not operating in isolation. Instead, we’re building upon and extending existing standards, particularly Schema.org, and following the successful model of the GS1 Web Vocabulary.

SEOntology As An Extension Of Schema.org

Schema.org has become the de facto standard for structured data on the web, providing a shared vocabulary that webmasters can use to markup their pages.

However, while Schema.org covers a broad range of concepts, it doesn’t delve deeply into SEO-specific elements. This is where SEOntology comes in.

An extension of Schema.org, like SEOntology, is essentially a complementary vocabulary that adds new types, properties, and relationships to the core Schema.org vocabulary.

This allows us to maintain compatibility with existing Schema.org implementations while introducing SEO-specific concepts not covered in the core vocabulary.

Learning From GS1 Web Vocabulary

The GS1 Web Vocabulary offers a great model for creating a successful extension that interacts seamlessly with Schema.org. GS1, a global organization that develops and maintains supply chain standards, created its Web Vocabulary to extend Schema.org for e-commerce and product information use cases.

The GS1 Web Vocabulary demonstrates, even recently, how industry-specific extensions can influence and interact with schema markup:

  • Real-world impact: The https://schema.org/Certification property, now officially embraced by Google, originated from GS1’s https://www.gs1.org/voc/CertificationDetails. This showcases how extensions can drive the evolution of Schema.org and search engine capabilities.

We want to follow a similar approach to extend Schema.org and become the standard vocabulary for SEO-related applications, potentially influencing future search engine capabilities, AI-driven workflows, and SEO practices.

Much like GS1 defined their namespace (gs1:) while referencing schema terms, we have defined our namespace (seovoc:) and are integrating the classes within the Schema.org hierarchy when possible.

The Future Of SEOntology

SEOntology is more than just a theoretical framework; it’s a practical tool designed to empower SEO professionals and tool makers in an increasingly AI-driven ecosystem.

Here’s how you can engage with and benefit from SEOntology.

If you’re developing SEO tools:

  • Data Interoperability: Implement SEOntology to export and import data in a standardized format. This ensures your tools can easily interact with other SEOntology-compliant systems.
  • AI-Ready Data: By structuring your data according to SEOntology, you’re making it more accessible for AI-driven automations and analyses.

If you’re an SEO professional:

  • Contribute to Development: Just like with Schema.org, you can contribute to SEOntology’s evolution. Visit its GitHub repository to:
    • Raise issues for new concepts or properties you think should be included.
    • Propose changes to existing definitions.
    • Participate in discussions about the future direction of SEOntology.
  • Implement in Your Work: Start using SEOntology concepts in your structured data.

In Open Source We Trust

SEOntology is an open-source effort, following in the footsteps of successful projects like Schema.org and other shared linked vocabularies.

All discussions and decisions will be public, ensuring the community has a say in SEOntology’s direction. As we gain traction, we’ll establish a committee to steer its development and share regular updates.

Conclusion And Future Work

The future of marketing is human-led, not AI-replaced. SEOntology isn’t just another buzzword – it’s a step towards this future. SEO is strategic for the development of agentive marketing practices.

SEO is no longer about rankings; it’s about creating intelligent, adaptive content and fruitful dialogues with our stakeholders across various channels. Standardizing SEO data and practices is strategic to build a sustainable future and to invest in responsible AI.

Are you ready to join this revolution?

There are three guiding principles behind the work of SEOntology that we need to make clear to the reader:

  • As AI needs semantic data, we need to make SEO data interoperable, facilitating the creation of knowledge graphs for everyone. SEOntology is the USB-C of SEO/crawling data. Standardizing data about content assets and products and how people find content, products, and information in general is important. This is the first objective. Here, we have two practical use cases. We have a connector for WordLift that gets crawl data from the Botify crawler and helps you jump-start a KG that uses SEOntology as a data model. We are also working with Advertools, an open-source crawler and SEO tool, to make data interoperable with SEOntology;
  • As we progress with the development of a new agentic way of doing SEO and digital marketing, we want to infuse the know-how of SEO using SEOntology, a domain-specific language to infuse the SEO mindset to SEO agents (or multi-agent systems like Agent WordLift). In this context, the skill required to create dynamic internal links is encoded as nodes in a knowledge graph, and opportunities become triggers to activate workflows.
  • We expect to work with human-in-the-loop HITL, meaning that the ontology will become a way to collaboratively share knowledge and tactics that help improve findability and prevent the misuse of Generative AI that is polluting the Web today.

Project Overview

This work on SEOntology is the product of collaboration. I extend my sincere thanks to the WordLift team, especially CTO David Riccitelli. I also appreciate our clients for their dedication to innovation in SEO through knowledge graphs. Special thanks to Milos Jovanovik and Emilia Gjorgjevska for their critical expertise. Lastly, I’m grateful to the SEO community and the SEJ editorial team for their support in sharing this work.

More resources: 


Featured Image: tech_BG/Shutterstock

Google Rolls Out AI-Organized Search Results Pages via @sejournal, @MattGSouthern

Google is introducing AI-organized search results pages in the United States.

The new feature, set to launch this week, returns a full page of multi-format results personalized for the searcher.

Google’s announcement states:

“This week, we’re rolling out search results pages organized with AI in the U.S. — beginning with recipes and meal inspiration on mobile. You’ll now see a full-page experience, with relevant results organized just for you. You can easily explore content and perspectives from across the web including articles, videos, forums and more — all in one place.”

Key Features

The AI-organized pages will compile various content types, including articles, videos, and forum discussions.

Google claims this approach will provide users with a more diverse range of information sources and perspectives.

In its announcement, Google adds:

“… with AI-organized search results pages, we’re bringing people more diverse content formats and sites, creating even more opportunities for content to be discovered.”

Industry Implications

While Google touts the benefits of AI-organized search results pages, the update raises several questions:

  1. How will the AI-organized pages affect traffic to individual websites? Keeping users on Google’s results page might reduce clicks to source websites.
  2. With AI determining content organization, there are concerns about potential biases in how information is presented.
  3. The new format may require new strategies to ensure visibility within these AI-organized results.
  4. It’s unclear how this change will impact ad visibility.

This update could alter how we approach SEO. We may need to adapt strategies to ensure content is discoverable and presentable in this new format.

Microsoft’s Bing recently announced an expansion of its generative search capabilities, focusing on handling complex, informational queries. Google’s reorganizing of entire results pages appears to be a unique offering compared to Bing’s.

The initial rollout focusing on mobile devices for recipe and meal-related queries aligns with Google’s mobile-first indexing approach.

It remains to be seen how this feature will translate to desktop searches.

Google’s Response to Industry Concerns

In light of the questions raised by this update, we contacted Google for clarification on several key points.

Impact on Search Console Tracking

Regarding how AI-organized search results will be tracked in Google Search Console, a Google spokesperson stated:

“We do not separate traffic by every feature in Search Console, but publishers will continue to see their traffic from Search reflected there. Check out the supported search appearances in our documentation.”

This suggests that while specific metrics will not be available for AI-organized pages, site owners will still be able to access overall traffic data.

Timeline for Expansion

When asked about the timeline for expanding this feature to other categories and regions, Google responded:

“When we previewed this feature, we mentioned expanding this to additional categories including dining, movies, music, books, hotels, and shopping. No further details to share at this time.”

While this confirms expansion plans, Google has not provided specific timelines for these rollouts.

Guidance for SEO Professionals and Content Creators

On whether new tools or guidance will be provided for optimizing content for AI-organized search results, Google emphasized that no changes are necessary:

“SEO professionals and creators don’t need to do anything differently. Search results pages organized with AI are rooted in our core Search ranking and quality systems, which we have been honing for decades to surface high quality information.”

This response suggests that existing SEO best practices should continue to be effective for visibility in these new result formats.

Looking Ahead

Google’s responses provide some clarity but also leave room for speculation.

The lack of specific tracking for AI-organized pages in Search Console may present challenges for SEO professionals in understanding the direct impact of this new feature on their traffic.

The confirmation of plans to expand to other categories like dining, movies, music, books, hotels, and shopping indicates that this update could have far-reaching effects across various industries.

Despite Google’s assurances, new best practices may emerge as the SEO community adapts to this significant change in search result presentation.

We here at SEJ will closely monitor the rollout and report on its effects and what it means for you in the coming months. Sign up for the SEJ newsletter to stay up to date.


Featured Image: JarTee/Shutterstock