AI-Enhanced Keyword Selection In PPC via @sejournal, @brookeosmundson

Let’s be honest – PPC keyword research can be tedious.

Sifting through search terms, analyzing performance data, and trying to predict what people will type into the search bar next feels like an endless puzzle.

This is where AI can enter the game.

This isn’t just about adding another buzzword to your marketing toolbox – AI can actually save you time and give you insights you might miss on your own.

In this article, we’ll explore how AI can take the grunt work out of your keyword strategy.

Whether you’re hunting for new keywords, optimizing your existing campaigns, or cleaning up your negative keyword lists, AI offers real, actionable solutions.

Let’s break down exactly how you can use AI to level up your PPC keyword game without the headaches.

Why AI Matters For PPC Keyword Selection

Before we jump into how to use AI to enhance your keyword strategy, it’s worth understanding why AI is a game-changer for PPC.

AI tools can process vast amounts of data faster than any human ever could. They identify patterns, analyze search behavior, and even predict trends, giving you insights that are both actionable and timely.

Instead of spending hours combing through search terms, competitor ads, or campaign performance reports, AI does the heavy lifting, allowing you to focus on higher-level strategy.

More importantly, AI tools learn and adapt over time, becoming smarter with each data point they analyze.

That means your keyword research and selection process becomes not just automated, but also continually improving.

Using AI For New Keyword Research

One of the most significant ways AI can enhance your PPC strategy is by discovering new keywords.

Traditional keyword research relies on manual tools and processes, but AI tools like Google’s Keyword Planner, other third-party tools, and AI-powered tools like ChatGPT’s keyword analysis capabilities take it up a notch.

These tools don’t just spit out related search terms – they can provide context, trends, and relevance scores based on real-time data.

How AI Tools Find New Keywords

AI-powered keyword tools analyze search patterns across millions of queries, detecting emerging trends, consumer interests, and semantic relationships that would otherwise go unnoticed.

For example, if you’re managing an ad campaign for a fitness brand, AI might detect an uptick in searches for [home workout routines for busy moms] or [low-impact exercises for seniors].

Even more powerful is AI’s ability to consider user intent.

AI doesn’t just give you a list of keywords – it provides context, predicting whether a user is more likely searching for information, looking to buy, or wanting to compare products.

This helps you create campaigns that align closely with user intent, which is critical for achieving higher quality scores, leading to better ad placement.

Tools And Tactics For AI-Driven Keyword Discovery

  • Google’s Keyword Planner (AI-driven recommendations): Google’s own AI-backed keyword tool not only suggests keywords but prioritizes them based on real-time search trends.
  • ChatGPT or Jasper for idea generation: These tools can help brainstorm new keyword ideas based on competitor campaigns, product descriptions, or industry trends. Just input your product or service, and these AI systems will offer insights on relevant keywords, often from angles you hadn’t thought of.

Adding Keywords To Existing Campaigns

Once you’ve got a solid list of new keywords, it’s time to put them to work.

AI isn’t just great at discovering keywords – it’s also incredibly useful for helping you refine your existing campaigns.

This is especially important if your campaign has been running for a while and might need some fine-tuning.

AI For Keyword Expansion

AI can help you intelligently expand your keyword lists by finding closely related keywords, synonyms, and long-tail variations.

For example, if you’ve been running ads for a local bakery using keywords like “best bakery near me,” AI tools might suggest adding longer variations like “best custom birthday cakes in pittsburgh.”

These expanded keywords help you reach more specific audiences who are ready to convert.

Leveraging AI For Semantic Keywords

Semantic keyword matching is another area where AI shines.

Unlike traditional keyword match types, AI doesn’t rely strictly on exact keyword matches.

Instead, it understands the broader meaning behind search queries, enabling you to target more relevant searches.

Google’s AI algorithms, for instance, now consider the overall intent of a search query, offering a more nuanced keyword match than we had a few years ago.

This makes adding keywords to your campaign not just about volume but relevance and intent alignment.

Optimizing Existing Campaigns With AI Tools

  • Google Ads Recommendations: Google’s built-in AI will continuously monitor your campaign and suggest keyword additions based on ongoing performance and search trends.
  • AI-Powered Keyword Expansion in Optmyzr: Tools like Optmyzr integrate AI and machine learning to suggest keyword expansions and bid adjustments in real time.
  • Microsoft Ads AI Integration: Microsoft’s platform offers AI-based keyword suggestions, making it easier to add or remove keywords from your existing campaigns, ensuring your ad stays relevant.

How AI Helps With Negative Keyword Selection

A successful PPC campaign isn’t just about the keywords you include – it’s also about the keywords you exclude.

This is where negative keywords come into play, and AI can help you refine your negative keyword strategy like a pro.

Common Mistakes With Negative Keywords (And How AI Can Help)

One of the most common mistakes PPC marketers make with negative keywords is not updating them regularly.

It’s easy to set a few negative keywords at the start of a campaign and then forget about them.

But search behaviors change, new trends emerge, and without adjusting your negative keyword list, you might start showing ads for irrelevant searches.

For example, a brand selling high-end outdoor gear might inadvertently show ads to people searching for [cheap camping supplies], which dilutes the brand image and wastes ad spend.

Another mistake is being too broad with negative keywords. While you want to exclude irrelevant searches, casting too wide of a net can accidentally block valuable traffic.

For instance, adding “free” as a negative keyword could prevent your ads from showing to users looking for “free delivery” or “free returns,” which are often associated with ready-to-buy customers.

This is where AI can step in and save the day.

AI tools can analyze search queries in real-time, identifying irrelevant traffic while being nuanced enough to avoid overly broad exclusions.

They allow you to add negative keywords that prevent wasted ad spend without cutting off relevant users.

AI can also spot trends in what kinds of queries lead to bounces or low engagement, helping you automatically refine your negative keyword list with precision, based on performance data.

Identifying Irrelevant Traffic With AI

AI excels at spotting patterns and trends that humans might miss.

By analyzing search terms that trigger your ads but don’t lead to conversions, AI can suggest negative keywords that will help you avoid wasted spend.

Let’s say you’re running a campaign for luxury watches, and your ads are being triggered by search terms related to “cheap watches” or “free watch giveaways.”

AI tools can analyze the performance of these search terms and suggest adding them as negative keywords.

AI can even analyze the context of negative keywords, understanding when a specific word or phrase is irrelevant in one campaign but useful in another.

This level of nuance helps ensure your ads aren’t being wasted on the wrong audience while still reaching relevant customers.

AI Tools For Managing Negative Keywords

  • Google Ads Search Query Report (AI-Enhanced): Google Ads provides search query reports, and its AI-powered suggestions will flag irrelevant search terms for potential negative keyword additions.
  • Custom AI Algorithms for Negative Keyword Mining: Some marketers are using AI tools like Python with machine learning libraries to automate the detection of irrelevant terms that are draining budgets.

Making AI Work For You: Practical Tips

While AI tools are incredibly powerful, the best results come from combining AI’s capabilities with human expertise.

Here are some practical tips for making AI-enhanced keyword selection work for you:

  • Regularly Update Your Keyword Lists: AI tools provide real-time insights, but the digital landscape is always changing. Review and update your keyword lists at least once a month to stay ahead of emerging trends.
  • Refine Your Negative Keywords Consistently: Just like with regular keywords, your negative keyword list needs to evolve. AI tools can help you keep this list up to date without much manual effort.
  • Experiment with Different AI Tools: No single tool will give you everything you need. Experiment with different AI-powered platforms to find the ones that best fit your workflow.

In Summary: AI As Your PPC Keyword Sidekick

AI is not here to replace PPC marketers – it’s here to make us more efficient, more strategic, and ultimately, more successful.

By leveraging AI to enhance keyword research, optimize existing campaigns, and refine negative keyword strategies, you can free up time for more creative and strategic tasks.

The key to success is combining the power of AI with your own expertise and instincts.

After all, while AI can analyze data, it’s the human touch that ultimately connects with customers and drives results.

So, dive into AI-enhanced keyword selection and start reaping the benefits of smarter, more efficient PPC campaigns.

More resources: 


Featured Image: Sammby/Shutterstock

How ChatGPT search paves the way for AI agents

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

OpenAI’s Olivier Godement, head of product for its platform, and Romain Huet, head of developer experience, are on a whistle-stop tour around the world. Last week, I sat down with the pair in London before DevDay, the company’s annual developer conference. London’s DevDay is the first one for the company outside San Francisco. Godement and Huet are heading to Singapore next. 

It’s been a busy few weeks for the company. In London, OpenAI announced updates to its new Realtime API platform, which allows developers to build voice features into their applications. The company is rolling out new voices and a function that lets developers generate prompts, which will allow them to build apps and more helpful voice assistants more quickly. Meanwhile for consumers, OpenAI announced it was launching ChatGPT search, which allows users to search the internet using the chatbot. Read more here

Both developments pave the way for the next big thing in AI: agents. These are AI assistants that can complete complex chains of tasks, such as booking flights. (You can read my explainer on agents here.) 

“Fast-forward a few years—every human on Earth, every business, has an agent. That agent knows you extremely well. It knows your preferences,” Godement says. The agent will have access to your emails, apps, and calendars and will act like a chief of staff, interacting with each of these tools and even working on long-term problems, such as writing a paper on a particular topic, he says. 

OpenAI’s strategy is to both build agents itself and allow developers to use its software to build their own agents, says Godement. Voice will play an important role in what agents will look and feel like. 

“At the moment most of the apps are chat based … which is cool, but not suitable for all use cases. There are some use cases where you’re not typing, not even looking at the screen, and so voice essentially has a much better modality for that,” he says. 

But there are two big hurdles that need to be overcome before agents can become a reality, Godement says. 

The first is reasoning. Building AI agents requires us to be able to trust that they will be able to complete complex tasks and do the right things, says Huet. That’s where OpenAI “reasoning” feature comes in. Introduced in OpenAI’s o1 model last month, it uses reinforcement learning to teach the model how to process information using “chain of thought.” Giving the model more time to generate answers allows it to recognize and correct mistakes, break down problems into smaller ones, and try different approaches to answering questions, Godement says. 

But OpenAI’s claims about reasoning should be taken with a pinch of salt, says Chirag Shah, a computer science professor at the University of Washington. Large language models are not exhibiting true reasoning. It’s most likely that they have picked up what looks like logic from something they’ve seen in their training data.

“These models sometimes seem to be really amazing at reasoning, but it’s just like they’re really good at pretending, and it only takes a little bit of picking at them to break them,” he says.

There is still much more work to be done, Godement admits. In the short term, AI models such as o1 need to be much more reliable, faster, and cheaper. In the long term, the company needs to apply its chain-of-thought technique to a wider pool of use cases. OpenAI has focused on science, coding, and math. Now it wants to address other fields, such as law, accounting, and economics, he says. 

Second on the to-do list is the ability to connect different tools, Godement says. An AI model’s capabilities will be limited if it has to rely on its training data alone. It needs to be able to surf the web and look for up-to-date information. ChatGPT search is one powerful way OpenAI’s new tools can now do that. 

These tools need to be able not only to retrieve information but to take actions in the real world. Competitor Anthropic announced a new feature where its Claude chatbot can “use” a computer by interacting with its interface to click on things, for example. This is an important feature for agents if they are going to be able to execute tasks like booking flights. Godement says o1 can “sort of” use tools, though not very reliably, and that research on tool use is a “promising development.” 

In the next year, Godemont says, he expects the adoption of AI for customer support and other assistant-based tasks to grow. However, he says that it can be hard to predict how people will adopt and use OpenAI’s technology. 

“Frankly, looking back every year, I’m surprised by use cases that popped up that I did not even anticipate,” he says. “I expect there will be quite a few surprises that you know none of us could predict.” 


Now read the rest of The Algorithm

Deeper Learning

This AI-generated version of Minecraft may represent the future of real-time video generation

When you walk around in a version of the video game Minecraft from the AI companies Decart and Etched, it feels a little off. Sure, you can move forward, cut down a tree, and lay down a dirt block, just like in the real thing. If you turn around, though, the dirt block you just placed may have morphed into a totally new environment. That doesn’t happen in Minecraft. But this new version is entirely AI-generated, so it’s prone to hallucinations. Not a single line of code was written.

Ready, set, go: This version of Minecraft is generated in real time, using a technique known as next-frame prediction. The AI companies behind it did this by training their model, Oasis, on millions of hours of Minecraft game play and recordings of the corresponding actions a user would take in the game. The AI is able to sort out the physics, environments, and controls of Minecraft from this data alone. Read more from Scott J. Mulligan.

Bits and Bytes

AI search could break the web
At its best, AI search can better infer a user’s intent, amplify quality content, and synthesize information from diverse sources. But if AI search becomes our primary portal to the web, it threatens to disrupt an already precarious digital economy, argues Benjamin Brooks, a fellow at the Berkman Klein Center at Harvard University, who used to lead public policy for Stability AI. (MIT Technology Review

AI will add to the e-waste problem. Here’s what we can do about it.
Equipment used to train and run generative AI models could produce up to 5 million tons of e-waste by 2030, a relatively small but significant fraction of the global total. (MIT Technology Review

How an “interview” with a dead luminary exposed the pitfalls of AI
A state-funded radio station in Poland fired its on-air talent and brought in AI-generated presenters. But the experiment caused an outcry and was stopped when tone of them  “interviewed” a dead Nobel laureate. (The New York Times

Meta says yes, please, to more AI-generated slop
In Meta’s latest earnings call, CEO Mark Zuckerberg said we’re likely to see 
“a whole new category of content, which is AI generated or AI summarized content or kind of existing content pulled together by AI in some way.” Zuckerberg added that he thinks “that’s going to be just very exciting.” (404 Media

The Download: inside animals’ minds, and how to make AI agents useful

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

What do jumping spiders find sexy? How DIY tech is offering insights into the animal mind.

Studying the minds of other animals comes with a challenge that human psychologists don’t usually face: Your subjects can’t tell you what they’re thinking. 

To get answers from animals, scientists need to come up with creative experiments to learn why they behave the way they do. Sometimes this requires designing and building experimental equipment from scratch. 

These contraptions can range from ingeniously simple to incredibly complex, but all of them are tailored to help answer questions about the lives and minds of specific species. Do honeybees need a good night’s sleep? What do jumping spiders find sexy? Do falcons like puzzles? For queries like these, off-the-shelf gear simply won’t do. Check out these contraptions custom-built by scientists to help them understand the lives and minds of the animals they study

—Betsy Mason

This piece is from the latest print issue of MIT Technology Review, which is all about the weird and wonderful world of food. If you don’t already, subscribe to receive future copies once they land.

How ChatGPT search paves the way for AI agents

It’s been a busy few weeks for OpenAI. Alongside updates to its new Realtime API platform, which will allow developers to build apps and voice assistants more quickly, it recently launched ChatGPT search, which allows users to search the internet using the chatbot.

Both developments pave the way for the next big thing in AI: agents. These AI assistants can complete complex chains of tasks, such as booking flights. OpenAI’s strategy is to both build agents itself and allow developers to use its software to build their own agents, and voice will play an important role in what agents will look and feel like.

Melissa Heikkilä, our senior AI reporter, sat down with Olivier Godement, OpenAI’s head of product for its platform, and Romain Huet, head of developer experience, last week to hear more about the two big hurdles that need to be overcome before agents can become a reality. Read the full story.

This story is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday.

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 America is heading to the polls
Here’s how Harris and Trump will attempt to lead the US to tech supremacy. (The Information $)
+ The ‘Stop the Steal’ election denial movement is preparing to contest the vote. (WP $)
+ The muddy final polls suggest it’s still all to play for. (Vox)

2 Abortion rights are on the 2024 ballot
A lack of access to basic health care has led to the deaths of at least four women. (NY Mag $)
+ Nine states will decide whether to guarantee their residents abortion access. (Fortune)
+ If Trump wins he could ban abortion nationwide, even without Congress. (Politico)

3 Inside New York’s election day wargames
Tech, business and policy leaders gathered to thrash out potential risks. (WSJ $)+ Violence runs throughout all aspects of this election cycle. (FT $)

4 Elon Musk’s false and misleading X election posts have billions of views
In fact, they’ve been viewed twice as much as all X’s political ads this year. (CNN)
+ Musk’s decision to hitch himself to Trump may end up backfiring, though. (FT $)

5 Meta will permit the US military to use its AI models
It’s an interesting update to its previous policy, which explicitly banned its use for military purposes. (NYT $)
+ Facebook has kept a low profile during the election cycle. (The Atlantic $)
+ Inside the messy ethics of making war with machines. (MIT Technology Review)

6 The hidden danger of pirated software
It’s not just viruses you should be worried about. (404 Media)

7 Apple is weighing up expanding into smart glasses
Where Meta leads, Apple may follow. (Bloomberg $)
+ The coolest thing about smart glasses is not the AR. It’s the AI. (MIT Technology Review)

8 India’s lithium plans may have been a bit too ambitious
Reports of a major lithium reserve appear to have been massively overblown.(Rest of World)
+ Some countries are ending support for EVs. Is it too soon? (MIT Technology Review)

9 Your air fryer could be surveilling you
Household appliances are now mostly smart, and stuffed with trackers. (The Guardian)

10 How to stay sane during election week
Focus on what you can control, and try to let go of what you can’t. (WP $)
+ Here’s how election gurus are planning to cope in the days ahead. (The Atlantic $)
+ How to log off. (MIT Technology Review)

Quote of the day

“We’re in kind of the ‘throw spaghetti at the wall’ moment of politics and AI, where this intersection allows people to try new things for propaganda.”

—Rachel Tobac, chief executive of ethical hacking company SocialProof Security, tells the Washington Post why a deepfake video of Martin Luther King endorsing Donald Trump is being shared online in the closing hours of the presidential race.

The big story

The hunter-gatherer groups at the heart of a microbiome gold rush

December 2023

Over the last couple of decades, scientists have come to realize just how important the microbes that crawl all over us are to our health. But some believe our microbiomes are in crisis—casualties of an increasingly sanitized way of life. Disturbances in the collections of microbes we host have been associated with a whole host of diseases, ranging from arthritis to Alzheimer’s.

Some might not be completely gone, though. Scientists believe many might still be hiding inside the intestines of people who don’t live in the polluted, processed environment that most of the rest of us share.

They’ve been studying the feces of people like the Yanomami, an Indigenous group in the Amazon, who appear to still have some of the microbes that other people have lost. But they’re having to navigate an ethical minefield in order to do so. Read the full story.

—Jessica Hamzelou

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Move over Moo Deng—Haggis the baby pygmy hippo is the latest internet star!
+ To celebrate the life of the late, great Quincy Jones, check out this sensational interview in which he spills the beans on everything from the Beatles’ musical shortcomings to who shot Kennedy. Thank you for the music, Quincy.
+ The color of the season? Sage green, apparently.
+ Dinosaurs are everywhere, you just need to look for them.

Google Search Snippets Show Contradictory Information, Study Finds via @sejournal, @MattGSouthern

A recent investigation finds that Google’s Featured Snippets may display conflicting information from the same source material, depending on how users phrase their search queries.

This raises concerns about the search engine’s ability to interpret content accurately.

Sarah Presch, director at Dragon Metrics, discovered that Google’s Featured Snippets pull opposing statements from the same articles when users frame questions differently.

For example, searching “link between coffee and hypertension” generates a Featured Snippet highlighting caffeine’s potential to cause blood pressure spikes.

Searching “no link between coffee and hypertension” produces a contradictory snippet from the same Mayo Clinic article stating caffeine has no long-term effects.

Similar contradictions appeared across health topics, political issues, and current events.

The investigation found that asking whether a political candidate is “good” versus “bad” yields dramatically different results despite the fundamental question remaining the same.

Impact On Search Quality

“It’s one big bias machine,” Presch notes, explaining how Google’s algorithms appear to prioritize content that matches user intent rather than providing comprehensive, balanced information.

The findings align with internal Google documents from 2016, where engineers admitted, “We do not understand documents – we fake it.”

While Google maintains these documents are outdated, SEO experts suggest the underlying technical limitations persist.

Presch adds:

“What Google has done is they’ve pulled bits out of the text based on what people are searching for and fed them what they want to read.”

Mark Williams-Cook, founder of AlsoAsked, commented on the findings, stating:

“Google builds models to try and predict what people like, but the problem is this creates a kind of feedback loop. If confirmation bias pushes people to click on links that reinforce their beliefs, it teaches Google to show people links that lead to confirmation bias.”

Implications

These findings have implications for content creators and SEO professionals:

  • Featured Snippets may not accurately represent comprehensive content
  • User intent heavily influences how content is interpreted and displayed
  • Content strategy may need adjustment to maintain accuracy across various query formats

Google’s spokesperson defended the system, stating that users can find diverse viewpoints if they scroll beyond initial results.

The company also highlighted features like “About this result” that help users evaluate information sources.

Recommendations

Based on these findings, publishers should take the following actions:

  • Develop comprehensive content that remains accurate regardless of how queries are phrased.
  • Recognize the impact of search intent on the selection of Featured Snippets.
  • Track how your content is displayed in Featured Snippets for different search phrases.

As Google moves toward becoming an “answer engine” with AI-generated responses, digital marketers and content creators need to understand these limitations.


Featured Image: Song_about_summer/Shutterstock

ChatGPT Search May Have A Shot At Google via @sejournal, @Kevin_Indig

ChatGPT Search (CGS) is a landmark launch in the shift from traditional to AI Search.

Now, OpenAI competes with Google (Search) heads-on. Note the subtle elbow hit between the lines in the announcement:

Getting useful answers on the web can take a lot of effort. It often requires multiple searches and digging through links to find quality sources and the right information for you.

The positioning is clear: ChatGPT Search is a way to get a straight answer without digging through cluttered search results or browsing websites.

CGS, which is directly integrated with ChatGPT instead of a standalone search engine, decides whether a query benefits from web results or not, and you can rerun queries through other models like o1 preview to compare the answers:

ChatGPT will choose to search the web based on what you ask, or you can manually choose to search by clicking the web search icon.

It keeps the context of your search going in a conversation interface (bolding from me):

Go deeper with follow-up questions, and ChatGPT will consider the full context of your chat to get a better answer for you.

ChatGPT Search’s interface features prominent links to sources (Image Credit: Kevin Indig)

OpenAI has a strategic advantage, as I explained in Search GPT:

The Information reports that OpenAI loses $5 billion a year in expenses. Just capturing 3% of Google’s $175 billion Search business would allow OpenAI to recoup expenses.

OpenAI has a strategic advantage over Google: Search GPT can provide a very different, maybe less noisy, user experience than Google because it’s not reliant on ad revenue. In any decision regarding Search, Google needs to take ads into account.

CGS marks the entry to a new paradigm where traditional search engines like Google or Bing compete with AI chatbots.

They solve the same problems for users as search engines but with lower friction. But it also marks a critical event that should lead you to evaluate your strategy.

Companies face a choice to invest and “be early” to AI Search or ignore the noise and stay the course. What makes this decision hard:

  1. Divided opinions about Chat GPT’s chance to take significant market share from Google.
  2. Rapidly changing mechanisms of AI Search platforms.
  3. Confusion about what to do.

The first search engines didn’t represent the model (Google) that eventually won.

In the same vein, the AI Search experience we’re seeing today might be completely different in a few years. However, there is little doubt that search is fundamentally changing.

As a result, my recommendation is to invest in AI Search. It is not capital-intensive (yet), but the upside to finding a playbook is high.

If CGS grabs significant Google market share, you’re in a good position. If it fails, no harm is done.

Collision Course

Based on recent traffic trends, ChatGPT could catch up to Google in 2 years. (Image Credit: Kevin Indig)

In the chart above, I extrapolated ChatGPT’s and Google’s total traffic over the next two years if the trend from the last six months remains constant.

This chart will probably outrage or scare you, but the chance that events unfold exactly as depicted in this chart is low.

The reason I bring it up here is to consider the fact that many structural changes start slowly based on the saying “first gradually, then suddenly.”

It took Google about three to four years to beat Yahoo, Altavista, and Lycos. Given that new technology gets to critical mass ever faster, I’m not surprised ChatGPT could do it faster (in theory).

ChatGPT’s traffic has already passed the No. 3 search engine, Bing (YouTube is second).

When you look at comments and posts on social media, more and more people report using ChatGPT instead of Google for various purposes, but that could be availability bias.

Image Credit: Kevin Indig

One point a lot of people miss when looking at the traffic comparison between ChatGPT and Bing is that they’re not the same, and yet this is a fair comparison.

ChatGPT is more than a search engine. People use it for all sorts of things. But that’s exactly the point: a search engine that looks like Google never stood a chance to compete with Google or Bing.

CGS is something new, and that’s why it stands a chance. So, when you see chatgpt.com passing bing.com, the critical argument is not that both do different things but that they’re used to accomplish the same goal.

After all, search is just a way to solve problems or achieve goals, not to search for the sake of searching.

To clarify, I don’t think Google or Alphabet as a company is at risk of dying. I do think CGS has a chance to capture significant market share, and too many people underestimate how fast this can go.

Referral Traffic Skyrockets

ChatGPT’s outgoing referral traffic is skyrocketing (Image Credit: Kevin Indig)

AI Search marks a new paradigm where users get a direct answer without having to browse websites. So, how should companies think about pivoting their strategy?

Here’s what I’m telling my clients when they ask me whether they should pivot their SEO roadmap: For now, no. Reserve 10 to 20% of capacity to establish visibility in AI Search and for experimentation.

Look for signal: If you’re hesitant to invest more in AI Search right now, at least monitor traffic to and from ChatGPT. Base your decision on how long ChatGPT can keep its current traffic trend up.

Establishing visibility: This referral dashboard from Flow Agency is great for monitoring referral traffic.

With a few tweaks, you can monitor conversions in GA4 as well. You should also monitor site crawls from LLMs and your performance on Bing.

Then, experiment with content tweaks to improve your AI Search visibility. Keep investing in traditional SEO because it forms the basis of AI Search and answers.

Place a bet: The big question in this is whether you’re willing to take a bet or play it safe.

Being a first-mover to SEO had massive benefits as the incumbents tend to stay incumbents, mainly caused by strong backlink profiles, robust user signals, and brand familiarity.

For now, ChatGPT uses Bing search results to ground and weigh answers, which means sites with strong visibility on Bing also have a high chance of being very visible in CGS.

However, there is a chance that using Search for RAG (grounding) is just a jumping-off point until AI Search platforms have gathered enough of their data (queries and user behavior).

Early in this transition period, not much changes. Content that ranks well in traditional search engines, specifically Bing, gets a higher weighting in CGS, which means traditional SEO has a big impact on visibility in AI Search.

AI Chatbot referral traffic is skyrocketing, and ChatGPT’s new search capability could accelerate that growth even more.

Outgoing referral traffic from chatgpt.com increased massively in August and September, according to Similarweb.

Image Credit: Kevin Indig

Noticeable call-outs:

  • YouTube’s referral traffic increased from .17% in July to 3.9% in September.
  • Bing grew from 0% in April to 1.8% in September.
  • Amazon grew from 0% in July to 1.1% in September.

If referral traffic keeps growing at the same rate, it will get interesting in the next six to 12 months. It’s not just the volume but also the quality of traffic.

People use longer and more complex questions when they engage with AI answers, according to Sundar Pichai. Length is a way to be more specific.

Longer questions allow search engines, LLMs, and marketers to better understand and serve users what they want.

Based on conversations and observations, referral traffic from AI chatbots isn’t consistently higher than search traffic in every case, but in most.

Looking Forward

I’m leaving you with two interesting questions:

1. Is it a coincidence that ChatGPT Search came out three days after Apple Intelligence launched publicly?

Apple launched Apple Intelligence, which uses ChatGPT in certain situations:

Apple is integrating ChatGPT access into experiences within iOS 18, iPadOS 18, and macOS Sequoia, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to jump between tools. Siri can tap into ChatGPT’s expertise when helpful.

Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.

Additionally, ChatGPT will be available in Apple’s systemwide Writing Tools, which help users generate content for anything they are writing about. With Compose, users can also access ChatGPT image tools to generate images in a wide variety of styles to complement what they are writing.

We also know how valuable Google’s exclusive search deal with Apple is.

From Monopoly:

Apple’s impact on Google Search is massive. The court documents reveal that 28% of Google searches (US) come from Safari and make up 56% of search volume. Consider that Apple sees 10 billion searches per week across all of its devices, with 8 billion happening on Safari and 2 billion from Siri and Spotlight.

“Google receives only 7.6% of all queries on Apple devices through user-downloaded Chrome” and “10% of its searches on Apple devices through the Google Search App (GSA).” Google would take a big hit without the exclusive agreement with Apple.

Since Search is part of ChatGPT, any API request could trigger the new Search feature.

As a result, ChatGPT has a direct line to searches and actions on Apple devices whenever Apple Intelligence uses ChatGPT. Is that integration the new version of Google’s deal with Apple?

I speculated that OpenAI could work on a browser in Search GPT:

If the main benefit or Search GPT for OpenAI is a revenue stream and access to more user data, the next logical step for OpenAI is to build a (AI-powered) browser.

Browser data is incredibly valuable for understanding user behavior, personalization and LLM training. Best of all, it’s app-agnostic, so OpenAI could learn from users even when they use Perplexity or Google.

We’ve seen the power of browser data in the Google lawsuit, where it turned out Google relied on Chrome data all along for ranking. The only layer that’s more powerful is the operating system and device layer.

OpenAI seems to be very aware of the importance of being the default when we look at how hard it pushes its Chrome extension, which changes the default browser search engine to ChatGPT.

2. As it’s likely that more users don’t browse the web but get answers from ChatGPT, Gemini, Perplexity, etc. directly, will the open web become a place primarily for bots instead of humans? And how would that change the purpose and look of websites?


1 Introducing ChatGPT search

2 Introducing Apple Intelligence, the personal intelligence system that puts powerful generative models at the core of iPhone, iPad, and Mac


Featured Image: Paulo Bobita/Search Engine Journal

What Is A Google Broad Core Algorithm Update? via @sejournal, @RyanJones

Since the first broad core algorithm update, there has been much confusion, debate, speculation, and questions about what exactly it is.

How does a core update differ from a named update that many SEO pros are used to?

Google’s acknowledgment of core updates is usually vague, offering little detail beyond the fact that the update occurred. As we’ll see in this post, there is a good reason for this.

Typically, core updates take a few days (or weeks) to fully roll out, with Google making one announcement at the start and another at its conclusion. This invariably leaves SEO professionals and site owners wondering how the core update impacted their rankings.

Understanding a broad core update and how it differs from other algorithm updates can provide insight into what may have caused a site’s rankings to go up, down, or stay the same.

So, What Exactly Is A Core Update?

First, let me get the obligatory “Google makes hundreds of algorithm changes per year, often more than one per day” boilerplate out of the way.

Many of the named updates we hear about (Penguin, Panda, Pigeon, Fred, and the much-talked-about Helpful Content Update – which has since been integrated into core) are implemented to address specific faults or issues in Google’s algorithms.

In the case of Penguin, it was link spam; in the case of Pigeon, it was local SEO spam. Each of these targeted one specific thing – and, in many cases, brought a new metric or new calculation or new machine learning model into the overall algorithm.

They all had a specific purpose but also required new data or new systems to run.

In these cases, Google (sometimes reluctantly) informed us what it was trying to accomplish or prevent with the algorithm update, and we were able to go back and remedy our sites.

A core update is different.

The way I understand it, a core update is a tweak or change to the main search algorithm itself.

It’s not adding anything “new” in terms of metrics, data, signals, machine learning, etc. It’s simply re-arranging or adjusting existing signals/factors and their importance.

You know, it is believed that there are somewhere between 200 and 500 (or maybe more) ranking factors and signals – the exact number is unknown.

What a core update means to me is that Google slightly tweaked the importance, order, weights, or values of these signals.

Because of that, they can’t come right out and tell us what changed without revealing the secret sauce.

The simplest way to visualize this would be to imagine, let’s say, 200 factors listed in order of importance.

Now imagine Google changing the order of 42 of those 200 factors.

Rankings would change, but it would be a combination of many things, not due to one specific factor or cause.

Obviously, it isn’t that simple, but that’s a good way to think about a core update.

A good analogy would be your list of top 10 favorite Taylor Swift songs. You might re-order that occasionally based on what’s going on in your life – or she may update an older song or release some new music. All of that might change your list.

Here’s a purely made-up, slightly more complicated example of things Google might adjust and why it can’t tell us.

“In this core update, we increased the value of keywords in H1 tags by 2%, changed the core web vitals boost from 0.00001 to 0.0000001, decreased the value of the ratio of keyword trigrams covered by title trigrams, changed the D value in our PageRank calculation from .85 to .70, and started using a TF-iDUF retrieval method for logged in users instead of the traditional TF-PDF method.”

(I swear these are real things in information retrieval. I just have no idea if they’re real things used by Google.)

As you can see, these types of updates wouldn’t be helpful at all to SEO professionals – even if they understood them.

Put simply, a core update means Google changed the weight and importance of existing ranking factors and signals – and some results shifted because of it.

At its most complex form, Google ran a new training set through its machine learning ranking model. Quality raters gave this new set of results a higher Information Satisfaction (IS) score than the previous set, and the engineers have no idea what weights changed or how they changed because that’s just how machine learning works.

(We all know Google uses quality raters to rate search results. These ratings are one input into how it chooses one algorithm change over another – not how it rates your site. Whether it feeds this into machine learning is anybody’s guess. But it’s one possibility.)

It’s likely some random combination of weighting delivered more relevant results for the quality raters, so they tested it more, the test results (clicks!) confirmed it, and they pushed it live.

Remember clicks in the famous Google leak? This is one way they use those – for A/B testing new algorithm variants.

How Can You Recover From A Core Update?

I want to be very clear about language here.

A core algorithm update didn’t “penalize” you for something. It’s not adding negative weights. It most likely just rewarded another site more than yours when it comes to relevance, authority, and quality signals.

Unlike a major named update that targets specific things, a core update may tweak the values of everything.

New sites could be considered for this query; old sites no longer considered. Many sites were probably updated. The user intent (click data) could have changed for this query, new entities could be deemed relevant to this query, the link graph could have changed, etc.

Data changes and the weights of factors have been re-ordered.

Because websites are weighted against other websites relevant to your query (engineers call this a corpus), the reason your site dropped could be entirely different than the reason somebody else’s increased or decreased in rankings.

To put it simply, Google isn’t telling you how to “recover” because it’s likely a different answer for every website and query.

Maybe you were killing it with internal anchor text and doing a great job of formatting content to match user intent – and Google shifted the weights so that content formatting was slightly higher and internal anchor text was slightly lower.

(Again, hypothetical examples here.)

In reality, it was probably several minor tweaks that, when combined, tipped the scales slightly in favor of one site or another (think of our reordered list here).

Finding that “something else” that is helping your competitors isn’t easy – but it’s what provides SEO pros with some job security.

What about AI?

It’s worth pointing out that core updates do affect Google’s AI overviews and citations – as those things are powered by search.

Next Steps And Action Items

Rankings are down after a core update – now what?

First step: Figure out if there are any patterns in the pages that dropped. Were they all the same template? Same intent? Same private blog network of paid links? (Just kidding).

Google is good at patterns; we should be good at spotting them, too.

Your next step is to gather intel on the pages that are ranking where your site used to be.

Conduct a search engine results page (SERP) analysis to find positive correlations between pages that are ranking higher for queries where your site is now lower.

Try not to overanalyze the minute technical details, such as how fast each page loads or its core web vitals scores. These are very minor tiebreaker factors.

Pay attention to the content itself. As you go through it, ask yourself questions like:

  • Does it provide a better answer to the query than your article?
  • Does the content contain more recent data and current stats than yours?
  • What’s the information gain of this page compared to the others that rank? Does it just say the same stuff or does it offer more?
  • What is the intent of the user doing this query? Does this help them accomplish that better?

Google aims to serve content that provides the best and most complete answers to searchers’ queries. Relevance is the one ranking factor that will always win out over all others.

Take an honest look at your content to see if it’s as relevant today as it was prior to the core algorithm update.

From there, you’ll have an idea of what needs improvement.

The best advice for conquering core updates? Keep focusing on:

Finally, don’t stop improving your site once you reach position 1 because the site in position 2 will not stop.

Yeah, I know – it’s not the answer anybody wants, and it sounds like Google propaganda, but it’s just the reality of what a core update is.

Nobody said SEO was easy.

More resources: 


Featured Image: BestForBest/Shutterstock

Structured Data Markup for Ecommerce Product Pages

Structured data markup helps optimize a site for search engines in two ways. First, it aids in understanding the content and purpose of a web page. For example, structured data will help search engines distinguish a page selling beer-making kits from an article about beer.

Second, structured data can enhance the appearance of an organic search listing, making it much more prominent. These enhancements — called “rich results” — can include:

  • Average rating stars
  • Product images,
  • Pricing,
  • Availability,
  • Special offers
  • Shipping pricing
  • Shipping time
  • Return policies.

Product Rich Results

Rich snippets make organic search listings much more noticeable, but they also add a competitive advantage by providing lower pricing, higher ratings, and better delivery terms.

For example, in an “apple macbook pro 16 m3” search, Best Buy’s structured data claims affordable pricing options, while B&H Photo includes detailed delivery information, including free one-day delivery.

Hence rich snippets stand out in search results and likely drive targeted clicks and higher conversions since buyers land on a website with set expectations.

For a search of “apple macbook pro 16 m3,” Best Buy’s structured data shows prices from “$1,740.99 to $1,999.00,” while B&H Photo includes “Free 1-day delivery.”

More Visibility

Structured data markup helps brands stand out beyond Google’s traditional organic listings to include image packs, “popular products” sections, and “deals.”

For example, for a “buy laptops” search, Google generates a “Deals on Laptops” section that partly relies on structured data (and partly on Shopping feeds).

Searching “buy laptops” produces a “Deals on Laptops” section.

Search for “superman costume” in Google Images, and the results blend images into traditional organic listings, labeling images associated with product pages.

An image search for “superman costume” produces blended results with images of products labeled as such.

Types of Structured Data

Structured data markup sits in your code. Google supports three markup types for generating rich snippets: JSON-LD, Microdata, and RDFa. Schema.org provides a popular method of organizing JSON-LD into a vocabulary recognized and recommended by Google and Bing and easily understood by non-coders. (JSON-LD helpfully resides inside of a script, away from HTML.)

Markup from Schema.org is now more or less ubiquitous. Hence the term “schema” is synonymous with “structured data markup.”

There’s schema to describe just about anything on a product detail page — pricing, ratings, shipping, and more.

Implementation

Implementing product schema on a site depends on the content management system or platform. Shopify’s App Store lists a variety of free and paid apps for that purpose. Wix has a built-in solution.

Schema for products must be dynamic, requiring updates based on inventory. Only seasoned developers should attempt to code it manually.

Next, test and validate the schema once it’s set up. Then keep an eye on the “Enhancements” tab in Search Console to ensure Google can see the structured data and it provides no recommendations on improving it. (Search Console lists only the schema resulting in rich snippets.)

Organic Rankings

Structured data has no direct impact on organic rankings. Google has confirmed this multiple times, as recently as last year.

However, it can influence rankings by clarifying the page’s content, thereby helping to rank for the right search queries at the right time.

Rich snippets from structured data can increase clicks and engagement. Clicks and on-page engagement are confirmed ranking signals. Thus rich snippets can improve organic positions by making results more attractive and setting the right expectations for searchers.

Top AI Search Engine Features Compared

AI-powered search engines have profoundly impacted how we obtain information and are lessening our reliance on established providers such as Google and others. Even Meta is reportedly working on AI-powered search.

Here is a list of AI search engines to explore, offering new options for research, marketing, and even advertising. In addition to new disruptive AI startups, I’ve included recent AI enhancements and applications from established providers.

AI Search Engines

SearchGPT is a new feature within OpenAI’s ChatGPT. OpenAI says the search offers up-to-date sports scores, news, stock quotes, weather info, and more, powered by real-time search and partnerships with data providers. ChatGPT will choose to search the web based on what users ask, or they can manually choose to search by clicking the web search icon. Search will be available at ChatGPT.com⁠ and its desktop and mobile apps. Access to free users will roll out over the coming months.

Web page for OpenAI's SearchGPT

OpenAI’s SearchGPT

Google has included AI in its Search product to enhance its capabilities. Google also recently extended its real-time search for its Gemini AI platform, enabling its language models to access current information from Google Search. The Grounding with Google Search feature is available for Gemini API and AI Studio, enabling improved accuracy in the model’s responses. When it enriches results with data from Search, Google provides supporting links back to the underlying sources.

Additionally, Google recently launched AI Overviews, which summarizes search results into short paragraphs. Google Search is gradually making AI Overviews available to more users and regions, providing AI-generated snapshots on applicable searches with key information and links to dig deeper.

Microsoft Copilot, formerly called Bing Chat, is an AI-powered digital assistant that engages in conversations and helps people with a range of tasks, such as answering queries or creating images or text drafts. Users can choose their chat style, such as “More Creative” or “More Precise,” to generate their preferred style of help. Copilot also provides citations to ensure the information is reliable. It is powered by OpenAI’s GPT-4 and is free to use but requires a Microsoft account.

You.com is an AI search tool and productivity platform for individuals and businesses. You.com says it’s the first company to connect an LLM to live web access for real-time responses with citations. The platform is built on ChatGPT, Claude, and other models for its LLM capabilities, enabling complex searches and accurate results by carefully controlling which models are prompted and how. It’s developing shared AI workspace tools where multiple users can add documents and then summarize or ask questions among a team. You.com aims to provide next-generation AI research with clickable citations that are deep-linked and in context.

Perplexity, backed by Jeff Bezos and Nvidia, is an AI-powered conversational search engine. The standard version uses OpenAI’s ChatGPT-3.5. Users can create a user profile so the search engine can optimize for personalization. The focus option tailors searches, such as Web, Writing, and Academic. Users can upload documents to get summary information and detailed insights. Users can enhance their search experience with the copilot feature for conversational interaction and thorough results. The free version of Perplexity offers unlimited quick searches and limits the conversational Pro Searches to five per day. The Pro version provides 300 daily searches, access to advanced AI models such as GPT-4 and Claude 3, unlimited file analysis, API credit, and more.

Home page for Perplexity

Perplexity

Brave is a privacy-first search engine built into the Brave browser. Brave uses multiple self-hosted open-source LLMs and erases the search query information when a session has ended. In June, Brave integrated its search results with its AI chatbot, Leo, enabling users to research information without navigating the search page. A premium version of Leo is also available, offering access to better models and higher limit rates. When accessing the AI search feature, users can choose an external LLM from a partner such as Anthropic, and Brave automatically shields the users’ IP addresses and identifiable information. Brave now allows users to link their models to the browser.

Andi is an AI search tool that provides answers and content. Andi uses generative AI and large language models combined with live data, smart algorithms, and semantic search technology. Andi’s proprietary Trantora backend analyzes the meaning and credibility of web content, not just keywords. Andi provides accurate answers free from ads, spam, and misinformation. Recently, in the inaugural SearchBench AI benchmark, an independent evaluation of AI search accuracy, Andi received an accuracy score of 87% versus 71% for Google Gemini and 62% for OpenAI ChatGPT.

Komo is a free AI-powered search engine that creates personalized experiences for searchers with instant, reliable information, and no ads. It’s designed primarily to help individuals and businesses conduct in-depth research. Komo’s recent Search Everywhere feature enhances the journey by enabling users to dive deeper into any source of information. As they read through AI search responses, users can select text or click on the highlighted text, learn more, ask follow-up questions, get quick links to related websites, or explore opinions on the selected topic.

Arc Search from The Browser Company is an AI-powered mobile browser. Its “Browse for me” feature finds answers to a search query by generating a web page of information from at least six sources powered by models from OpenAI and others. Recently, Arc Search added an AI-powered “Call Arc” feature that lets users get quick answers on the go. Users access the feature by opening the app, holding the phone to an ear, and asking questions. The app provides instant voice responses; users can then ask follow-up questions.

Home page for Arc Search

Arc Search

Waldo AI is a search tool that delivers relevant and personalized results that consider user context, preferences, and previous interactions. Users can automatically generate scheduled or recurring research with insights, competitive intel, and trends, with full transparency on publication sources. Waldo integrates with paid-access sources and can pull sentiment data from the major social networks for up-to-date insights.

Consensus is an AI-powered academic search engine with a mission to make scientific and evidence-based knowledge more accessible. Consensus uses language models and search technology to surface relevant research papers and then synthesizes topic-level and paper-level insights. Its current source material comes from Semantic Scholar, which includes over 200 million documents across all domains of science, updated monthly.

Phind is an AI search engine for developers, finding and generating coding solutions with references. The Playground mode is a vanilla chat interface for learning about its models. The Ask mode is a default experience for answering technical questions, automatically deciding when to search the internet for additional context.

DuckDuckGo is an independent browser and privacy-focused search engine that doesn’t track a user’s search history. DuckDuckGo recently launched a beta of DuckAssist, a generative AI-assisted feature to provide instant, sourced answer summaries in search results. Answers are intentionally short, with prominently cited sources to click through and learn more. Additionally, DuckDuckGo has released DuckDuckGo AI Chat, a free and anonymous way to access four popular generative AI chatbots. The company says its AI chat can access popular chatbots such as Open AI’s GPT 3.5 Turbo, Anthropic’s Claude 3 Haiku, and two open-source models (Meta Llama 3 and Mistral’s Mixtral 8x7B), with more to come.

Web page for DuckDuckGo's DuckAssist

DuckDuckGo’s DuckAssist

ChatGPT Search Indexing: Essential Steps For Websites via @sejournal, @MattGSouthern

As the availability of ChatGPT Search expands, understanding its indexing mechanics will be vital for digital visibility.

While Bing’s index plays a key role, OpenAI’s system surfaces content using its own crawlers and attribution methods.

Here is a breakdown of the technical requirements for ensuring your website is indexed correctly.

Technical Framework

ChatGPT Search combines Bing’s search index with OpenAI’s proprietary technology.

According to OpenAI’s technical documentation, the platform utilizes a fine-tuned version of GPT-4o, enhanced with synthetic data generation techniques and integration with their o1-preview system.

The platform employs three distinct crawlers, each serving different purposes.

The OAI-SearchBot serves as the primary crawler for search functionality, while ChatGPT-User handles real-time user requests and enables direct interaction with external applications.

The third crawler, GPTBot, manages AI model training and can be blocked without affecting search visibility.

Implementation

Proper indexing begins with robots.txt configuration.

Your website’s robots.txt should specifically allow OAI-SearchBot while maintaining separate permissions for different OpenAI crawlers.

In addition to this basic configuration, websites must ensure proper indexing by Bing and maintain a clear site architecture.

It’s worth noting that allowing OAI-SearchBot doesn’t automatically mean the content will be used for AI training.

It can take approximately 24 hours for OpenAI’s systems to adjust to new crawling directives after a site’s robots.txt update.

Content Attribution

ChatGPT Search includes several key features for content publishers:

  • Source Attribution: All referenced content includes proper citation
  • Source Sidebar: Provides reference links for verification
  • Multiple Citation Opportunities: A single query can generate multiple source citations
  • Locations: Searches for specific locations will return an interactive map, as shown below.
Image Credit: OpenAI

Additional Considerations

Recent testing has revealed several important factors:

  • Content freshness affects visibility
  • Pages behind paywalls can still be cited
  • URLs returning 404 errors may still appear in citations
  • Multiple pages from the same domain can be referenced in a single response

Recommendations

Indexing in ChatGPT requires ongoing attention to technical health, including regular verification of the robots.txt file and crawler access.

Publishers should prioritize maintaining factual accuracy and up-to-date information while implementing a clear content structure.

This ensures that pages remain accessible across traditional search engines and AI-powered platforms, helping websites achieve broader visibility.


Featured Image: designkida/Shutterstock

Making Social Media & SEO Work Together via @sejournal, @krisjonescom

The synergy between social media and SEO is critical in modern digital marketing.

As a seasoned SEO professional with 26 years of experience, I’ve observed how recent developments – such as the Google algorithm updates and the rise of Artificial Intelligence Optimization (AIO) – have reshaped the interplay between social media and search engine optimization.

Understanding this dynamic is essential for businesses aiming to enhance their online visibility and connect with their target audience effectively.

In this article, we’ll explore how social media can significantly boost your SEO efforts.

We’ll delve into practical examples, consider the impact of recent industry changes, and provide up-to-date references to help you navigate this complex landscape.

The Evolving Relationship Between Social Media And SEO

A Brief History And Recent Developments

Historically, the connection between social media and SEO has been nuanced.

While Google has consistently stated that social signals (likes, shares, comments) are not direct ranking factors, the indirect benefits are undeniable.

The Google leak of 2024, which unveiled insights into the search giant’s algorithmic considerations, highlighted the growing importance of user engagement metrics – many of which are influenced by social media activity.

Moreover, the introduction of AIO has further intertwined social media with SEO.

AIO leverages AI to optimize content and user experiences across platforms, including social media.

With AI algorithms now better at understanding content context and user intent, the lines between social media engagement and SEO performance are blurring.

The Impact of AI And Algorithm Updates On Social Signals

Google’s advancements in AI, particularly with the BERT and MUM algorithms, have enhanced the search engine’s ability to interpret natural language and user intent.

These developments mean that content shared on social media – especially when it generates significant engagement – can influence how search engines perceive and rank your website indirectly.

For instance, a viral social media post can lead to increased brand searches on Google, which is a positive signal to the search engine about your brand’s authority and relevance.

Additionally, AI-powered tools now analyze social media trends to inform SEO strategies, making social media an indispensable component of SEO planning.

How Social Media Helps SEO

1. Amplifying Content Reach And Engagement

Social media platforms serve as powerful amplifiers for your content, extending its reach beyond your website’s regular audience.

By actively sharing and promoting your content on these platforms, you tap into a vast network of potential readers, customers, and influencers who can engage with and spread your message further.

Example: Imagine you’ve published a comprehensive guide on sustainable living on your website. By sharing this guide on your social media platforms, you expose it to a broader audience who may not have found it through search engines alone. If your post gains traction – receiving shares, comments, and likes – it can drive substantial traffic to your site.

Recent Changes: With the rise of short-form video content on platforms like TikTok and Instagram Reels, sharing snippets or highlights of your content can entice users to visit your website for the full version. These platforms’ algorithms favor engaging content, increasing the likelihood of your content reaching a wider audience.

2. Enhancing Link Building Opportunities

Backlinks remain a cornerstone of SEO, signaling to search engines that your content is valuable and authoritative.

Social media can be a catalyst for earning these backlinks by exposing your content to individuals and organizations that may link to it from their own websites or blogs.

Example: A tech startup publishes an insightful article about emerging technologies. By promoting this article on LinkedIn, industry professionals may notice and reference it in their own blogs or articles, creating valuable backlinks.

Recent Changes: Google has provided guidance that focusing too much on links can be a distraction from what matters most to your audience. While links are still important, engagement with your content is becoming a key metric of effectiveness. Social media is an excellent way to both encourage and measure engagement with your content. A strong social media strategy encourages the most valuable types of links: earned links based on content that resonates with your audience.

3. Social Profiles In SERPs And Brand Visibility

Your social media profiles are extensions of your brand and can occupy prominent positions in search engine results pages (SERPs).

Optimizing these profiles not only strengthens your online presence but also provides additional pathways for users to discover and engage with your brand.

In recent updates, Google emphasizes results from user-generated content on Reddit so being present in relevant conversations and managing your brand on social platforms is becoming more critical to your SEO strategy.

Example: When users search for your brand, your social media profiles can appear alongside your website in the search results, occupying more real estate on the SERP and increasing your brand’s visibility.

Recent Changes: With Google’s continuous updates, there is a greater emphasis on providing users with comprehensive information. Social media profiles often appear in the knowledge panel or as rich results, offering users direct access to your latest updates and engagement opportunities.

4. Building Brand Trust and Authority

In a crowded digital marketplace, establishing trust and authority is essential.

Social media allows you to showcase your expertise, engage authentically with your audience, and build a community around your brand – all of which contribute to a stronger, more trustworthy online presence.

Example: A financial advisor regularly shares expert insights on X (Twitter), engaging with followers’ questions and discussions. This consistent presence builds credibility, encouraging users to visit their website for more in-depth information.

Recent Changes: The integration of user-generated content and reviews on social media platforms can influence public perception. Google considers brand reputation in its E-E-A-T (experience, expertise, authoritativeness, trustworthiness) criteria, and a strong social media presence contributes to this.

Conclusion

Social media and SEO are more interconnected than ever.

The evolution of Google’s algorithms and the advent of AI optimization have amplified the impact social media can have on your SEO efforts.

By strategically leveraging social platforms to share content, engage with your audience, and build brand authority, you indirectly boost your search engine rankings.

Remember, while social media signals may not be direct ranking factors, the ripple effects – such as increased traffic, enhanced backlink opportunities, and improved brand perception – play a significant role in your overall SEO performance.

Embrace the synergy between social media and SEO to stay ahead in this dynamic environment.

By integrating these strategies into your digital marketing plan, you’ll not only enhance your SEO efforts but also build a more robust and engaged online presence.

Stay informed about the latest developments, and don’t hesitate to adapt your approach as the digital landscape continues to evolve.

More resources:  


Featured Image: gonin/Shutterstock