Archive

Generative AI

Google Labs & DeepMind Launch Pomelli AI Marketing Tool via @sejournal, @MattGSouthern

Pomelli, a Google Labs & DeepMind AI experiment, builds a “Business DNA” from your site and generates editable branded campaign assets for small businesses.

Pomelli scans your website to create a “Business DNA” profile.

It uses the created profile to keep content consistent across channels.

It suggests campaign ideas and generates editable marketing assets.

Read More »
Generative AI

Why The Build Process Of Custom GPTs Matters More Than The Technology Itself

When Google introduced the transformer architecture in its 2017 paper “Attention Is All You Need,” few realized how much it would help transform digital work. Transformer architecture laid the foundations for today’s GPTs, which are now part of our daily work in SEO and digital marketing.Search engines have used machine learning for decades, but it was the rise of generative AI that made many of us actively explore AI. AI platforms and tools like custom GPTs are already influencing how we research keywords, generate content ideas, and analyze data.
The real value, however, is not in using these tools to cut corners. It lies in designing them intentionally, aligning them with business goals, and ensuring they serve users’ needs.
This article is not a tutorial on how to build GPTs. I share why the build process itself matters, what I have learned so far, and how SEOs can use this product mindset to think more strategically in the age of AI.
From Barriers To Democratization
Not long ago, building tools without coding experience meant relying on developers, dealing with long lead times, and waiting for vendors to release new features. That has changed slightly. The democratization of technology has lowered the entry barriers, making it possible for anyone with curiosity to experiment with building tools like custom GPTs. At the same time, expectations have necessarily risen, as we expect tools to be intuitive, efficient, and genuinely useful.
This is a reason why technical skills still matter. But they’re not enough on their own. What matters more, in my opinion, is how we apply them. Are we solving a real problem? Are we creating workflows that align with business needs?
The strategic questions SEOs should be asking are no longer just “Can I build this?,” but:

Should I build this?
What problem am I solving, and for whom?
What’s the ultimate goal?

Why The Build Process Matters
Building a custom GPT is straightforward. Anyone can add a few instructions and click “save.” What really matters is what happens before and after: defining the audience, identifying the problem, scoping the work realistically, testing and refining outputs, and aligning them with business objectives.
In many ways, this is what good marketing has always been about: understanding the audience, defining their needs, and designing solutions that meet them.
As an international SEO, I’ve often seen cultural relevance and digital accessibility treated as afterthoughts. OpenAI offered me a way to explore whether AI could help address these challenges, especially since the tool is accessible to those of us without any coding expertise.
What began as a single project to improve cultural relevance in global SEO soon evolved into two separate GPTs when I realized the scope was larger than I could manage at the time.
That change wasn’t a failure, but a part of the process that led me toward a better solution.
Case Study: 2 GPTs, 1 Lesson
The Initial Idea
My initial idea was to build a custom GPT that could generate content ideas tailored to the UK, US, Canada, and Australia, taking both linguistic and cultural nuances into account.
As an international SEO, I know it is hard to engage global audiences who expect personalized experiences. Translation alone is not enough. Content must be linguistically accurate and contextually relevant.
This mirrors the wider shift in search itself. Users now expect personalized, context-driven results, and search engines are moving in that same direction.
A Change In Direction
As I began building, I quickly realized that the scope was bigger than expected. Capturing cultural nuance across four different markets while also learning how to build and refine GPTs required more time than I could commit at that moment.
Rather than leaving the project, I reframed it as a minimum viable product. I revisited the scope and shifted focus to another important challenge, but with a more consistent requirement – digital accessibility.
The accessibility GPT was designed to flag issues, suggest inclusive phrasing, and support internal advocacy. It adapted outputs to different roles, so SEOs, marketers, and project managers could each use it in relevant ways in their day-to-day work.
This wasn’t giving up on the content project. It was a deliberate choice to learn from one use case and apply those lessons to the next.
The Outcome
Working on the accessibility GPT first helped me think more carefully about scope and validation, which paid off.
As accessibility requirements are more consistent than cultural nuance, it was easier to refine prompts and test role-specific outputs, ensuring an inclusive, non-judgmental tone.
I shared the prototype with other SEOs and accessibility advocates. Their feedback was invaluable. Although their feedback was generally positive, they pointed out inconsistencies I hadn’t seen, including how I described the prompt in the GPT store.
After all, accessibility is not just about alt text or color contrast. It’s about how information is presented.
Once the accessibility GPT was running, I went back to the cultural content GPT, better prepared, with clearer expectations and a stronger process.
The key takeaway here is that the value lies not only in the finished product, but in the process of building, testing, and refining.
Risks And Challenges Along The Way
Not every risk became an issue, but the process brought its share of challenges.
The biggest was underestimating time and scope, which I solved by revisiting the plan and starting smaller. There were also platform limitations – ongoing model development, AI fatigue, and hallucinations. OpenAI itself has admitted that hallucinations are mathematically unavoidable. The best response is to be precise with prompts, keep instructions detailed, and always maintain a human-in-the-loop approach. GPTs should be seen as assistants, not replacements.
Collaboration added another layer of complexity. Feedback loops depended on colleagues’ availability, so I had to stay flexible and allow extra time. Their input, however, was crucial – I couldn’t have made progress without them. As none of the these are under my control, I could only keep on top of developments and do my best to handle all of them.
These challenges reinforced an important truth: Building strategically isn’t about chasing perfection, but about learning, adapting, and improving with each iteration.
Applying Product Thinking
The process I followed was similar to how product managers approach new products. SEOs can adopt the same mindset to design workflows that are both practical and strategic.
Validate The Problem
Not every issue needs AI – and not every issue needs solving. Identify and prioritize what really matters at that time and confirm whether a custom GPT, or any other tool, is the right way to address it.
Define The Use Case
Who will use the GPT, and how? A wide reach may sound appealing, but value comes from meeting specific needs. Otherwise, success can quickly fade away.
My GPTs are designed to support SEOs, marketers, and project managers in different scenarios of their daily work.
Prototype And Test
There is real value in starting small. With GPTs, I needed to write clear, specific instructions, then review the outputs and refine.
For instance, instead of asking the accessibility GPT for general ideas on making a form accessible, I instructed it to act as an SEO briefing developers on fixes or as a project manager assigning tasks.
For the content GPT, I instructed the GPT to act as a UK/ U.S. content strategist, developing inclusive, culturally relevant ideas for specific publications in British English/Standard American.
Iterate With Feedback
Bring colleagues and subject-matter experts into the process early. Their insights challenge assumptions, highlight inconsistencies, and make outputs more robust.
Keep On Top Of Developments
AI platforms evolve quickly, and processes also need to adapt to different scenarios. Product thinking means staying agile, adapting to change, and reassessing whether the tools we build still serve their purpose.
The roll-out of the failed GPT-5 reminded me how volatile the landscape can be.
Practical Applications For SEOs
Why build GPTs when there are already so many excellent SEO tools available? For me, it was partly curiosity and partly a way to test what I could achieve with my existing skills before suggesting a collaboration for a different product.
Custom GPTs can add real value in specific situations, especially with a human-in-the-loop approach. Some of the most useful applications I have found include:

Analyzing campaign data to support decision-making.
Assisting with competitor analysis across global markets.
Supporting content ideation for international audiences.
Clustering keywords or highlighting internal linking opportunities.
Drafting documentation or briefs.

The point is not to replace established tools or human expertise, but to use them as assistants within structured workflows. They can free up time for deeper thinking, while still requiring careful direction and review.
How SEOs Can Apply Product Thinking
Even if you never build a GPT, you can apply the same mindset in your day-to-day work. Here are a few suggestions:

Frame challenges strategically: Ask who the end user is, what they need, and what is broken in their experience. Don’t start with tactics without context.
Design repeatable processes: Build workflows that scale and evolve over time, instead of one-off fixes.
Test and learn: Treat tactics like prototypes. Run experiments, refine based on results. If A/B testing isn’t possible, as it often happens, at least be open to making any necessary adjustments where necessary.
Collaborate across teams: SEO does not exist in isolation. Work with UX, development, and content teams early. The key is to find ways to add value to their work.
Redefine success metrics: Qualified traffic, conversions, and internal process improvements matter in AI times. Success should reflect actual business impact.
Use AI strategically: Quick wins are tempting, but GPTs and other tools are best used to support structured workflows and highlight blind spots. Keep a human-in-the-loop approach to ensure outputs are accurate and relevant to your business needs.

Final Thought
The real innovation is not in the technology itself, but in how we choose to apply it.
We are now in the fifth industrial revolution, a time when humans and machines collaborate more closely than ever.
For SEOs, the opportunity is to move beyond tactical execution and start thinking like product strategists. That means asking sharper questions, testing hypotheses, designing smarter workflows, and creating solutions that adapt to real-world constraints.
It is about providing solutions, not just executing tasks.
More Resources:

Featured Image: SvetaZi/Shutterstock

Read More »
SEO

How Google Discover REALLY Works

This is all based on the Google leak and tallies up with my experience of content that does well in Discover over time. I have pulled out what I think are the most prominent Discover proxies and grouped them into what seems like the appropriate workflow.Like a disgraced BBC employee, thoughts are my own.
TL;DR

Your site needs to be seen as a “trusted source” with low SPAM, evaluated by proxies like publisher trust score, in order to be eligible.
Discover is driven by a six-part pipeline, using good vs. bad clicks (long dwell time vs. pogo-sticking) and repeat visits to continuously score and re-score content quality.
Fresh content gets an initial boost. Success hinges on a strong CTR and positive early-stage engagement (good clicks/shares from all channels count, not just Discover).
Content that aligns with a user’s interests is prioritised. To optimize, focus on your areas of topical authority, use a compelling headline(s), be entity-driven, and use large (1200px+) images.

Image Credit: Harry Clarkson-Bennett
I count 15 different proxies that Google uses to guide satiate the doomscrollers’ desperate need for quality content in the Discover feed. It’s not that different to how traditional Google search works.
But traditional search (a high-quality pull channel) is worlds apart from Discover. Audiences killing time on trains. At their in-laws. The toilet. Given they’re part of the same ecosystem, they’re bundled together into one monolithic entity.
And here’s how it works.
Image Credit: Harry Clarkson-Bennett
Google’s Discover Guidelines
This section is boring, and Google’s guidelines around eligibility are exceptionally vague:

Content is automatically eligible to appear in Discover if it is indexed by Google and meets Discover’s content policies.
Any kind of dangerous, spammy, deceptive, or violent/vulgar content gets filtered out.

“…Discover makes use of many of the same signals and systems used by Search to determine what is… helpful, reliable, people-first content.”
Then they give some solid, albeit beige advice around quality titles – clicky, not baity as John Shehata would say. Ensuring your featured image is at least 1200px wide and creating timely, value-added content.
But we can do better.
Discover’s Six-Part Content Pipeline
From cradle to grave, let’s review exactly how your content does or, in most cases, doesn’t appear in Discover. As always, remembering I have made these clusters up, albeit based on real Google proxies from the Google leak.

Eligibility check and baseline filtering.
Initial exposure and testing.
User quality assessment.
Engagement and feedback loop.
Personalization layer.
Decay and renewal cycles.

Eligibility And Baseline Filtering
For starters, your site has to be eligible for Google Discover. This means you are seen as a “trusted source” on the topic, and you have a low enough SPAM score that the threshold isn’t triggered.
There are three primary proxy scores to account for eligibility and baseline filtering:

is_discover_feed_eligible: a Boolean feature that filters non-eligible pages.
publisher_trustScore: a score that evaluates publisher reliability and reputation.
topicAuthority_discover: a score that helps Discover identify trusted sources at the topic level.

The site’s reputation and topical authority are ranked for the topic at hand. These three metrics help evaluate whether your site is eligible to appear in Discover.
Initial Exposure And Testing
This is very much the freshness stage, where fresh content is given a temporary boost (because contemporary content is more likely to satiate a dopamine addicted mind).

freshnessBoost_discover: provides a temporary boost for fresh content to keep the feed alive.
discover_clicks: where early-stage article clicks are used as a predictor of popularity.
headlineClickModel_discover: is a predictive CTR model based on the headline and image.

I would hypothesize that using a Bayesian style predictive model, Google applies learnings at a site and subfolder level to predict likely CTR. The more quality content you have published over time (presumably at a site, subfolder and author level), the more likely you are to feature.
Because there is less ambiguity. A key feature of SEO now.
User Quality Assessment
An article is ultimately judged by the quality of user engagement. Google uses the good and bad click style model from Navboost to establish what is and isn’t working for users. Low CTR and/or pogo-sticking style behavior downgrades an article’s chance of featuring.
Valuable content is decided by the good vs bad click ratio. Repeat visits are used to measure lasting satisfaction and re-rank top-performing content.

discover_blacklist_score: Penalty for spam, misinformation, or clickbait.
goodClicks_discover: Positive user interactions (long dwell time).
badClicks_discover: Negative interactions (bounces, short dwell).
nav_boosted_discover_clicks: Repeat or return engagement metric.

The quality of the article is then measured by its user engagement. As Discover is a personalized platform, this can be done accurately and at scale. Cohorts of users can be grouped together. People with the same general interests are served the content if, by the algorithm’s standard, they should be interested.
But if the overly clicky or misleading title delivers poor engagement (dwell time and on-page interactions), then the article may be downgraded. Over time, this kind of practice can compound and nerf your site completely.
Headlines like this are a one way ticket to devaluing your brand in the eyes of people and search engines (Image Credit: Harry Clarkson-Bennett)
Important to note that this click data doesn’t have to come from Discover. Once an article is out in the ether – it’s been published, shared on social, etc. – Chrome click data is stored and is applied to the algorithm.
So, the more quality click data and shares you can generate early in an article’s lifecycle (accounting for the importance of freshness), the better your chance of success on Discover. Treat it like a viral platform. Make noise. Do marketing.
Engagement And Feedback Loop
Once the article enters the proverbial fray, a scoring and rescoring loop begins. Continuous CTR, impressions, and explicit user feedback (like, hate, and “don’t show me this again, please” style buttons) feed models like Navboost to refine what gets shown.

discover_impressions: The number of times an article appears in a Discover feed.
discover_ctr: Clicks divided by impressions. Impressions and click data feed CTR modelling
discover_feedback_negative: Specific user feedback, i.e., not interested suppresses content for individuals, groups, and on the platform as a whole.

These behavioral signals define an article’s success. It lives or dies on relatively simple metrics. And the more you use it, the better it gets. Because it knows what you and your cohort are more likely to click and enjoy.
This is as true in Discover as it is in the main algorithm. Google admitted as such in the DoJ rulings. (Image Credit: Harry Clarkson-Bennett)
I imagine headline and image data are stored so that the algorithm can apply some rigorous standards to statistical modelling. Once it knows what types of headlines, images and articles perform best for specific cohorts, personalisation becomes effective faster.
Personalization Layer
Google knows a lot about us. It’s what its business is built on. It collects a lot of non-anonymized data (credit card details, passwords, contact details, etc.) alongside every conceivable interaction you have with webpages.
Discover takes personalization to the next level. I think it may offer an insight into how part of the SERP could look like in the future. A personalized cluster of articles, videos, and social posts designed to hook you in embedded somewhere alongside search results and AI Mode.
All of this is designed to keep you on Google’s owned properties for longer. Because they make more money that way.
Hint: They want to keep you around because they make more money (Image Credit: Harry Clarkson-Bennett)

contentEmbeddings_discover: Content embeddings determine how well the content aligns with the user’s interests. This powers Discover’s interest-matching engine.
personalization_vector_match: This module dynamically personalises the user’s feed in real-time. It identifies similarity between content and user interest vectors.

Content that matches well with your personal and cohort’s interest will be boosted into your feed.
You can see the site’s you engage with frequently using the site engagement page in Chrome (from your toolbar: chrome://site-engagement/) and every stored interaction with histograms. This histogram data indirectly shows key interaction points you have with web pages, by measuring the browser’s response and performance around those interactions.
It doesn’t explicitly say user A clicked X, but logs the technical impact, i.e., how long did the browser spending processing said click or scroll.
Decay And Renewal Cycles
Discover boosts freshness because people are thirsty for it. By boosting fresh content, older or saturated stories naturally decay as the news cycle moves on and article engagement declines.
For successful stories, this is through market saturation.

freshnessDecay_timer: This module measures recency decay after initial exposure, gradually reducing visibility to make way for fresher content.
content_staleness_penalty: Outdated content or topics are given a lower priority once engagement starts to decline to keep the feed current.

Discover is Google’s answer to a social network. None of us spend time in Google. It’s not fun. I use the word fun loosely. It isn’t designed to hook us in and ruin our attention spans with constant spiking of dopamine.
But Google Discover is clearly on the way to that. They want to make it a destination. Hence, all the recent changes where you can “catch up” with creators and publishers you care about across multiple platforms.
Videos, social posts, articles … the whole nine yards. I wish they’d stop summarizing literally everything with AI, however.
My 11-Step Workflow To Get The Most Out Of Google Discover
Follow basic principles and you will put yourself in good stead. Understand where your site is topically strong and focus your time on content that will drive value. Multiple ways you can do this.
If you don’t feature much in Discover, you can use your Search Console click and impressions data to identify areas where you generate the highest value. Where you are topically authoritative. I would do this at a subfolder and entity level (e.g., politics and Rachel Reeves or the Labor Party).
Also worth breaking this down in total and by article. Or you can use something like Ahrefs’ Traffic Share report to determine your share of voice via third-party data.
Essentially share of voice data (Image Credit: Harry Clarkson-Bennett)
Then really focus your time on a) areas where you’re already authoritative and b) areas that drive value for your audience.
Assuming you’re not focusing on NSFW content and you’re vaguely eligible, here’s what I would do:

Make sure you’re meeting basic image requirements. 1200 pixels wide as a minimum.
Identify your areas of topical authority. Where do you already rank effectively at a subfolder level? Is there a specific author who performs best? Try to build on your valuable content hubs with content that should drive extra value in this area.
Invest in content that will drive real value (links and engagement) in these areas. Do not chase clicks via Discover. It’s a one-way ticket to clickbait city.
Make sure you’re plugged into the news cycle. Being first has a huge impact on your news visibility in search. If you’re not first on the scene, make sure you’re adding something additional to the conversation. Be bold. Add value. Understand how news SEO really works.
Be entity-driven. In your headlines, first paragraph, subheadings, structured data, and image alt text. Your page should remove ambiguity. You need to make it incredibly clear who this page is about. A lack of clarity is partly why Google rewrites headlines.
Use the Open Graph title. The OG title is a headline that doesn’t show on your page. Primarily designed for social media use, it is one of the most commonly picked up headlines in Discover. It can be jazzy. Curiosity led. Rich. Interesting. But still entity-focused.
Make sure you share content likely to do well on Discover across relevant push channels early in its lifecycle. It needs to outperform its predicted early-stage performance.*
Create a good page experience. Your page (and site) should be fast, secure, ad-lite, and memorable for the right reasons.
Try to drive quality onward journeys. If you can treat users from Discover differently to your main site, think about how you would link effectively for them. Maybe you use a pop-up “we think you’ll like this next” section based on a user’s scroll depth of dwell time.
Get the traffic to convert. While Discover is a personalized feed, the standard scroller is not very engaged. So, focus on easier conversions like registrations (if you’re a subscriber first company) or advertising revenue et al.
Keep a record of your best performers. Evergreen content can be refreshed and repubbed year after year. It can still drive value.

*What I mean here is if your content is predicted to drive three shares and two links, if you share it on social and in newsletters and it drives seven shares and nine links, it is more likely to go viral.
As such, the algorithm identifies it as ‘Discover-worthy.’
More Resources:

This was originally published on Leadership in SEO.

Featured Image: Roman Samborskyi/Shutterstock

Read More »
Local Search

How To Do A Complete Local SEO Audit: 11-Point Checklist via @sejournal, @JRiddall

Local SEO includes several specific tasks geared to establishing the relevance and authority of a business within a targeted geographic area.Search engines and large language models (LLMs) like Google Gemini and ChatGPT reference many different data points to determine who will be surfaced in their respective result sets, which include AI Overviews and AI Mode in Google, featured snippets, local map packs, image or video carousels, and other emerging search formats.
So, how can you identify and prioritize optimizations with the greatest potential to deliver converting traffic to your website or your business door from traditional organic local SEO or AI search?
Below, we’ll walk through an evaluation of each key facet of your local search presence and uncover your best opportunities to improve your visibility in traditional organic and AI search.
The Local SEO Audit Checklist1. Keyword Topic/AI Prompt Audit2. Website Audit3. Google Business Profile Audit4. Review Monitoring And Management5. Local Business Listing/Citation Audit6. Backlink Audit7. Local Content Audit8. Google Search Console Review9. Analytics Review10. Competitor Analysis11. AI Search For Local Businesses12. Prioritizing Your Action Items
These tasks are listed in typical order of completion during a full audit, but some can be accomplished concurrently.
1. Keyword Topic/AI Prompt Audit
Although the introduction of AI in search has changed the keyword-first strategy, the natural place to start a local SEO audit is in organic and AI search results. Start with the topical keywords, phrases, and AI prompts you are hoping your business will be found for, in order to identify where you are positioned relative to your competitors and other websites/content.
This research can help you quickly identify where you have established some level of authority/momentum to build on, as well as topics upon which you should not waste your time and effort.
SEO is a long-term strategy, so no keyword or prompt should be summarily dismissed. Even so, it’s generally best to focus on keyword topics you realistically have a chance to gain visibility and drive traffic for. Pay close attention to the intent behind the keywords you choose and ideally focus on those with commercial or transactional intent, as informational content search results are largely being dominated by AI summaries.
You will also need to consider optimizing for conversational search queries or prompts and voice search, as AI Mode will increasingly rely on natural language processing.
Further, some younger users have developed different searching behaviors altogether and are using social media platforms like Instagram and TikTok for local searches. Search optimization for these platforms is a different conversation, but having an eye on how your business and its products/services are found when searching here can provide insight into how searches are conducted in more traditional and emerging AI formats.
Different people search in different ways, and it’s important not to limit your research to single keywords, but rather account for the various ways and phrases your audience may use to try to find you or your offerings; hence, taking a topical approach. This only becomes amplified in AI search, where every prompt is the beginning of a potentially long, drawn-out chat.
2. Website Audit
You can now conduct full content and technical website audits to ensure your site is optimized for maximum crawlability, indexability, and visibility by search engine and LLM crawlers. A typical audit is designed to analyze the underlying structure, content, and overall site experience.
Here again, there are many site auditing tools to crawl a website and then identify issues and prioritize actions to be taken based on SEO best practices.
A website audit and optimization can be broken down into a few buckets:
Page Optimization
Webpage optimization is all about ensuring pages are well structured, focused around targeted topical keywords, and provide a positive user experience.
As a search engine crawls a webpage, it looks for signals to determine what the page is about and what questions it can answer. These crawlers analyze the entire page to determine its focus, but specifically focus on page titles and headings as primary descriptors. A well-structured page with a hierarchical heading structure is key to helping site visitors, search engine and LLM bots easily scan and consume your content.
Ideally, each webpage is keyword topic-focused and unique. As such, keyword variations should be used consistently in titles, URLs, headings, and body content.
Another important potential issue raised in an audit, depending on the nature of your local business, is image optimization. As a best practice, all images should include relevant descriptive filenames and alt text, which may include pertinent keywords. This becomes particularly important when images (e.g., product or service photos) are central to your business, as image carousels can and will show up in web search results. In every case, attention should be paid to the images appearing on your primary ranking pages.
Lastly, an over-reliance on JavaScript can be particularly detrimental for LLM visibility, as some LLMs currently do not execute JavaScript. If your site is powered by JavaScript, you’ll want to address this with your developer to see how the most important content can be presented in raw HTML or via server-side scripting to enable crawling and indexing.
Internal Link Audit
A link audit will help you quickly identify any potential misdirected or broken links, which can create a less-than-optimal experience for your site visitors and may confuse search engine and LLM bots.
Links are likewise signals the search engines use to determine the structure of a website and its ability to direct searchers to appropriate, authoritative answers to their questions.
Part of this audit should include the identification of opportunities to crosslink prominent pages. If a page within your site has keywords (anchor text) referencing relevant content on another page, a link should be created, provided the link logically guides users to more relevant content or an appropriate conversion point.
External links should also be considered, especially when there is an opportunity to link to an authoritative source of information. From a local business perspective, this may include linking to relevant local organizations, partners, or events.
Schema Review
Schema or structured data can help search engines and LLMs better understand your business and its offerings and offer enhanced visibility. An effective local SEO audit should include the identification of content within a website to which schema can be applied.
Local businesses have an opportunity to have their content highlighted if they:

Publish highly authoritative and relevant content.
Use structured schema markup to tag content.

Relevant local business schema markup includes LocalBusiness, Product, Service, Review, and FAQPage, among others. All schema markup code should be validated via Google’s Rich Results Test and/or the Schema.org validator.
Mobile Audit
As most consumers search via their mobile devices – especially for local services – it’s essential for local businesses to provide a positive mobile web experience. Websites need to load quickly, be easily navigated, and enable seamless user interaction.
Google offers a range of free mobile testing and mobile-specific monitoring tools, such as Page Experience and Core Web Vitals, in Google Search Console.
More in-depth user experience and SEO analysis can be done via Google Lighthouse, though a local business owner will likely want to enlist the help of a web developer to action any of the recommendations this tool provides.
Duplicate Content
High-quality, authoritative content is, by definition, original content.
As such, it’s important to let Google know if your website contains any content/pages you did not create by adding a canonical tag to the HTML header of the page. Most pages, which are unique unto themselves, will have a self-referencing canonical.
Not doing so can have a detrimental effect on your authority and, by extension, your ability to rank. Most site auditing tools will flag content missing or having malformed canonical tags.
3. Google Business Profile Audit
A Google Business Profile (GBP) effectively represents a “secondary” website and highly visible point of presence for most local businesses. Increasingly, this “secondary” website is becoming the consumers’ first point of contact.
An accurate, comprehensive GBP is critical to establishing visibility in organic and now AI search results.
A recent behavioral study of travel booking in AI Mode conducted by Propellic found GBP to be among the most highly displayed and engaged content for searchers booking local accommodations and experiences.
A Google Business Profile audit should focus on the accuracy and completeness of the various components within the profile, including:

Business information and location details.
Correct primary business category.
Hours of operation.
Correct pin location in Google Maps.
Proper categorization as a physical location or service area business.
Products.
Services.
Appointment link(s), if applicable.
Photos or Videos.
Social Profiles.
Offers.
Regular updates.
Events.
Informational content.

Screenshot from Google Business Profile, September 2025
The more complete the profile is, the more likely it will be viewed as a reliable local resource and be given appropriate billing in the search results.
Assuming you have claimed and are authorized to manage your GBP, you can access and edit your info directly within the search results.
4. Review Monitoring And Management
Another very important aspect of a GBP is reviews.
Local business customers have an opportunity to write reviews, which appear on the GBP for other customers to reference and play a significant role in determining visibility in the local map pack. They are most certainly a determining factor with regard to appearing in Google AI Overviews.
Google will notify business owners as soon as reviews are submitted, and they should be responded to as soon as possible. This goes for negative reviews just as much as positive ones. Include an analysis of your reviews to ensure none have fallen through the cracks. This will also help determine whether there are recurring customer service and satisfaction issues or themes to be addressed. A detailed analysis of reviews can be a great source of content ideas aimed at answering customers’ most pressing questions or concerns.
Of course, there are also several other places for consumers to submit reviews, including Facebook, local review sites like Yelp, and industry-specific sites such as TripAdvisor and Houzz. A full audit should take inventory of reviews left on any of these services, as they can show up in search results.
Pro tip: Request positive reviews from all customers and politely suggest they reference the product or service they are reviewing, as keywords contained in reviews can have a positive effect from ranking perspective.
5. Local Business Listing/Citation Audit
Local business listings and citations provide search engines and LLM bots with a way of confirming a business is both local and reputable within a specific geographic region. Recent studies have revealed unlinked brand mentions and citations play a significant role in AI Visibility.
It is important to have a presence in reputable local directories, review sites, business directories (e.g., Chambers of Commerce), or local partner sites to prove your “localness.”
Depending on the size and scope of your local business, an audit of your listings and citations can be done in an automated or manual fashion.
Business listings and citation management tools can be used to find, monitor, and update all primary citations with your proper Name, Address, Phone Number (aka NAP), and other pertinent business details found in broader listings (e.g., website address, business description).
If you manage a limited number of locations and have the time, one quick method of identifying where your current listings can be found is to simply conduct a search on your business name. The first three to four pages of search results should reveal the same.
It’s also important to find and resolve any duplicate listings to prevent confusing customers and search engines alike with outdated, inaccurate information.
Local business owners and managers should also monitor Reddit for their brand and local product/service offerings to gauge activity and sentiment. Reddit is a unique platform where “karma” and trust are tantamount, but there is an opportunity for brands and local businesses to engage with their customers if they do it in a transparent, authentic, and non-promotional way.
6. Backlink Audit
Backlinks or inbound links are similar to citations, but are effectively any links to your website pages from other third-party websites.
Links remain an important factor in determining the authority of a website, as they lend validity if they come from relevant, reputable sources.
As with other components of an audit, there are several good free and paid backlink tools available, including a link monitoring service in Google Search Console, which is a great place to start.
An effective backlink audit has the dual purpose of identifying and building links via potentially valuable backlink sources, which can positively affect your ranking and visibility.
For local businesses, reputable local sources of links are naturally beneficial in validating location, as noted with citations above.
Potential backlink sources can be researched in a variety of locations:

Free and paid backlink research tools, such as Ahrefs or Semrush identify any domains where your primary competition has acquired backlinks, but you have not.
Any non-competitive sites appearing in the organic search results for your primary keywords are, by definition, good potential backlink sources. Look for directories you can be listed in, blogs or articles you can comment on, or publications you can submit articles to.
Referral sources in Google Analytics may reveal relevant external websites where you already have links and may be able to acquire more.

7. Local Content Audit
People search differently and require different types of information depending on where they are in their buying journey. A well-structured local web presence will include content tailored and distributed for consumption during each stage of this journey, to bolster visibility and awareness.
You want to be found throughout your customer’s search experience. A content audit can be used to make sure you have helpful content for each of the journey buckets your audience members may find themselves in.
Informational content may be distributed via social or other external channels or published on your website to help educate your consumers on the products, services, and differentiators you offer at the beginning of their path to purchase.
As AI is consuming and repurposing much of this informational content, it’s important to ensure your informational content includes your unique perspective based on your experience and expertise. This content ideally answers your prospects’ why, how, and what types of questions.
Transactional content is designed to address those consumers who already know what they want, but are in the process of deciding where or who to purchase from. This type of content may include reviews, testimonials, or competitive comparisons.
Navigational content ensures when people click through from Google after having searched your brand name or a variation thereof, they land on a page or information validating your position as a leader in your space. This page should also include a clear call-to-action with the assumption they have arrived with a specific goal in mind.
Commercial content addresses those consumers who have signaled a strong intent to buy. Effective local business sites and social pages must include offers, coupons, discounts, and clear paths to purchase.
Optimizing Content For AI
From an AI search and visibility perspective, keep in mind the vast majority of AI results are responses to long-form questions/prompts from consumers. As such, it is crucial for some of your content to be in a direct question/answer format.
One quick and effective tactic is the creation of an FAQ section within product or service pages. However, avoid overseeding FAQs by including generic questions and answers. FAQs should be specific to the pages they reside on.
We’ve previously touched upon the importance of structured content for improved crawling, scanning, and comprehension. When reviewing your content, look for opportunities to incorporate defined heading structures, tables of contents for long-form content, and ordered lists.
Content Variety And Distribution
Quality content is content your audience wants to consume, like, and share. For many businesses, this means considering and experimenting with content beyond simple text and images.
Video content shared via platforms like YouTube, Instagram, Facebook, TikTok, and others is easier to consume and generally more engaging.
8. Google Search Console Review
Google Search Console is an invaluable free resource for data related to keyword and content performance, indexing, schema/rich results validation, mobile/desktop experience monitoring, and security/manual actions.
A complete local SEO audit must include a review and analysis of this data to identify and react to strengths, weaknesses, opportunities, and threats outlined in each section.
Google Search Console screenshot, September 2025
Website owners and managers will want to pay particular attention to any issues related to pages not being crawled/indexed or manual actions having been taken based on questionable practices, as both can have a detrimental effect on search engine visibility.
Google Search Console does send notifications for these types of issues as well as regular performance updates, but an audit will ensure nothing has been overlooked.
9. Analytics Review
Whether you are using Google Analytics or another site/visitor tracking solution, the data available here is useful during an audit to validate top and lesser-performing content, traffic sources, audience profiles, and paths to purchase.
Findings in analytics will be key to your content audit.
As you review your site analytics, you may ask the following questions:

Are my top-visited pages also my top-ranking pages in search engines?
Which are my top entry pages from organic and AI search?
Which LLMs are sending traffic to my site?
Which pages/content are not receiving the level of traffic or engagement desired?
What is the typical path to purchase on my site, and can it be condensed or otherwise optimized?
Which domains are my top referrers, and are there opportunities to further leverage these sites for backlinks? (see Backlink Audit above).

Use Google Analytics (or another tool of your choice) to find the answers to these questions, so you can focus and prioritize your content and keyword optimization efforts.
10. Competitor Analysis
A comprehensive local SEO audit should identify and review the strengths and weaknesses of your competition.
You may already have a good sense of who your competition is, but to begin, it’s always a good idea to confirm who specifically shows up in the organic search and AI results when you enter your target keywords. You may find different competitors in these two formats, which represent both a threat and an opportunity.
These businesses/domains are your true online competitors and the sites you can learn the most from. If any of your online competitors’ sites and/or pages are ranking ahead of yours, you’ll want to review what they may be doing to gain this advantage.
You can follow the same checklist of steps you would conduct for your own audit to identify how they may be optimizing their keywords, content, Google Business Profile, reviews, local business listings, or backlinks.
In general, the best way to outperform your competition is to provide a better overall experience online and off, which includes generating more relevant, unique, high-quality content to more fully address the questions your mutual customers have.
11. AI Search For Local Businesses
AI Overviews and AI Mode are increasingly superseding traditional organic search results in Google, as the search engine aims to provide the answers to questions directly within its SERPs. Further, Google has signalled its commitment to AI Mode by recently integrating it into the Chrome address bar.
While AI search optimization has some new considerations, a strong foundation in traditional SEO will go a long way to building visibility in AI search results; chief among these at a local level is a fully optimized Google Business Profile, which appears prominently for local searches with commercial intent as outlined above.
Screenshot of Google AI Mode displaying Google Business Profile Cards, September 2025
AI Mode Strategy Checklist Should Consider:

Enhanced GBP Features: Stay updated on new features within Google Business Profile, allowing for direct interactions or transactions, as these will be favored by AI Mode.
Focus on User Intent: Understand the transactional and informational intent behind local searches. AI Mode aims to provide immediate solutions, so businesses facilitating this will gain an advantage.
Voice Search Optimization: As AI Mode becomes more conversational, optimizing for natural language queries and voice search will be crucial. Ensure your content answers questions directly and uses conversational language.
Direct Action Integrations: This may still be a ways away, but review and explore opportunities to integrate with Google’s booking or reservation features, if applicable to your business. This could become a direct pathway to conversions within AI Mode.

Prioritizing Your Action Items
A complete local SEO audit is going to produce a fairly significant list of action items.
Many of the keyword, site, content, and backlink auditing tools do a good job of prioritizing tasks; however, the list can still be daunting.
One of the best places to start with an audit action plan is around the keywords, AI prompts, and content you have already established some, but not enough, authority for.
Determine how to best address deficiencies or opportunities to optimize this content first before moving on to more competitive keywords or those you have less or no visibility for. Establishing authority and trust is a long-term game.
These audit items should be reviewed every six to 12 months, depending on the size and scale of your web presence, to enable the best chance of being found by your local target audience.
More Resources:

Featured Image: BestForBest/Shutterstock

Read More »
digital marketing

How to Turn Every Campaign Into Lasting SEO Authority [Webinar] via @sejournal, @hethr_campbell

Capture Links, Mentions, and Citations That Make a DifferenceBacklinks alone no longer move the authority needle. Brand mentions are just as critical for visibility, recognition, and long-term SEO success. Are your campaigns capturing both?
Join Michael Johnson, CEO of Resolve, for a webinar where he shares a replicable campaign framework that aligns media outreach, SEO impact, and brand visibility, helping your campaigns become long-term assets.
What You’ll Learn

The Resolve Campaign Framework: Step-by-step approach to ideating, creating, and pitching SEO-focused digital PR campaigns.
The Dual Outcome Strategy: How to design campaigns that earn both high-quality backlinks and brand mentions from top-tier media.
Real Campaign Case Studies: Examples of campaigns that created a compounding effect of links, mentions, and brand recognition.
Techniques for Measuring Success: How to evaluate the SEO and branding impact of your campaigns.

Why You Can’t Miss This Webinar
Successful SEO campaigns today capture authority on multiple fronts. This session provides actionable strategies for engineering campaigns that work hand in hand with SEO, GEO, and AEO to grow your brand.
📌 Register now to learn how to design campaigns that earn visibility, links, and citations.
🛑 Can’t attend live? Register anyway, and we’ll send you the recording so you don’t miss out.

Read More »
App

An AI adoption riddle

A few weeks ago, I set out on what I thought would be a straightforward reporting journey.  After years of momentum for AI—even if you

Read More »