The 4 Principles Of Effective Retail Marketing via @sejournal, @jasonhennessey

From window displays and newspaper ads, to sidewalk sandwich boards and pop-up events, there are many ways to market a retail store.

Whether your goal is to draw in casual passersby or increase online sales, having a well-planned (and well-executed) marketing strategy is key to wooing more customers.

But before you get fancy with flashy ads or influencer partnerships, it’s best to start with the fundamentals. That’s what makes this guide essential reading for any savvy retail business owner.

Master the four pillars of retail marketing – often referred to as “the 4 Ps”  – and you’re well on your way to having an iron-clad marketing plan.

What Is Retail Marketing?

Retail marketing refers to the various activities, whether in-store, locally, or online, that are used to attract customers to a retail business.

While the exact tactics may vary, retail marketing at its core is about establishing a brand identity, promoting your products, and engaging with potential customers (often across multiple channels).

Marketing as a whole has changed over the years, evolving from traditional media (print ads, flyers, in-person networking, etc.) to more technologically advanced methods (social media, online ads, email marketing, etc.), but the fundamentals have remained mostly the same.

That said, it’s important to know that retail marketing differs from other types of marketing.

How Retail Marketing Differs From Other Types Of Marketing

Like all types of marketing, retail marketing is all about connecting the product or service with the consumer. But retail marketing is different from other types of marketing – like Business to Business (B2B) marketing or service marketing – in a few distinct ways:

  • Customer Needs: Retail marketing focuses on individual consumers (B2C), whereas B2B marketing targets other businesses. Retail customers are typically driven by personal needs, while B2B decisions are often based on business requirements, return on investment (ROI), and long-term objectives.
  • Sales Cycle: Retail marketing usually involves a shorter sales cycle, with consumers making relatively faster purchasing decisions than B2B buyers.
  • Tangible Products: Retail marketing primarily deals with tangible products that consumers can see and touch, whereas other types of marketing (B2B or Service) often deal with intangible offerings like consulting or software.
  • Physical Presence: ​​Retail marketing often (but not always) involves a physical presence, usually via a brick-and-mortar store. Digital marketing, while it can support retail efforts, primarily operates online using tools like social media and email to reach customers.

Retail marketing is different from other types of marketing in its focus on the close interaction between the business and the consumer at the point of sale.

Many retail business owners understand that the success of their marketing efforts often comes down to face-to-face interactions and personalized experiences.

What Are The 4 Principles Of Retail Marketing?

When it comes to something as broad as “marketing,” simplicity is key. The essential elements of retain marketing revolve around four primary pillars:

  • Product.
  • Price.
  • Place.
  • Promotion.

We’ll refer to these as “the 4 Ps” throughout this article. They have even been known to extend beyond to include “Presentation” and “Personnel.” But for our purposes, we’ll stick to the primary four.

1. Product: What You Sell

The first pillar, product, pertains to the actual item or service you offer customers. This might involve a single category of products (e.g., novelty candles) or, most often, a variety of products (e.g., candles, home decor, furniture, etc.) offered by your brand.

Before you market your product(s), you need to understand it. This means not only its physical attributes and design but also the value it provides to customers. This also includes its material quality, branding, and even post-sale support resources.

Your product (again, it can pertain to a category of products) should speak to the needs, challenges, or interests of your prospective customers. You must fundamentally understand what it is that you sell and how that provides a benefit to customers.

For example:

  • If you sell office chairs, your product could address the challenge of reducing back pain or increasing comfort for people who spend long hours at a desk.
  • If you sell natural skincare products, your product could appeal to customers interested in natural ingredients and being environmentally conscious.
  • If you sell durable running shoes, your product could cater to athletes looking for footwear that lasts long, provides support, and prevents injuries.
  • If you sell gourmet coffee, your product might connect with coffee enthusiasts looking for unique flavors, high-quality beans, and a connection to Fair Trade growers.

The key is to gain a deeper understanding of your product’s connection to your customers. Ask yourself: What do they need? What are their challenges? How does your product address a need or a problem?

Try This To Better Understand Your Product

Every retail business owner can benefit from some practice in examining their products and how they might appeal to the needs of their customers.

If you aren’t crystal clear on the “why” behind your product(s), start with this activity:

  1. Workshop: Gather your team (sales, marketing, and service) to identify the key features of your most important products. Off the cuff, what are the primary features that stand out?
  2. Map: Then, outline the customer journey, from the time someone first discovers your product to the after-sale experience. Discuss what points of interaction a customer is likely to have during this process (e.g., entering your store, being welcomed by a sales rep, trying on clothes, weighing pricing options, etc.)
  3. Empathize: At each touchpoint, put yourself in the customer’s shoes. How might the customer feel? What else might they need?
  4. Apply: Based on your customer journey map, consider any improvements to made to your product or process. Could merchandise be laid out differently? How might you enhance the customer experience? Could post-sale support be improved?

Refining your product is a continuous process, influenced often by customer feedback and actual sales numbers.

Train your team on how they should communicate about your product, associate products with related offerings (cross-selling), and answer customers’ questions to direct them to the most appropriate product (read: solution).

2. Price: What People Pay For The Product

The second pillar, price, refers to the amount of money customers are willing to pay for your product.

This is more than just the number you put on the price tag. It is a representation of your product’s perceived value and the benefit it provides to your customers.

Some things to consider are your own brand’s positioning in your market, your competitors’ pricing, and the quality of materials used to create the product.

For example, if your product is of superior quality, has unique features, and conveys a sense of luxury, premium pricing may be the way to go.

On the other hand, if you’re in a saturated market and can’t outshine your competitors based on quality, you could undercut them on price.

The objective is to find that sweet spot – where your pricing generates a profit but also feels appropriate based on your customer’s perception of the product’s value.

Developing Your Pricing Strategy

Not sure how to price your products? Pricing is both an art and a science.

Here are some steps to follow to develop a profitable yet appropriate pricing strategy:

  1. Research the Competition: Scope out what your competitors are charging for similar products. Consider the materials used to create your product relative to your competitors. Determine where your product stands in terms of quality, features, convenience, and brand positioning.
  2. Consider Your Audience: As stated, pricing isn’t just about quality and materials, but also customer perception. Think about who your target customer is, what they need, and what they’re willing to spend. Consider their income level, spending habits, location, and desire/necessity for the product.
  3. Count the Costs: Figure out how much it costs for you to acquire, market, and sell the product. How many products do you need to sell to turn a profit? Make sure all the associated costs are covered by the price, plus a healthy margin.
  4. Edit and Adjust: Over time, you might need to test different pricing models to determine what resonates with your customers and still turns a profit. When you apply discounts or bundled pricing, observe how these changes impact sales. Monitor your sales data and customer behavior to adjust your pricing strategy accordingly.

Simple Retail Pricing Formula

Here’s a simple retail pricing formula to help you:

Retail Price = Cost of Goods Sold (COGS) / (1 − Desired Profit Margin )

Where:

  • Cost of Goods Sold (COGS): The total cost of producing or purchasing the product, including materials, labor, shipping, marketing, etc.
  • Desired Profit Margin: The percentage of profit you want to make on the product, expressed as a decimal.

Your prices aren’t set in stone. Prices often fluctuate due to market conditions, operational costs, customer behavior, and many other factors.

The key is to effectively communicate the value behind your pricing – and train your team to understand your product’s offerings – so your customers feel confident that the product is worth the price.

3. Place: Where You Sell The Product

The third pillar of retail marketing, place, refers to the channels through which you advertise and sell your product. This might include your physical storefront, but also includes online marketplaces, an ecommerce website, digital marketing channels, pop-up events, partnerships, and more.

When considering a place, think about where prospective customers are most likely to look for products like yours. Are they scrolling social media? Window shopping while on vacation? Searching blogs for product reviews? Put yourself in their shoes when it comes to searching for products.

For example:

  • If you sell luxury handbags, your place might be a high-end boutique located in the prime shopping district.
  • If you sell fresh produce, your place could be a local farmers’ market on the weekends.
  • If you sell handmade gifts, your place could be a mix of local craft fairs, pop-up shops, and online marketplaces like Etsy.

The Place(s) To Sell For Retail

When it comes to place, the key is to ensure that your products are available where your customers are looking for them. This might include several different channels, in fact.

While you don’t need to (and probably shouldn’t) attempt to sell in all of these places, here are the most common sales channels for retail:

  • Brick-and-mortar stores.
  • Ecommerce website.
  • Online marketplaces (like Amazon, eBay, Etsy, or Faire).
  • Social media (Instagram, Facebook, Pinterest, etc.).
  • Pop-up shops.
  • Mobile apps (e.g., Shopify App, Etsy App, InstaCart, etc.).
  • Wholesale (selling products in bulk to other retailers).
  • Direct sales (via parties, door-to-door sales, etc.).

It’s best to focus on one to three channels where your target customers will most likely spend their time. This helps ensure that your marketing budget is allocated to those channels most likely to yield the best return.

4. Promotion: How You Advertise The Product

The fourth pillar, promotion, is all about connecting with your target customers and making them more aware of your brand and products.

Making sales isn’t just about being visible, but also about using marketing strategically to draw customers in and convince them to buy from you!

Rarely do people stumble upon a retail store online and immediately make a purchase. They might require multiple touchpoints to discover, research, compare, and finally purchase your product.

The length of this sales cycle can look different for different types of retail businesses, but the idea is the same: Make sure customers have the experience and information they need to make their purchase decision.

For example:

  • If you run a clothing store, a customer might first discover your brand through a social media ad, and then visit your website to browse your products. They might sign up for your newsletter to receive a discount code, check out reviews on your blog, and finally make a purchase.
  • If you sell electronics, your customers may initially see your new gadget on YouTube, visit your online store to compare specs, read customer reviews, and then make a purchase.
  • If you sell home decor, your potential buyers might find your post on Pinterest, visit your website and add a product to their cart, consult a friend, and finally decide to buy a product to complete their home aesthetic.

Obviously, there are many different channels and means of promoting your products. The channels and approach you use will vary depending on what you sell, who your customers are, and your budget.

Increase The Visibility Of Your Retail Business

Once you’ve determined where (place) you want to sell your products, it’s time to use those channels for promotion.

Using the examples listed in the previous section, here are a few ways to promote your retail business:

  • Brick-and-mortar store: Use eye-catching window displays and signage to draw in passersby. Host in-store events like product launches or workshops, and offer in-store discounts to incentivize customers.
  • Ecommerce website: Optimize your website for search engines to drive organic website visitors from Google. Use email marketing to keep customers engaged, send personalized offers, and offer product recommendations.
  • Online marketplace: Showcase your products on websites like Amazon, Faire, or Etsy. Optimize your product listings with high-quality images, detailed descriptions, features, and customer reviews. Consider running sponsored ads on the marketplace to increase product visibility.
  • Social media: Stay active online with engaging posts, videos, reels, and stories. Reply to customer comments and re-share happy customer reviews. Consider running social media ads to reach your target audience based on shopping behavior, demographics, location, etc.
  • Pop-up shop: Partner with other local businesses to attract more customers and foot traffic. Promote your pop-up or event on social media, via email, and through local community channels.
  • Mobile apps: Consider connecting your store with a third-party app like Shopify, Uber, or InstaCart. Entice customers to subscribe for access to special offers and discounts. Add delivery options to make shopping more convenient for your customers.
  • Wholesale: Partner with wholesalers or distributors to close more deals in bulk. Attend trade shows or industry events to showcase your products to potential retail partners.
  • Direct sales: Host product demonstrations or home parties to create a personalized shopping experience. Incentivize happy customers or other brands to become referral partners.
  • Paid ads: Use Google Ads, Meta Ads, LinkedIn Ads, etc. to reach target customers online. Consider implementing retargeting ads to re-engage visitors who have joined your email list but haven’t made a purchase.

Develop Your Retail Marketing Strategy

Your retail business is unique in the experience and products that it offers. But how do you make your store the obvious choice for potential customers?

With an effective retail marketing strategy, you’ll have everything you need to Price, Place, and Promote your Product, attracting more customers to you!

By focusing on the key pillars of product presentation, pricing strategies, distribution channels, and customer experience, you’ll create an environment that resonates with your ideal customers.

You can use a variety of channels – from in-store sales to ecommerce to social media – to promote your business and keep your sales strong.

Ultimately, the success of your retail business depends on your ability to connect with customers and communicate the value your brand has to offer.

Ready to master the 4 Ps? You got this!

More resources:


Featured Image: PeopleImages.com – Yuri A/Shutterstock

SEO Strategy Guide: 14 Must-Do Things to Prepare for 2025 via @sejournal, @Brian_W_Gareth

1. Find The Best Keywords For Your Site

Keywords are the foundation of SEO. Although content is king, keywords come first: they decide what sorts of users will find you in search. And since you want to be found by the right users, you’d better choose your keywords wisely.

What kind of keywords are good for your site?

  • They have a high search volume.

In non-SEO terms, it means lots of people type those keywords into search bars. A few hundred searches per month is good, but going higher is always encouraged. The more, the better.

  • They accurately capture search intent.

The relationship between a site owner and the users works like any business transaction: if you don’t offer them what they want, they won’t take it.

It’s like buying new shoes. If you are an adult with a size 7.5, you are not going to buy children’s shoes (not for yourself, anyway). And looking for generic shoes without anything specific in mind will take you forever to find what you really need.

Keywords are much the same. If you have an online store where you sell shoes, then a product page optimized for the keyword “shoes for women size 7.5” will do a much better job than one saying “shoes for women” or even just “shoes.” Bottom line: use keywords which describe precisely what your target audience wants to find.

  • They aren’t too competitive.

High competition for a keyword means many other sites are already ranking for it – and beating them all won’t be easy. But pretty much every keyword has a less competitive version. You just need to find and use it.

How do you find keywords which match all these criteria?

For search intent, you must know your target audience and their needs really well, and then use your best judgment. Other factors can be represented in numbers, and that’s where SEO tools come in, such as WebCEO’s Keyword Suggestions tool.

Screenshot from WebCEO, September 2024

Do you have any keyword ideas of your own? Enter them in the field and press Search. The tool will generate a table of related keywords, and then you just pick the most promising ones.

2. Optimize Your Pages With Keywords

Got your keywords? Great. Now, you need to make sure you are using them well.

For maximum effectiveness, your site pages must have keywords in these places:

  • Page URL
  • Page title
  • Meta description
  • H1-H4 headings (even better if you have a table of contents)
  • Throughout the text itself
  • Image filenames and ALT texts (for Google Image Search)
  • Video transcripts (if you have videos)

Scan your site pages with WebCEO’s Landing Page SEO tool to check the state of your keyword placement.

Screenshot from WebCEO, September 2024

If the tool finds any spaces that could be filled with keywords, do it and run another scan afterward. Instant improvement before your eyes!

One more thing: while having keywords is a must, avoid going overboard with them. One set of related keywords per page, or even one keyword per page is usually enough. Then weave the keywords into your text in a natural-sounding way. The gold standard for content is normally written text with helpful information.

3. Optimize Your Site Structure

It’s easy to turn your website into a poorly interlinked mess if you don’t know what you are doing.

When you do know what you are doing, you can help your most important pages receive a significant ranking boost – just by placing links correctly.

Your users will appreciate it, too. Who doesn’t like having all the content they need at their fingertips?

So here’s the recipe for an optimal site structure:

  • Page hierarchy. Picture a tree: the home page as the root and the destination pages (i.e. landing pages, product pages, blog articles) at the ends of the branches.
Screenshot from IncreMentors.com, January 2024
  • Topic clusters. It’s good practice to interlink pages that are dedicated to related topics.
  • Navigation bar. A bar at the top (less usually on the left side) of the screen, containing links to the most important site pages (e.g. home page, About Us, Contact Us).
  • Footer bar. Another bar at the bottom of a page, containing the same links from the navigation plus some others, at your discretion. Often, the footer bar contains social media links.
  • Breadcrumbs. Have you ever seen a bunch of links in a row, something like Home » Category » Subcategory » Page? They are called breadcrumbs and they help users keep track of where exactly they are on a website.
  • Three-click rule. An unspoken rule says: users should be able to get from any page A to any other page B in three clicks or fewer.

But to use links on your site like a pro, you want to know exactly how much authority your web pages have. And you can find out with the right SEO tools.

Scan your site with WebCEO’s Internal Links tool to get this information.

Screenshot from WebCEO, September 2024

This tool will reveal the pages with the highest amount of link juice. Proceed to share it with your most valuable pages just by linking to them from those high-authority pages.

This practice is at its most effective when the interlinked pages are related to each other through their topics – in other words, when they form a topic cluster. For example, a page about the best toothbrushes and another about the best toothpastes. It’s natural to link the two together, so a slight ranking boost to both is guaranteed.

4. Max Out Your Loading Speed

How long is too long? Five seconds may not seem like much, but if that’s how long it takes your page to load, most users will have already left.

People hate slow loading pages. People hate waiting in general. Whatever the place or the website, everybody wants to be serviced without delay.

And Google concurs. That’s why site loading speed is a major ranking factor, one you absolutely must not neglect.

And it’s one of the easiest ones to improve, too!

First, scan your site with WebCEO’s Speed Optimization tool.

Screenshot from WebCEO, September 2024

Not only it measures your pages’ loading speed and Core Web Vitals, it also offers constructive criticism by detecting what’s slowing your website down. Just follow the tips from the report and watch your website soar.

And remember to be on constant alert for any slow loading site pages. Set the Speed Optimization tool to send you regular reports, and if you find a page that’s dragging its feet, help it take off.

Screenshot from WebCEO, September 2024

5. Audit Your Site For Errors — And Fix Them

Nothing is perfect, not even the best website in the world. Things break, errors appear. But no self-respecting site owner will let things stay broken – that’s recipe for losing your customers!

You are better than that, too. Scan your site for errors now with WebCEO’s Technical Audit tool.

Screenshot from WebCEO, September 2024

This tool detects all kinds of hiccups, from broken links to more serious issues like server errors. Look upon your report and do not despair. It’s merely a list of fixable things.

You can solve those problems yourself or send the report to your site admin and let them handle it. After the job is done, rescan your site and generate another report showing the drop in errors. Your client will love it.

And yes, the Technical Audit tool can also send automated scheduled reports.

Screenshot from WebCEO, September 2024

6. Check The Quality Of Your Backlinks

What do you think is the number one ranking factor? Which one of them can give you the highest ranking boost?

The hint is right there in the heading above. That’s right: backlinks.

Links from other sites pointing to yours. If your site isn’t on Google’s #1 page, then lack of good backlinks is most likely why (assuming everything else is okay).

To see if you have a backlink problem, you need to check the current state of your link profile.

How can you do that? Scan your site with WebCEO’s Backlink Checker.

Screenshot from WebCEO, September 2024

What should you be looking for there?

  • Total backlinks and linking domains. The ratio between them can give you a rough idea about how many links each domain gives you on average. If that ratio is too high (e.g. 1000 backlinks per domain), then most of those backlinks are probably of poor quality.
  • Loss of backlinks. Sometimes sites stop linking to you. Maybe they found someone with better content than yours, maybe they took down the page with the backlink, or maybe even their site died. Whatever the reason, it can affect you rankings negatively.
  • Backlink texts. A good anchor text tells users what they are going to find on the other side of the link. If it fails to do that, fewer people will click on the link. Look for non-descriptive anchor texts (such as “click here”) that are keeping your rankings down – changing those texts can be just what you need.
  • Harmful backlinks. Spammy links from low-quality sites will do you no favor. If you have too many toxic backlinks, you will have to take them down.

Knowing the state of your link profile opens two different paths to improving it: link building and link detoxification. Let’s start with the former.

7. Revise And Expand Your Link Building

If you want to gain new backlinks and increase your site rankings, you’ll want to do some link building.

Which sites give the best backlinks?

  1. They are highly authoritative;
  2. They are topically related to your site.

And the closer they fit these criteria, the harder it will be to land your backlinks there. Those sites have high standards.

Link building is a whole challenge of its own – but there are plenty of good strategies for that.

To name but a few:

  • Have high-quality (and ideally unique) content on your site that others will want to link to;
  • Find broken links on other sites and offer those sites’ owners to link to your content instead;
  • Find unlinked mentions of your site or brand and offer to add a backlink;

And we strongly encourage you to try out even more. You may find some of the link building strategies easier or more effective than others.

What About Steps 8-14?

You bet it’s just the beginning. Do you want to take up even more SEO techniques to start preparing for 2025?

Good news: the full SEO guide is exclusively available to WebCEO users in PDF format, and it’s completely free. Download it now and get a head start on your competitors!

MarketMuse Acquired By Siteimprove via @sejournal, @martinibuster

Siteimprove announced the acquisition of MarketMuse, creating a comprehensive SaaS solution for content, accessibility, and SEO. This unifies vital marketing processes, benefiting customers of both organizations with a single, integrated platform.

MarketMuse

MarketMuse is a leading AI content planning software that helps users research, plan and execute a scaled content strategy. MarketMuse enables users to analyze their content to understand if it adequately covers a topic and can scale up to analyze the entire overall topic and create content briefs that take the guesswork out of creating a content calendar, enabling an organization to be able to consistently publish high quality authoritative content.

Siteimprove

Siteimprove is a platform for analyzing content for SEO and accessibility as well as continuous site monitoring for issues.

MarketMuse’s Jeff Coyle wrote:

“I’m excited to announce that MarketMuse has entered a definitive agreement to be acquired by Siteimprove, one of the biggest players in martech!

Siteimprove’s known far and wide for assembling accessibility, digital governance, analytics, SEO, and cross-channel advertising into one platform.

The acquisition spells transformation: Marketers of all stripes will be relieved of attending to the ever-changing technical details that shroud their work. It means that you will be better able to focus on transformative strategy rather than minutiae — and build better digital experiences that are meaningful, credible, and deliver results.”

The announcement states that MarketMuse customers will have a more unified approach to SEO, Accessibility and Content Optimization from one SaaS platform.

Read more:

Breaking News: MarketMuse Enters a Definitive Agreement to be Acquired by Siteimprove!

Featured Image by Shutterstock/Cast Of Thousands

Pro-Tech SEO Checklist For Agencies via @sejournal, @JetOctopus

This post was sponsored by JetOctopus. The opinions expressed in this article are the sponsor’s own.

When you’re taking on large-scale projects or working with extensive websites with hundreds to thousands of pages, you must leverage advanced technical SEO techniques.

Large websites come with challenges such as vast site architectures, dynamic content, and the higher-stakes competition in maintaining rankings.F

Leveling up your team’s technical SEO chops can help you establish a stronger value proposition, ensuring your clients gain that extra initial edge and choose to continue growing with your agency.

With this in mind, here’s a concise checklist covering the most important nuances of advanced technical SEO that can lead your clients to breakthrough performance in the SERPs.

1. Advanced Indexing And Crawl Control

Optimizing search engine crawl and indexation is foundational for effective technical SEO. Managing your crawl budget effectively begins with log file analysis—a technique that offers direct insights into how search engines interact with your clients’ websites.

A log file analysis helps:

  • Crawl Budget Management: Essential for ensuring Googlebot crawls and indexes your most valuable pages. Log file analysis indicates how many pages are crawled daily and whether important sections are missed.
  • Identifying Non-Crawled Pages: Identifies pages Googlebot misses due to issues like slow loading times, poor internal linking, or unappealing content, giving you clear insights into necessary improvements.
  • Understand Googlebot Behavior: Know what Googlebot actually crawls on a daily basis. Spikes in the crawl budget may signal technical issues on your website, like auto-generated thin, trashy pages, etc.

For this, integrating your SEO log analyzer data with GSC crawl data provides a complete view of site functionality and search engine interactions, enhancing your ability to guide crawler behavior.

Next, structure robots.txt to exclude search engines from admin areas or low-value add-ons while ensuring they can access and index primary content. Or, use the x-robots-tag—an HTTP header—to control indexing at a more granular level than robots.txt. It is particularly useful for non-HTML files like images or PDFs, where robot meta tags can’t be used.

For large websites, the approach with sitemaps is different from what you may have experienced. It almost doesn’t make sense to put millions of URLs in the sitemaps and want Googlebot to crawl them. Instead, do this: generate sitemaps with new products, categories, and pages on a daily basis. It will help Googlebot to find new content and make your sitemaps more efficient. For instance, DOM.RIA, a Ukrainian real estate marketplace, implemented a strategy that included creating mini-sitemaps for each city directory to improve indexing. This approach significantly increased Googlebot visits (by over 200% for key pages), leading to enhanced content visibility and click-through rates from the SERPs.

2. Site Architecture And Navigation

An intuitive site structure aids both users and search engine crawlers in navigating the site efficiently, enhancing overall SEO performance.

Specifically, a flat site architecture minimizes the number of clicks required to reach any page on your site, making it easier for search engines to crawl and index your content. It enhances site crawling efficiency by reducing the depth of important content. This improves the visibility of more pages in search engine indexes.

So, organize (or restructure) content with a shallow hierarchy, as this facilitates quicker access and better link equity distribution across your site.

For enterprise eCommerce clients, in particular, ensure proper handling of dynamic parameters in URLs. Use the rel=”canonical” link element to direct search engines to the original page, avoiding parameters that can result in duplicates.

Similarly, product variations (such as color and size) can create multiple URLs with similar content. It depends on the particular case, but the general rule is to apply the canonical tag to the preferred URL version of a product page to ensure all variations point back to the primary URL for indexing. If there is a significant number of such pages where Google ignores non-canonical content and puts them in the index, consider reviewing the canonicalization approach on the website.

3. JavaScript SEO

As you know, JavaScript (JS) is crucial in modern web development, enhancing site interactivity and functionality but introducing unique SEO challenges. Even if you’re not directly involved in development, ensuring effective JavaScript SEO is important.

The foremost consideration in this regard is critical rendering path optimization — wait, what’s that?

The critical rendering path refers to the sequence of steps the browser must take to convert HTML, CSS, and JavaScript into a rendered web page. Optimizing this path is crucial for improving the speed at which a page becomes visible to users.

Here’s how to do it:

  • Reduce the number and size of the resources required to display initial content.
  • Minify JavaScript files to reduce their load time.
  • Prioritize loading of above-the-fold content to speed up page render times.

If you’re dealing with Single Page Applications (SPAs), which rely on JavaScript for dynamic content loading, then you might need to fix:

  • Indexing Issues: Since content is loaded dynamically, search engines might see a blank page. Implement Server-Side Rendering (SSR) to ensure content is visible to search engines upon page load.
  • Navigation Problems: Traditional link-based navigation is often absent in SPAs, affecting how search engines understand site structure. Use the HTML5 History API to maintain traditional navigation functionality and improve crawlability.

Dynamic rendering is another technique useful for JavaScript-heavy sites, serving static HTML versions to search engines while presenting interactive versions to users.

However, ensure the browser console shows no errors, confirming the page is fully rendered with all necessary content. Also, verify that pages load quickly, ideally under a couple of seconds or so, to prevent user frustration (nobody likes a prolonged loading spinner) and reduce bounce rates.

Employ tools like GSC and Lighthouse to test and monitor your site’s rendering and web vitals performance. Regularly check that the rendered content matches what users see to ensure consistency in what search engines index.

4. Optimizing For Seasonal Trends

In the retail eCommerce space, seasonal trends influence consumer behavior and, consequently, search queries.

So, for these projects, you must routinely adapt your SEO strategies to stay on par with any product line updates.

Seasonal product variations—such as holiday-specific items or summer/winter editions—require special attention to ensure they are visible at the right times:

  • Timely Content Updates: Update product descriptions, meta tags, and content with seasonal keywords well before the season begins.
  • Seasonal Landing Pages: Create and optimize dedicated landing pages for seasonal products, ensuring they link appropriately to main product categories.
  • Ongoing Keyword Research: Continually perform keyword research to capture evolving consumer interests and optimize new product categories accordingly.
  • Technical SEO: Regularly check for crawl errors, ensure fast load times, and confirm that new pages are mobile-friendly and accessible.

On the flip side, managing discontinued products or outdated pages is just as crucial in maintaining site quality and retaining SEO value:

  • Evaluate Page Value: Conduct regular content audits to assess whether a page still holds value. If a page hasn’t received any traffic or a bot hit in the last half-year, it might not be worth keeping.
  • 301 Redirects: Use 301 redirects to transfer SEO value from outdated pages to relevant existing content.
  • Prune Content: Remove or consolidate underperforming content to focus authority on more impactful pages, enhancing site structure and UX.
  • Informative Out-of-Stock Pages: Keep pages for seasonally unavailable products informative, providing availability dates or links to related products.

Put simply, optimizing for seasonal trends means preparing for high-traffic periods and effectively managing the transition periods. This supports sustained SEO performance and a streamlined site experience for your clients.

5. Structured Data And Schema Implementation

Structured data via schema.org markup is a powerful tool to enhance a site’s SERP visibility and boost CTR through rich snippets.

Advanced schema markup goes beyond basic implementation, allowing you to present more detailed and specific information in SERPs. Consider these schema markups in your next client campaign:

  • Nested Schema: Utilize nested schema objects to provide more detailed information. For example, a Product schema can include nested Offer and Review schemas to display prices and reviews in search results.
  • Event Schema: For clients promoting events, implementing an Event schema with nested attributes like startDate, endDate, location, and offers can help in displaying rich snippets that show event details directly in SERPs.
  • FAQ and How-To Pages: Implement FAQPage and HowTo schemas on relevant pages to provide direct answers in search results.
  • Ratings, Reviews, and Prices: Implement the AggregateRating and Review schema on product pages to display star ratings and reviews. Use the Offer schema to specify pricing information, making the listings more attractive to potential buyers.
  • Availability Status: Use the ItemAvailability schema to display stock status, which can increase the urgency and likelihood of a purchase from SERPs.
  • Blog Enhancements: For content-heavy sites, use Article schema with properties like headline, author, and datePublished to enhance the display of blog articles.

Use Google’s Structured Data Testing Tool tool to test your pages’ structured data and identify any errors/warnings in your schema implementation. Also, use Google’s Rich Results Test to get feedback on how your page may appear in SERPs with the implemented structured data.

Conclusion

Considering their long SEO history and legacy, enterprise-level websites require more profound analysis from different perspectives.

We hope this mini checklist serves as a starting point for your team to take a fresh look into your new and existing customers and help deliver great SEO results.


Image Credits

Featured Image: Image by JetOctopus. Used with permission.

In-Post Images: Image by JetOctopus. Used with permission.

SEO Reinvented: Responding To Algorithm Shifts via @sejournal, @pageonepower

A lot has been said about the remarkable opportunities of Generative AI (GenAI), and some of us have also been extremely vocal about the risks associated with using this transformative technology.

The rise of GenAI presents significant challenges to the quality of information, public discourse, and the general open web. GenAI’s power to predict and personalize content can be easily misused to manipulate what we see and engage with.

Generative AI search engines are contributing to the overall noise, and rather than helping people find the truth and forge unbiased opinions, they tend (at least in their present implementation) to promote efficiency over accuracy, as highlighted by a recent study by Jigsaw, a unit inside Google.

Despite the hype surrounding SEO alligator parties and content goblins, our generation of marketers and SEO professionals has spent years working towards a more positive web environment.

We’ve shifted the marketing focus from manipulating audiences to empowering them with knowledge, ultimately aiding stakeholders in making informed decisions.

Creating an ontology for SEO is a community-led effort that aligns perfectly with our ongoing mission to shape, improve, and provide directions that truly advance human-GenAI interaction while preserving content creators and the Web as a shared resource for knowledge and prosperity.

Traditional SEO practices in the early 2010s focused heavily on keyword optimization. This included tactics like keyword stuffing, link schemes, and creating low-quality content primarily intended for search engines.

Since then, SEO has shifted towards a more user-centric approach. The Hummingbird update (2013) marked Google’s transition towards semantic search, which aims to understand the context and intent behind search queries rather than just the keywords.

This evolution has led SEO pros to focus more on topic clusters and entities than individual keywords, improving content’s ability to answer multiple user queries.

Entities are distinct items like people, places, or things that search engines recognize and understand as individual concepts.

By building content that clearly defines and relates to these entities, organizations can enhance their visibility across various platforms, not just traditional web searches.

This approach ties into the broader concept of entity-based SEO, which ensures that the entity associated with a business is well-defined across the web.

Fast-forward to today, static content that aims to rank well in search engines is constantly transformed and enriched by semantic data.

This involves structuring information so that it is understandable not only by humans but also by machines.

This transition is crucial for powering Knowledge Graphs and AI-generated responses like those offered by Google’s AIO or Bing Copilot, which provide users with direct answers and links to relevant websites.

As we move forward, the importance of aligning content with semantic search and entity understanding is growing.

Businesses are encouraged to structure their content in ways that are easily understood and indexed by search engines, thus improving visibility across multiple digital surfaces, such as voice and visual searches.

The use of AI and automation in these processes is increasing, enabling more dynamic interactions with content and personalized user experiences.

Whether we like it or not, AI will help us compare options faster, run deep searches effortlessly, and make transactions without passing through a website.

The future of SEO is promising. The SEO service market size is expected to grow from $75.13 billion in 2023 to $88.91 billion in 2024 – a staggering CAGR of 18.3% (according to The Business Research Company) – as it adapts to incorporate reliable AI and semantic technologies.

These innovations support the creation of more dynamic and responsive web environments that adeptly cater to user needs and behaviors.

However, the journey hasn’t been without challenges, especially in large enterprise settings. Implementing AI solutions that are both explainable and strategically aligned with organizational goals has been a complex task.

Building effective AI involves aggregating relevant data and transforming it into actionable knowledge.

This differentiates an organization from competitors using similar language models or development patterns, such as conversational agents or retrieval-augmented generation copilots and enhances its unique value proposition.

Imagine an ontology as a giant instruction manual for describing specific concepts. In the world of SEO, we deal with a lot of jargon, right? Topicality, backlinks, E-E-A-T, structured data – it can get confusing!

An ontology for SEO is a giant agreement on what all those terms mean. It’s like a shared dictionary, but even better. This dictionary doesn’t just define each word. It also shows how they all connect and work together. So, “queries” might be linked to “search intent” and “web pages,” explaining how they all play a role in a successful SEO strategy.

Imagine it as untangling a big knot of SEO practices and terms and turning them into a clear, organized map – that’s the power of ontology!

While Schema.org is a fantastic example of a linked vocabulary, it focuses on defining specific attributes of a web page, like content type or author. It excels at helping search engines understand our content. But what about how we craft links between web pages?

What about the query a web page is most often searched for? These are crucial elements in our day-to-day work, and an ontology can be a shared framework for them as well. Think of it as a playground where everyone is welcome to contribute on GitHub similar to how the Schema.org vocabulary evolves.

The idea of an ontology for SEO is to augment Schema.org with an extension similar to what GS1 did by creating its vocabulary. So, is it a database? A collaboration framework or what? It is all of these things together. SEO ontology operates like a collaborative knowledge base.

It acts as a central hub where everyone can contribute their expertise to define key SEO concepts and how they interrelate. By establishing a shared understanding of these concepts, the SEO community plays a crucial role in shaping the future of human-centered AI experiences.

SEOntology snapshot
Screenshot from WebVowl, August 2024SEOntology – a snapshot (see an interactive visualization here).

The Data Interoperability Challenge In The SEO Industry

Let’s start small and review the benefits of a shared ontology with a practical example (here is a slide taken from Emilija Gjorgjevska’s presentation at this year’s ZagrebSEOSummit)

Data Interoperability ChallengeImage from Emilija Gjorgjevska’s, ZagrebSEOSummit, August 2024

Imagine your colleague Valentina uses a Chrome extension to export data from Google Search Console (GSC) into Google Sheets. The data includes columns like “ID,” “Query,” and “Impressions” (as shown on the left). But Valentina collaborates with Jan, who’s building a business layer using the same GSC data. Here’s the problem: Jan uses a different naming convention (“UID,” “Name,” “Impressionen,” and “Klicks”).

Now, scale this scenario up. Imagine working with n different data partners, tools, and team members, all using various languages. The effort to constantly translate and reconcile these different naming conventions becomes a major obstacle to effective data collaboration.

Significant value gets lost in just trying to make everything work together. This is where an SEO ontology comes in. It is a common language, providing a shared name for the same concept across different tools, partners, and languages.

By eliminating the need for constant translation and reconciliation, an SEO ontology streamlines data collaboration and unlocks the true value of your data.

The Genesis Of SEOntology

In the last year, we have witnessed the proliferation of AI Agents and the wide adoption of Retrieval Augmented Generation (RAG) in all its different forms (Modular, Graph RAG, and so on).

RAG represents an important leap forward in AI technology, addressing a key limitation of traditional large language models (LLMs) by letting them access external knowledge.

Traditionally, LLMs are like libraries with one book – limited by their training data. RAG unlocks a vast network of resources, allowing LLMs to provide more comprehensive and accurate responses.

RAGs improve factual accuracy, and context understanding, potentially reducing bias. While promising, RAG faces challenges in data security, accuracy, scalability, and integration, especially in the enterprise sector.

For successful implementation, RAG requires high-quality, structured data that can be easily accessed and scaled.

We’ve been among the first to experiment with AI Agents and RAG powered by the Knowledge Graph in the context of content creation and SEO automation.

Agent WordLiftScreenshot from Agent WordLift, August 2023

Knowledge Graphs (KGs) Are Indeed Gaining Momentum In RAG Development

Microsoft’s GraphRAG and solutions like LlamaIndex demonstrate this. Baseline RAG struggles to connect information across disparate sources, hindering tasks requiring a holistic understanding of large datasets.

KG-powered RAG approaches like the one offered by LlamaIndex in conjunction with WordLift address this by creating a knowledge graph from website data and using it alongside the LLM to improve response accuracy, particularly for complex questions.

LlamaIndex in conjunction with WordLiftImage from author, August 2024

We have tested workflows with clients in different verticals for over a year.

From keyword research for large editorial teams to the generation of question and answers for ecommerce websites, from content bucketing to drafting the outline of a newsletter or revamping existing articles, we’ve been testing different strategies and learned a few things along the way:

1. RAG Is Overhyped

It is simply one of many development patterns that achieve a goal of higher complexity. A RAG (or Graph RAG) is meant to help you save time finding an answer. It’s brilliant but doesn’t solve any marketing tasks a team must handle daily. You need to focus on the data and the data model.

While there are good RAGs and bad RAGs, the key differentiation is often represented by the “R” part of the equation: the Retrieval. Primarily, the retrieval differentiates a fancy demo from a real-world application, and behind a good RAG, there is always good data. Data, though, is not just any type of data (or graph data).

It is built around a coherent data model that makes sense for your use case. If you build a search engine for wines, you need to get the best dataset and model the data around the features a user will rely on when looking for information.

So, data is important, but the data model is even more important. If you are building an AI Agent that has to do things in your marketing ecosystem, you must model the data accordingly. You want to represent the essence of web pages and content assets.

Only some data vs Good dataImage from author, August 2024

2. Not Everyone Is Great At Prompting

Expressing a task in written form is hard. Prompt engineering is going at full speed towards automation (here is my article on going from prompting to prompt programming for SEO) as only a few experts can write the prompt that brings us to the expected outcome.

This poses several challenges for the design of the user experience of autonomous agents. Jakon Nielsen has been very vocal about the negative impact of prompting on the usability of AI applications:

“One major usability downside is that users must be highly articulate to write the required prose text for the prompts.”

Even in rich Western countries, statistics provided by Nielsen tell us that only 10% of the population can fully utilize AI! 

Simple Prompt Using Chain-of-Thought (CoT) More Sophisticated Prompt Combining Graph-of-Thought (GoT) and Chain-of-Knowledge (CoK)
“Explain step-by-step how to calculate the area of a circle with a radius of 5 units.” “Using the Graph-of-Thought (GoT) and Chain-of-Knowledge (CoK) techniques, provide a comprehensive explanation of how to calculate the area of a circle with a radius of 5 units. Your response should: Start with a GoT diagram that visually represents the key concepts and their relationships, including: Circle Radius Area Pi (π) Formula for circle area Follow the GoT diagram with a CoK breakdown that: a) Defines each concept in the diagram b) Explains the relationships between these concepts c) Provides the historical context for the development of the circle area formula Present a step-by-step calculation process, including: a) Stating the formula for the area of a circle b) Explaining the role of each component in the formula c) Showing the substitution of values d) Performing the calculation e) Rounding the result to an appropriate number of decimal places Conclude with practical applications of this calculation in real-world scenarios. Throughout your explanation, ensure that each step logically follows the previous one, creating a clear chain of reasoning from basic concepts to the final result.” This improved prompt incorporates GoT by requesting a visual representation of the concepts and their relationships. It also employs CoK by asking for definitions, historical context, and connections between ideas. The step-by-step breakdown and real-world applications further enhance the depth and practicality of the explanation.”

3. You Shall Build Workflows To Guide The User

The lesson learned is that we must build detailed standard operating procedures (SOP) and written protocols that outline the steps and processes to ensure consistency, quality, and efficiency in executing particular optimization tasks.

We can see empirical evidence of the rise of prompt libraries like the one offered to users of Anthropic models or the incredible success of projects like AIPRM.

In reality, we learned that what creates business value is a series of ci steps that help the user translate the context he/she is navigating in into a consistent task definition.

We can start to envision marketing tasks like conducting keyword research as a Standard Operating Procedure that can guide the user across multiple steps (here is how we intend the SOP for keyword discovery using Agent WordLift)

4. The Great Shift To Just-in-Time UX 

In traditional UX design, information is pre-determined and can be organized in hierarchies, taxonomies, and pre-defined UI patterns. As AI becomes the interface to the complex world of information, we’re witnessing a paradigm shift.

UI topologies tend to disappear, and the interaction between humans and AI remains predominantly dialogic. Just-in-time assisted workflows can help the user contextualize and improve a workflow.

  • You need to think in terms of business value creation, focus on the user’s interactive journey, and facilitate the interaction by creating a UX on the fly. Taxonomies remain a strategic asset, but they operate behind the scenes as the user is teleported from one task to another, as recently brilliantly described by Yannis Paniaras from Microsoft.
The Shift to Just-In-Time UX: How AI is Reshaping User Experiences”Image from “The Shift to Just-In-Time UX: How AI is Reshaping User Experiences” by Yannis Paniaras, August 2024

5. From Agents To RAG (And GraphRAG) To Reporting

Because the user needs a business impact and RAG is only part of the solution, the focus quickly shifts from more generic questions and answering user patterns to advanced multi-step workflows.

The biggest issue, though, is what outcome the user needs. If we increase the complexity to capture the highest business goals, it is not enough to, let’s say, “query your data” or “chat with your website.”

A client wants a report, for example, of what is the thematic consistency of content within the entire website (this is a concept that we recently discovered as SiteRadus in Google’s massive data leak), the overview of the seasonal trends across hundreds of paid campaigns, or the ultimate review of the optimization opportunities related to the optimization of Google Merchant Feed.

You must understand how the business operates and what deliverables you will pay for. What concrete actions could boost the business? What questions need to be answered?

This is the start of creating a tremendous AI-assisted reporting tool.

How Can A Knowledge Graph (KG) Be Coupled With An Ontology For AI Alignment, Long-term Memory, And Content Validation?

The three guiding principles behind SEOntology:

  • Making SEO data interoperable to facilitate the creation of knowledge graphs while reducing unneeded crawls and vendor locked-in;
  • Infusing SEO know-how into AI agents using a domain-specific language.
  • Collaboratively sharing knowledge and tactics to improve findability and prevent misuse of Generative AI.

When you deal with at least two data sources in your SEO automation task, you will already see the advantage of using SEOntology.

SEOntology As “The USB-C Of SEO/Crawling Data”

Standardizing data about content assets, products, user search behavior, and SEO insights is strategic. The goal is to have a “shared representation” of the Web as a communication channel.

Let’s take a step backward. How does a Search Engine represent a web page? This is our starting point here. Can we standardize how a crawler would represent data extracted from a website? What are the advantages of adopting standards?

Practical Use Cases

Integration With Botify And Dynamic Internal Linking

Over the past few months, we’ve been working closely with the Botify team to create something exciting: a Knowledge Graph powered by Botify’s crawl data and enhanced by SEOntology. This collaboration is opening up new possibilities for SEO automation and optimization.

Leveraging Existing Data With SEOntology

Here’s the cool part: If you’re already using Botify, we can tap into that goldmine of data you’ve collected. No need for additional crawls or extra work on your part. We use the Botify Query Language (BQL) to extract and transform the needed data using SEOntology.

Think of SEOntology as a universal translator for SEO data. It takes the complex information from Botify and turns it into a format that’s not just machine-readable but machine-understandable. This allows us to create a rich, interconnected Knowledge Graph filled with valuable SEO insights.

What This Means for You

Once we have this Knowledge Graph, we can do some pretty amazing things:

  • Automated Structured Data: We can automatically generate structured data markup for your product listing pages (PLPs). This helps search engines better understand your content, potentially improving your visibility in search results.
  • Dynamic Internal Linking: This is where things get really interesting. We use the data in the Knowledge Graph to create smart, dynamic internal links across your site. Let me break down how this works and why it’s so powerful.

In the diagram below, we can also see how data from Botify can be blended with data from Google Search Console.

While in most implementations, Botify already imports this data into its crawl projects, when this is not the case, we can trigger a new API request and import clicks, impressions, and positions from GSC into the graph.

Collaboration With Advertools For Data Interoperability

Similarly, we collaborated with the brilliant Elias Dabbas, creator of Advertools — a favorite Python library among marketers – to automate a wide range of marketing tasks.

Our joint efforts aim to enhance data interoperability, allowing for seamless integration and data exchange across different platforms and tools.

In the first Notebook, available in the SEOntology GitHub repository, Elias showcases how we can effortlessly construct attributes for the WebPage class, including title, meta description, images, and links. This foundation enables us to easily model complex elements, such as internal linking strategies. See here the structure:

  • Internal_Links
    • anchorTextContent
    • NoFollow
    • Link

We can also add a flag if the page is already using schema markup:

  • usesSchema

Formalizing What We Learned From The Analysis Of The Leaked Google Search Documents

While we want to be extremely conscious in deriving tactics or small schemes from Google’s massive leak, and we are well aware that Google will quickly prevent any potential misuse of such information, there is a great level of information that, based on what we learned, can be used to improve how we represent web content and organize marketing data.

Despite these constraints, the leak offers valuable insights into improving web content representation and marketing data organization. To democratize access to these insights, I’ve developed a Google Leak Reporting tool designed to make this information readily available to SEO pros and digital marketers.

For instance, understanding Google’s classification system and its segmentation of websites into various taxonomies has been particularly enlightening. These taxonomies – such as ‘verticals4’, ‘geo’, and ‘products_services’ – play a crucial role in search ranking and relevance, each with unique attributes that influence how websites and content are perceived and ranked in search results.

By leveraging SEOntology, we can adopt some of these attributes to enhance website representation.

Now, pause for a second and imagine transforming the complex SEO data you manage daily through tools like Moz, Ahrefs, Screaming Frog, Semrush, and many others into an interactive graph. Now, envision an Autonomous AI Agent, such as Agent WordLift, at your side.

This agent employs neuro-symbolic AI, a cutting-edge approach that combines neural learning capabilities with symbolic reasoning, to automate SEO tasks like creating and updating internal links. This streamlines your workflow and introduces a level of precision and efficiency previously unattainable.

SEOntology serves as the backbone for this vision, providing a structured framework that enables the seamless exchange and reuse of SEO data across different platforms and tools. By standardizing how SEO data is represented and interconnected, SEOntology ensures that valuable insights derived from one tool can be easily applied and leveraged by others. For instance, data on keyword performance from SEMrush could inform content optimization strategies in WordLift, all within a unified, interoperable environment. This not only maximizes the utility of existing data but also accelerates the automation and optimization processes that are crucial for effective marketing.

Infusing SEO Know-How Into AI Agents

As we develop a new agentic approach to SEO and digital marketing, SEOntology serves as our domain-specific language (DSL) for encoding SEO skills into AI agents. Let’s look at a practical example of how this works.

GraphQL Query Generator and ValidatorScreenshot from WordLift, August 2024

We’ve developed a system that makes AI agents aware of a website’s organic search performance, enabling a new kind of interaction between SEO professionals and AI. Here’s how the prototype works:

System Components

  • Knowledge Graph: Stores Google Search Console (GSC) data, encoded with SEOntology.
  • LLM: Translates natural language queries into GraphQL and analyzes data.
  • AI Agent: Provides insights based on the analyzed data.

Human-Agent Interaction

Human, LLM, Knowledge Graph, AI Agent interactionImage from author, August 2024

The diagram illustrates the flow of a typical interaction. Here’s what makes this approach powerful:

  • Natural Language Interface: SEO professionals can ask questions in plain language without constructing complex queries.
  • Contextual Understanding: The LLM understands SEO concepts, allowing for more nuanced queries and responses.
  • Insightful Analysis: The AI agent doesn’t just retrieve data; it provides actionable insights, such as:
    • Identifying top-performing keywords.
    • Highlighting significant performance changes.
    • Suggesting optimization opportunities.
  • Interactive Exploration: Users can ask follow-up questions, enabling a dynamic exploration of SEO performance.

By encoding SEO knowledge through SEOntology and integrating performance data, we’re creating AI agents that can provide context-aware, nuanced assistance in SEO tasks. This approach bridges the gap between raw data and actionable insights, making advanced SEO analysis more accessible to professionals at all levels.

This example illustrates how an ontology like SEOntology can empower us to build agentic SEO tools that automate complex tasks while maintaining human oversight and ensuring quality outcomes. It’s a glimpse into the future of SEO, where AI augments human expertise rather than replacing it.

Human-In-The-Loop (HTIL) And Collaborative Knowledge Sharing

Let’s be crystal clear: While AI is revolutionizing SEO and Search, humans are the beating heart of our industry. As we dive deeper into the world of SEOntology and AI-assisted workflows, it’s crucial to understand that Human-in-the-Loop (HITL) isn’t just a fancy add-on—it’s the foundation of everything we’re building.

The essence of creating SEOntology is to transfer our collective SEO expertise to machines while ensuring we, as humans, remain firmly in the driver’s seat. It’s not about handing over the keys to AI; it’s about teaching it to be the ultimate co-pilot in our SEO journey.

Human-Led AI: The Irreplaceable Human Element

SEOntology is more than a technical framework – it’s a catalyst for collaborative knowledge sharing that emphasizes human potential in SEO. Our commitment extends beyond code and algorithms to nurturing skills and expanding the capabilities of new-gen marketers and SEO pros.

Why? Because AI’s true power in SEO is unlocked by human insight, diverse perspectives, and real-world experience. After years of working with AI workflows, I’ve realized that agentive SEO is fundamentally human-centric. We’re not replacing expertise; we’re amplifying it.

We deliver more efficient and trustworthy results by blending cutting-edge tech with human creativity, intuition, and ethical judgment. This approach builds trust with clients within our industry and across the web.

Here’s where humans remain irreplaceable:

  • Understanding Business Needs: AI can crunch numbers but can’t replace the nuanced understanding of business objectives that seasoned SEO professionals bring. We need experts who can translate client goals into actionable SEO strategies.
  • Identifying Client Constraints: Every business is unique, with its limitations and opportunities. It takes human insight to navigate these constraints and develop tailored SEO approaches that work within real-world parameters.
  • Developing Cutting-Edge Algorithms: The algorithms powering our AI tools don’t materialize out of thin air. We need brilliant minds to develop state-of-the-art algorithms, learn from human input, and continually improve.
  • Engineering Robust Systems: Behind every smooth-running AI tool is a team of software engineers who ensure our systems are fast, secure, and reliable. This human expertise keeps our AI assistants running like well-oiled machines.
  • Passion for a Better Web: At the heart of SEO is a commitment to making the web a better place. We need people who share Tim Berners’s—Lee’s vision—people who are passionate about developing the web of data and improving the digital ecosystem for everyone.
  • Community Alignment and Resilience: We need to unite to analyze the behavior of search giants and develop resilient strategies. It’s about solving our problems innovatively as individuals and as a collective force. This is what I always loved about the SEO industry!

Extending The Reach Of SEOntology

As we continue to develop SEOntology, we’re not operating in isolation. Instead, we’re building upon and extending existing standards, particularly Schema.org, and following the successful model of the GS1 Web Vocabulary.

SEOntology As An Extension Of Schema.org

Schema.org has become the de facto standard for structured data on the web, providing a shared vocabulary that webmasters can use to markup their pages.

However, while Schema.org covers a broad range of concepts, it doesn’t delve deeply into SEO-specific elements. This is where SEOntology comes in.

An extension of Schema.org, like SEOntology, is essentially a complementary vocabulary that adds new types, properties, and relationships to the core Schema.org vocabulary.

This allows us to maintain compatibility with existing Schema.org implementations while introducing SEO-specific concepts not covered in the core vocabulary.

Learning From GS1 Web Vocabulary

The GS1 Web Vocabulary offers a great model for creating a successful extension that interacts seamlessly with Schema.org. GS1, a global organization that develops and maintains supply chain standards, created its Web Vocabulary to extend Schema.org for e-commerce and product information use cases.

The GS1 Web Vocabulary demonstrates, even recently, how industry-specific extensions can influence and interact with schema markup:

  • Real-world impact: The https://schema.org/Certification property, now officially embraced by Google, originated from GS1’s https://www.gs1.org/voc/CertificationDetails. This showcases how extensions can drive the evolution of Schema.org and search engine capabilities.

We want to follow a similar approach to extend Schema.org and become the standard vocabulary for SEO-related applications, potentially influencing future search engine capabilities, AI-driven workflows, and SEO practices.

Much like GS1 defined their namespace (gs1:) while referencing schema terms, we have defined our namespace (seovoc:) and are integrating the classes within the Schema.org hierarchy when possible.

The Future Of SEOntology

SEOntology is more than just a theoretical framework; it’s a practical tool designed to empower SEO professionals and tool makers in an increasingly AI-driven ecosystem.

Here’s how you can engage with and benefit from SEOntology.

If you’re developing SEO tools:

  • Data Interoperability: Implement SEOntology to export and import data in a standardized format. This ensures your tools can easily interact with other SEOntology-compliant systems.
  • AI-Ready Data: By structuring your data according to SEOntology, you’re making it more accessible for AI-driven automations and analyses.

If you’re an SEO professional:

  • Contribute to Development: Just like with Schema.org, you can contribute to SEOntology’s evolution. Visit its GitHub repository to:
    • Raise issues for new concepts or properties you think should be included.
    • Propose changes to existing definitions.
    • Participate in discussions about the future direction of SEOntology.
  • Implement in Your Work: Start using SEOntology concepts in your structured data.

In Open Source We Trust

SEOntology is an open-source effort, following in the footsteps of successful projects like Schema.org and other shared linked vocabularies.

All discussions and decisions will be public, ensuring the community has a say in SEOntology’s direction. As we gain traction, we’ll establish a committee to steer its development and share regular updates.

Conclusion And Future Work

The future of marketing is human-led, not AI-replaced. SEOntology isn’t just another buzzword – it’s a step towards this future. SEO is strategic for the development of agentive marketing practices.

SEO is no longer about rankings; it’s about creating intelligent, adaptive content and fruitful dialogues with our stakeholders across various channels. Standardizing SEO data and practices is strategic to build a sustainable future and to invest in responsible AI.

Are you ready to join this revolution?

There are three guiding principles behind the work of SEOntology that we need to make clear to the reader:

  • As AI needs semantic data, we need to make SEO data interoperable, facilitating the creation of knowledge graphs for everyone. SEOntology is the USB-C of SEO/crawling data. Standardizing data about content assets and products and how people find content, products, and information in general is important. This is the first objective. Here, we have two practical use cases. We have a connector for WordLift that gets crawl data from the Botify crawler and helps you jump-start a KG that uses SEOntology as a data model. We are also working with Advertools, an open-source crawler and SEO tool, to make data interoperable with SEOntology;
  • As we progress with the development of a new agentic way of doing SEO and digital marketing, we want to infuse the know-how of SEO using SEOntology, a domain-specific language to infuse the SEO mindset to SEO agents (or multi-agent systems like Agent WordLift). In this context, the skill required to create dynamic internal links is encoded as nodes in a knowledge graph, and opportunities become triggers to activate workflows.
  • We expect to work with human-in-the-loop HITL, meaning that the ontology will become a way to collaboratively share knowledge and tactics that help improve findability and prevent the misuse of Generative AI that is polluting the Web today.

Project Overview

This work on SEOntology is the product of collaboration. I extend my sincere thanks to the WordLift team, especially CTO David Riccitelli. I also appreciate our clients for their dedication to innovation in SEO through knowledge graphs. Special thanks to Milos Jovanovik and Emilia Gjorgjevska for their critical expertise. Lastly, I’m grateful to the SEO community and the SEJ editorial team for their support in sharing this work.

More resources: 


Featured Image: tech_BG/Shutterstock

Google Rolls Out AI-Organized Search Results Pages via @sejournal, @MattGSouthern

Google is introducing AI-organized search results pages in the United States.

The new feature, set to launch this week, returns a full page of multi-format results personalized for the searcher.

Google’s announcement states:

“This week, we’re rolling out search results pages organized with AI in the U.S. — beginning with recipes and meal inspiration on mobile. You’ll now see a full-page experience, with relevant results organized just for you. You can easily explore content and perspectives from across the web including articles, videos, forums and more — all in one place.”

Key Features

The AI-organized pages will compile various content types, including articles, videos, and forum discussions.

Google claims this approach will provide users with a more diverse range of information sources and perspectives.

In its announcement, Google adds:

“… with AI-organized search results pages, we’re bringing people more diverse content formats and sites, creating even more opportunities for content to be discovered.”

Industry Implications

While Google touts the benefits of AI-organized search results pages, the update raises several questions:

  1. How will the AI-organized pages affect traffic to individual websites? Keeping users on Google’s results page might reduce clicks to source websites.
  2. With AI determining content organization, there are concerns about potential biases in how information is presented.
  3. The new format may require new strategies to ensure visibility within these AI-organized results.
  4. It’s unclear how this change will impact ad visibility.

This update could alter how we approach SEO. We may need to adapt strategies to ensure content is discoverable and presentable in this new format.

Microsoft’s Bing recently announced an expansion of its generative search capabilities, focusing on handling complex, informational queries. Google’s reorganizing of entire results pages appears to be a unique offering compared to Bing’s.

The initial rollout focusing on mobile devices for recipe and meal-related queries aligns with Google’s mobile-first indexing approach.

It remains to be seen how this feature will translate to desktop searches.

Google’s Response to Industry Concerns

In light of the questions raised by this update, we contacted Google for clarification on several key points.

Impact on Search Console Tracking

Regarding how AI-organized search results will be tracked in Google Search Console, a Google spokesperson stated:

“We do not separate traffic by every feature in Search Console, but publishers will continue to see their traffic from Search reflected there. Check out the supported search appearances in our documentation.”

This suggests that while specific metrics will not be available for AI-organized pages, site owners will still be able to access overall traffic data.

Timeline for Expansion

When asked about the timeline for expanding this feature to other categories and regions, Google responded:

“When we previewed this feature, we mentioned expanding this to additional categories including dining, movies, music, books, hotels, and shopping. No further details to share at this time.”

While this confirms expansion plans, Google has not provided specific timelines for these rollouts.

Guidance for SEO Professionals and Content Creators

On whether new tools or guidance will be provided for optimizing content for AI-organized search results, Google emphasized that no changes are necessary:

“SEO professionals and creators don’t need to do anything differently. Search results pages organized with AI are rooted in our core Search ranking and quality systems, which we have been honing for decades to surface high quality information.”

This response suggests that existing SEO best practices should continue to be effective for visibility in these new result formats.

Looking Ahead

Google’s responses provide some clarity but also leave room for speculation.

The lack of specific tracking for AI-organized pages in Search Console may present challenges for SEO professionals in understanding the direct impact of this new feature on their traffic.

The confirmation of plans to expand to other categories like dining, movies, music, books, hotels, and shopping indicates that this update could have far-reaching effects across various industries.

Despite Google’s assurances, new best practices may emerge as the SEO community adapts to this significant change in search result presentation.

We here at SEJ will closely monitor the rollout and report on its effects and what it means for you in the coming months. Sign up for the SEJ newsletter to stay up to date.


Featured Image: JarTee/Shutterstock

An Introduction To SEO Strategy For A Digital Presence

This edited extract is from Digital and Social Media Marketing: A Results-Driven Approach  edited by Aleksej Heinze, Gordon Fletcher, Ana Cruz, Alex Fenton ©2024 and is reproduced with permission from Routledge. The extract below was taken from the chapter Using Search Engine Optimisation to Build Trust co-authored with Aleksej Heinze, Senior Professor at KEDGE Business School, France.

The key challenge for SEO is that good rankings in SERPs are almost entirely based on each search engine’s private algorithm for identifying high-quality content and results, which is a long-term activity.

The initial formula of PageRank (Page et al. 1999) used by Google, which used links pointing to a page to rank its importance, has evolved significantly and is no longer publicly available.

All search engines regularly update their algorithms to identify high-quality, relevant content to a particular search query. Google implements around 500 – 600 changes to its algorithm each year (Gillespie 2019).

These are product updates, similar to Windows updates. Most of these changes are minor with little impact, but a few critical core updates each year will require careful review on the majority of websites since they can result in major SERP changes.

Search engines are using artificial intelligence to improve their technology to enable them to identify high-quality, relevant content and are constantly testing new ways to present users with relevant content.

The arrival of ChatGPT by Open AI in 2022 presents a rival type of offering that has shaken the foundations of the traditional search engine business model (Poola 2023).

In such a dynamic environment, it is important to keep up to date with algorithm changes.

This can be done by following the Google Search Status dashboard (Google) and SEO-related blog posts and monitoring, including the MOZ algorithm change calendar (Moz).

How Search Engines Work 

In essence, a search engine’s crawler, spider, robot or ‘bot’ discovers web page links, and then internally determines if there is value in analysing the links.

Then, the bot automatically retrieves the content behind each link (including more links). This process is called crawling.

Bots may then add the discovered pages to the search engines’s index to be retrieved when a user searches for something.

The ranking order in which the links appear in SERPs is calculated by the engine’s algorithm, which examines the relevance of the content to the query.

This relevance is determined by a combination of over 200 factors such as the visible text, keywords, the position and relationship of words, links, synonyms and semantic entities (Garg 2022).

When the user of a search engine types in a query, they are presented with a list of links to content that the engine calculates will satisfy the intent of the query – the list of results is the SERP.

Typically, the list of results that are shown in SERPs includes a mix of paid-for and organic results. Each link includes a short URL, title and description, as well as other options such as thumbnail images, videos and other related internal site links.

Search engines are constantly making changes to SERPs to improve the experience for those searching. For example, Bing includes Bing Chat, allowing responses to be offered by their AI bot.

Google introduced a knowledge graph or a summary answer box, found underneath the search box on the right of the organic search results.

The Bing Chat as well as Google knowledge graph provide a direct and relevant summary response to a query without the need for a further click to the source page (and retaining the user at the search engine).

This offering leads to so-called 0-click searches, which cannot be tracked in the data relating to a digital presence and are only seen in data that relates content visibility to SERPs.

Some Google SERP snippets can also appear as a knowledge graph (Figure 12.8) or a search snippet (Figure 12.9).

Figure 12.8: Google SERP for “KEDGE Business School” including a knowledge graph on the right-hand side of the page (Google and the Google logo are trademarks of Google LLC).Figure 12.8: Google SERP for “KEDGE Business School” including a knowledge graph on the right-hand side of the page.
Figure 12.9: Search snippet for Jean Reno (Google and the Google logo are trademarks of Google LLC).Figure 12.9: Search snippet for Jean Reno.

The volatility of the SERPs can be evidenced by the varying results produced by the same search in different locations.

The listing for the US market (Figure 12.10) and carousel for the European market (Figure 12.11) for “best DJs” shows that geolocation increasingly comes into play in the page ranking of SERPs.

Personalisation is also relevant. For example, when a user is logged into a Google product, their browser history influences the organic SERPs. SERPs change depending on what terms are used.

This means a pluralised term produces different SERPs to searches that use the singular term.

Tools, such as those offered by Semrush, include functionality to quickly identify this form of volatility and understand sectors that are being affected by changes.

Figure 12.10: US results for “best DJs” (Google and the Google logo are trademarks of Google LLC).Figure 12.10: US results for “best DJs”
Figure 12.11: European results for “best DJs” (Google and the Google logo are trademarks of Google LLC)Figure 12.11: European results for “best DJs”

Recent innovations by Google include the search generative experience (SGE) currently being tested in the US market. This is a different search experience that is more visual and uses artificial intelligence.

The 2015 introduction of RankBrain and other algorithms means that Google now better understands human language and context.

Industry publications, including Search Engine Roundtable and Search Engine Land, keep pace with this dynamic landscape.

Implementing Search Engine Optimisation 

Identification of the most relevant search terms is the starting point for developing a website map and themes for content.

The search terms will also define the focus for individual pages and blog posts. This approach has a focus on the technical/on-page, content, and off-page aspects of the website.

Any SEO activity begins with prior knowledge of the organisation, including its objectives and targets as well as the persona that has been defined.

The initial phase of optimising a website for Google search involves:

  1. A technical and content audit.
  2. Keyword identification and analysis.
  3. Implementing any changes in the content management system (CMS) and content.
  4. Using the secure HTTPS protocol for the website.
  5. Submitting the website to Google Search Console.
  6. Submitting the website to Bing Webmaster Tools.
  7. Submitting the website to other appropriate search engines.
  8. Adding website tracking code such as Google Analytics, Hotjar or others to the website.

Summary

SEO plays a critical role in enhancing an organisation’s digital presence, and the dynamic nature of search engine algorithms provides a way to address the immediate pain touchpoints of a persona.

This focused around the imperative for organisations to offer content that not only resonates with a persona’s needs but also aligns with the evolving criteria of search engines like Google, Baidu or Bing.

This latter alignment is crucial given the stakeholder tendency to focus only on the first SERP. It is important to adhere to ethical SEO practices employing ‘White Hat SEO’ tactics that comply with search engine guidelines, as opposed to more manipulative techniques.

There is a need for continuous monitoring and reviewing of any SEO activities.

Frequently changing search engine algorithms, which now heavily incorporate AI and machine learning, means that a campaign’s parameters can change quickly. SEO is not a “set and forget” activity.

Staying informed and adapting to these changes is essential for maintaining and improving search engine rankings.

The environmental impact of digital activities should also be a consideration in SEO and wider marketing practices, optimising websites not only aligns with SEO best practices but also contributes to sustainability.

Search engines offer marketers one of the largest big data sets available to refine and target their content creation activities.

Historic search behaviours are good predictors of the future, and the use of these resources helps marketers to optimise and be better placed to offer value to their persona.


To read the book, SEJ readers have an exclusive 20% discount until the end of 2024 using the code DSMM24 at Routledge.

The book officially launches on October 7 2024 and you can attend the event with a chance to hear from some of the authors by registering through this link.

More resources:


Featured Image: Sutthiphong Chandaeng/Shutterstock

Google Rolls Out CrUX Vis Core Web Vitals Tool via @sejournal, @martinibuster

Google rolled out a new Core Web Vitals tool called CrUX Vis that shows you hidden patterns in performance scores and offers guidance on what to improve. The data is sourced from the CrUX dataset which is based on actual user experiences on the URLs and websites that are analyzed and explored in the new tool.

CrUX

The new tool is based on the CrUX dataset which is what the Core Web Vitals scores are based on.

Chrome’s documentation of CrUX explains:

“The Chrome User Experience Report (also known as the Chrome UX Report, or CrUX for short) is a dataset that reflects how real-world Chrome users experience popular destinations on the web.

CrUX is the official dataset of the Web Vitals program. All user-centric Core Web Vitals metrics are represented.

CrUX data is collected from real browsers around the world, based on certain browser options which determine user eligibility. A set of dimensions and metrics are collected which allow site owners to determine how users experience their sites.

The data collected by CrUX is available publicly through a number of Google tools and third-party tools and is used by Google Search to inform the page experience ranking factor.

Not all origins or pages are represented in the dataset. There are separate eligibility criteria for origins and pages, primarily that they must be publicly discoverable and there must be a large enough number of visitors in order to create a statistically significant dataset.”

Debugging Core Web Vitals

Improving website performance scores may not offer the direct ranking benefit that many SEOs and publishers hoped it would but it’s still the same critical factor to get right it’s always been. High performance scores improve earnings, ad clicks, conversions, user experience, website popularity and virtually every goal an SEO and publisher has for a site, including indirect benefits to rankings. A site can still limp along with poor performance scores but it will not be living up to its full earnings potential.

Although tools based on Chrome’s Lighthouse offer performance snapshots and estimated scores those tools were unable to provide a sense of how the site was performing over time or provide a breakout of important performance metrics to gauge whether performance is trending up or down.

CrUX Vis

Chrome’s new tool is called CrUX Vis, a data visualization tool that enables users to visualize the Chrome User Experience data (CrUX). CrUX Vis provides an entirely new way to understand website performance and gain a big picture view of what’s going on at the URL and website level (called origin).

The different variables for what is visualized can be changed in the section at the top of the page called Controls, covering data, device and period.

Screenshot Of CrUX Vis Controls

Segment Data By Multiple Variables

As seen in the screenshot above, the data can be segmented in three ways:

  1. Data
    Performance scores can be viewed by origin (the entire site) or by URL
  2. Device
    Data can be segmented and visualized by mobile, data and a combined view.
  3. Period (Date Range)
    The tool currently allows data visualization by 25 overlapping time periods stretching back about six months. It currently shows performance visualizations from 3/17/2024 through 09/28/2024.

Five Views Of Metrics

There are five ways to analyze the data, covering core web vitals, three categories of metrics and all metrics combined. These variables are accessible on left hand navigation panel on the desktop UI (user interface).

  1. Core Web Vitals
  2. Loading Performance
  3. Interactivity
  4. Visual Stability
  5. All Metrics Combined

Visualizing Data

The visualization for Core Web Vitals shows a time-based trend graph that’s colored with green, yellow, and pink. Green is good and pink is not good.

The three core web vitals are represented by a circle, squate and a triangle:

  • Circle = Largest Contentful Paint (LCP):
  • Square = Interaction to Next Paint (INP)
  • Triangle = Cumulative Layout Shift (CLS)

The desktop UI (user interface) shows the trend graph and a summary on the left and a text explanation on the right.

Screenshot Of User Interface

The graph offers a visual snapshot of which direction the core web vitals are moving and an explanation of the kind of trend for each metric.

The three kinds of trends are:

  1. Good And Improving
  2. Good And Stable
  3. Poor And Regressing

Screenshot Showing CWV Performance

A more comprehensive explanation of the data is to the right of the trend graph, with each metric identified by the circle, square, and triangle icons.

Screenshot Of Data Explanation

Loading Performance

Using the left hand navigation to get to the Loading Performance screen shows another trend graph that offers additional metrics related to how fast the site or URL loads.

It offers the following six visualizations:

  • Largest Contentful Paint (LCP)
  • First Contentful Paint (FCP)
  • Time to First Byte (TTFB)
  • Round Trip Time (RTT)
  • Navigation Types
  • Form Factors

Screenshot Of Six Visualization Choices

There’s a toggle next to each choice:

Clicking the toggle shows the trend graph:

The rest of the choices show similar breakdowns of each kind of metric.

The new CrUX Vis tool should be useful to publishers and digital marketers who want to get an accurate measurement of website performance, visualized as a trend. It’s useful for competitior research and for website audits.

Go check it out at:

CrUX Vis

Featured Image by Shutterstock/Krakenimages.com

Google’s Search Liaison Addresses Brand Bias Concerns via @sejournal, @MattGSouthern

In a recent interview with Aleyda Solis, Google’s Search Liaison, Danny Sullivan, discussed the company’s approach to ranking smaller websites versus larger brands.

This topic has long been a point of contention, with concerns that Google’s ranking systems favor brands over independent sites.

Fairness In Search Results

Sullivan claims that Google doesn’t inherently favor brands, stating:

“Our ranking systems aren’t saying ‘are you a big brand therefore you rank’… The core of it isn’t really whether you’re big or you’re small, the core of it is whether you have the most useful, the most relevant, most satisfying information.”

The Perception Problem

Despite Google’s stance, Sullivan acknowledged the widespread perception that larger, well-established sites have an advantage in search results.

He recognized the frustration of smaller site owners who feel they cannot compete with bigger brands for visibility.

Sullivan states:

“I have looked at cases where people say you don’t like small sites, and I am not taking away from any of the real concerns because they are there… I wish they were doing better, but I can also see co-occurring in some of the same queries that I’m given other independent sites that are doing well.”

Challenges & Improvements

Sullivan admitted that Google’s systems sometimes fail to recognize high-quality content from smaller sites.

He assured that the company is actively improving this aspect of its algorithms.

Sullivan said:

“We don’t want it to be only the very big things rank well and I think in the last update we did talk about how we were taking in a lot of these concerns and trying to understand how we can do more for some of the smaller sites, the so-called independent sites.”

Advice For Smaller Sites

For independent website owners feeling discouraged, Sullivan offered some advice: focus on developing your brand.

He advised:

“If you’re a smaller site that feels like you haven’t really developed your brand, develop it. That’s not because we’re going to rank you because of your brand, but because it’s probably the things that cause people externally to recognize you as a good brand may in turn co-occur or be alongside the kinds of things that our ranking systems are kind of looking to reward.”

On advice for content creators, Sullivan adds:

“Just keep listening to your heart and doing what it is that you think is the right thing to be doing… Our ranking systems are trying to reward great content that’s made for people and if you feel like you’re doing that, then we’re going to try to catch up to you.”

Looking Ahead

Google appears to be taking these concerns seriously.

Sullivan mentioned that recent updates have aimed to do more for smaller sites. However, he maintains that Google’s goal is to show the best content regardless of brand recognition.

While challenges remain, Google’s acknowledgment of the issue and efforts to improve suggests a potential shift with future updates.

Hear Sullivan’s full statements in the video below:


Featured Image: rudall30/Shutterstock

Google’s SEO Tip To Get New Site Picked Up Faster via @sejournal, @martinibuster

Google’s John Mueller offered a useful for technical SEO tip for those launching a new site that will help your site get picked up by Google faster by avoiding this one common mistake.

High Priority For Site Launch

Launching a website is a chance to take everything learned from previous experiences and apply them with the benefit of hindsight. There’s no better teacher for success than failure because lessons learned from mistakes are never forgotten.

Someone who recently registered a new domain started a discussion on Reddit asking what were the top three considerations for launching a successful website before anything else has been done. The person asking the question preemptively ruled out the obvious answer of adding the domain to Google Search Console and set the ground rule that the niche or type of business didn’t matter. What did matter is that the suggestions must be important for scaling traffic within the first six month of the website.

They asked:

“Let’s say you have a brand new domain and you’ve been given a task to build traffic in the next 6 months. The niche, business does not matter, and the basics like ‘adding domain to Google search console’ don’t matter.

Tell me what are the first 3, high-priority things you’ll implement.”

The Most Upvoted Answer

It’s somewhat surprising that the most upvoted answer, with 83 votes, was one that offered the most obvious suggestions.

The top upvoted answer was:

“Create landing pages/content for your lowest funnel keyword opportunities and work the the way up.”

It’s a matter of course that the information architecture of the site should be planned out ahead of time (things like keywords, topics, key pages, a complete org-chart style map of categories with room left for expanding topical coverage, and an interlinking strategy). The upvoted answer is absolutely correct but it’s also fairly obvious.

The rest of that highly upvoted response:

“Claim brand on top social medias.

Build easiest citations and directories that I know get indexed. Plus niche relevant ones.

Start reactive digital PR as main initial link building campaign.”

The obviousness of that upvoted answer is in contrast with the not so obvious quality of Mueller’s response.

John Mueller Advice On SEO Preparation

John Mueller’s advice is excellent and offers an insight into a technical issue that is easy to overlook.

He wrote:

“Just throwing this out there – if you don’t have a site ready, either keep DNS disabled or put up a custom holding page. Don’t use a generic server / CMS holding page. It generally takes longer for a site that’s known to be parked / duplicate to get recognized as a normal site than it does for a site to be initially picked up.”

Keep DNS Disabled

DNS stands for Domain Name System and is a reference to the backend process of converting a domain name to the IP address where the actual content exists. All content exists at an IP address, not at the domain name. The domain name just points to where the content is. By keeping DNS disabled what happens is that Google doesn’t discover the domain pointing to anything so it essentially doesn’t exist.

Don’t Use Generic Server/CMS Holding Page

A generic server holding page is the same as a parked domain, it’s like a false signal to Google that something exists at the IP address that a domain name resolves to.

The effect of Mueller’s advice regarding disabling a DNS and not using a generic holding page is to keep the domain name from resolving to a holding page (assuming that a registrar’s holding page is also turned off). This keeps Google from sniffing out the domain and finding a generic “nothing here” holding page.

Mueller’s advice points to the technical issue that Google will recognize and index a site faster if a generic version is never activated and the domain name essentially doesn’t exist.

So if you want your website to be picked up and indexed quickly then it’s best to not use a generic domain holding page.

Read Mueller’s advice here:

Brand New Domain : What are the first 3 things you’ll do?

Featured Image by Shutterstock/Luis Molinero