Some marketers now measure and attribute the cost of labor, technology, and services to individual promotional channels.
The concept is simple. Many companies track only revenue from marketing channels without considering the expense of managing them. The result is often misleading bottom-line performance.
Channel Comparison
Imagine a business with two marketing channels, A and B, each costing $1,000. Both generate 3,000 interactions from potential customers. However, Channel A converts at 2.5%, while Channel B converts at 4%.
If both channels had a $75 average order value and a 25% gross profit margin, Channel A would produce $406 in profit, and Channel B would earn $1,250. Channel B is the clear winner when compared in this way.
Channel A
Channel B
Promotional Cost
$1,000
$1,000
Interactions
3,000
3,000
Conv. Rate
2.50%
4.00%
Orders
75
120
Avg. Order Value
$75
$75
Sales Generated
$5,625
$9,000
Margin
25%
25%
Gross Revenue
$1,406
$2,250
Profit
$406
$1,250
Just about every business would take the $1,000 invested in Channel A and double down on Channel B. After all, Channel B produces about three times as much profit.
This is often the proper choice, but not always.
Marketing Budgets
There’s more to marketing expenses than advertising or accessing a channel.
Let’s apply this idea to Channel A and Channel B. Suppose each channel is a demand-side platform (DSP), wherein marketers choose from a list of potential publishers.
DSP A allows marketers to pick some basic targeting demographics, but there’s little a specialist could do to optimize performance. It is a set-it-and-forget-it sort of platform.
On the other hand, DSP B has 100 targeting options that can be compared, fine-tuned, and optimized.
DSP B’s platform provides real-time data with Slack notifications every time a campaign’s conversion rate changes.
The marketing specialist spends about 30 minutes a month setting up the simplistic DSP A but about an hour a day monitoring, studying, and tweaking DSP B.
If the marketing specialist earns $50 an hour, DSP A costs about $25 per month in labor. Given 20 working days a month and an hour per day spent monitoring and optimizing, DSP B takes $1,000 in labor to run.
When counting labor, DSP A generates $381 in profit compared to DSP B’s $250. DSP A is the clear winner.
DSP A
DSP B
Promotional Cost
$1,000
$1,000
Interactions
3,000
3,000
Conv. Rate
2.50%
4.00%
Orders
75
120
Avg. Order Value
$75
$75
Sales Generated
$5,625
$9,000
Margin
25%
25%
Gross Revenue
$1,406
$2,250
Labor Cost
$25
$1,000
Profit
$381
$250
Applying the Concept
Beyond labor, other expenses — e.g., software, creative design, agency retainers — could also change a channel’s return on investment, although not every expense is ongoing. Some are one-time or upfront charges that go away.
Thus, when attributing marketing expenses by channel:
Decide what to measure. Labor, software, or simply the cost of an ad or promotion?
Choose when to measure. Should the channel be measured per interaction? Or would monthly work better?
Plan for upfront expenses. Should upfront expenses be amortized? If so, over what period? How will channels with amortized costs compare with those having ongoing expenses?
Manage sensitive information. Some costs are sensitive or private. Will salaries be shared, or will the labor portion of the equation be closely held?
Decide how you will measure. Should marketers use time-tracking software?
Document the process. Record what, when, and how results are measured.
Collect only essential data. There’s no need to track labor or software costs if they don’t impact marketing decisions.
Lastly, remember that sometimes the cure can be worse than the illness. Attributing expenses by channel can drive performance at the cost of damaging staff morale. So attribute with prudence.
Press release distribution is a mainstay in ecommerce search engine optimization. Public relations consultants issue press releases for announcements or news, while search engine optimizers use them to acquire backlinks. But a digital PR consultant aims for both: announcements and links.
For fees ranging from $50 to $2,500 per press release, distribution services can get a release republished on other websites. Sometimes these are reputable news organizations. Other times, they’re low-traffic sites posing as news.
There are a few large press release distributors and dozens of smaller middlemen that syndicate to their own news circuits. Distributing a press release can be complicated due to decisions on the reach, the length of the announcement, and the number of accompanying images, logos, and videos
First-timers can waste a lot of money paying for services they don’t need. Smaller distributors are more willing to assist and provide advisory services. Larger distributors have less patience, especially for low-volume customers.
Search Optimized
“Press releases are a great way to boost ecommerce rankings, but you need to know how to use them,” says Maj Hussain, owner of Magic PR, one of the smaller press release distributors.
The process of search optimizing a press release and a web page is similar. Select a keyword and its variations and support them with related entities and semantic terms in the headline, sub-headings, and body text.
A press release is usually search-optimized for the same keyword and tactics as the page it links to and typically includes three components: (i) a news angle, (ii) optimized body and anchor text, (iii) and hyperlinks.
The example release below is a news angle optimized for an unbranded keyword (“Sebastopol Wine Tasting.”)
This release is optimized for “Sebastopol Wine Tasting,” an unbranded keyword.
And here’s an example of my own release for backlink acquisition optimized for a compound keyword: a brand (“Eric Schwartzman”) and a category (“SEO consultant”).
Maj Hussain of Magic PR suggests that ecommerce merchants start with a branded press release to establish authority, followed by product or category releases to drive traffic and link equity.
Branded releases promote the store’s name and link to its home page. Once the brand and home page rank, target your product listing pages, where the sales occur.
Press Coverage vs. Backlinks
The most powerful ecommerce press releases spur third-party editorial coverage with links to product-detail or category pages. A media-savvy digital PR consultant can help.
For newsworthy scoops, I’ve negotiated exclusives with a single editorial outlet. That site gets to break the story first in exchange for a prominent link in the article.
Google considers links from credible news outlets as authoritative and arranges its indexing system accordingly. I addressed the issue in July, explaining how the Google API leak suggests a three-tiered link storage hierarchy that places the most valuable, highly trafficked pages in RAM storage for faster access.
News articles with traffic are more likely to be stored on Google’s RAM tier.
But even if it doesn’t score original news coverage, a press release can still be aggregated by a group of websites that post it unchanged. If the aggregators are legit, this is a reliable method of acquiring backlinks.
Anchor Text
How you link from the press release to an ecommerce page matters. Qamar Zaman, founder of Kiss PR Brand Story, recommends linking “product name as anchor text to a product detail page and including a separate brand anchor link to the home page.”
He prefers no more than one link per 100 words, adding, “Naked URLs work very well, especially in ecommerce, because they drive engagement and link power directly to the page.”
Naked URLs start with https:// followed by the domain you’re linking to. The text of the URL remains in the body even if an aggregator strips out the link. Google may consider it active nonetheless.
“It’s also smart to use partial-match keywords or compound links that include both the brand and product name,” says Zaman. A compound ecommerce anchor could be the company name plus a product category, wherein both a store name and product category link to the same page.
Press Release Pick-ups
A press release is “picked up” when it is republished (“aggregated”) on another website.
Not all press release distributors have relationships with aggregators that can deliver high-authority backlinks. PR Newswire, Business Wire, and GlobeNewswire are three of the larger distributors. They satisfy the Securities and Exchange Commission’s fair disclosure requirements (Regulation FD), which are imposed on publicly traded companies to ensure no one investor has an unfair trading advantage based on selectively revealed information.
Publicly traded companies comply by using one of the distribution or “wire” services that discloses to the Associated Press, Dow Jones, Bloomberg, and Reuters simultaneously.
Distributors that comply with Reg FD generally have superior distribution. But they’re more expensive and have stricter editorial guidelines, employing editors who review and approve releases before distributing them. I offer free training on satisfying Reg FD with social media.
Some of the smaller press release companies buy extended distribution through wires such as GlobeNewswire and then pad their pick-up reports with links from publishers with little traffic. The Associated Press aggregates press releases, but its backlinks are no-followed.
Aggregators have their own (undisclosed) rules and conditions. Many are low-quality, quasi-news sites that programmatically publish the same content. The sites frequently use the same templates with different mastheads. Smaller distributors often send their releases to these “news” aggregators to fluff up the number of pick-ups to justify their fees.
It is unlikely that Google will index press release pick-ups on these sites, much less store them on its RAM index.
The goal is press release pick-ups from known news outlets. “Press releases distributed through major outlets like Yahoo News and Bloomberg can attract attention from Google’s search engine and get those critical backlinks indexed,” Hussain adds. “The result is higher rankings, more traffic, and increased visibility for ecommerce sites.”
Google News
Press releases featured in Google News Top Stories or Google Discover can significantly boost rankings, but it’s not guaranteed.
“Press releases can show up in Top Stories, but they typically have to be picked up by major publications like MarketWatch or Bloomberg,” says Zaman. “A content creator with an established site has a better chance of getting into Google News than newer sites.”
Press Release Pitfalls
Press releases must follow distributors’ guidelines. And that depends on the editor’s experience.
I recently had a release with the headline “Predatory Practices of Military Credit Card Issuers Exposed” rejected by an editor at one wire service, only to be approved with no changes by an editor at another. The rules are opaque and unevenly enforced.
The sites a release links to must be approved too. “Editors won’t approve releases that promote third-party brands without a direct relationship,” says Hussain. “But it can still work depending on how you structure the release.” Announcing a brand-specific sale or promotion at your online store is one way to get a brand name in the headline.
Another challenge of press releases (and all content) is link decay. You must continuously acquire quality backlinks as their value decreases over time, especially from lower-tier sites.
“Many smaller sites with low domain authority can initially get indexed, but over time, their impact on rankings fades,” says Zaman. “The key is securing high-authority placements and building links from topical, relevant sources.”
Ecommerce SEO
Despite these challenges, press releases remain a valuable tool for building ecommerce brand authority, securing backlinks, and boosting organic traffic.
Start with brand-focused releases. Establish authority and recognition by targeting the home page.
Target product detail and category pages. Focus on specific categories and product pages to drive direct traffic and conversions. Use compound anchors and naked links to target pages.
Maintain link quality. Use tools such as Ahrefs, Semrush, and Moz to assess the quality and authority of backlinks. Be wary of press release providers that bundle quasi-news pick-ups with one or two known outlets.
Leverage major news outlets. Target top-tier publications to increase the chances of appearing in Google’s most valuable index tiers, securing high-authority, long-lasting links. Examine distributors’ news circuits carefully to determine how many reputable news sites aggregate their releases.
Struggling to rank for your target keywords? You’re not alone.
The SEO landscape is more complex than ever, with search intent evolving and SERP features constantly changing.
So, how do you make sure your content aligns with Google’s evolving expectations?
Check out our webinar on September 25, 2024: “Navigating SERP Complexity: How to Leverage Search Intent for SEO.”
Tom Capper of STAT will discuss the role of search intent in SEO and how to use it to climb in the right SERPs for your brand.
Why This Webinar Is A Must-Attend Event
Ranking isn’t just about keywords anymore—it’s about understanding the intent behind each search.
We’ll cover:
How intent is nuanced, and many keywords can support multiple intents.
Why the same keyword can have a different intent depending where it was searched from, and on what device.
The differences in SERP features depending on intent, and how this impacts your content strategy.
Expert Insights From Tom Capper
Leading this session is Tom Capper from STAT Search Analytics.
Capper will dive deep into searcher motivations using first-party research data and provide actionable insights to help you improve your site’s organic visibility.
Reserve your spot and find out more about how these insights can impact your ranking.
Who Should Attend?
This webinar is perfect for:
SEO professionals looking to take their strategies to the next level
Content managers and strategists wanting to increase the effectiveness of their work
Enterprise professionals and digital marketers looking to blend branding, marketing, and SEO for a unified customer experience
Anyone interested in search results and consumer behavior
Live Q&A: Get Your Questions Answered
Following the presentation, Tom will host a live Q&A session.
This is your chance to clarify misconceptions surrounding the intersection of content, search intent, and the SERPs and get expert advice on optimizing your strategies.
Don’t Miss Out!
Understanding search intent is critical to staying competitive in SEO. Reserve your spot today to ensure you’re not left behind.
Can’t attend live? Sign up anyway for the recording.
Get ready to unlock new SEO opportunities and boost your rankings. See you there!
WordPress security company Patchstack announced a round of $5 million USD funding and the addition of Joost de Valk, co-founder of Yoast SEO, to their board. The funding will accelerate the development of Patchstack toward becoming the fastest full-cycle security solution.
Patchstack – Trusted Security Partner
Patchstack, based in Estonia, is a fast growing WordPress security company that is trusted by major web hosts, plugins and websites around the world. It recently released a free security tool for open-source software vendors that helps them comply with the upcoming European Cyber Resilience Act compliance.
Patchstack is a highly regarded WordPress security company that is trusted by customers such as GoDaddy, Digital Ocean, Plesk, and cPanel and is a security partner with over 300 WordPress plugins such as Elementor, WP Rocket, WP Bakery Page Builder and Slider Revolution.
Patchstack provides security scans for over five million websites every day and offers a free plugin for vulnerability detection and a low cost real-time protection (starting at $5 per website/month).
The announcement by Patchstack offers details of the $5 million dollar funding:
“Estonian cybersecurity startup Patchstack who in 2022 received €2.7M R&D grant from European Innovation Council announced an additional 5 million USD funding round to further their mission of covering the entire lifecycle of open-source security to provide the fastest mitigation to the emerging security threats.
Patchstack’s Series A round was led by Karma Ventures, an early-stage venture capital fund focusing on deep-tech software companies, with participation from G+D Ventures, the German TrustTech investor, and Emilia Capital, the investment firm of Yoast founders Marieke van de Rakt and Joost de Valk.”
Joost de Valk commented to Search Engine Journal:
“Patchstack is really an amazing company and product. I recently joined their board.”
He’s right, Patchstack currently prevents millions of vulnerability attacks and should be on the shortlist of security solutions for every WordPress website. Although WordPress security is not considered an SEO-related concern it actually should be an important factor of every SEO audit because all it takes is one major vulnerability event to lose the trust of customers and site visitors which can impact earnings and rankings.
Since its launch in 2021, Google’s Performance Max (PMax) campaigns have revolutionized how we approach cross-channel advertising.
But while this AI-powered campaign has been a game-changer for many, particularly in ecommerce, its application for lead generation comes with unique challenges and opportunities.
I thought this would be a good time to share advanced strategies for leveraging PMax in lead generation campaigns. We’ll also identify common pitfalls and reveal how to maximize your results.
The Lead Gen Challenge: Quality Over Quantity
Unlike ecommerce, where a purchase marks a clear end to the customer journey, lead generation is just the beginning.
This fundamental difference creates a significant hurdle for PMax campaigns, presenting unique challenges that require advanced strategies.
Menachem Ani, founder of JXT Group, explains the core issue:
“Unlike e-commerce, where a purchase signifies the ‘end’ of the transaction, lead creation is the beginning of the sales process—and just because someone fills out a form doesn’t make them a quality lead.”
The challenge lies in teaching Google’s algorithm to distinguish between high-quality and low-quality leads.
Without this crucial information, PMax campaigns can fall into what some marketers call the “feedback loop of doom” – optimizing for quantity over quality and potentially flooding your pipeline with unqualified leads.
Image from author, September 2024
Another critical factor in determining whether PMax suits your lead generation efforts is Google’s understanding of your business and website.
PMax heavily relies on your website content when targeting users. If Google misinterprets your service offerings, your campaigns are likely to underperform.
To assess Google’s comprehension of your business, try this simple test: Input your website URL into the Search Terms section of Google’s Keyword Planner.
Review the generated keywords and evaluate their relevance to your services. This exercise can provide valuable insights into how well Google understands your business, which is crucial for PMax’s success in lead generation campaigns.
Cracking The Code: Strategies For Success
1. Harness The Power Of Offline Conversion Tracking (OCT)
The key to success with PMax for lead generation lies in feeding Google’s algorithm with quality data. This means implementing robust OCT.
“To make Performance Max for lead generation work, it is vital to instruct Google on the outcome. If you don’t, it will seek to bring in as many submissions as possible – regardless of the quality.”
Import offline conversion data, either manually or automatically.
Define what a qualified lead is for your business and make the form difficult to complete.*
Set qualified leads as your primary conversion point.
* You can do this by adding several qualifying questions, setting up confirmation pages for qualified leads versus unqualified leads, and only counting the qualified leads.
2. Leverage Cross-Campaign Optimization
Use data from successful campaigns across other channels to inform your PMax strategy. This is especially useful when dealing with limited data in new PMax campaigns.
Practical tips:
Incorporate broader themes and topics related to your products or services as asset signals rather than specific keywords. For example, if you sell home renovation products, use signals like “home improvement,” “interior design,” or “home renovations” instead of specific product keywords.
Note: It’s important to understand that PMax uses these signals differently than traditional keyword targeting. The goal is to help Google understand your potential customers’ broader context and interests, not to target specific search terms.
Collaborate with your sales and marketing teams to incorporate insights from persona studies and campaign success stories
3. Rethink Your Campaign Structure
It’s vital to understand PMax’s strengths and limitations. Performance Max excels at capturing existing demand but struggles with generating new demand.
To leverage this characteristic effectively, consider implementing an advanced tactic:
Run a PMax campaign alongside a separate Search campaign that targets high-quality, intent-driven search keywords. This approach allows you to:
Use traditional Search campaigns to capture high-intent traffic based on specific keywords.
Let PMax excel at what it does best: Remarketing to this high-quality traffic and working on converting them.
Action steps:
Identify your highest-performing, intent-driven keywords from historical data.
Create a focused Search campaign using these keywords.
Set up a complementary PMax campaign.
Ensure your audience settings allow PMax to remarket to users interacting with your Search ads.
Monitor both campaigns closely, adjusting budget allocation based on performance.
This strategy lets you maintain control over your highest-value search terms while leveraging PMax’s optimization and remarketing capabilities. It creates precise targeting of traditional Search and the broad reach and AI-driven optimization of PMax.
Remember, viewing these campaigns as complementary rather than competitive is key. Doing so can create a more comprehensive and effective lead generation strategy that captures high-intent traffic and then nurtures it effectively toward conversion.
4. Harness The Power Of First-Party Data
Your first-party data is gold for PMax campaigns. It provides Google with clear signals about who is genuinely interested in your products or services.
Steps to implement:
Add high-value customer lists from your CRM.
Utilize existing remarketing lists.
If compliant with privacy regulations, import contacts who have engaged with your email campaigns.
5. Strategic Exclusions
While PMax limits our control compared to traditional search campaigns, we can still guide the algorithm by telling it where not to show our ads.
Tactics to consider:
Exclude poor-performing keywords and placements at the account level.
Use Google’s Insights reports to identify and exclude irrelevant search terms and placements.
Pitfalls To Avoid
1. Neglecting Lead Scoring
Not all leads are created equal. Implement a lead scoring system in your CRM and feed this data back to Google. This allows the algorithm to optimize for lead quality, not just quantity.
2. Ignoring The Full Funnel
Remember that lead generation is just the start. Track and optimize for downstream metrics like qualified leads, meetings set, and eventual sales.
3. Over-Reliance On Automation
While PMax is highly automated, it still requires human oversight and strategic input. Review performance regularly, adjust audience signals, and refine your creative assets.
4. Neglecting Creative Assets
PMax relies heavily on your creative input. Ensure you provide a diverse range of high-quality assets, including compelling ad copy, eye-catching images, and engaging videos.
5. Leverage Negative Keywords And URL Controls
While Performance Max limits traditional keyword targeting, Google now allows advertisers to use negative keyword lists in these campaigns.
This feature is crucial for lead generation efforts, serving two essential purposes:
Brand Protection: Create a list of your branded keywords and exclude them in PMax. As Brooke Osmundson explains, “At the very minimum, create a list of targeted brand keywords and exclude them in PMax. This allows your tried-and-true search campaign to run and optimize per usual, without PMax cannibalizing any existing efforts”.
Competitor Exclusion: You can also exclude competitor brand terms if you believe Google might show your ads for these searches inappropriately.
Additionally, pay close attention to URL settings. Consider disabling URL Expansion in your PMax settings, as this will help you retain control over what landing pages are used. For lead generation, sending users to the right landing page is crucial for lead quality.
If you choose to keep URL Expansion enabled, be sure to exclude irrelevant pages like blogs, recruitment pages, or ‘About’ sections to maintain control over the user journey.
These controls allow you to shape your PMax campaign more precisely, ensuring it complements your existing strategies and maintains the quality of your lead generation efforts.
6. Overlooking Placement And Asset Control
Reviewing the display placements and excluding irrelevant websites from showing your ads is essential. Typically, gaming sites and kids’ apps are the first things you want to exclude.
Also, be wary of Google’s automatically created assets. At my PPC agency, Hop Skip Media, we’ve seen instances where Google automatically creates YouTube videos of inferior quality. Make sure you turn those off and review assets regularly.
7. Misunderstanding Google’s Comprehension Of Your Business
Before fully committing to PMax for lead generation, assess whether Google accurately understands your business and website content.
You can do this by inputting your website URL into the Search Terms section of the Keyword Planner and evaluating the relevance of the generated keywords.
If there’s a significant mismatch, PMax may need help to target the right audience for your lead generation efforts.
8. Neglecting To Define And Track Quality Leads
Make your lead generation forms more discriminating by including qualifying questions.
Set up separate confirmation pages for qualified and unqualified leads, and only count qualified leads when firing your conversion pixel. This approach helps ensure that PMax optimizes for quality leads rather than just quantity.
By avoiding these pitfalls and implementing the strategies we’ve discussed, you can significantly improve the effectiveness of your Performance Max campaigns for lead generation.
Remember, the key is continually providing Google’s algorithm with high-quality data and maintaining strategic oversight of your campaigns.
The Future Of PMax For Lead Gen
Like anything in Google Ads, Performance Max for lead generation is not a set-it-and-forget-it solution. It requires a strategic approach, continuous optimization, and a deep understanding of your lead qualification process.
By implementing offline conversion tracking, leveraging first-party data, structuring campaigns thoughtfully, and avoiding common pitfalls, you can harness the power of Google’s AI to generate more leads and better leads.
As Google continues to refine and improve Performance Max, we can expect even more sophisticated targeting and optimization capabilities.
The marketers who will succeed are those who stay ahead of the curve, continuously testing, learning, and adapting their strategies.
Remember, the goal isn’t just to fill your funnel – it’s to fill it with suitable leads. With these advanced strategies, you’re well-equipped to make Performance Max a powerful tool in your lead generation arsenal.
Google’s John Mueller affirmed in a LinkedIn post that two site characteristics that could be perceived as indicative of site quality aren’t ranking factors, suggesting that other perceived indicators of quality may not be either.
Site Characteristics And Ranking Factors
John Mueller posted something interesting on LinkedIn because it offers insight into how an attribute of quality sometimes isn’t enough to be an actual ranking factor. His post also encourages a more realistic consideration of what should be considered a signal of quality and what is simply a characteristic of a site.
The two characteristics of site quality that Mueller discussed are valid HTML and typos (typographical errors, commonly in reference to spelling errors). His post was inspired by an analysis of 200 home pages of the most popular websites that found that only 0.5% of which had valid HTML. That means that out of the 200 of the most popular sites, only 1 home page was written with valid HTML.
John Mueller said that a ranking factor like valid HTML would be a low bar, presumably because spammers can easily create web page templates that use valid HTML. Mueller also made the same observation about typos.
Valid HTML
Valid HTML means that the code underlying a web page follows all of the rules for how HTML should be used. What constitutes valid HTML is defined by the W3C (World Wide Web Consortium), the international standards making body for the web. HTML, CSS, and Web Accessibility are examples of standards that the W3C creates. The validity of HTML can be tested at the W3C Markup Validation Service which is available at validator.w3.org.
Is Valid HTML A Ranking Factor?
The post begins by stating that a commonly asked question is whether valid HTML is a ranking factor or some other kind of factor for Google Search. It’s a valid question because valid HTML could be seen as a characteristic of quality.
He wrote:
“Every now and then, we get questions about whether “valid HTML” is a ranking factor, or a requirement for Google Search.
Jens has done regular analysis of the validity of the top websites’ homepages, and the results are sobering.”
The phrase, “the results are sobering” means that the results that most home pages use invalid HTML is surprising and possibly cause for consideration.
Given how virtually all content management systems do not generate valid HTML, I’m somewhat surprised that even one site out of 200 used valid HTML. I would expect a number closer to zero.
Mueller goes on to note that valid HTML is a low bar for a ranking factor:
“…this is imo a pretty low bar. It’s a bit like saying professional writers produce content free of typos – that seems reasonable, right? Google also doesn’t use typos as a ranking factor, but imagine you ship multiple typos on your homepage? Eww.
And, it’s trivial to validate the HTML that a site produces. It’s trivial to monitor the validity of important pages – like your homepage.”
Ease Of Achieving Characteristic Of Quality
There have been many false signals of quality promoted and abandoned by SEOs, the most recent one being “authorship” and “content reviews” that are supposed to show that an authoritative author wrote an article and that the article was checked by someone who is authoritative. People did things like invent authors with AI generated images that are associated to fake LinkedIn profiles in the naïve belief that adding an author to the article will trick Google into awarding ranking factor points (or whatever, lol).
The authorship signal turned out to be a misinterpretation of Google’s Search Quality Raters Guidelines and a big waste of a lot of people’s time. If SEOs had considered how easy it was to create an “authorship” signal it would have been apparent to more people that it was a trivial thing to fake.
So, one takeaway from Mueller’s post can be said to be that if there’s a question about whether something is a ranking factor, first check if Google explicitly says it’s a ranking factor and if not then consider if literally any spammer can achieve that “something” that an SEO claims is a ranking factor. If it’s a trivial thing to achieve then there’s a high likelihood it’s not a ranking factor.
There Is Still Value To Be Had From Non-Ranking Factors
The fact that something is relatively easy to fake doesn’t mean that web publishes and site owners should stop doing it. If something is good for users and helps to build trust then it’s likely a good idea to keep doing it. Just because something is not a ranking factor doesn’t invalidate the practice. It’s always a good practice in the long run to keep doing activities that build trust in the business or the content, regardless of whether it’s a ranking factor or not. Google tries to pick up on the signals that users or other websites give in order to determine if a website is high quality, useful, and helpful, so anything that generates trust and satisfaction is likely a good thing.
The other core function of search engines is to retain users.
Search engines retain users by ensuring their confidence and trust in the displayed results. Over time, they build expectations that using their platform is a safe, streamlined experience that quickly leads users to what they want.
SEO success depends on being found by your target audience for what they are looking for and consistently providing a satisfying user experience based on the context of the queries they type into search engines.
Search Is Built On Content
The core function of search engines is to help users find information. Search engines first discover webpages, they parse and render and they then add them to an index. When a user inputs a query, search engines retrieve relevant webpages in the index and then “rank” them.
Search engines need to know what pages are about and what they contain in order to serve them to the right users. In concept, they do this quite simply: They examine the content. The real process behind this is complicated, executed by automated algorithms and evaluated with human feedback.
This relationship between searchers, search engines, and websites, has come to define the internet experience for most users. Unless you know the exact URL of the website you intend to visit, you need must find it via a third party. That could be social media, a search engine, or even discovering the website offline and then typing it in. This is called a “referral,” and Google sends 64% of all website referrals in the U.S. Microsoft and Bing send the next largest amount of referrals, followed by YouTube.
Getting discovered by people who don’t already know you depends on search engines, and search engines depend on content.
At this point, whether this relationship is causal or correlative doesn’t matter. You must prioritize user experience and satisfaction because it’s a key indicator of SEO success.
Written language is still the primary way users interact with search engines and how algorithms understand websites. Google algorithms can interpret audio and videos, but written text is core to SEO functionality.
Enticing clicks and engaging users through content that satisfies their queries is the baseline of SEO. If your pages can’t do that, you won’t have success.
High-quality content and user experiences aren’t just important for SEO; they’re prerequisites.
This is true for all advertising and branding. Entire industries and careers are built on the skills to refine the right messaging and put it in front of the right people.
Evidence For The SEO Value Of Content
Google highlights the importance of content in its “SEO fundamentals” documentation. It advises that Google’s algorithms look for “helpful, reliable information that’s primarily created to benefit people,” and provides details about how to self-assess high-quality content.
In fact, Google’s analysis of the content may determine whether a page enters the index at all to become eligible to rank. If you work hard to provide a good experience and serve the needs of your users, search engines have more reason to surface your content and may do so more often.
A 2024 study in partnership between WLDM, ClickStream, and SurferSEO suggests that the quality of your coverage on a topic is highly correlated with rankings.
Content And User Behavior
Recent developments in the SEO industry, such as the Google leak, continue to highlight the value of both content and user experience.
Google values user satisfaction to determine the effectiveness and quality of webpages and does seem to use behavioral analysis in ranking websites. It also focuses on the user intent of queries and whether a specific intent is served by a particular resource.
The satisfaction of your users is, if not directly responsible for SEO performance, highly correlated with it.
Many factors affect user experience and satisfaction. Website loading speed and other performance metrics are part of it. Intrusive elements of the page on the experience are another.
Content, however, is one of the primary determiners of a “good” or “bad” experience.
Does the user find what they’re looking for? How long does it take?
Is the content accurate and complete?
Is the content trustworthy and authoritative?
The answers to these questions reflect whether the user has a good or bad experience with your content, and this determines their behavior. Bad experiences tend to result in the user leaving without engaging with your website, while good experiences tend to result in the user spending more time on the page or taking action.
This makes content critical not only to your SEO efforts on search engines but also to your website’s performance metrics. Serving the right content to the right users in the right way impacts whether they become leads, convert, or come back later.
Leaning into quality and experience is a win all around. Good experiences lead to desirable behaviors. These behaviors are strong indications of the quality of your website and content. They lead to positive outcomes for your business and are correlated with successful SEO.
What Kinds Of Content Do You Need?
Successful content looks different for each goal you have and the different specific queries you’re targeting.
Text is still the basis of online content when it comes to search. Videos are massively popular. YouTube is the second-most popular search engine in the world. However, in terms of referrals, it only sends 3.5% of referral traffic to the web in the U.S. In addition, videos have titles, and these days, most have automated transcripts. These text elements are critical for discovery.
That isn’t to say videos and images aren’t popular. Video, especially “shorts” style videos, is an increasingly popular medium. Cisco reported that video made up 82% of all internet traffic in 2022. So you absolutely shoulder consider images and video as part of your content strategy to best serve your audiences and customers.
Both can enhance text-based webpages and stand on their own on social platforms.
But for SEO, it’s critical to remember that Google search sends the most referral traffic to other websites. Text content is still the core of a good SEO strategy. Multi-modal AI algorithms are getting very good at translating information between various forms of media, but text content remains critical for several reasons:
Plain text has high accessibility. Screen readers can access it, and it can be resized easily.
Text is the easiest way for both people and algorithms to analyze semantic connections between ideas and entities.
Text doesn’t depend on device performance like videos and images might.
Text hyperlinks are very powerful SEO tools because they convey direct meaning along with the link.
It’s easier to skim through text than video.
Text content is still dominant for SEO. But you should not ignore other content. Images, for example, make for strong link building assets because they’re attractive and easily sharable. Accompanying text with images and video accommodates a variety of user preferences and can help capture attention when plain text might not.
Like everything else, it’s down to what best serves users in any given situation.
SEO Content: Serving Users Since Search Was A Thing
Search engines match content to the needs of users.
Content is one-third of this relationship: user – search engine – information.
You need content to perform SEO, and any digital marketing activity successfully.
The difficulty comes from serving that perfect content for the perfect situation.
Google provides an “About this result” panel for organic listings. It appears behind the three dots to the right of every organic result and contains data about that page.
The panel does not appear on blended results, such as “stores,” “news,” and “Interesting finds.” Sometimes the only way to tell organic results from other listings is via the three dots.
Google’s “about” panel appears behind the three dots of every organic search result.
The panel may vary slightly, but it generally includes the following sections.
Options to Interact
The top section links to the page and provides options to share it on social media, save it, remove it from your personalized results, and provide feedback.
The top section links to the page and provides options to interact with it.
Search engine practitioners debate whether these options provide additional ranking data to Google. For example, could a page rank higher if enough people saved or shared it?
I’ve seen no direct confirmation from Google, but manipulating these signals (e.g., recruiting folks to save URLs or post negative feedback on a competitor) is a bad idea.
‘About the source’
This section contains summary info about the site. The sources are usually from the site itself or Wikipedia, although I’ve also seen AI-generated snapshots in this section.
“About the source” contains summary info about the page and site.
Click the “More about this page” button for more detail, such as:
An in-depth description of the site (sometimes AI-generated),
Keywords (including related keywords) on the page matching the query,
Publish date,
Relevant images,
Links from relevant pages,
Images on those relevant pages,
Alignment with the searcher’s language and location.
For example, I searched “best waterfall trails in NY.” One of the panels included the following (I added the bold text):
“These search terms appear in the result: best, waterfall, trails.”
“Terms related to your search in the result: highest, waterfalls, falls, new york.”
“This result has images related to your search.”
“This result was published or updated recently.”
“Other websites with your search terms link to this result.”
“The result is in English.”
“This result seems relevant for searches from: United States.”
Search on key queries and then check this section for higher-ranking pages. Note their keywords and other potential factors, such as recency and visuals.
Have you ever found yourself lost in a building that felt impossible to navigate? Thoughtful building design should center on the people who will be using those buildings. But that’s no mean feat.
It’s not just about navigation, either. Just think of an office that left you feeling sleepy or unproductive, or perhaps a health center that had a less-than-reviving atmosphere. A design that works for some people might not work for others. People have different minds and bodies, and varying wants and needs. So how can we factor them all in?
To answer that question, neuroscientists and architects are joining forces at an enormous laboratory in East London—one that allows researchers to build simulated worlds. In this lab, scientists can control light, temperature, and sound. They can create the illusion of a foggy night, or the tinkle of morning birdsong.
And they can study how volunteers respond to these environments, whether they be simulations of grocery stores, hospitals, pedestrian crossings, or schools. That’s how I found myself wandering around a fake art gallery, wearing a modified baseball cap with a sensor that tracked my movements.
I first visited the Person-Environment-Activity Research Lab, referred to as PEARL, back in July. I’d been chatting to Hugo Spiers, a neuroscientist based at University College London, about the use of video games to study how people navigate. Spiers had told me he was working on another project: exploring how people navigate a lifelike environment, and how they respond during evacuations (which, depending on the situation, could be a matter of life or death).
For their research, Spiers and his colleagues set up what they call a “mocked-up art gallery” within PEARL. The center in its entirety is pretty huge as labs go, measuring around 100 meters in length and 40 meters across, with 10-meter-high ceilings in places. There’s no other research center in the world like this, Spiers told me.
The gallery setup looked a little like a maze from above, with a pathway created out of hanging black sheets. The exhibits themselves were videos of dramatic artworks that had been created by UCL students.
When I visited in July, Spiers and his colleagues were running a small pilot study to trial their setup. As a volunteer participant, I was handed a numbered black cap with a square board on top, marked with a large QR code. This code would be tracked by cameras above and around the gallery. The cap also carried a sensor, transmitting radio signals to devices around the maze that could pinpoint my location within a range of 15 centimeters.
At first, all the volunteers (most of whom seemed to be students) were asked to explore the gallery as we would any other. I meandered around, watching the videos, and eavesdropping on the other volunteers, who were chatting about their research and upcoming dissertation deadlines. It all felt pretty pleasant and calm.
That feeling dissipated in the second part of the experiment, when we were each given a list of numbers, told that each one referred to a numbered screen, and informed that we had to visit all the screens in the order in which they appeared on our lists. “Good luck, everybody,” Spiers said.
Suddenly everyone seemed to be rushing around, slipping past each other and trying to move quickly while avoiding collisions. “It’s all got a bit frantic, hasn’t it?” I heard one volunteer comment as I accidentally bumped into another. I hadn’t managed to complete the task by the time Spiers told us the experiment was over. As I walked to the exit, I noticed that some people were visibly out of breath.
The full study took place on Wednesday, September 11. This time, there were around 100 volunteers (I wasn’t one of them). And while almost everyone was wearing a modified baseball cap, some had more complicated gear, including EEG caps to measure brainwaves, or caps that use near-infrared spectroscopy to measure blood flow in the brain. Some people were even wearing eye-tracking devices that monitored which direction they were looking.
“We will do something quite remarkable today,” Spiers told the volunteers, staff, and observers as the experiment started. Taking such detailed measurements from so many individuals in such a setting represented “a world first,” he said.
I have to say that being an observer was much more fun than being a participant. Gone was the stress of remembering instructions and speeding around a maze. Here in my seat, I could watch as the data collected from the cameras and sensors was projected onto a screen. The volunteers, represented as squiggly colored lines, made their way through the gallery in a way that reminded me of the game Snake.
The study itself was similar to the pilot study, although this time the volunteers were given additional tasks. At one point, they were given an envelope with the name of a town or city in it, and asked to find others in the group who had been given the same one. It was fascinating to see the groups form. Some had the names of destination cities like Bangkok, while others had been assigned fairly nondescript English towns like Slough, made famous as the setting of the British television series The Office. At another point, the volunteers were asked to evacuate the gallery from the nearest exit.
The data collected in this study represents something of a treasure trove for researchers like Spiers and his colleagues. The team is hoping to learn more about how people navigate a space, and whether they move differently if they are alone or in a group. How do friends and strangers interact, and does this depend on whether they have certain types of material to bond over? How do people respond to evacuations—will they take the nearest exit as directed, or will they run on autopilot to the exit they used to enter the space in the first place?
All this information is valuable to neuroscientists like Spiers, but it’s also useful to architects like his colleague Fiona Zisch, who is based at UCL’s Bartlett School of Architecture. “We do really care about how people feel about the places we design for them,” Zisch tells me. The findings can guide not only the construction of new buildings, but also efforts to modify and redesign existing ones.
PEARL was built in 2021 and has already been used to help engineers, scientists, and architects explore how neurodivergent people use grocery stores, and the ideal lighting to use for pedestrian crossings, for example. Zisch herself is passionate about creating equitable spaces—particularly for health and education—that everyone can make use of in the best possible way.
In the past, models used in architecture have been developed with typically built, able-bodied men in mind. “But not everyone is a 6’2″ male with a briefcase,” Zisch tells me. Age, gender, height, and a range of physical and psychological factors can all influence how a person will use a building. “We want to improve not just the space, but the experience of the space,” says Zisch. Good architecture isn’t just about creating stunning features; it’s about subtle adaptations that might not even be noticeable to most people, she says.
The art gallery study is just the first step for researchers like Zisch and Spiers, who plan to explore other aspects of neuroscience and architecture in more simulated environments at PEARL. The team won’t have results for a while yet. But it’s a fascinating start. Watch this space.
Now read the rest of The Checkup
Read more from MIT Technology Review’s archive
Brain-monitoring technology has come a long way, and tech designed to read our minds and probe our memories is already being used. Futurist and legal ethicist Nita Farahany explained why we need laws to protect our cognitive liberty in a previous edition of The Checkup.
Achim Menges is an architect creating what he calls “self-shaping” structures with wood, which can twist and curve with changes in humidity. His approach is a low-energy way to make complex curved architectures, Menges told John Wiegand.
From around the web
Scientists are meant to destroy research samples of the poliovirus, as part of efforts to eradicate the disease it causes. But lab leaks of the virus may be more common than we’d like to think. (Science)
Neurofeedback allows people to watch their own brain activity in real time, and learn to control it. It could be a useful way to combat the impacts of stress. (Trends in Neurosciences)
Microbes, some of which cause disease in people, can travel over a thousand miles on wind, researchers have shown. Some appear to be able to survive their journey. (The Guardian)
Is the X chromosome involved in Alzheimer’s disease? A study of over a million people suggests so. (JAMA Neurology)
A growing number of men are paying thousands of dollars a year for testosterone therapies that are meant to improve their physical performance. But some are left with enlarged breasts, shrunken testicles, blood clots, and infertility. (The Wall Street Journal)
David Heinemeier Hansson is the creator of the Ruby on Rails software framework, the co-founder of Basecamp, an investor in multiple tech startups, a race car driver, and a family man. He’s a modern-day polymath.
Yet his workday calendar is not full of appointments. He abhors managing employees and attending meetings. His is a maker’s schedule, he says, with much uninterrupted time dedicated to solving problems he cares about.
In our recent conversation, his second in 16 years, Heinemeier Hansson addressed the rise of Rails, Basecamp, and, yes, time management.
The entire audio of our discussion is embedded below. The transcript is edited for clarity and length.
Eric Bandholz: Give us your pitch.
David Heinemeier Hansson: I am a co-owner of 37signals. We make software products. Our original tool is Basecamp, a project management tool we’ve been running for over 20 years. Hey.com is the email service we launched a few years ago and an alternative to Gmail. I also write a lot with my business partner, Jason Fried.
We’ve written four books on starting a business, running a business, and thinking about business. We published “Rework” in 2010, which sold a million copies worldwide. We also wrote “Remote: Office Not Required,” “It Doesn’t Have to be Crazy at Work,” and “Getting Real: The smarter, faster, easier way to build a successful web application.”
As part of building Basecamp in 2003, I created Ruby on Rails, the web framework behind Shopify, GitHub, and Airbnb. It was the original Twitter platform and about a million other prominent websites and applications worldwide.
I still work on that. We’re just putting the final touches on Rails 8, a big upgrade for a framework that’s also been around for 20 years and is powering 10% of worldwide ecommerce. That is what Shopify is responsible for. If you add on whatever else in the ecommerce world runs on Rails, it’s probably a higher number. Shopify is the largest Rails application. It’s 5 million lines of code and a huge portion of all ecommerce worldwide.
In my free time, I like racing cars. I’ve been driving race cars for about 15 years, mainly endurance events. The 24 Hours of Le Mans is my pivotal moment.
Bandholz: How do you prioritize your day?
Heinemeier Hansson: From the outset, Jason and I were on the same page about setting good habits early. We had seen so many entrepreneurs try to do the mode switch and fail. They’ll work 80, 100 hours a week in the early days and get accustomed, if not outright addicted, to that style of working.
We designed the business from the get-go so that we would work 40 hours a week, eight hours a day. That’s plenty. Negative things often happen when you push beyond that when you are so focused on work that you miss other things. You don’t have the right perspective on stuff. And you also think it’s all about input, which it’s not. It’s all about output.
After dropping my three kids off at school in the morning, I have a block of time and make it count. I’ve found and seen repeatedly from entrepreneurs who take pride in bragging about how much they work. It usually means sitting in front of a computer for maybe many hours, but what’s the output of those hours?
The way I make them count is through long stretches of uninterrupted time. I try to be on a maker’s schedule most days of most weeks. That’s not a luxury I can do every day or every week, but it is surprisingly easy to structure your business so that you don’t have a day full of meetings.
When I look at my schedule, very often it’s empty. It’s full of one long, beautiful block of uninterrupted time that I can dedicate to solving the problems I care deeply about, and that requires me to think for more than 20 minutes here or 40 minutes there or whatever crumbs are left over. We’ve designed 37signals not to need that level of constant minding and intervention.
We don’t have status update meetings where we sit around in a circle and tell each other what we’ve done. We use Basecamp’s automated questions. It’ll ask every employee on Monday morning, “What will you work on this week?” They will record it for the whole company to know, not just to their manager, not just to me, not just to Jason, but to everyone.
So the entire staff is in the loop on what’s happening in the business. At the end of every day, the system asks, what have you worked on today? That clock frequency allows me to check in on the business, to develop trust that the people we’ve hired are doing the work we intend for them to do and that they’re going in the right direction without me constantly supervising them.
It is incredible how much time you have in a 40-hour week when no one is constantly bothering you. Forty hours is a luxurious amount of time to make progress, but most people don’t see it that way because they squander it. They cut it into little bits, and then they end up Friday afternoon going, “Oh, man, I was so busy this week. What did I get done?”
Because we don’t work like that, we have room for kids, racing, hobbies, vacations, and time off while still progressing on Basecamp and Hey. We’re working on two new products simultaneously. I’m working on Rails 8, and I write a bunch. I can clear the decks and get stuff done.
Bandholz: How much insight are you looking to get from your team on those daily updates?
Heinemeier Hansson: I’m expecting a story. It can focus on whatever you want to emphasize. This is one of the reasons why we collect this information in an open text field. It’s not derived from what to-dos you’ve checked off or the files you uploaded. It’s not automated. It is an opportunity to reflect on what you did today that was important and that you would like to convey to others. Sometimes, the answer is pretty mundane, “I worked on this same project. Here’s a quick anecdote about an issue I encountered and why it was hard, and why it sucked up a lot of my time.”
Often, those anecdotes become conversation starters in the comment thread for that update. Maybe I’ll chime in. “I hadn’t seen that problem or seen it elsewhere, and here’s how I solved it. Maybe you can do that too.” Or someone else from another part of the business goes, “Actually, we had a customer ask about that.” The updates in Basecamp are public to everyone in the company. If you work in an office and occasionally have that hallway or water cooler conversation, it’s usually contained to your team. When you do it on Basecamp, everyone gets to see everything. We’re 60 people, and it works excellent.
Bandholz: You’re not reading all 60, right?
Heinemeier Hansson: No, I scan. I usually scroll through most of these check-ins daily or weekly. Something will catch my eye, and I can scroll back up. I can consume the status updates of 60 people in about five minutes.
We have zero full-time managers. Out of the 60 people we have, everyone, including Jason and me, treats management as a second job to put on only when necessary.
Bandholz: Where can people follow you?
Heinemeier Hansson:Dhh.dk is my website. I’m also on X, @dhh.