AI is rapidly changing how search engines like Google rank websites. To stay competitive, it’s essential to know which ranking signals matter most and how to optimize for them.
Join us for our upcoming webinar on October 16th, “Optimizing For Google’s New Landscape And The Future Of Search.”
We’ll explore how these changes are impacting SEO strategies and what you can do to succeed in 2024 and beyond.
You’ll leave with a collection of actionable insights that will help you craft a winning SEO strategy and navigate the disruption successfully, while getting optimal engagement and ROI.
Why This Webinar Is A Must-Attend Event
Google’s AI-driven algorithm is transforming search rankings, and this webinar will provide you with the expert insights and actionable strategies needed to stay ahead of the competition.
Here’s what you’ll learn:
The top ranking signals to optimize for increasing your search visibility.
Expert-level SEO strategies that will improve performance right now.
Insights on how to optimize your website to win rich search result types.
Expert Insights From Nathan Endres
Nathan Endres, Sr. SEO Analyst at Conductor, will walk you through the ranking signals that still matter, and how to fine-tune your content strategy to maximize engagement and ROI in an AI-first world.
Who Should Attend?
SEO professionals who want to understand Google’s AI-driven algorithm and the strategies to gain visibility in them.
In-house SEO and marketing professionals building strategies for the next year.
Agency professionals who want to prepare for the SERPs of 2025.
Live Q&A: Get Your Questions Answered
Got questions about Google’s future direction? Stick around for a live Q&A session where Nathan will answer your pressing SEO concerns and provide additional insights.
Can’t make it? No problem! Register now, and we’ll send you the recording after the event.
Don’t miss this opportunity to decode Google’s latest updates and gain a competitive edge.
Keywords are the foundation of SEO. Although content is king, keywords come first: they decide what sorts of users will find you in search. And since you want to be found by the right users, you’d better choose your keywords wisely.
What kind of keywords are good for your site?
They have a high search volume.
In non-SEO terms, it means lots of people type those keywords into search bars. A few hundred searches per month is good, but going higher is always encouraged. The more, the better.
They accurately capture search intent.
The relationship between a site owner and the users works like any business transaction: if you don’t offer them what they want, they won’t take it.
It’s like buying new shoes. If you are an adult with a size 7.5, you are not going to buy children’s shoes (not for yourself, anyway). And looking for generic shoes without anything specific in mind will take you forever to find what you really need.
Keywords are much the same. If you have an online store where you sell shoes, then a product page optimized for the keyword “shoes for women size 7.5” will do a much better job than one saying “shoes for women” or even just “shoes.” Bottom line: use keywords which describe precisely what your target audience wants to find.
They aren’t too competitive.
High competition for a keyword means many other sites are already ranking for it – and beating them all won’t be easy. But pretty much every keyword has a less competitive version. You just need to find and use it.
How do you find keywords which match all these criteria?
For search intent, you must know your target audience and their needs really well, and then use your best judgment. Other factors can be represented in numbers, and that’s where SEO tools come in, such as WebCEO’s Keyword Suggestions tool.
Screenshot from WebCEO, September 2024
Do you have any keyword ideas of your own? Enter them in the field and press Search. The tool will generate a table of related keywords, and then you just pick the most promising ones.
2. Optimize Your Pages With Keywords
Got your keywords? Great. Now, you need to make sure you are using them well.
For maximum effectiveness, your site pages must have keywords in these places:
Page URL
Page title
Meta description
H1-H4 headings (even better if you have a table of contents)
Throughout the text itself
Image filenames and ALT texts (for Google Image Search)
If the tool finds any spaces that could be filled with keywords, do it and run another scan afterward. Instant improvement before your eyes!
One more thing: while having keywords is a must, avoid going overboard with them. One set of related keywords per page, or even one keyword per page is usually enough. Then weave the keywords into your text in a natural-sounding way. The gold standard for content is normally written text with helpful information.
3. Optimize Your Site Structure
It’s easy to turn your website into a poorly interlinked mess if you don’t know what you are doing.
When you do know what you are doing, you can help your most important pages receive a significant ranking boost – just by placing links correctly.
Your users will appreciate it, too. Who doesn’t like having all the content they need at their fingertips?
So here’s the recipe for an optimal site structure:
Page hierarchy. Picture a tree: the home page as the root and the destination pages (i.e. landing pages, product pages, blog articles) at the ends of the branches.
Screenshot from IncreMentors.com, January 2024
Topic clusters. It’s good practice to interlink pages that are dedicated to related topics.
Navigation bar. A bar at the top (less usually on the left side) of the screen, containing links to the most important site pages (e.g. home page, About Us, Contact Us).
Footer bar. Another bar at the bottom of a page, containing the same links from the navigation plus some others, at your discretion. Often, the footer bar contains social media links.
Breadcrumbs. Have you ever seen a bunch of links in a row, something like Home » Category » Subcategory » Page? They are called breadcrumbs and they help users keep track of where exactly they are on a website.
Three-click rule. An unspoken rule says: users should be able to get from any page A to any other page B in three clicks or fewer.
But to use links on your site like a pro, you want to know exactly how much authority your web pages have. And you can find out with the right SEO tools.
This tool will reveal the pages with the highest amount of link juice. Proceed to share it with your most valuable pages just by linking to them from those high-authority pages.
This practice is at its most effective when the interlinked pages are related to each other through their topics – in other words, when they form a topic cluster. For example, a page about the best toothbrushes and another about the best toothpastes. It’s natural to link the two together, so a slight ranking boost to both is guaranteed.
4. Max Out Your Loading Speed
How long is too long? Five seconds may not seem like much, but if that’s how long it takes your page to load, most users will have already left.
People hate slow loading pages. People hate waiting in general. Whatever the place or the website, everybody wants to be serviced without delay.
And Google concurs. That’s why site loading speed is a major ranking factor, one you absolutely must not neglect.
Not only it measures your pages’ loading speed and Core Web Vitals, it also offers constructive criticism by detecting what’s slowing your website down. Just follow the tips from the report and watch your website soar.
And remember to be on constant alert for any slow loading site pages. Set the Speed Optimization tool to send you regular reports, and if you find a page that’s dragging its feet, help it take off.
Screenshot from WebCEO, September 2024
5. Audit Your Site For Errors — And Fix Them
Nothing is perfect, not even the best website in the world. Things break, errors appear. But no self-respecting site owner will let things stay broken – that’s recipe for losing your customers!
This tool detects all kinds of hiccups, from broken links to more serious issues like server errors. Look upon your report and do not despair. It’s merely a list of fixable things.
You can solve those problems yourself or send the report to your site admin and let them handle it. After the job is done, rescan your site and generate another report showing the drop in errors. Your client will love it.
And yes, the Technical Audit tool can also send automated scheduled reports.
Screenshot from WebCEO, September 2024
6. Check The Quality Of Your Backlinks
What do you think is the number one ranking factor? Which one of them can give you the highest ranking boost?
The hint is right there in the heading above. That’s right: backlinks.
Links from other sites pointing to yours. If your site isn’t on Google’s #1 page, then lack of good backlinks is most likely why (assuming everything else is okay).
To see if you have a backlink problem, you need to check the current state of your link profile.
Total backlinks and linking domains. The ratio between them can give you a rough idea about how many links each domain gives you on average. If that ratio is too high (e.g. 1000 backlinks per domain), then most of those backlinks are probably of poor quality.
Loss of backlinks. Sometimes sites stop linking to you. Maybe they found someone with better content than yours, maybe they took down the page with the backlink, or maybe even their site died. Whatever the reason, it can affect you rankings negatively.
Backlink texts. A good anchor text tells users what they are going to find on the other side of the link. If it fails to do that, fewer people will click on the link. Look for non-descriptive anchor texts (such as “click here”) that are keeping your rankings down – changing those texts can be just what you need.
Harmful backlinks. Spammy links from low-quality sites will do you no favor. If you have too many toxic backlinks, you will have to take them down.
Knowing the state of your link profile opens two different paths to improving it: link building and link detoxification. Let’s start with the former.
7. Revise And Expand Your Link Building
If you want to gain new backlinks and increase your site rankings, you’ll want to do some link building.
Which sites give the best backlinks?
They are highly authoritative;
They are topically related to your site.
And the closer they fit these criteria, the harder it will be to land your backlinks there. Those sites have high standards.
Have high-quality (and ideally unique) content on your site that others will want to link to;
Find broken links on other sites and offer those sites’ owners to link to your content instead;
Find unlinked mentions of your site or brand and offer to add a backlink;
And we strongly encourage you to try out even more. You may find some of the link building strategies easier or more effective than others.
What About Steps 8-14?
You bet it’s just the beginning. Do you want to take up even more SEO techniques to start preparing for 2025?
Good news: the full SEO guide is exclusively available to WebCEO users in PDF format, and it’s completely free. Download it now and get a head start on your competitors!
Automattic CEO and WordPress co-creator Matt Mullenweg announced a new Executive Director for WordPress.org after the previous director’s resignation. Social media reactions, while generally positive, were notably subdued, with many comments focused on the recent WordPress controversy.
New Executive Director
Mullenweg announced that Mary Hubbard, was hired as the new Executive Director. Hubbard was formerly the Chief Product Officer for WordPress.com from 2020 and will begin her new position on October 21st. She recently resigned as the Head of TikTok Americas, Governance and Experience.
The Executive Director position at WordPress.org opened up after the resignation of 8.4% of Automattic employees, including the previous Executive Director Josepha Haden Chomphosy. Mullenweg offered employees who wished to leave $30,000 or the equivalent of six months pay, whichever was higher. The severance package was offered after the recent issues between Automattic, Mullenweg and WP Engine (WPE) which resulted in WPE filing a federal lawsuit against Mullenweg and Automattic, alleging attempted extortion.
Muted Response To Announcement
A post in the popular Dynamic WordPress Facebook Group generated 21 responses within seven hours, with most of the comments a discussion about the recent drama and the Mullenweg’s ownership of WordPress.org and other similar topics (view the discussion here, must join the private group to view).
The response to the official WordPress.org announcement on X was muted, with about equal amounts of people posting welcomes as those who were taking the opportunity to post their displeasure and opinions about recent events.
Seven hours after posting the announcement there were only 15 responses, 21 retweets, and 117 likes.
What does the E.D. do for https://t.co/sbi8NmJkOL? Is https://t.co/sbi8NmJkOL part of the Foundation? Part of Automattic? Something else? Unlike WPEngine, the differences between these organizations, their ownership, and governance are incredibly confusing.
Adobe has announced a new tool to help creators watermark their artwork and opt out of having it used to train generative AI models.
The web app, called Adobe Content Authenticity, allows artists to signal that they do not consent for their work to be used by AI models, which are generally trained on vast databases of content scraped from the internet. It also gives creators the opportunity to add what Adobe is calling “content credentials,” including their verified identity, social media handles, or other online domains, to their work.
Content credentials are based on C2PA, an internet protocol that uses cryptography to securely label images, video, and audio with information clarifying where they came from—the 21st-century equivalent of an artist’s signature.
Although Adobe had already integrated the credentials into several of its products, including Photoshop and its own generative AI model Firefly, Adobe Content Authenticity allows creators to apply them to content regardless of whether it was created using Adobe tools. The company is launching a public beta in early 2025.
The new app is a step in the right direction toward making C2PA more ubiquitous and could make it easier for creators to start adding content credentials to their work, says Claire Leibowicz, head of AI and media integrity at the nonprofit Partnership on AI.
“I think Adobe is at least chipping away at starting a cultural conversation, allowing creators to have some ability to communicate more and feel more empowered,” she says. “But whether or not people actually respond to the ‘Do not train’ warning is a different question.”
The app joins a burgeoning field of AI tools designed to help artists fight back against tech companies, making it harder for those companies to scrape their copyrighted work without consent or compensation. Last year, researchers from the University of Chicago released Nightshade and Glaze, two tools that let users add an invisible poison attack to their images. One causes AI models to break when the protected content is scraped, and the other conceals someone’s artistic style from AI models. Adobe has also created a Chrome browser extension that allows users to check website content for existing credentials.
Users of Adobe Content Authenticity will be able to attach as much or as little information as they like to the content they upload. Because it’s relatively easy to accidentally strip a piece of content of its unique metadata while preparing it to be uploaded to a website, Adobe is using a combination of methods, including digital fingerprinting and invisible watermarking as well as the cryptographic metadata.
This means the content credentials will follow the image, audio, or video file across the web, so the data won’t be lost if it’s uploaded on different platforms. Even if someone takes a screenshot of a piece of content, Adobe claims, credentials can still be recovered.
However, the company acknowledges that the tool is far from infallible. “Anybody who tells you that their watermark is 100% defensible is lying,” says Ely Greenfield, Adobe’s CTO of digital media. “This is defending against accidental or unintentional stripping, as opposed to some nefarious actor.”
The company’s relationship with the artistic community is complicated. In February, Adobe updated its terms of service to give it access to users’ content “through both automated and manual methods,” and to say it uses techniques such as machine learning in order to improve its vaguely worded “services and software.” The update was met with a major backlash from artists who took it to mean the company planned to use their work to train Firefly. Adobe later clarified that the language referred to features not based on generative AI, including a Photoshop tool that removes objects from images.
While Adobe says that it doesn’t (and won’t) train its AI on user content, many artists have argued that the company doesn’t actually obtain consent or own the rights to individual contributors’ images, says Neil Turkewitz, an artists’ rights activist and former executive vice president of the Recording Industry Association of America.
“It wouldn’t take a huge shift for Adobe to actually become a truly ethical actor in this space and to demonstrate leadership,” he says. “But it’s great that companies are dealing with provenance and improving tools for metadata, which are all part of an ultimate solution for addressing these problems.”
It can also automatically import your ad campaigns from your Google Ads, Facebook, and Instagram accounts, create a new note when you publish an article, and annotate confirmed Google algorithm updates.
The tool offers a limited free version. Paid plans start at $39 per month with a free two-week trial.
Siteimprove announced the acquisition of MarketMuse, creating a comprehensive SaaS solution for content, accessibility, and SEO. This unifies vital marketing processes, benefiting customers of both organizations with a single, integrated platform.
MarketMuse
MarketMuse is a leading AI content planning software that helps users research, plan and execute a scaled content strategy. MarketMuse enables users to analyze their content to understand if it adequately covers a topic and can scale up to analyze the entire overall topic and create content briefs that take the guesswork out of creating a content calendar, enabling an organization to be able to consistently publish high quality authoritative content.
Siteimprove
Siteimprove is a platform for analyzing content for SEO and accessibility as well as continuous site monitoring for issues.
MarketMuse’s Jeff Coyle wrote:
“I’m excited to announce that MarketMuse has entered a definitive agreement to be acquired by Siteimprove, one of the biggest players in martech!
Siteimprove’s known far and wide for assembling accessibility, digital governance, analytics, SEO, and cross-channel advertising into one platform.
The acquisition spells transformation: Marketers of all stripes will be relieved of attending to the ever-changing technical details that shroud their work. It means that you will be better able to focus on transformative strategy rather than minutiae — and build better digital experiences that are meaningful, credible, and deliver results.”
The announcement states that MarketMuse customers will have a more unified approach to SEO, Accessibility and Content Optimization from one SaaS platform.
To make it easier for advertisers, Google said “These videos can be downloaded to use across Ads campaigns, Merchant Center, your website, or other marketing channels.”
Video generation is available in Merchant Center, or the Google and YouTube app within Shopify.
How to Get Started
In order to start generating videos in Merchant Center, you’ll need to have brand information set up within the platform.
This includes your brand colors and logo.
Then, within Merchant Center Next, navigate to the “Products” page, then click “Product Studio.” To choose a product, click “Get started”.
From there, select the product you want to generate a new video for.
This is where you’ll select a theme and choose from a variety of the optional add-ons, like a headline or audio tracks.
The video generation can take a few minutes, and then you’ll have the option to generate a high-res video or download a low-res video for easy sharing capabilities.
Summary
As of the announcement, video generation in Product Studio is only available for merchants in the United States.
Google does plan to expand to additional countries in the near future, but no concrete timeline as of yet.
Creating product videos for cross-channel marketing efforts is about to get more streamlined for many advertisers, right in time for the holidays.
The update can save marketers valuable time and resources on creatives, which allows them to shift the focus on other high priority items that may arise during a busy Q4 season.
The ascension of Reddit to one of the largest sites on the web over the last 15 months is quite controversial and unique. Never in the history of SEO has a site grown that fast to such a level.
In a recent interview on the Motley Fool Money podcast, CEO Steve Huffman paints a picture of how large sites can succeed on Google you shouldn’t miss.
Image Credit: Kevin Indig
I extracted the key quotes and added my own takes below. Every quote is verbatim, but I removed filler words.
The takeaway questions at the end of each section hopefully inspire you to find new growth opportunities.
I optionally uploaded this Memo to NotebookLM’s new podcast feature, so you can listen to it if you like.
I’d love to hear your opinion in the comments about whether you prefer reading or listening to Memos!
On: Growth
We made sign up much, much easier.We made both the website and the app much faster. We redesigned it in a lot of little ways so it’s easier on the eyes. There are fewer bugs. And our home feed has gotten much better at making recommendations of communities that you might like. [We’re] getting people into their home on Reddit and then finding all their interests much more effectively.
We used to be more aggressive about ‘hey, login, download the app’. That worked in the short term, but long term, it was just kind of annoying because in that moment, that person probably has a question and Reddit likely has the Answer, but they’re not looking to be on Reddit in that moment. They’re trying to do something. I’m trying to buy this thing or I’m trying to get an answer to this question.
Our attitude now is give the person what they want. Give them the answer, let them see all the content, let them go about their day and trust that we’ll see them again on the front page or opening the app when they’re more primed to have the community experience. Every time they come to Reddit and get the Answer, they’re learning ‘Reddit has the Answer to my questions’. That alone is really valuable.
Logically, the more friction you remove for users, the easier it will be for them to solve their problems, and the better the signals you send to Google.
The key point here is that these changes are related to the product, not just the website. The product experience influences SEO.
The effect of positive user signals is often masked by time, as Google takes months to collect user behavior for queries and might only slowly reward sites.
The slow pace starkly contrasts the fact that companies are often incentivized to harvest short-term gains, usually by adding friction to the experience rather than removing it. While there is a balance to be had, the result is usually worse user engagement signals.
Brand recognition in the search results plays into the same challenge: When visitors have a good experience with a site or product, they’re more likely to click on it again when they see it in the search results.
If they encounter too much friction signing up or a poor product experience, though, that opportunity goes out the window.
Takeaway questions:
How can you improve your product and onboarding experience?
Do you have too much friction in the sign-up process to allow users a good experience?
Where can you take friction away, and where do you need to find a compromise?
How do you measure user experience on the site vs. in the product?
Either you haven’t heard of Reddit or it didn’t work for you. Those are the number two we’re really focused on. There’s a third one, which is you don’t speak English. That’s the next frontier of Reddit.
We can actually translate the existing Reddit corpus into other languages at human quality. Now, not all the content is relevant, but a lot of it is. We have been testing this in France, in French in the first half this year, and it’s gone very, very well.
After winning in the U.S., international markets are a huge growth lever for Reddit, and machine translation has become good enough for most cases. The Hidden Gems update initiated Reddit’s rise in the SERPs – not just in the U.S. – and Reddit needs to capture the momentum.
As a marketplace, it faces the classic chicken-egg problem: You need content to attract users, but users need to create the content.
In the U.S., Reddit has famously solved the problem with fake users.
In international markets, Reddit can use the content it already has to stimulate new content creation and “make the site feel alive.” The key is to get the translation good enough, and that’s where Reddit uses machine learning.
Takeaway questions:
What assets do you have in your core market that you could leverage to enter new markets?
Can you use machine translation to get to “good enough” quickly?
Do you have momentum in INTL markets that you should capitalize on?
On: SEO
We made our website substantially faster – two to five times faster. We launched this in May of 2023. Googlebot likes speed, and faster pages rank higher and get indexed faster.
When our website got a lot faster, we started ranking higher. Users are having a better experience on Reddit. It creates this Flywheel that we’re really benefiting from as we see a lot of new and core users coming from search.
A lot of SEO pros miss this: Google crawls and indexes faster sites more.
As a direct ranking factor, speed and Core Web Vitals optimization have the biggest impact on ecommerce.
I don’t recommend prioritizing it for other types of sites – unless you see a high amount of discovered, not indexed or crawled, not indexed pages in your Google Search Console pages report in combination with low CWV scores. As a result, crawl and indexing rates are relevant metrics for site speed as well.
Takeaway questions:
Could (server) speed be the reason you’re seeing a high amount of discovered, not indexed or crawled, not indexed pages in GSC?
Could you slim down the amount of stuff Google has to download to render the page without a massive resource investment?
We have no idea how search works. Nobody does. Right? Right. Nobody does.
Google algorithm and product changes sometimes help, sometimes hurt, but we don’t live or die by them by any means.
The art of SEO is leaning into it really hard and then diversifying. It’s like investing: Double down on something and diversify once you have wealth.
A common approach to getting wealthy is to double down on one investment and diversify once you’ve reached a certain return to hedge your bets. SEO for marketplaces should be no different. The question here is how dependent Reddit is on Google for growth.
My take is that, Reddit depends on Google for its top-of-the-funnel (TOFU) but provides a good enough experience that users would come to Reddit even if a Search algo update brought it back to its 2022 baseline.
Over 50% of Reddit’s traffic comes from SEO and 42% direct, according to Similarweb. But Search is not Search. There is branded and non-branded SEO traffic.
A significant number of searchers append “Reddit” to their queries, which is an incredibly strong ranking signal for Google and shows that users want Reddit results specifically.
Reddit also saw strong user growth due to its exploding presence in Search. So, both are true: Reddit needs Search to grow but wouldn’t die without its front-row seat.
Takeaway questions:
Are you at the point at which you should diversify from SEO?
How can you stimulate more brand searches?
On: Brand Search
If you go to Google Trends, you can see this: Reddit is the sixth most searched word on Google in 2024 in the U.S. last year. Number five is news, and maybe number eight is maps.
People are going to Google looking for Reddit. A lot of those users are already logged in. They’re actually core Reddit users. They’re using Google to navigate Reddit. If you’re just searching on the Internet, there’s a good chance you end up on Reddit.
This quote goes back to my previous point and addresses the common criticism that Reddit’s search is so bad that users need to use Google. But isn’t that in Reddit’s interest?
If Google is Reddit’s TOFU and searches that include “Reddit” are a strong signal, why would Reddit improve its onsite search and kill that behavior?
Reddit needs to thread the needle and make the experience good enough that users signal up once they find a Reddit result but not so bad that users can’t find anything on Reddit.
To be fair, the chance of coming across a Reddit result on the web sooner or later is incredibly high since the platform is huge. It also hosts many small but passionate sub-communities that form one large community.
Other than Reddits competitors, which are mostly small niche forums, its footprint on the web is large enough to allow poor site search. Not every business can get into such a position, but some can.
Takeaway questions:
Do users love you so much that they would search your site on Google even if your onsite search were poor?
Is your footprint large enough that users would come to you either way?
On: Monetization
Our ad server doesn’t care if you’re logged out or logged in. They both have a user id. The main difference between a logged in user and a logged out user is logged in users spend more time on Reddit. So, we’ll have a more fulsome view of what your interests are, because over time, people join more and more communities on Reddit.
You might have 100 subscriptions or more and a logged out user doesn’t have any. They may have just visited a few subreddits. The main difference in value to us between a logged in user and a logged out user is time spent of the logged in accounts, they just have more inventory. But we monetize logged out users as well.
There’s broadly two ways that we’ll target an ad. One is based on your explicitly expressed interest on Reddit. If you join the skiing subreddit, you’re likely to see outdoor ads. The other is the context of what you’re looking at.
If you’re on a comments page, we call them post detail pages, that page is likely mentioning a company or companies by name and often specific products. We can target an ad based on the context as well.
We think targeting based on your explicit interests or the context of what you’re looking at are healthy and explainable and not creepy ways of targeting ads. What we don’t do is we don’t target ads based on your personal information, your Internet browsing habits.
Reddit’s ad targeting system is very similar to Google’s. Instead of tracking user behavior and interests, Reddit and Google target them based on their search query or subreddits and posts.
The benefit is not just a lower “creep factor” but also less dependency on logged-in users.
As a matter of fact, about half of Reddit’s daily active users (DAU) are logged in and half logged out.
Meta, for example, couldn’t operate under these circumstances. They need more logged-in users for ad targeting.
Takeaway question:
What intent can you derive from pages users visited, e.g., with a customer journey intelligence platform or attribution model?
30% to 60% of our users are not on the other platforms.
40% of conversations on Reddit are about products or product recommendations.
First of all, these are stunning statistics. Reddit has a unique audience, which is rare as a social platform, but speaks about the passion and engagement of its many sub-communities.
Second, the fact that almost half of Reddit’s conversations are about products might explain why Google ranks Reddit so highly for so many product-related keywords.
Reddit’s visibility in the space seems to be justified by its content and Google’s ambition to display better content for products.
Cases like House Fresh, an affiliate site that got published badly by Google for unknown reasons, while big affiliate sites seem to be sloppy with their product reviews, highlight how hard it is to find good affiliate content.
I also referenced a study from Germany that shows how easy it is to identify affiliate content based on borderline spammy optimization.
Even though Reddit is by no means perfect and needs to find ways to combat spam, it’s still a place on the web where users can find unfiltered product reviews.
Takeaway questions:
How can you be present in Reddit’s product conversations in an open, transparent way?
Should you start a Subreddit or engage more passively?
What can you learn from product conversions about your space on Reddit?
On: Moderation
Every content starts at zero points. Human beings have to vote on a piece of content up and not down to make it popular. Stuff that’s out of alignment with the community or if you’re being a jerk in the comments is likely to get voted down. In that sense, every user is a moderator on Reddit because every user can vote.
Then we have users called moderators. They’re not employees. These are the users who create communities on Reddit. They write the rules that can be as strict as they want for their communities. They write that and enforce it for themselves.
Of course, we have our own safety team. Those are our employees. We enforce our policies at scale. We have all sorts of fancy tooling for doing so. By the way, we expose much of that tooling, all the AI stuff to the user, moderators as well. They have all sorts of filtering and this and that. By and large, Reddit is a really safe and welcoming place because it’s organized by community.
Again, Google’s decision to raise Reddit’s search visibility is controversial, and Reddit is by no means clean of spam.
Yet, Reddit’s multi-layered moderation of votes, Karma, moderators, and safety teams makes it a good experience for most users.
Takeaway question:
As a user platform, how can you leverage your user base for moderation, such as Twitter’s community notes?