Silos don’t cut it anymore. User journeys are too complex for you to view and track channels separately.
To improve your campaign performance, you need a holistic view of your marketing activities and how they intertwine. This is especially true for organic and paid search strategies.
You need to be front and center with your ideal customers at multiple touchpoints, including active interactions and passive awareness. An ideal marketing strategy has paid and organic campaigns working in tandem, and it’s becoming harder to succeed without doing both.
If you’re looking to drive quality growth in your own campaigns, iQuanti can help.
Join us live on July 24 as we delve into this intricate relationship between organic and paid search channels. You’ll get actionable insights for measuring success to maximize their combined potential.
You’ll gain a comprehensive, data-driven understanding of how to measure, analyze, and optimize holistic search marketing efforts, ensuring sustainable growth and superior ROI for your business.
You’ll walk away with:
Integrated Metrics and KPIs: Learn how to define and track key metrics to capture the performance of your organic and paid search campaigns, so you can make informed strategic decisions that work.
Attribution Models: You’ll see firsthand how strong attribution models are crucial to understanding your customers’ journeys, allowing you to identify influential touchpoints and allocate budget effectively for maximum ROI.
Optimization Strategies: You’ve gathered data from your campaigns…now what? Take the data and leverage it to further optimize your paid and organic search campaigns, increasing conversions along the way.
Shaubhik Ray, Senior Director of Digital Analytics Solutions at iQuanti is an expert at crafting holistic search strategies to reach more of your ideal audiences at relevant stages in their journeys. Now, he’s ready to share his insights with you.
You’ll walk away equipped with the knowledge and tools necessary to execute a combined organic and paid strategy that improves the performance of each channel. You’ll gain data-driven insights on how to align a combined strategy with business goals and lead your organization to success.
Sign up now and prepare to maximize the potential of combining your organic and paid campaigns.
At the end of the presentation, you’ll get a chance to ask Shaubhik your burning questions in our live Q&A, so be sure to attend.
And if you can’t make it that day, register here and we’ll send you a recording following the webinar.
Are you certain that the content you’re publishing on your website is 100% original?
Steering clear of plagiarism is a top priority for content creators, educators, businesses, and others in order to maintain credibility and avoid legal issues – among other things.
While Copyscape has long been one of the most well-known and popular options for plagiarism checking, the range of available tools has expanded significantly, with various features designed to meet people’s unique needs.
In this article, we will cover the basics of plagiarism – what it is, why you should check for it, how to check, and what to do if someone plagiarizes your content – before highlighting some of the top alternatives to Copyscape, helping you keep your content unique and valuable.
What Is Plagiarism?
Plagiarism is when you use someone else’s work, whether words or ideas, and present it as your own without proper attribution.
Plagiarism can range from directly copying someone’s work to closely paraphrasing something without acknowledging the source. Sometimes, it’s purposeful, while other times, the perpetrator might not even realize they’re doing it.
Regardless of intent, plagiarism is a widespread problem that is difficult to combat – but the first step is detecting it.
Why It’s Important To Check For Plagiarism
The consequences of plagiarism can be severe – you can lose credibility, harm your reputation, and even face legal repercussions.
Here are a few reasons why it’s essential to check for (and avoid) plagiarism:
Prevent legal problems. Engaging in plagiarism or copyright infringement can expose you to a range of potential legal issues.
Maintain your reputation. Trust is vital. But why should audiences trust you if you’re stealing somebody else’s work? Checking for plagiarism is crucial to preserving your reputation and trust with your audience or customers.
Preserve your SEOefforts. Google and other search engines are actively trying to crack down on plagiarism and will penalize any plagiarized content. This can hurt your website’s ranking and visibility.
How You Can Check For Plagiarism
There are a handful of different ways to check for plagiarism, including:
Manual checks. This is precisely what it sounds like: manually reviewing content for plagiarism by cross-checking text using search engines and academic databases. If you’re examining a small chunk of text, this can work, but it can get unwieldy fast.
Use alerts. It’s possible to create your own plagiarism checker by setting up Google Alerts. Simply enter your content into the search query field and let Google know how frequently you want it to alert you of copied content. While not a totally accurate or complete method, it can be effective at times.
Monitoring services. You can use existing tools that help flag unauthorized use of your content. They do so by scanning the internet and leveraging algorithms to detect plagiarized content.
Online plagiarism checker tools. Software and tools designed specifically to analyze content and run a comprehensive check for plagiarism.
While checking text for direct plagiarism is one thing, identifying paraphrased content or ideas is much more complicated.
And while we will highlight many useful tools in this article, it’s worth remembering that no tool is perfect.
With the sheer amount of content available and more being produced and published every second, it’s nearly impossible to complete a full check. Hence, why plagiarism is an ongoing issue.
What To Do If Someone Plagiarized Your Content
So, what do you do if you discover that somebody else has plagiarized your content? Here are a few steps you can take:
Collect evidence. Take screenshots, make notes, and save any URLs as proof of the offense.
Contact the perpetrator. As we mentioned earlier, sometimes, plagiarism can be an innocent mistake. No matter the situation, we recommend contacting the offending party and requesting that they either remove your content or label it with the proper attribution.
File a complaint. If that doesn’t work, you can file a Digital Millennium Copyright Act (DMCA) takedown complaint, which will send notice to the service provider (e.g., Google or web hosting companies) to remove the content or face legal liability.
Seek legal advice. If the case is particularly egregious, or the above steps fail, you can consider speaking with a legal professional.
Top 11 Plagiarism-Checking Alternative Tools To Copyscape
1. Grammarly
Screenshot from Grammarly.com, June 2024
While most people know Grammarly for its spelling and grammar check capabilities, it also offers a useful plagiarism checker tool.
Grammarly’s free plagiarism checker will compare your text (up to 10,000 characters) against academic databases and billions of webpages, then give you an immediate report that lets you know whether it found any plagiarized content.
As a helpful bonus, it will also flag if it finds problems with grammar, spelling, punctuation, conciseness, readability, word choice, or other writing issues.
If you want to take it a step further, Grammarly offers a Premium version of the tool with more advanced capabilities. The paid version will highlight specific sentences of concern, include source information, give you deeper writing feedback, and even allocate your text an “overall originality score.”
Cost
Free version available with limited plagiarism detection as well as basic grammar, spelling, etc. checks.
Premium Grammarly membership starts at $12/month and includes advanced plagiarism detection.
2. Plagiarisma
Screenshot from Plagiarisma.net, June 2024
If you’re looking for a plagiarism checker that works in several languages, look no further than Plagiarisma. It supports 190+ languages and offers both free and paid versions.
Users can enter text into Plagiarisma in a variety of ways, including uploading documents, entering URLs, or pasting text directly into the tool. Once you’ve shared your copy, it will check it against sources like books, websites (you can choose between Google and Bing as your search engine of choice), and academic papers.
With the free version, users can run plagiarism checks up to three times in one day. You can also upgrade to a Premium membership for access to more features, including a Synonymizer (which helps you leverage synonyms to recreate sentences), a Similarity Checker (which compares documents for similarity), and unlimited access to plagiarism checks.
Cost
Free version with up to three plagiarism checks per day.
Premium membership starts at $5/month and offers unlimited plagiarism checks and more advanced features.
3. ProWritingAid
Screenshot from ProWritingAid.com, June 2024
Similar to Grammarly, ProWritingAid is an AI-powered writing assistant tool that analyzes your copy and suggests areas for improvement. It also offers a helpful plagiarism checker – and while there is no free version, it’s still reasonably affordable.
According to ProWritingAid, its plagiarism detection tool can compare your text (up to 2,000 words) against billions of sources, both online and offline, including databases, periodicals, and websites.
It will flag directly copied content and give similarity percentages to show areas needing improved paraphrasing or citation.
You can use ProWritingAid’s online editing tool to conduct your check or leverage its Microsoft Word Add-In.
Unlike some other tools, you pay for ProWritingAid based on the number of checks you want to conduct versus a monthly or yearly subscription – so that is worth noting, and might be a benefit if you only have a specific number of documents you need to look at.
Cost
No free version.
Pricing starts at $10 for 10 checks, $40 for 100 checks, $120 for 500 checks, and $200 for 1,000 checks.
4. Plagiarism Checker
Screenshot from Plagiarism-Checker.me, June 2024
Plagiarism Checker is a fairly straightforward plagiarism detection tool that’s both free and easy to use. If you need a quick and simple option, this is worth checking out.
It boasts a simple user interface and allows users to insert their text directly into the web-based editor, share a URL, or upload a document. You can even denote a URL you want it to exclude, which is a helpful feature if there are particular pages on your site that you want to ignore for now.
Plagiarism Checker scans your text against blogs, websites, and academic papers to detect plagiarism, which it delivers as a percentage. It’s compatible with Mac, Windows, and Android, and supports multiple file formats, including .rtf, .pdf., .docx, .odt, and txt.
Note that there is a limit of 1,000 words per check. The tool also includes a grammar checker and word counter, and you can download the reports it gives you.
Cost
5. CopyGator
Screenshot from CopyGator.com, June 2024
CopyGator is a free service designed to help bloggers and content creators monitor and detect duplicate versions of their content on other blogs or websites.
It works by monitoring your website’s RSS feed to see whether content has been republished elsewhere – and automatically notifying you if it finds plagiarism or quotations.
There are two different options for using CopyGator:
Image badge: By copying and pasting some code into your site, you can add a CopyGator image badge to your blog that will monitor your feeds for you. When you want to run a check, simply click the badge. If it turns red, CopyGator has detected plagiarized versions of your content.
RSS feed: Your other option is to input an RSS feed directly into CopyGator’s tool and ask it to watch the feed. It will create your own custom overview page where you can get updates.
Cost
6. PlagScan
Screenshot from PlagScan.com, June 2024
PlagScan is quite a robust plagiarism detection tool most commonly used by academic institutions and professional writers. One thing to note upfront: There is no free version of this tool.
PlagScan compares your text to a massive database of websites, academic resources, and journals to find plagiarism and compiles a report to help you understand the results.
You’ll receive a PlagLevel score, which summarizes the level of duplicate text found within a document, as well as colored highlighting for possible plagiarism:
Red for direct matches.
Blue for potentially altered copy.
Green for correctly cited text.
With PlagScan, you get a list of sources that match your document to help you with proper citation. You can also compare two documents side-by-side to find similarities. It works with most file types, and your data is protected.
Cost
No free version.
PlagScan uses a prepaid pricing model based on the number of words/pages. Pricing starts at $6.5 for 6,000 words/24 pages.
7. CopyLeaks
Screenshot from CopyLeaks.com, June 2024
CopyLeaks is a more sophisticated plagiarism detection tool than many of the options used on this list, making it a popular choice for businesses, educational institutions, and individuals around the world.
According to CopyLeaks, it uses “advanced AI” to detect instances of plagiarism across over 100 languages, including paraphrasing, plagiarism in programming code, and even AI-generated plagiarism. Each scan checks content against 60 trillion websites, more than 16,000 journals, over 1 million internal documents, and 20+ code data repositories.
The tool has a very user-friendly interface, allowing you to choose from different types of files you might want to scan – text, documents, code, URLs, etc. You can also use the “compare” option to compare two documents or URLs to each other.
Another handy feature within CopyLeaks is the ability to schedule recurring scans so that it will automatically check for duplicate content on a regular basis. It also offers easy and flexible API integration,
Cost
Free trial available.
Paid plans start at $8.99/month for up to 1,200 credits (equal to 300,000 words). For $13.99/month, you’ll get access to both the plagiarism detection and AI content detection tools in one.
8. Plagium
Screenshot from Plagium.com, June 2024
Plagium is a good choice if you’re looking for an easy and cost-effective plagiarism checker. It uses a simple web-based text box and offers both “quick search” and “Deep Search” functions, the latter of which is basically a term for a closer check and the ability to scan large documents.
A quick search is free and allows up to 500 characters – though the website appears to indicate that the number of quick searches is capped. In order to use the Deep Search feature, you’ll need to create an account – and these searches start at $0.08/page using Plagium’s credits system.
As a member, you’re able to upload different types of documents – such as PDFs – and Plagium also integrates with Google Drive and offers a Google Docs Add-on.
Cost
Free quick search up to 500 characters.
Paid plans start at $9.99/month for over 143,000 characters, with options for prepaid plans if that is more your speed.
9. Dupli Checker
Screenshot from DupliChecker.com, June 2024
Need a free, easy-to-use plagiarism checker that’s available in up to seven languages and accepts a variety of file formats? Dupli Checker could be for you.
Dupli Checker’s simple interface makes it easy to scan your documents for plagiarism. You can paste directly into the website or upload files from your computer, Dropbox, or Google Drive. Like other tools in this list, you can also share a URL you’d like the tool to check, and up to five URLs you want it to exclude.
The tool promises 100% privacy – meaning it doesn’t save any of your documents – and summarizes your results in a report that highlights duplicate copy, gives you a percentage rating, and offers more features like grammar issues.
Cost
Free version with up to 1,000 words per search.
Paid plans start at $10/month for increased searches, higher word limits, and other advanced features.
10. Quetext
Screenshot from quetext.com, June 2024
Quetext has become a popular plagiarism detection tool, and for good reason. It’s dependable and user-friendly, with some handy little features to help you spot plagiarism in your documents.
How does it work? You just enter your text into the web-based browser box and click “Check for plagiarism.” Quetext then uses its DeepSearch™ Technology (a machine-learning algorithm) to scan your text against billions of internet sources and spot plagiarism.
It provides you with a report that includes a plagiarism score and both exact matches and near matches to other existing text.
It highlights the latter using its ColorGrade™ feedback feature, which uses different colors to highlight exact match copy vs. “fuzzy” matches (or close matches) – a valuable tool for spotting plagiarism that might have otherwise flown under the radar.
It also offers a “Cite Source” feature, which helps you produce citations across Chicago, MLA, and APA formats.
Cost
Free version available, which includes up to 500 words, a website citation generator, and a citation assistant.
Paid tiers start at $8.80/month, which includes 100,000 words per month and a range of other advanced features.
11. PlagTracker
Screenshot from Plagtracker.com, June 2024
PlagTracker is an online, web-based plagiarism detector that bills itself as “the most accurate plagiarism checking service.” The tool lists students, teachers, publishers, and site owners as its intended users, and it checks text against over 14 billion webpages and “more than 20 million academic works.”
Using PlagTracker is pretty straightforward. Users upload a document into the tool, which scans it and then returns a detailed report that shows what percentage of their document is plagiarized and highlights specific sections with sources.
It supports multiple languages –English, German, French, Romanian, Spanish, and Italian – making it a versatile tool. PlagTracker has a 5,000-word limit for free users, though you can pay for a Premium membership for unlimited access.
Cost
Free version is available with a 5,000-word limit.
Premium subscription starts at $7.49/month for unlimited volume and other advanced features.
The Best Plagiarism Detection Tools On The Market
And there you have it: Copyscape is by no means the only option for plagiarism detection tools.
Those listed above are great alternatives that cater to a wide range of use cases, whether you’re looking for a cheap and easy solution or an all-in-one AI-powered writing assistant.
If you’re a content creator of any kind, you must produce work that’s original and unique – and these tools can help you do just that.
Avoiding plagiarism will protect your credibility and reputation and ultimately drive more traffic to your website. Not to mention, it’ll keep you out of trouble.
Recently, I had the pleasure of hosting a webinar discussing Reddit, its growth, and how it impacts search results.
It’s been quite a while since I covered Reddit as a topic, but I could talk about it all day, and I think it has and is one of the most influential communities around today.
Let’s dive into the key points we covered during the session.
The Current State Of Reddit
Reddit has seen explosive growth recently. Here are some stats that highlight this surge:
Reddit’s growth rate of 37% year-over-year is phenomenal, and it’s now 50% non-US, marking its impressive international expansion.
Why Reddit Has Become So Influential
Reddit is successful because it addresses a significant problem: the oversaturation of low-quality content on the internet.
Traditional search experiences are becoming less effective, and users seek more reliable, conversational answers.
Reddit fills this gap by providing authentic, user-generated content that is trusted by so many searchers today that they actually add [reddit] to the end of their search queries to force Reddit results.
Deals with Google and OpenAI underscore Reddit’s value, with Google signing a $60 million deal for real-time content access and training future AI models using Reddit data.
Tips For Having Success On Reddit
1. Understand The Platform
Reddit isn’t just another social media site; it’s a content-sharing platform.
This fundamental distinction is crucial for understanding how to navigate and succeed on Reddit. Unlike traditional social media platforms, Reddit is designed around user-generated subreddits –communities where content is shared and discussed.
Each subreddit is unique, with its own tone, culture, and rules. Think of subreddits as completely separate communities rather than categories of the same community.
The individuality of each subreddit means you must tailor your approach to fit the specific norms and expectations of each community.
On Reddit, the focus is on topics, not individuals. Influencer marketing, as it exists on other platforms, doesn’t translate well here.
While notable figures like Bill Gates participate, it’s their contributions and the topics they discuss that matter, not their personal brand. This topic-centric approach sets Reddit apart from other social media sites.
Anonymity is a core feature of Reddit, encouraging users to speak freely and honestly.
This anonymity fosters open discussion, as seen in subreddits like “/r/AmItheAsshole,” where users seek unbiased opinions on personal situations. Understanding and valuing this anonymity is key to engaging authentically with the Reddit community.
It’s important to distinguish between moderators and admins on Reddit. Moderators are regular users who manage subreddits, while admins are Reddit employees.
Confusing the two can lead to frustration, as moderators don’t have the same powers or responsibilities as admins. Recognize this distinction to better navigate issues and interactions within the platform.
Understanding these fundamental aspects of Reddit is critical for making a strong first impression and achieving success on the platform.
Mistakes can quickly derail your efforts, but with the right approach, Reddit offers a unique and valuable space for content sharing and community engagement.
2. Avoid Spamming
Spamming on Reddit can take many forms, and it’s essential to avoid behaviors that may be perceived as spammy.
Understanding and respecting the community guidelines is crucial for maintaining a positive presence on the platform.
Spam can be defined differently by each subreddit, but common behaviors include:
Posting off-topic content.
Submitting too frequently, even with good content.
Sharing the same content across multiple subreddits.
Trying to bypass subreddit rules (such as using redirects to post prohibited links).
Posting without engaging in comments.
Engaging in excessive self-promotion.
Each subreddit often lists its specific rules and definitions of spam in the sidebar, so make sure to read and follow them.
Additionally, Reddit continuously enhances its spam prevention measures, making it harder to game the system. Key improvements include:
AutoMod: Automated moderation scripts that filter submissions based on various parameters like account age and karma.
Contributor Quality Tiers: Assessing users based on their overall activity and legitimacy.
Ban Evasion Filters: Using AI to detect and prevent users from creating new accounts to bypass bans.
These evolving measures ensure a better community experience by reducing spam and encouraging genuine engagement.
Avoid attempting to game the system, as these improvements make it increasingly difficult – and ultimately, it just doesn’t lead to success.
3. Become A Redditor And Build Karma
Engage authentically by commenting and participating in discussions before starting your own posts. Build karma and learn what works within different communities.
To establish a presence on Reddit, it’s essential to start by becoming an active member of the community. Engage in subreddits that align with your passions, whether it’s growing peppers or discussing “Rick and Morty.”
This involvement helps you understand Reddit’s unique features, language, and community norms. By participating in discussions, you can start building karma – a reputation score based on upvotes and downvotes.
While karma is a simplified measure of your acceptance on Reddit, it’s vital for unlocking certain privileges and ensuring your posts aren’t hidden by automated moderation.
Focus on subreddits with low restrictions to accumulate karma easily. For instance, posting cute pictures in a cat subreddit or offering advice in a subreddit dedicated to questions can help you earn upvotes and interaction.
However, it’s crucial to approach this naturally and avoid trying to game the system, as patterns of inauthentic engagement can lead to bans and negative karma.
Before diving into posting your own content, prioritize commenting on existing posts.
Commenting helps you blend into the community and learn what types of content generate engagement. By focusing on “rising” posts – threads likely to gain significant visibility – you can maximize your exposure and karma.
This strategic approach allows your comments to receive more attention as the post gains popularity.
Starting with comments also offers insights into the community’s interests and opinions, helping you tailor future submissions to better resonate with the audience.
Submitting content places you directly under scrutiny, whereas commenting allows you to learn and experiment while becoming a Redditor.
Take the time to craft thoughtful, well-considered comments, as Reddit users value effort and sincerity.
Even as a brand, adopt a personable tone. Engaging authentically can shift perceptions positively, as seen in successful campaigns like the one we did for TikTok.
By focusing on building karma through comments first and understanding the community dynamics, you set a strong foundation for successful content submissions in the future.
4. Choose Your Subreddits Carefully
When you’re ready to submit content on Reddit, the first step is to identify the subreddits that align with your interests and goals.
Use Reddit’s search functionality to explore relevant subreddits. You can search specific domains by using “site:searchenginejournal.com” to see where your content or similar content is being discussed.
This helps you understand which subreddits have positive engagement with your topics. Additionally, you can research your competitors to see where they are active and successful.
Look for subreddits that address user journeys and questions related to your content, ensuring they match the topics you want to discuss.
Once you’ve identified potential subreddits, evaluate their activity levels. Check both the number of members and the current active users to ensure the subreddit is lively and engaged.
A large member count doesn’t always mean high engagement, so prioritize active subreddits over those with inflated, inactive memberships.
Review each subreddit’s rules meticulously. If the rules don’t align with your content goals – such as restrictions on link submissions – find a more suitable subreddit. Attempting to bypass rules will only harm your efforts.
Observe the subreddit moderators to understand their tone and interaction style. Knowing how moderators behave and what content they support can provide valuable insights into how your submissions might be received.
Since moderators play a crucial role in approving or removing content, aligning with their preferences increases your chances of success.
Finally, assess the overall tone around your topics within the subreddit. Even if a subreddit focuses on a relevant subject, the community might have varying opinions.
Be mindful of these nuances to ensure your content resonates positively with the audience. By carefully selecting and evaluating subreddits, you can effectively position your content for maximum engagement and success.
5. Create Your Own Subreddit
For brands, having a dedicated subreddit can be highly beneficial. It allows you to control the tone and foster in-depth discussions that not only solve your customers’ needs but also enhance your search and AI visibility.
Selecting knowledgeable moderators is crucial. They should understand Reddit culture and handle interactions professionally. Avoid arguing with users or getting defensive, which only incites negative responses.
Establish clear rules and use automated moderation tools to manage your subreddit effectively, ensuring a positive experience even when you’re not online.
Embrace critical comments and engage thoughtfully rather than deleting negative posts or banning users. The community respects genuine engagement and can differentiate between legitimate criticism and trolling.
By responding as a relatable, empathetic person, you can turn potential conflicts into opportunities for positive interaction.
Always approach interactions as if you’re the Redditor who convinced your company to join Reddit, focusing on authentic, helpful communication. This personal touch can significantly enhance your brand’s reputation on the platform.
6. Post The Right Content That Adds Value
Once you found the right subreddit and know where you want to submit content, focus on providing value to the community.
Research thoroughly to understand what topics resonate and where you can contribute your expertise. This will ensure positive engagement and brand interaction.
Support existing discussions by offering solutions or insights, enhancing the conversation with your brand’s unique perspective. Reviewing top content from the past year in your chosen subreddit can help you identify successful topics and understand the community’s interests.
Pay attention to standout users and comments to gauge what works and anticipate reactions.
Engage with moderators by studying their submissions and preferences. Building a rapport with them can significantly influence your content’s success, as they play a crucial role in approving and promoting submissions.
Always consider the needs of the subreddit members. While you have the freedom to post as you like, focusing on what the community wants will lead to greater success.
Again, avoid the temptation to spam or overpromote your content. Focus on what will really provide value to the community.
Finally, keep in mind the broader impact on search and AI. Reddit’s influence on search engine results is significant, and the platform’s content is increasingly used to train AI models.
Choose content that enhances your brand’s visibility and reputation, ensuring it aligns with how you want your brand and products to be perceived. This strategic approach will maximize your reach and effectiveness on Reddit.
7. Post At The Right Time
Timing is crucial for gaining initial engagement on Reddit. Content typically thrives for about 24 hours, although exceptionally popular content can remain visible longer. The general trend is a rapid decline after the initial 24-hour period, and the initial votes are the most critical for boosting visibility.
Avoid the temptation to game the system by creating multiple accounts to upvote your own content, as this can lead to bans and diminish your credibility. Instead, focus on organic engagement to achieve those vital first votes.
To maximize your reach, consider using tools to analyze the best times for posting in your specific subreddit. Generally, posting between 8 a.m. and 10 a.m. Eastern Time is effective, as it captures a full day of high activity, including both domestic and international users.
8. Stay Engaged After You Submit
After submitting a post on Reddit, it’s crucial to stay engaged. Monitor your post closely for the full 24 hours it’s live.
Engage with comments by voting and responding thoughtfully. Upvote genuine comments and downvote trolls or low-quality contributions to manage comment visibility effectively.
Engage with commenters without being defensive, but don’t feel obligated to respond to every single comment.
Approach interactions naturally, applying common social standards. The goal is to foster a positive, constructive discussion, enhancing your presence on Reddit.
Reddit Is An Invaluable Platform If You Invest The Time And Effort
Reddit is one of the hardest communities to break into. It takes a lot of dedication and sometimes years of experience, but Reddit’s community-driven approach and authentic content make it an invaluable platform for marketers willing to invest the time and effort.
Whether it’s engaging in existing subreddits, creating your own, or running ads, Reddit offers unique opportunities for meaningful connections and impactful marketing.
Thank you to everyone who joined the webinar. It was definitely fun talking about Reddit again. If you have any questions or need help with your brand’s impact on Reddit, feel free to reach out.
A Little Something About Me You Might Not Know…
I’ve been an avid Redditor since 2006. My journey with social media started in San Francisco, where I got involved with companies like Digg, Reddit, and StumbleUpon right as they were taking off.
One of my favorite memories is emailing Alexis Ohanian, Reddit’s co-founder, to tell him how much I loved the site. He responded, we chatted, and before I knew it, we were hanging out.
From tacos to SEO chats, we bonded over our shared passion for Reddit. Fun fact: My son is the first baby Redditor, thanks to Alexis.
Google updated their documentation to reflect that it added eight new languages to its translated results feature, broadening the reach of publishers to an increasingly global scale, with automatic translations to a site visitor’s native language.
Google Translated Results
Translated Results is a Google Search feature that will automatically translate the title link and meta description into the local language of a user, making a website published in one language available to a searcher in another language. If the searcher clicks on the link of a translated result the web page itself will also be automatically translated.
According to Google’s documentation for this feature:
“Google doesn’t host any translated pages. Opening a page through a translated result is no different than opening the original search result through Google Translate or using Chrome in-browser translation. This means that JavaScript on the page is usually supported, as well as embedded images and other page features.”
This feature benefits publishers because it makes their website available to a larger audience.
Search Feature Available In More Languages
Google’s documentation for this feature was updated to reflect that it is now available in eight more languages.
Users who speak the following languages will now have automatic access to a broader range of websites.
List Of Added Languages
Arabic
Gujarati
Korean
Persian
Thai
Turkish
Urdu
Vietnamese
Why Did It Take So Long?
It seems odd that Google didn’t already translate results into so many major languages like Turkish, Arabic or Korean. So I asked international SEO expert Christopher Shin (LinkedIn profile) about why it might have taken so long for Google to do this in the Korean language.
Christopher shared:
Google was always facing difficulties in the South Korean market as a search engine, and that has to do mainly with Naver and Kakao, formerly known as Daum.
But the whole paradigm shift to Google began when more and more students that went abroad to where Google is the dominant search engine came back to South Korea. When more and more students, travelers abroad etc., returned to Korea, they started to realize the strengths and weaknesses of the local search portals and the information capabilities these local portals provided. Laterally, more and more businesses in South Korea like Samsung, Hyundai etc., started to also shift marketing and sales to global markets, so the importance of Google as a tool for companies was also becoming more important with the domestic population.
Naver is still the dominant search portal, but not to retrieve answers to specific queries, rather for the purpose of shopping, reviews etc.
So I believe that market prioritization may be a big part as to the delayed introduction of Translated Google Search Results. And in terms of numbers, Korea is smaller with only roughly 52M nationwide and continues to decline due to poor birth rates.
Another big factor as I see it, has to do with the complexity of the Korean language which would make it more challenging to build out a translation tool that only replicates a simple English version. We use the modern Korean Hangeul but also the country uses Hanja, which are words from the Chinese origin. I used to have my team use Google Translate until all of them complained that Naver’s Papago does a better job, but with the introduction of ChatGPT, the competitiveness offered by Google was slim.”
Takeaway
It’s not an understatement to say that 2024 has not been a good year for publishers, from the introduction of AI Overviews to the 2024 Core Algorithm Update, and missing image thumbnails on recipe blogger sites, there hasn’t been much good news coming out of Google. But this news is different because it creates the opportunity for publisher content to be shown in even more languages than ever.
This excerpt is from The Digital Marketing Success Plan, the new book from SEJ VIP Contributor Corey Morris.
In what is the most distracted and disrupted era in digital marketing–especially SEO–history, we’re testing and trying things out faster than ever. While change is coming at us fast, it is critically important to still have a documented, actionable, and accountable plan for your digital marketing efforts.
In his new book, Corey Morris, details a five-step START Planning process to help brands arrive at their own digital marketing success plans to ensure ROI and business outcomes are at the heart of every effort while allowing plenty of room and agility for the rapid changes we’re experiencing in digital and search marketing.
Search Engine Journal has an exclusive feature of the first step in the START Planning process–”S for Strategy”–unpacking the four steps in this first and most critical phase.
Chapter 3: S For Strategy
The Strategy Phase is the most comprehensive part of the START planning process. The subsequent phases are all dependent on the work done and defined in this phase.
Strategy works through profiling, auditing, research, and goal setting. Knowing what marketing has been done in the past, where things stand currently, and—most importantly—where you want to go is critical at this juncture and overall for any digital marketing success plan.
The strategy phase has four steps, the first of which is profile. This could be considered a simple step, as we’re just gathering information and definitions.
However, it could also be misinterpreted, and it is challenging because it requires an expert to ask the right questions. That includes detailing the team involved in the effort and defining the product (services) we must sell, the brand, and the target audiences.
In short, we’re putting the details on the table about who we are, our resources, and our capabilities. We are identifying what we’re selling, what value it has, how we deliver it, and the pricing model. We also must know what our brand is in terms of positioning, differentiation, and equity that it holds.
And, as important as anything, we must know who our target audience personas are, their customer behaviors, and the funnels or journeys they take to buy.
Anyone can ramble off some demographics or targets. But, as companies grow, having a mutually agreed understanding of what the business sells, who it sells to, and the money it costs to do so is extremely hard.
I say all of this in hopes that you don’t get stuck here on some of the hard details, and also knowing that if it is easy, you might want to challenge some things and see if you can go deeper and ensure that you truly have the agreement and buy-in that you seem to.
The second step in the strategy phase is audit. We need to know what we’ve done in the past and are currently doing so we have a full picture of what has worked, what hasn’t, and why. Audits are important at this juncture, and this step might be one of the most time-consuming in the entire digital marketing success plan development journey.
As you obtain or create documentation of historical activities, you’ll need access to all the past and present networks and platforms. Then, you can deep dive into audits, including technical paid search, technical SEO, content SEO, web systems, email marketing systems, and more, based on what has been done in the past and what is available for you at this juncture.
The third step in the strategy phase is research. So far, the focus has been on who we are and what we’ve done leading up to where we currently stand with our efforts. This phase is where we get perspectives beyond our own data and understanding.
This is where we seek out internal perspectives from marketing, sales, ops, product, and other relevant teams and stakeholders—as well as from our customers or clients. Additionally, we’re doing external research to learn new insights or validate what we think when it comes to competitors, target audiences, and what the future opportunity forecasts or models out for us.
The final step in strategy is goals. With a thorough picture of who we are, where we stand, and what opportunities are out there for us, we can workshop to arrive at a realistic set of goals. Maybe we came into the process with our own goals, or maybe at this point, we’re starting from scratch.
Regardless, this step is critical to the rest of the process and arriving at a plan that can drive success. This is where we look at business goals and how marketing can affect them and ensure we set proper expectations before we move the strategy from ideas to action.
“WE HAVE A PROBLEM” Premium Roofing Manufacturer Story
A high-end roofing manufacturing company came to us with a unique problem. Marcy, their marketing manager, had a lot of past success with SEO, their website and email marketing, and extensive campaigns driving traffic to their websites for homeowners and contractors alike–fueling their sales operations.
Marcy had gone through several different agencies over the past few years. She had varying experiences with them, had a great one for a while, and then had a couple that didn’t value or know as much about SEO. She didn’t realize that, at the time, it was a line item to some of those agencies. It was getting done, and rankings and traffic were fine. Nothing was sticking out of the ordinary.
One day, Marcy noticed a problem in Google Analytics. Traffic is starting to drop overall. She dives in and, as she is very familiar with the reports and channels and diagnoses this as an SEO problem within a minute. SEO traffic is dropping, but she can’t tell why.
The agency says everything looks good on their end. Marcy can’t find any errors on the site. However, there’s this mysterious drop where she can see they’re not where they used to be in the Google rankings. Subsequent drops in traffic, conversions, and form submissions going through to their sales team validate it.
She remembered her work with me a few years prior at a different agency and reached out. She thought of me as someone she could trust to fix any SEO problem, which I take as high praise. I was at a conference in Silicon Valley, getting ready to take the stage to speak about SEO troubleshooting.
And so that was the ironic part of it to me. I gave my speech and immediately after had a longer conversation with Marcy over the phone. I could dive in and see the same things she saw, and I knew that we needed to do a full audit very quickly and understand what was going on.
I brought the rest of my team back home into the challenge. Within two days, we had diagnosed two very acute issues that were hidden and that most people wouldn’t see. We wouldn’t have found them unless we had gone through our analysis auditing process to get that deep.
We presented those findings to Marcy and her CEO, who both knew how big of a negative impact this would have on their business if they didn’t get this corrected.
We presented three options. One was to fix the issues technically within their current site. Still, being forward-thinking and ROI-driven, we didn’t want just to offer to patch the holes and wait for the next problem to come. So, we presented two other plans. They included a midrange plan and a long-range plan to build a new website and not only fix the issues but also strategically amplify some other things.
They opted to invest in the new website, and that turned into an ongoing relationship with us to monitor and amplify their SEO and take it to new heights, not just reclaiming what they had lost but making new ground. And I’m excited that we saw that all the way through. It played out exactly as we had projected and was validated by growth for them.
The company eventually sold for a record amount and won awards from our peers for that work. The moral of the story is not just to accept the status quo but to realize that not all professionals who have SEO in their title have an equal set of skills. Auditing is an important tool in getting to the root cause, not just for fixing an immediate problem but even more critically for long-term success.
“WE HAVE TO GET THIS RIGHT” Continuing Care Retirement Community Story
Jamaal found us through Google. He was the director of admissions and marketing for a high-end retirement facility that serves as a continuing care community. They had everything: independent living, dining in chef-inspired restaurants, activities, a pub, and anything that active senior living would want through the continuum of care, including assisted living and skilled nursing.
They have an excellent reputation in their city and are well known; however, that’s with the community at large. They needed help to reach their target audience, who could be potential residents or adult child influencers in their lives—the next generation down.
When something happens, and it’s time to look for this type of living situation, the people at that important step are less aware and less prepared for the conversations they must have with their loved ones in a critical phase of life. These people were supposed to be moving into research and action toward admission.
Also, while it was a wealthy, high-end property, it was nonprofit, very benevolent, and gave back so much. The margins were tight, and there wasn’t a large marketing budget, but they knew they needed to do something.
Jamaal’s challenge when he came to us was, “I know you can do everything. I know I probably need all the things under the digital marketing umbrella. I even need a new website, but I don’t have the budget.”
We said, “That’s not a problem. We start small with many of our clients and find the areas where we can have the greatest ROI and impact. Then, we build from there and create budgets, opening up dollars for investment in other opportunities.”
So, we came into the situation, and we analyzed their audience. They had a wealth of data. They knew their business inside and out, and it was fantastic for us to see that. Still, they needed help understanding digital marketing and couldn’t connect the dots.
They had talked to three or four other providers who gave them high-ticket products or service offerings and didn’t want to work with them to find the right solution or where they should get the most bang for their buck.
We returned to them and recommended, “You should start with SEO.”
Jamaal laughed because he said that was the opposite recommendation that several of the other agencies had made. They had said, “No, you should start with $100,000 a month in Google Ads.”
I said, “You should start on SEO at a fraction of that,” even though we knew the challenges were there with being unable to build a new website. We’d have to navigate their antiquated website and optimize what they had.
We knew that telling the story, getting the content right, and even optimizing a lousy website would get us further along in the long-term journey of driving new leads to the website. We knew we only needed a handful of people to find the site to understand what they did at the right moment, get the right story, and come through the doors and experience this wonderful place.
After building momentum, one lead at a time, we could start talking about a new website, activate additional marketing channels, and layer in aspects of the digital marketing success plan to see success in the long term.
Ultimately, they grew as a business and their marketing investment grew respectively. Eventually, they were acquired by a large hospital system, where everyone could flourish and get the mission and the word out.
The moral of the story is it’s always better to do something rather than nothing.
But if you’re on a limited budget, understand that the obvious answers or the expensive ones aren’t necessarily the best ones. Be willing to dig into the data, do the hard work, and see the opportunity to create new budgets.
By seeing small successes, one at a time, you can build toward bigger things.
To learn more about why digital marketing planning is so important, Corey’s START Planning process, and how to implement which he details in the full book (including more real stories and “how to” sections for each phase of the process), download the book now on Amazon.
For a limited time through July 17, the Kindle version is only 99 cents.
You can also find out more information and free resources at https://thedmsp.com
An SEO asked on LinkedIn why an anonymous user on Reddit could outrank a credible website with a named author. Google’s answer gives a peek at what’s going on with search rankings and why Reddit can outrank expert articles.
Why Do Anonymous Redditors Outrank Experts?
The person asking the question wanted to know why an anonymous author on Reddit can outrank an actual author that has “credibility” such as in a brand name site like PCMag.
The person wrote:
“I was referring to how important the credibility of the writer is now. If we search for ‘best product under X amount,’ we see, let’s say, PCMag and Reddit both on the first page.
PCMag is a reliable source for that product, while Reddit has UGC and surely doesn’t guarantee authenticity. Where do you see this in terms of credibility?
In my opinion, Google must be focusing more on this, especially after the AI boom, where misinformation can be easily and massively spread.
Do you think this is an important factor in rankings anymore?”
This is their question that points out what the SEO feels is wrong with Google’s search results:
“As we can see, Reddit, popular for anonymous use, ranks much higher than many other websites.
This means that content from anonymous users is acceptable.
Can I conclude that a blog without any ‘about’ page or ‘author profile’ can also perform as well?”
Relevance And Usefulness Versus Credibility
Google’s John Mueller answered the question by pointing out that there are multiple kinds of websites, not just sites that are perceived to be credible and everyone else. The idea of credibility is one dimension of what a site can be, one quality of a website. Mueller’s answer reminded that search (and SEO) is multidimensional.
Google’s John Mueller answered:
“Both are websites, but also, they’re quite different, right? Finding the right tools for your needs, the right platforms for your audience and for your messages – it’s worth looking at more than just a simplification like that. Google aims to provide search results that are relevant & useful for users, and there’s a lot involved in that.
I feel this might fit, perhaps you have seen it before -“
Does Reddit Lack Credibility?
When it comes to recipes , in my opinion, Reddit users lack a lot of credibility in some contexts. When it comes to recipes, I’ll take the opinions of a recipe blogger or Serious Eats over what a random Redditor “thinks” a recipe should be.
The person asking the question mentioned product reviews as a topic that Reddit fails at credibility and ironically that’s actually a topic where Reddit actually shines. A person on Reddit who is sharing their hands-on experience using a brand of air fryer or mobile phone is the epitome of what Google is trying to rank for reviews because it’s the opinion of someone with days, weeks, months, years of actual experience with a product.
Saying that UGC product reviews are useful doesn’t invalidate the professional product reviews. It’s possible that both UGC and professional reviews have value, right? And I think that’s the point that John Mueller was trying to get across about not simplifying search to one ranking criteria, one dimension.
This a dimension of search that the person asking the question overlooked, the hands-on experience of the reviewer and it illustrates what Mueller means when he says that “it’s worth looking at more than just a simplification” of what’s ranking in the search results.
OTOH… Feels Like A Slap In The Face
There are many high quality sites with original photos, actual reviews and content based on real experience that are no longer ranking the search results. I know because I have seen many of these sites that in my opinion should be ranking but are not. Googlers have expressed the possibility that a future update will help more quality sites bounce back and many expert publishers are counting on that.
Nevertheless, it must be acknowledged that it must feel like a slap in the face for an expert author to see an anonymous Redditor outranking them in Google’s search results.
Multidimensional Approach To SEO
A common issue I see in how some digital marketers and bloggers debug the search engine results pages (SERPs) is that they see it through one, two, or three dimensions such as:
Keywords,
Expertise
Credibility
Links
Reviewing the SERPs to understand why Google is ranking something is a good idea. But reviewing it with just a handful of dimensions, a limited amount of “signals” can be frustrating and counterproductive.
It was only a few years ago that SEOs convinced themselves that “author signals” were a critical part of ranking and now almost everyone (finally) understands that this was all a misinterpretation of what Google and Googlers said (despite the Googlers consistently denying that authorship was a ranking signal).
The “authorship” SEO trend is an example of a one dimensional approach to SEO that overlooked the multidimensional quality of how Google ranks web pages.
There are thousands of contexts that contribute to what is ranked, like solving a problem from the user perspective, interpreting user needs, adapting to cultural and language nuances, nationwide trends, local trends, and so on. There are also ranking contexts (dimensions) that are related to Google’s Core Topicality Systems which are used to understand search queries and web pages.
Ranking web pages, from Google’s perspective, is a multidimensional problem. What that means is that reducing a search ranking problem to one dimension, like the anonymity of User Generated Content, inevitably leads to frustration. Broadening the perspective leads to better SEO.
SEO is a complex, vast, and sometimes mysterious practice. There are a lot of aspects to SEO that can lead to confusion.
Not everyone will agree with what SEO entails – where technical SEO stops and development begins.
What also doesn’t help is the vast amount of misinformation that goes around. There are a lot of “experts” online and not all of them should bear that self-proclaimed title. How do you know who to trust?
Even Google employees can sometimes add to the confusion. They struggle to define their own updates and systems and sometimes offer advice that conflicts with previously given statements.
The Dangers Of SEO Myths
The issue is that we simply don’t know exactly how the search engines work. Due to this, much of what we do as SEO professionals is trial and error and educated guesswork.
When you are learning about SEO, it can be difficult to test out all the claims you hear.
That’s when the SEO myths begin to take hold. Before you know it, you’re proudly telling your line manager that you’re planning to “AI Overview optimize” your website copy.
How, exactly, would Google be able to measure that? Would that actually benefit the end user in any way?
There is a danger in SEO of considering the search engines to be omnipotent, and because of this, wild myths about how they understand and measure our websites start to grow.
Myths in SEO tend to take the form of handed-down wisdom that isn’t tested.
As a result, something that might well have no impact on driving qualified organic traffic to a site gets treated like it matters.
Minor Factors Blown Out Of Proportion
SEO myths might also be something that has a small impact on organic rankings or conversion but are given too much importance.
This might be a “tick box” exercise that is hailed as being a critical factor in SEO success, or simply an activity that might only cause your site to eke ahead if everything else with your competition was truly equal.
Outdated Advice
Myths can arise simply because what used to be effective in helping sites rank and convert well no longer does but is still being advised. It might be that something used to work really well.
Over time, the algorithms have grown smarter. The public is more adverse to being marketed to.
Simply, what was once good advice is now defunct.
Google Being Misunderstood
Many times, the start of a myth is Google itself.
Unfortunately, a slightly obscure or just not straightforward piece of advice from a Google representative gets misunderstood and run away with.
Before we know it, a new optimization service is being sold off the back of a flippant comment a Googler made in jest.
SEO myths can be based on fact, or perhaps these are, more accurately, SEO legends?
In the case of Google-born myths, it tends to be that the fact has been so distorted by the SEO industry’s interpretation of the statement that it no longer resembles useful information.
26 Common SEO Myths
So, now that we know what causes and perpetuates SEO myths, let’s find out the truth behind some of the more common ones.
1. The Google Sandbox And Honeymoon Effects
Some SEO professionals believe that Google will automatically suppress new websites in the organic search results for a period of time before they are able to rank more freely.
Others suggest there is a sort of Honeymoon Period, during which Google will rank new content highly to test what users think of it.
The content would be promoted to ensure more users see it. Signals like click-through rate and bounces back to the search engine results pages (SERPs) would then be used to measure if the content is well received and deserves to remain ranked highly.
There is, however, the Google Privacy Sandbox. This is designed to help maintain peoples’ privacy online. This is a different sandbox from the one that allegedly suppresses new websites.
When asked specifically about the Honeymoon Effect and the rankings Sandbox, John Mueller answered:
“In the SEO world, this is sometimes called kind of like a sandbox where Google is like keeping things back to prevent new pages from showing up, which is not the case.
Or some people call it like the honeymoon period where new content comes out and Google really loves it and tries to promote it.
And it’s again not the case that we’re explicitly trying to promote new content or demote new content.
It’s just, we don’t know and we have to make assumptions.
And then sometimes those assumptions are right and nothing really changes over time.
Sometimes things settle down a little bit lower, sometimes a little bit higher.”
So, there is no systematic promotion or demotion of new content by Google, but what you might be noticing is that Google’s assumptions are based on the rest of the website’s rankings.
Verdict: Officially? It’s a myth.
2. Duplicate Content Penalty
This is a myth that I hear a lot. The idea is that if you have content on your website that is duplicated elsewhere on the web, Google will penalize you for it.
The key to understanding what is really going on here is knowing the difference between algorithmic suppression and manual action.
A manual action, the situation that can result in webpages being removed from Google’s index, will be actioned by a human at Google.
An algorithmic suppression occurs when your page cannot rank well due to it being caught by a filter from an algorithm.
Essentially, having copy that is taken from another webpage might mean you can’t outrank that other page.
The search engines may determine that the original host of the copy is more relevant to the search query than yours.
As there is no benefit to having both in the search results, yours gets suppressed. This is not a penalty. This is the algorithm doing its job.
There are some content-related manual actions, but essentially, copying one or two pages of someone else’s content is not going to trigger them.
It is, however, potentially going to land you in other trouble if you have no legal right to use that content. It also can detract from the value your website brings to the user.
What about content that is duplicated across your own site? Mueller clarifies that duplicate content is not a negative ranking factor. If there are multiple pages with the same content, Google may choose one to be the canonical page, and the others will not be ranked.
Verdict: SEO myth.
3. PPC Advertising Helps Rankings
This is a common myth. It’s also quite quick to debunk.
The idea is that Google will favor websites that spend money with it through pay-per-click advertising. This is simply false.
Google’s algorithm for ranking organic search results is completely separate from the one used to determine PPC ad placements.
Running a paid search advertising campaign through Google while carrying out SEO might benefit your site for other reasons, but it won’t directly benefit your ranking.
Verdict: SEO myth.
4. Domain Age Is A Ranking Factor
This claim is seated firmly in the “confusing causation and correlation” camp.
Because a website has been around for a long time and is ranking well, age must be a ranking factor.
Google has debunked this myth itself many times.
In July 2019, Mueller replied to a post on Twitter.com (recovered through Wayback Machine) that suggested that domain age was one of “200 signals of ranking” saying, “No, domain age helps nothing.”
Image from Twitter.com recovered through Wayback Machine, June 2024
The truth behind this myth is that an older website has had more time to do things well.
For instance, a website that has been live and active for 10 years may well have acquired a high volume of relevant backlinks to its key pages.
A website that has been running for less than six months will be unlikely to compete with that.
The older website appears to be ranking better, and the conclusion is that age must be the determining factor.
Verdict: SEO myth
5. Tabbed Content Affects Rankings
This idea is one that has roots going back a long way.
The premise is that Google will not assign as much value to the content sitting behind a tab or accordion.
For example, text that is not viewable on the first load of a page.
Google again debunked this myth in March 2020, but it has been a contentious idea among many SEO professionals for years.
In September 2018, Gary Illyes, Webmaster Trends Analyst at Google, answered a tweet thread about using tabs to display content.
His response:
“AFAIK, nothing’s changed here, Bill: we index the content, and its weight is fully considered for ranking, but it might not get bolded in the snippets. It’s another, more technical question of how that content is surfaced by the site. Indexing does have limitations.”
If the content is visible in the HTML, there is no reason to assume that it is being devalued just because it is not apparent to the user on the first load of the page. This is not an example of cloaking, and Google can easily fetch the content.
As long as there is nothing else that is stopping the text from being viewed by Google, it should be weighted the same as copy, which isn’t in tabs.
Want more clarification on this? Then check out this SEJ article that discusses this subject in detail.
Verdict: SEO myth.
6. Google Uses Google Analytics Data In Rankings
This is a common fear among business owners.
They study their Google Analytics reports. They feel their average sitewide bounce rate is too high, or their time on page is too low.
So, they worry that Google will perceive their site to be low quality because of that. They fear they won’t rank well because of it.
The myth is that Google uses the data in your Google Analytics account as part of its ranking algorithm.
It’s a myth that has been around for a long time.
Illyes has again debunked this idea simply with, “We don’t use *anything* from Google analytics [sic] in the “algo.”
Image from Twitter.com recovered from Wayback Machine, June 2024
If we think about this logically, using Google Analytics data as a ranking factor would be really hard to police.
For instance, using filters could manipulate data to make it seem like the site was performing in a way that it isn’t really.
What is good performance anyway?
High “time on page” might be good for some long-form content.
Low “time on page” could be understandable for shorter content.
Is either one right or wrong?
Google would also need to understand the intricate ways in which each Google Analytics account had been configured.
Some might be excluding all known bots, and others might not. Some might use custom dimensions and channel groupings, and others haven’t configured anything.
Using this data reliably would be extremely complicated to do. Consider the hundreds of thousands of websites that use other analytics programs.
How would Google treat them?
Verdict: SEO myth.
This myth is another case of “causation, not correlation.”
A high sitewide bounce rate might be indicative of a quality problem, or it might not be. Low time on page could mean your site isn’t engaging, or it could mean your content is quickly digestible.
These metrics give you clues as to why you might not be ranking well, they aren’t the cause of it.
7. Google Cares About Domain Authority
PageRank is a link analysis algorithm used by Google to measure the importance of a webpage.
Google used to display a page’s PageRank score a number up to 10 on its toolbar. It stopped updating the PageRank displayed in toolbars in 2013.
In 2016, Google confirmed that the PageRank toolbar metric was not going to be used going forward.
In the absence of PageRank, many other third-party authority scores have been developed.
Commonly known ones are:
Moz’s Domain Authority and Page Authority scores.
Majestic’s Trust Flow and Citation Flow.
Ahrefs’ Domain Rating and URL Rating.
Some SEO pros use these scores to determine the “value” of a page.
That calculation can never be an entirely accurate reflection of how a search engine values a page, however.
SEO pros will sometimes refer to the ranking power of a website often in conjunction with its backlink profile and this, too, is known as the domain’s authority.
You can see where the confusion lies.
Google representatives have dispelled the notion of a domain authority metric used by them.
“We don’t use domain authority. We generally try to have our metrics as granular as possible, sometimes that’s not so easy, in which case we look at things a bit broader (e.g., we’ve talked about this in regards to some of the older quality updates).”
Image from Twitter.com recovered through Wayback Machine, June 2024
Verdict: SEO myth.
8. Longer Content Is Better
You will have definitely heard it said before that longer content ranks better.
More words on a page automatically make yours more rank-worthy than your competitor’s. This is “wisdom” that is often shared around SEO forums without little evidence to substantiate it.
There are a lot of studies that have been released over the years that state facts about the top-ranking webpages, such as “on average pages in the top 10 positions in the SERPs have over 1,450 words on them.”
It would be quite easy for someone to take this information in isolation and assume it means that pages need approximately 1,500 words to rank on Page 1. That isn’t what the study is saying, however.
Unfortunately, this is an example of correlation, not necessarily causation.
Just because the top-ranking pages in a particular study happened to have more words on them than the pages ranking 11th and lower does not make word count a ranking factor.
“From our point of view the number of words on a page is not a quality factor, not a ranking factor.”
For more information on how content length can impact SEO, check out Sam Hollingsworth’s article.
Verdict: SEO myth.
9. LSI Keywords Will Help You Rank
What exactly are LSI keywords? LSI stands for “latent semantic indexing.”
It is a technique used in information retrieval that allows concepts within the text to be analyzed and relationships between them identified.
Words have nuances dependent on their context. The word “right” has a different connotation when paired with “left” than when it is paired with “wrong.”
Humans can quickly gauge concepts in a text. It is harder for machines to do so.
The ability of machines to understand the context and linking between entities is fundamental to their understanding of concepts.
LSI is a huge step forward for a machine’s ability to understand text. What it isn’t is synonyms.
Unfortunately, the field of LSI has been devolved by the SEO community into the understanding that using words that are similar or linked thematically will boost rankings for words that aren’t expressly mentioned in the text.
It’s simply not true. Google has gone far beyond LSI in its understanding of text with the introduction of BERT, as just one example.
For more about what LSI is and how it does or doesn’t affect rankings, take a look at this article.
Verdict: SEO myth.
10. SEO Takes 3 Months
It helps us get out of sticky conversations with our bosses or clients. It leaves a lot of wiggle room if you aren’t getting the results you promised. “SEO takes at least three months to have an effect.”
It is fair to say that there are some changes that will take time for the search engine bots to process.
There is then, of course, some time to see if those changes are having a positive or negative effect. Then more time might be needed to refine and tweak your work.
That doesn’t mean that any activity you carry out in the name of SEO is going to have no effect for three months. Day 90 of your work will not be when the ranking changes kick in. There is a lot more to it than that.
If you are in a very low-competition market, targeting niche terms, you might see ranking changes as soon as Google recrawls your page. A competitive term could take much longer to see changes in rank.
A study by Semrush suggested that of the 28,000 domains they analyzed, only 19% of domains started ranking in the top 10 positions within six months and managed to maintain those rankings for the rest of the 13-month study.
This study indicates that newer pages struggle to rank high.
However, there is more to SEO than ranking in the top 10 of Google.
For instance, a well-positioned Google Business Profile listing with great reviews can pay dividends for a company. Bing, Yandex, and Baidu might make it easier for your brand to conquer the SERPs.
A small tweak to a page title could see an improvement in click-through rates. That could be the same day if the search engine were to recrawl the page quickly.
Although it can take a long time to see first page rankings in Google, it is naïve of us to reduce SEO success just down to that.
Bounce rate is the percentage of visits to your website that result in no interactions beyond landing on the page. It is typically measured by a website’s analytics program, such as Google Analytics.
Unfortunately, it is not a good measure of quality.
There are many reasons why a visitor might land on a webpage and leave again without interacting further with the site. They may well have read all the information they needed on that page and left the site to call the company and book an appointment.
In that instance, the visitor bouncing has resulted in a lead for the company.
Although a visitor leaving a page having landed on it could be an indicator of poor quality content, it isn’t always. Therefore, it wouldn’t be reliable enough for a search engine to use as a measure of quality.
“Pogo-sticking,” or a visitor clicking on a search result and then returning to the SERPs, would be a more reliable indicator of the quality of the landing page.
It would suggest that the content of the page was not what the user was after, so much so that they have returned to the search results to find another page or re-search.
John Mueller cleared this up (again) during Google Webmaster Central Office Hours in June 2020. He was asked if sending users to a login page would appear to be a “bounce” to Google and damage their rankings:
“So, I think there is a bit of misconception here, that we’re looking at things like the analytics bounce rate when it comes to ranking websites, and that’s definitely not the case.”
Back on another Google Webmaster Central Office Hours in July 2018, he also said:
“We try not to use signals like that when it comes to search. So that’s something where there are lots of reasons why users might go back and forth, or look at different things in the search results, or stay just briefly on a page and move back again. I think that’s really hard to refine and say, “well, we could turn this into a ranking factor.”
So, why does this keep coming up? Well, for a lot of people, it’s because of this one paragraph in Google’s How Search Works:
“Beyond looking at keywords, our systems also analyze if content is relevant to a query in other ways. We also use aggregated and anonymised interaction data to assess whether Search results are relevant to queries.”
The issue with this is that Google doesn’t specify what this “aggregated and anonymised interaction data” is. This has led to a lot of speculation and of course, arguments.
My opinion? Until we have some more conclusive studies, or hear something else from Google, we need to keep testing to determine what this interaction data is.
For now, regarding the traditional definition of a bounce, I’m leaning towards “myth.”
In itself, bounce rate (measured through the likes of Google Analytics) is a very noisy, easily manipulated figure. Could something akin to a bounce be a ranking signal? Absolutely, but it will need to be a reliable, repeatable data point that genuinely measures quality.
In the meantime, if your pages are not satisfying user intent, that is definitely something you need to work on – not simply because of bounce rate.
Fundamentally, your pages should encourage users to interact, or if not that sort of page, at least leave your site with a positive brand association.
Verdict: SEO myth.
12. It’s All About Backlinks
Backlinks are important – that’s without much contention within the SEO community. However, exactly how important is still debated.
Some SEO pros will tell you that backlinks are one of the many tactics that will influence rankings, but they are not the most important. Others will tell you it’s the only real game-changer.
What we do know is that the effectiveness of links has changed over time. Back in the wild pre-Jagger days, link-building consisted of adding a link to your website wherever you could.
Forum comments had spun articles, and irrelevant directories were all good sources of links.
It was easy to build effective links. It’s not so easy now.
Google has continued to make changes to its algorithms that reward higher-quality, more relevant links and disregard or penalize “spammy” links.
However, the power of links to affect rankings is still great.
There will be some industries that are so immature in SEO that a site can rank well without investing in link-building, purely through the strength of their content and technical efficiency.
That’s not the case with most industries.
Relevant backlinks will, of course, help with ranking, but they need to go hand-in-hand with other optimizations. Your website still needs to have relevant content, and it must be crawlable.
If you want your traffic to actually do something when they hit your website, it’s definitely not all about backlinks.
Ranking is only one part of getting converting visitors to your site. The content and usability of the site are extremely important in user engagement.
Following the slew of Helpful Content updates and a better understanding of what Google considers E-E-A-T, we know that content quality is extremely important.
Backlinks can definitely help to indicate that a page would be useful to a reader, but there are many other factors that would suggest that, too.
Verdict: SEO myth.
13. Keywords In URLs Are Very Important
Cram your URLs full of keywords. It’ll help.
Unfortunately, it’s not quite as powerful as that.
John Mueller has said several times that keywords in a URL are a very minor, lightweight ranking signal.
“We use the words in a URL as a very, very lightweight factor. And from what I recall, this is primarily something that we would take into account when we haven’t had access to the content yet.
So, if this is the absolute first time we see this URL and we don’t know how to classify its content, then we might use the words in the URL as something to help rank us better.
But as soon as we’ve crawled and indexed the content there, then we have a lot more information.”
If you are looking to rewrite your URLs to include more keywords, you are likely to do more damage than good.
The process of redirecting URLs en masse should be when necessary, as there is always a risk when restructuring a site.
For the sake of adding keywords to a URL? Not worth it.
Verdict: SEO myth.
14. Website Migrations Are All About Redirects
SEO professionals hear this too often. If you are migrating a website, all you need to do is remember to redirect any URLs that are changing.
If only this one were true.
In actuality, website migration is one of the most fraught and complicated procedures in SEO.
A website changing its layout, content management system (CMS), domain, and/or content can all be considered a website migration.
In each of those examples, there are several aspects that could affect how the search engines perceive the quality and relevance of the pages to their targeted keywords.
As a result, there are numerous checks and configurations that need to occur if the site is to maintain its rankings and organic traffic – ensuring tracking hasn’t been lost, maintaining the same content targeting, and making sure the search engine bots can still access the right pages.
All of this needs to be considered when a website is significantly changing.
Redirecting URLs that are changing is a very important part of website migration. It is in no way the only thing to be concerned about.
Verdict: SEO myth.
15. Well-Known Websites Will Always Outrank Unknown Websites
It stands to reason that a larger brand will have resources that smaller brands do not. As a result, more can be invested in SEO.
More exciting content pieces can be created, leading to a higher volume of backlinks acquired. The brand name alone can lend more credence to outreach attempts.
The real question is, does Google algorithmically or manually boost big brands because of their fame?
In 2009, Google released an algorithm update named “Vince.” This update had a huge impact on how brands were treated in the SERPs.
Brands that were well-known offline saw ranking increases for broad competitive keywords. It stands to reason that brand awareness can help with discovery through Search.
It’s not necessarily time for smaller brands to throw in the towel.
The Vince update falls very much in line with other Google moves towards valuing authority and quality.
Big brands are often more authoritative on broad-level keywords than smaller contenders.
However, small brands can still win.
Long-tail keyword targeting, niche product lines, and local presence can all make smaller brands more relevant to a search result than established brands.
Yes, the odds are stacked in favor of big brands, but it’s not impossible to outrank them.
Verdict: Not entirely truth or myth.
16. Your Page Needs To Include ‘Near Me’ To Rank Well For Local SEO
It’s understandable that this myth is still prevalent.
There is still a lot of focus on keyword search volumes in the SEO industry, sometimes at the expense of considering user intent and how the search engines understand it.
When a searcher is looking for something with local intent, i.e., a place or service relevant to a physical location, the search engines will take this into consideration when returning results.
With Google, you will likely see the Google Maps results as well as the standard organic listings.
The Maps results are clearly centered around the location searched. However, so are the standard organic listings when the search query denotes local intent.
So, why do “near me” searches confuse some?
A typical keyword research exercise might yield something like the following:
“pizza restaurant manhattan” – 110 searches per month.
“pizza restaurants in manhattan” – 110 searches per month.
“best pizza restaurant manhattan” – 90 searches per month.
“best pizza restaurants in manhattan” – 90 searches per month.
“best pizza restaurant in manhattan”– 90 searches per month.
“pizza restaurants near me” – 90,500 searches per month.
With search volume like that, you would think [pizza restaurants near me] would be the one to rank for, right?
It is likely, however, that people searching for [pizza restaurant manhattan] are in the Manhattan area or planning to travel there for pizza.
[pizza restaurant near me] has 90,500 searches across the USA. The likelihood is that the vast majority of those searchers are not looking for Manhattan pizzas.
Google knows this and, therefore, will serve pizza restaurant results relevant to the searcher’s location.
Therefore, the “near me” element of the search becomes less about the keyword and more about the intent behind the keyword. Google will just consider it to be the location the searcher is in.
So, do you need to include “near me” in your content to rank for those [near me] searches?
It’s prevalent in SEO forums and X (formally Twitter) threads. The common complaint is, “My competitor is ranking above me, but I have amazing content, and theirs is terrible.”
The cry is one of indignation. After all, shouldn’t search engines reward sites for their “amazing” content?
This is both a myth and sometimes a delusion.
The quality of content is a subjective consideration. If it is your own content, it’s harder still to be objective.
Perhaps in Google’s eyes, your content isn’t better than your competitors’ for the search terms you are looking to rank for.
Perhaps you don’t meet searcher intent as well as they do. Maybe you have “over-optimized” your content and reduced its quality.
In some instances, better content will equal better rankings. In others, the technical performance of the site or its lack of local relevance may cause it to rank lower.
This is a frustrating myth because it seems to have spread outside of the SEO industry.
Google loves frequent content. You should add new content or tweak existing content daily for “freshness.”
Where did this idea come from?
Google had an algorithm update in 2011 that rewards fresher results in the SERPs.
This is because, for some queries, the fresher the results, the better the likelihood of accuracy.
For instance, if you search for [royal baby] in the UK in 2013, you will be served with news articles about Prince George. Search it again in 2015, and you will see pages about Princess Charlotte.
In 2018, you would see reports about Prince Louis at the top of the Google SERPs, and in 2019 it would be baby Archie.
If you were to search [royal baby] in 2021, shortly after the birth of Lilibet, then seeing news articles on Prince George would likely be unhelpful.
In this instance, Google discerns the user’s search intent and decides showing articles related to the newest UK royal baby would be better than showing an article that is arguably more rank-worthy due to authority, etc.
If it does, then the age of content becomes a more important ranking factor.
This means that if you are creating content purely to make sure it is newer than competitors’ content, you are not necessarily going to benefit.
If the query you are looking to rank for does not deserve freshness, i.e., [who is Prince William’s third child?] a fact that will not change, then the age of content will not play a significant part in rankings.
If you are writing content every day thinking it is keeping your website fresh and, therefore, more rank-worthy, then you are likely wasting time.
It would be better to write well-considered, researched, and useful content pieces less frequently and reserve your resources to make those highly authoritative and shareable.
Verdict: SEO myth.
19. You Can Optimize Copy Once & Then It’s Done
The phrase “SEO optimized” copy is a common one in agency-land.
It’s used as a way to explain the process of creating copy that will be relevant to frequently searched queries.
The trouble with this is that it suggests that once you have written that copy – and ensured it adequately answers searchers’ queries – you can move on.
Unfortunately, over time, how searchers look for content might change. The keywords they use, the type of content they want could alter.
The search engines, too, may change what they feel is the most relevant answer to the query. Perhaps the intent behind the keyword is perceived differently.
The layout of the SERPs might alter, meaning videos are being shown at the top of the search results where previously it was just webpage results.
If you look at a page only once and then don’t continue to update it and evolve it with user needs, then you risk falling behind.
Verdict: SEO myth.
20. Google Respects The Declared Canonical URL As The Preferred Version For Search Results
This can be very frustrating. You have several pages that are near duplicates of each other. You know which one is your main page, the one you want to rank, the “canonical.” You tell Google that through the specially selected “rel=canonical” tag.
You’ve chosen it. You’ve identified it in the HTML.
Google ignores your wishes, and another of the duplicate pages ranks in its place.
The idea that Google will take your chosen page and treat it like the canonical out of a set of duplicates isn’t a challenging one.
It makes sense that the website owner would know best which page should be the one that ranks above its cousins. However, Google will sometimes disagree.
There may be instances where another page from the set is chosen by Google as a better candidate to show in the search results.
This could be because the page receives more backlinks from external sites than your chosen page. It could be that it’s included in the sitemap or is being linked to your main navigation.
Essentially, the canonical tag is a signal – one of many that will be taken into consideration when Google chooses which page from a set of duplicates should rank.
If you have conflicting signals on your site, or externally, then your chosen canonical page may be overlooked in favor of another page.
Want to know if Google has selected another URL to be the canonical despite your canonical tag? In Google Search Console, in the Index Coverage report, you might see this: “Duplicate, Google chose different canonical than user.”
“This page is marked as canonical for a set of pages, but Google thinks another URL makes a better canonical. Google has indexed the page that we consider canonical rather than this one.”
When questioned on the “other two” top ranking factors, the questioner assumed that Rank Brain was one, Lipattsev stated that links pointing to a site, and content were the other two. He does clarify by saying:
“Third place is a hotly contested issue. I think… It’s a funny one. Take this with a grain of salt. […] And so I guess, if you do that, then you’ll see elements of RankBrain having been involved in here, rewriting this query, applying it like this over here… And so you’d say, ‘I see this two times as often as the other thing, and two times as often as the other thing’. So it’s somewhere in number three.
It’s not like having three links is ‘X’ important, and having five keywords is ‘Y’ important, and RankBrain is some ‘Z’ factor that is also somehow important, and you multiply all of that … That’s not how this works.”
However it started, the concept prevails. A good backlink profile, great copy, and “Rank Brain” type signals are what matter most with rankings, according to many SEO pros.
Mueller is asked if there is a one-size-fits-all approach to the top three ranking signals in Google. His answer is a clear “No.”
He follows that statement with a discussion around the timeliness of searches and how that might require different search results to be shown.
He also mentions that depending on the context of the search, different results may need to be shown, for instance, brand or shopping.
He continues to explain that he doesn’t think that there is one set of ranking factors that can be declared the top three that apply to all search results all the time.
“To give you the most useful information, Search algorithms look at many factors and signals, including the words of your query, relevance and usability of pages, expertise of sources, and your location and settings.
The weight applied to each factor varies depending on the nature of your query. For example, the freshness of the content plays a bigger role in answering queries about current news topics than it does about dictionary definitions. ”
Verdict: Not entirely true or myth.
22. Use The Disavow File To Proactively Maintain A Site’s Link Profile
To disavow or not disavow — this question has popped up a lot over the years since Penguin 4.0.
Some SEO professionals are in favor of adding any link that could be considered spammy to their site’s disavow file. Others are more confident that Google will ignore them anyway and save themselves the trouble.
It’s definitely more nuanced than that.
In a 2019 Webmaster Central Office Hours Hangout, Mueller was asked about the disavow tool and whether we should have confidence that Google is ignoring medium (but not very) spammy links.
His answer indicated that there are two instances where you might want to use a disavow file:
And where you might think if someone from the webspam team saw it, they would issue a manual action.
You might not want to add every spammy link to your disavow file. In practice, that could take a long time if you have a very visible site that accrues thousands of these links a month.
There will be some links that are obviously spammy, and their acquisition is not a result of activity on your part.
However, where they are a result of some less-than-awesome link building strategies (buying links, link exchanges, etc.) you may want to proactively disavow them.
Read Roger Montti’s full breakdown of the 2019 exchange with John Mueller to get a better idea of the context around this discussion.
Verdict: Not a myth, but don’t waste your time unnecessarily.
23. Google Values Backlinks From All High Authority Domains
The better the website authority, the bigger the impact it will have on your site’s ability to rank. You will hear that in many SEO pitches, client meetings, and training sessions.
And more importantly, it is the understanding that there is a lot that goes into Google’s calculations of whether a link will impact a site’s ability to rank highly or not.
Relevancy, contextual clues, no-follow link attributes. None of these should be ignored when chasing a link from a high “domain authority” website.
John Mueller also threw a cat among the pigeons during a live Search Off the Record podcast recorded at BrightonSEO in 2022 when he said:
“And to some extent, links will always be something that we care about because we have to find pages somehow. It’s like how do you find a page on the web without some reference to it?” But my guess is over time, it won’t be such a big factor as sometimes it is today. I think already, that’s something that’s been changing quite a bit.”
Verdict: Myth.
24. You Cannot Rank A Page Without Lightning-Fast Loading Speed
There are many reasons to make your pages fast: usability, crawlability, and conversion. Arguably, it is important for the health and performance of your website, and that should be enough to make it a priority.
However, is it something that is absolutely key to ranking your website?
As this Google Search Central post from 2010 suggests, it was definitely something that factored into the ranking algorithms. Back when it was published, Google stated:
“While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.”
Is it still only affecting such a low percentage of visitors?
In 2021, the Google Page Experience system, which incorporates the Core Web Vitals for which speed is important, rolled out on mobile. It was followed in 2022 with a rollout of the system to desktop.
This was met with a flurry of activity from SEO pros, trying to get ready for the update.
Many perceive it to be something that would make or break their site’s ranking potential. However, over time, Google representatives have downplayed the ranking effect of Core Web Vitals.
More recently, in May 2023, Google introduced Interaction to Next Paint (INP) to the Core Web Vitals to replace First Input Delay (FID).
Google claims that INP helps to deal with some of the limitations found with FID. This change in how a page’s responsiveness is measured shows that Google still cares about accurately measuring user experience.
However, it will not necessarily cause your website to dramatically increase or decrease in rankings.
Google representatives Gary Illyes, Martin Splitt, and John Mueller hypothesized in 2021 during a “Search off the Record” podcast about the weighting of speed as a ranking factor.
Their discussion drew out the thinking around page load speed as a ranking metric and how it would need to be considered a fairly lightweight signal.
They went on to talk about it being more of a tie-breaker, as you can make an empty page lightning-fast, but it will not serve much use for a searcher.
“Core Web Vitals is definitely a ranking factor. We have that for mobile and desktop now. It is based on what users actually see and not kind of a theoretical test of your pages […] What you don’t tend to see is big ranking changes overall for that.
But rather, you would see changes for queries where we have similar content in the search results. So if someone is searching for your company name, we would not show some random blog, just because it’s a little bit faster, instead of your homepage.
We would show your homepage, even if it’s very slow. On the other hand, if someone is searching for, I don’t know, running shoes, and there are lots of people writing about running shoes, then that’s where the speed aspect does play a bit more of a role.”
With this in mind, can we consider page speed a major ranking factor?
My opinion is no, page speed is definitely one of the ways Google decides which pages should rank above others, but not a major one.
Verdict: Myth.
25. Crawl Budget Isn’t An Issue
Crawl budget – the idea that every time Googlebot visits your website, there is a limited number of resources it will visit – isn’t a contentious issue. However, how much attention should be paid to it is.
For instance, many SEO professionals will consider crawl budget optimization a central part of any technical SEO roadmap. Others will only consider it if a site reaches a certain size or complexity.
Google is a company with finite resources. It cannot possibly crawl every single page of every site every time its bots visit them. Therefore, some of the sites that get visited might not see all of their pages crawled every time.
Google has helpfully created a guide for owners of large and frequently updated websites to help them understand how to enable their sites to be crawled.
“If your site does not have a large number of pages that change rapidly, or if your pages seem to be crawled the same day that they are published, you don’t need to read this guide; merely keeping your sitemap up to date and checking your index coverage regularly is adequate.”
Therefore, it would seem that Google is in favor of some sites paying attention to its advice on managing crawl budget, but doesn’t consider it necessary for all.
For some sites, particularly ones that have a complex technical setup and many hundreds of thousands of pages, managing crawl budget is important. For those with a handful of easily crawled pages, it isn’t.
Verdict: SEO myth.
26. There Is A Right Way To Do SEO
This is probably a myth in many industries, but it seems prevalent in SEO. There is a lot of gatekeeping in SEO social media, forums, and chats.
Unfortunately, it’s not that simple.
We know some core tenets about SEO.
Usually, something is stated by a search engine representative that has been dissected, tested, and ultimately declared true.
The rest is a result of personal and collective trial and error, testing, and experience.
Processes are extremely valuable within SEO business functions, but they have to evolve and be applied appropriately.
Different websites within different industries will respond to changes in ways others would not. Altering a meta title so it is under 60 characters long might help the click-through rate for one page and not for another.
Ultimately, we have to hold any SEO advice we’re given lightly before deciding whether it is right for the website you are working on.
Verdict: SEO myth.
When Can Something Appear To Be A Myth
Sometimes an SEO technique can be written off as a myth by others purely because they have not experienced success from carrying out this activity for their own site.
It is important to remember that every website has its own industry, set of competitors, the technology powering it, and other factors that make it unique.
Blanket application of techniques to every website and expecting them to have the same outcome is naive.
Someone may not have had success with a technique when they have tried it in their highly competitive vertical.
It doesn’t mean it won’t help someone in a less competitive industry have success.
Causation & Correlation Being Confused
Sometimes, SEO myths arise because of an inappropriate connection between an activity that was carried out and a rise in organic search performance.
If an SEO has seen a benefit from something they did, then it is natural that they would advise others to try the same.
Unfortunately, we’re not always great at separating causation and correlation.
Just because rankings or click-through rates increased around the same time as you implemented a new tactic doesn’t mean it caused the increase. There could be other factors at play.
Soon, an SEO myth will arise from an overeager SEO who wants to share what they incorrectly believe to be a golden ticket.
Steering Clear Of SEO Myths
It can save you from experiencing headaches, lost revenue, and a whole lot of time if you learn to spot SEO myths and act accordingly.
Test
The key to not falling for SEO myths is making sure you can test advice whenever possible.
If you have been given the advice that structuring your page titles a certain way will help your pages rank better for their chosen keywords, then try it with one or two pages first.
This can help you measure whether making a change across many pages will be worth the time before you commit to it.
Is Google Just Testing?
Sometimes, there will be a big uproar in the SEO community because of changes in the way Google displays or orders search results.
These changes are often tested in the wild before they are rolled out to more search results.
Once a big change has been spotted by one or two SEO pros, advice on how to optimize for it begins to spread.
Remember the favicons in the desktop search results? The upset that caused the SEO industry (and Google users in general) was vast.
Suddenly, articles sprang up about the importance of favicons in attracting users to your search results. There was barely time to study whether favicons would impact the click-through rate that much.
Because just like that, Google changed it back.
Before you jump for the latest SEO advice being spread around Twitter as a result of a change by Google, wait to see if it will hold.
It could be that the advice that appears sound now will quickly become a myth if Google rolls back changes.
More resources:
Featured Image: Search Engine Journal/Paulo Bobita
Google’s Gary Illyes’ answer about authorship shared insights about why Google has less trust for signals that are under direct control of site owners and SEOs and provides a better understanding about what site owners and SEOs should focus on when optimizing a website.
The question that Illyes answered was in the context of a live interview at a search conference in May 2024. The interview went largely unnoticed but it’s full of great information related to digital marketing and how Google ranks web pages.
Authorship Signals
Someone asked the question about whether Google would bring back authorship signals. Authorship has been a fixation by some SEOs based on Google’s encouragement that SEOs and site owners review the Search Quality Raters Guidelines to understand what Google aspires to rank. SEOs however took the encouragement too literally and started to parse the document for ranking signal ideas instead.
Digital marketers came to see the concept of EEAT (Expertise, Experience, Authoritativeness, and Trustworthiness) as actual signals that Google’s algorithms were looking for and from there came the idea that authorship signals were important for ranking.
The idea of authorship signals is not far-fetched because Google at one time created a way for site owners and SEOs pass along metadata about webpage authorship but Google eventually abandoned that idea.
SEO-Controlled Markup Is Untrustworthy
Google’s Gary Illyes answered the question about authorship signals and very quickly, within the same sentence, shared that Google’s experience with SEO-controlled data on the web page (markup) tends to become spammy (implying that it’s untrustworthy).
This is the question as relayed by the interviewer:
“Are Google planning to release some authorship sooner or later, something that goes back to that old authorship?”
Google’s Gary Illyes answered:
“Uhm… I don’t know of such plans and honestly I’m not very excited about anything along those lines, especially not one that is similar to what we had back in 2011 to 2013 because pretty much any markup that SEOs and site owners have access to will be in some form spam.”
Gary next went into greater detail by saying that SEO and author controlled markup are not good signals.
Here is how he explained it:
“And generally they are not good signals. That’s why rel-canonical, for example is not a directive but a hint. And that’s why Meta description is not a directive, but something that we might consider and so on.
Having something similar for authorship, I think would be a mistake.”
The concept of SEO-controlled data not being a good signal is important to understand because many in search marketing believe that they can manipulate Google by spoofing authorship signals with fake author profiles, with reviews that pretend to be hands-on, and with metadata (like titles and meta descriptions) that is specifically crafted to rank for keywords.
What About Algorithmically Determined Authorship?
Gary then turned to the idea of algorithmically determined authorship signals and it may surprise some that Gary describes those siganls as lacking in value. This may come as a blow to SEOs and site owners who have spent significant amounts of time updating their web pages to improve their authorship data.
The concept of the importance of “authorship signals” for ranking is something that some SEOs created all by themselves, it’s not an idea that Google encouraged. In fact, Googlers like John Mueller and SearchLiaison have consistently downplayed the necessity of author profiles for years.
Gary explained about algorithmically determined authorship signals:
“Having something similar for authorship, I think would be a mistake. If it’s algorithmically determined, then perhaps it would be more accurate or could be higher accuracy, but honestly I don’t necessarily see the value in it.”
The interviewer commented about rel-canonicals sometimes being a poor source of information:
“I’ve seen canonical done badly a lot of times myself, so I’m glad to hear that it is only a suggestion rather than a rule.”
Gary’s response to the observation about poor canonicals is interesting because he doesn’t downplay the importance of “suggestions” but implies that some of them are stronger although still falling short of a directive. A directive is something that Google is obligated to obey, like a noindex meta tag.
Gary explained about rel-canonicals being a strong suggestion:
“I mean it’s it’s a strong suggestion, but still it’s a suggestion.”
Gary affirmed that even though rel=canonicals is a suggestion, it’s a strong suggestion. That implies a relative scale of how much Google trusts certain inputs that publishers make. In the case of a canonical, Google’s stronger trust in rel-canonical is probably a reflection of the fact that it’s in a publisher’s best interest to get it right, whereas other data like authorship could be prone to exaggeration or outright deception and therefore less trustworthy.
What Does It All Mean?
Gary’s comments should give a foundation for setting the correct course on what to focus on when optimizing a web page. Gary (and other Googlers) have said multiple times that authorship is not really something that Google is looking for. That’s something that SEOs invented, not something that Google encouraged.
This also provides guidance on not overestimating the importance of metadata that is controlled by a site owner or SEO.
Watch the interview starting at about the two minute mark:
Google Search Central updated their documentation to reflect support for labeling images that were extended or manipulated with AI. Google also quietly removed the “AI generated” metadata from Beta status, indicating that the “AI Generated” label is now fully supported in search.
IPTC Photo Metadata
The International Press Telecommunications Council (IPTC) is a standards making body that among other things creates standards for photo metadata. Photo metadata enables a photograph to be labeled with information about the photo, like information about copyright, licensing and image descriptions.
Although the standards is made for by an international press standards organization the meta data standards they curate are used by Google Images in a context outside of Google News. The metadata allows Google Images to show additional information about the image.
Google’s documentation explains the use case and benefit of the metadata:
“When you specify image metadata, Google Images can show more details about the image, such as who the creator is, how people can use an image, and credit information. For example, providing licensing information can make the image eligible for the Licensable badge, which provides a link to the license and more detail on how someone can use the image.”
AI Image Manipulation Metadata
Google quietly adopted the metadata standards pertaining to images that were manipulated with AI algorithms that are typically used to manipulate images, like convolutional neural networks (CNNs) and generative adversarial networks (GANs).
There are two forms of AI image manipulation that are covered by the new metadata:
Inpainting
Outpainting
Inpainting
Inpainting is generally conceived as enhancing an image for the purpose of restoring or reconstructing it, to fill in the missing parts. But inpainting is also any algorithm manipulation that adds to an image.
Outpainting
Outpainting is the algorithm process of adding to an image, extending it beyond the borders of the original photograph, adding more to it than what was in the original image.
Google now supports labeling images that were manipulated in both those ways with a new metadata property of the Digital Source Type that’s called compositeWithTrainedAlgorithmicMedia.
compositeWithTrainedAlgorithmicMedia
While the new property looks like structured data, it’s not Schema structured data. It’s metadata that’s embedded in a digital image.
This is what was added to Google’s documentation:
“Digital Source Type
compositeWithTrainedAlgorithmicMedia: The image is a composite of trained algorithmic media with some other media, such as with inpainting or outpainting operations.”
Label For “AI Generated” – algorithmicMedia Metadata
Google also lifted the Beta status of the algorithmicMedia metadata specifications, which means that images that are created with AI can now be labeled as AI Generated if the algorithmicMedia metadata is embedded within an image.
This is the documentation before the change:
“algorithmicMedia: The image was created purely by an algorithm not based on any sampled training data (for example, an image created by software using a mathematical formula).
Beta: Currently, this property is in beta and only available for IPTC photo metadata. Adding this property makes your image eligible for display with an AI-generated label, but you may not see the label in Google Images right away, as we’re still actively developing it.”
The change in the documentation was to remove the entirety of the second paragraph to remove any mention of Beta status. Curiously, this change is not reflected in Google’s changelog.
Google’s Search Central documentation changelog noted:
“Supporting a new IPTC digital source type What: Added compositeWithTrainedAlgorithmicMedia to the IPTC photo metadata documentation.
Why: Google can now extract the compositeWithTrainedAlgorithmicMedia IPTC NewsCode.”
Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!
Google’s launch and pullback of AI Overviews (AIOs) caught the most attention in the SEO scene over the last two months.
However, a change with at least the same significance flew under the radar: Google’s transformation from search engine to marketplace for shopping queries.
Yes, AIOs are impactful: In my initial analysis, I found a negative impact of -8.9% when a page cited in an AIO compared to ranking at the top of the classic web search results.
I then found that Google pulled 50-66% of AIOs back. However, Google shows a whole slew of SERP features and AI features for ecommerce queries that are at least as impactful as AIOs.
To better understand the key trends for shopping queries, I analyzed 35,305 keywords across categories like fashion, beds, plants, and automotive in the US over the last five months using SEOClarity.
The results:
Product listings appear more often in position 1 in June compared to February 2024.
SERP features like Discussions & Forums gained visibility and opened a new playground for marketers.
SERP features fluctuate in visibility and introduce a lot of noise in SEO metrics.
Google Shopping Marketplace
To summarize ecommerce shifts, where I explain Google’s shift from search engine to ecommerce marketplace, Google has merged the web results and shopping tab for shopping searches as a response to Amazon’s long-standing dominance:
Google has fully transitioned into a shopping marketplace by adding product filters to search result pages and implementing a direct checkout option.
These new features create an ecommerce search experience within Google Search and may significantly impact the organic traffic merchants and retailers rely on.
Google has quietly introduced a direct checkout feature that allows merchants to link free listings directly to their checkout pages.
Google’s move to a shopping marketplace was likely driven by the need to compete with Amazon’s successful advertising business.
Google faces the challenge of balancing its role as a search engine with the need to generate revenue through its shopping marketplace, especially considering its dependence on partners for logistics.
To illustrate with an example:
Say you are looking for kayaks (summertime!).
On desktop (logged-in), Google will now show you product filters on the left sidebar and product carousels in the middle on top of classic organic results – and ads, of course.
Image Credit: Kevin Indig
On mobile, you get product filters at the top, ads above organic results, and product carousels in the form of popular products.
Image Credit: Kevin Indig
This experience doesn’t look very different from Amazon, which is the whole point.
Image Credit: Kevin Indig
Google’s new shopping experience lets users explore products on Amazon, Walmart, Ebay, Esty, & Co.
From an SEO perspective, the prominent position of product grid (listings) and filters likely significantly impacts CTR, organic traffic, and, ultimately, revenue.
Product Listings Appear More Often In Position 1
30,172 out of 35,305 keywords (85.6%) show product listings, which are the free product carousels, in my analysis. It’s the most visible SERP feature in shopping search.
In February, product listings showed up for 39% of queries in position 1 and 15% of queries in position 3.
In July, that number shifted to 43% for position 1 and 13.6% for position 3. Google moved product listings higher up the SERPs.
Image Credit: Kevin Indig
The shift from web links to product images makes product listings a cornerstone feature in Google’s transformation. The increased visibility means Google doubles down on the new model.
Discussions & Forums Gain Visibility
After product listings (85.6% of queries), image carousels (61.8% of queries) are the most common SERP features.
Image Credit: Kevin Indig
Image carousels are highly impactful because shopping is a visual act. Seeing the right product can very quickly trigger a purchase, as opposed to customers being stuck in the Messy Middle for longer.
Retailers and ecommerce brands put a lot of effort into high-quality product pictures and need to spend equal time optimizing images for Google Search, even though organic traffic is usually much lower than web ranks.
Google now tests “generate image with AI,” a feature that lets users generate product images with prompts and then see similar (real) products.
It’s a powerful application of AI that, again, flies under the AIO radar but could also be impactful by making it easier for users to find things they want.
Image Credit: Kevin Indig
Visibility for most SERP features remained relatively unchanged between February and July, with one exception: Discussions & Forums grew from 28.7% to 34% of all queries (+5.3 percentage points).
Image Credit: Kevin Indig
The change in Discussions & Forums SERP features is in line with Reddit’s unprecedented SEO visibility gain over the last 12 months. The domain now operates at the traffic level of Facebook and Amazon.
Google’s Discussions & Forums feature highlights threads in forums like Reddit, Quora, and others. People visit forums when they are looking for authentic and unincentivized opinions from other consumers. Many review articles are biased, and it seems consumers know.
As a result, Google compensates for lower review quality with more user-generated content from forums. In Free Content, I referenced a study from Germany titled “Is Google getting worse?” that found:
“An overall downward trend in text quality in all three search engines.”
“Higher-ranked pages are on average more optimized, more monetized with affiliate marketing, and they show signs of lower text quality.”
Discussions & Forums show that high visibility doesn’t equal high impact for SERP features.
SERP Features And Their Impact Fluctuate
SERP features are commonly assumed to show up at a stable rate in Search, but Google constantly tests them.
As a result, SERP features that impact click-through rates can introduce a lot of noise into common SEO data (CTR, clicks, even revenue).
At the same time, Google switching some features on and off can help SEO pros understand the impact of SERP features on SEO metrics.
A good example is the Things To Know feature (TTK), which answers two common questions about a product with links to websites.
Image Credit: Kevin Indig
After months of stable visibility, Google suddenly reduced the number of TTKs by -37.5% for a month, bringing it back to previous levels.
Sites that were linked in TTK might have seen less organic traffic during that month. Since TTK isn’t reported in Search Console, those sites might wonder why their organic traffic dropped even though ranks might be stable.
Image Credit: Kevin Indig
Coming back to the Kayak example from earlier, Google tests variations like deals and carousel segments (“Kayaks For Beginners”).
Image Credit: Kevin Indig
You can imagine how hard this makes getting stable data and why it’s so critical to monitor SERP features.
Featured Image: Paulo Bobita/Search Engine Journal