Job listings for SEO roles dropped 37% in the first quarter of 2024 compared to last year.
Analysis by specialty job board SEOJobs.com compiled data from over 80,000 job listings posted by 4,700 employers in 2023 and early 2024.
The report cites the increasing use of AI in search as a critical factor impacting SEO hiring.
Mid-Level Roles Hardest Hit
While senior SEO positions saw a 3% year-over-year increase in Q1 2024 and entry-level roles rose 1%, mid-level SEO jobs experienced a 6% decline compared to Q1 2023.
Nick LeRoy, owner of SEOJobs.com, attributed this disparity to AI automation handling routine tasks. That suggests companies require experts capable of higher-level work.
He states:
“Tasks historically mapped to an entry-level position are now being done faster and cheaper with AI technology…
… These entry-level SEOs are now expected to have a base-level knowledge of search AND the soft skills to compete against their mid-level peers with 3+ years of experience.
… Companies want to “do more with less,” which means hiring cheap junior resources and paying for proven experience/results via senior SEOs.”
On a more positive note, LeRoy finds that remote SEO opportunities grew in Q1 2024 after dipping late last year.
Lack Of Salary Transparency
The SEO job report also highlighted the industry’s need for salary transparency.
Only 18% of job listings provide pay details, though there may be more wage disclosure as more states legally require it.
Economic Pressures
Alongside AI disruption, broader economic conditions appear to weigh on SEO employment.
According to the United States Bureau of Labor Statistics (BLS), the pullback in listings coincides with a slowing of job growth in the U.S. labor market.
The BLS recently reported that “employment was little changed over the month in major industries,” including professional and business services. SEO roles often fall under this sector.
6 Ways SEO Professionals Can Stay Competitive
As AI’s role in search grows, the most successful SEO professionals will likely be those who can combine technical mastery with strategic thinking, analytical skills, and a lifelong commitment to learning and professional development.
Here are some potential strategies to differentiate yourself in today’s job market:
1. Develop AI Expertise
With AI playing a prominent role in search, understanding and leveraging AI technologies will be critical.
Professionals utilizing AI tools like natural language processing, content generation, and semantic analysis will have a competitive edge.
2. Focus On Strategic SEO
As AI automates many technical and execution-based SEO tasks, employers will likely prioritize hiring SEOs with strategic abilities.
Professionals adept at competitive analysis, audience research, content strategy development, and conversion optimization may be in higher demand.
3. Build Analytics Prowess
With AI’s impact on search rankings and user behavior, the ability to extract insights from data will become even more valuable.
Expertise in analytics platforms, statistical analysis, data visualization, and communicating data-driven recommendations can set you apart from other candidates.
4. Specialization
While some SEO professionals take a generalist approach, increasing specialization within disciplines like local search, e-commerce, enterprise-level SEO, or a particular industry vertical could appeal more to potential employers.
5. Emphasize Soft Skills
As technical duties become automated, soft skills like communication, problem-solving, creativity, and adaptability may carry more weight in the hiring process.
SEO professionals who can collaborate across teams and articulate strategies can separate themselves.
6. Build A Personal Brand
Developing a solid personal brand through blogging, public speaking, publishing authoritative content, and engaging on social media can raise your profile. This increased visibility can lead to new job opportunities.
Parting Thoughts
While the short-term outlook is challenging, the role of the SEO professional is transforming, not disappearing.
Those able to adapt their value proposition and align with new demands can find viable career opportunities.
Featured Image: PeopleImages.com – Yuri A/Shutterstock
Since March 2024, the SEO industry has seen significant disruptions, with Google rolling out its latest algorithm changes.
From the deindexing of websites to the delivery of manual penalties, site owners and SEO professionals have found themselves in a tough spot as Google attempts to clean up search results.
So what’s the path forward? How can you keep your site in good standing?
If you’re looking for strategies to help you not just survive but thrive in this dynamic digital landscape, join us on June 5 for an insightful webinar with PageOne Power.
In this live session, we’ll demonstrate how businesses like yours have maintained steadfast rankings amidst the recent volatility in search.
How to create valuable content for users: Crafting content that not only satisfies search engine algorithms but also resonates with users is key to long-term SEO success. By focusing on E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness), you can ensure that your content provides genuine value to your audience and position your site as a reliable source of information in your niche.
How to build links that can boost rankings: Building high-quality links is essential for withstanding algorithm updates. By cultivating a diverse and robust backlink profile, you can enhance your website’s authority and relevance, thus improving your search rankings.
Best practices for content creation and link building: Discoverthe effectiveness of manual link building techniques and person-first content strategies. Through real-life case studies, we’ll demonstrate how you can use these tactics to thrive amidst algorithm updates.
Are you ready to start thriving in the new era of search?
Join Vince Ramos, SEO Consultant at PageOne Power, as he showcases how to implement white hat link building and user-focused content creation to elevate your SEO strategy.
Sign up now and get actionable insights and inspiration to help you endure future algorithm updates and maintain high search rankings.
Plus, you’ll get the chance to ask Vince your SEO questions in our live Q&A session after the presentation.
Can’t make the live event? No worries – just register here and we’ll send you a recording following the webinar.
In a recent interview, Google CEO Sundar Pichai sidestepped questions about whether the company would provide website owners with more granular data on traffic from AI-generated search previews.
As Google continues to integrate AI overviews, or “AI previews,” into its search results, publishers have grown increasingly concerned about the impact on their click-through rates and overall traffic.
Google could alleviate some concerns by breaking out traffic metrics for AI-generated results separately from traditional search clicks.
However, the company won’t commit to providing that data.
Pichai Dodges Direct Question
When pressed by The Verge on whether Google would commit to providing this data breakdown to publishers, Pichai avoided giving a straight answer.
“It’s a good question for the search team. They think about this at a deeper level than I do,” he said, deflecting responsibility.
The CEO suggested that Google needs to provide a “balance” in its data, arguing that website owners might try to game the system if it provides too many specifics.
” The more we spec it out, then the more people design for that,” he claimed.
Lack Of Transparency Fuels Publisher Frustration
Google’s lack of commitment to transparency will likely frustrate publishers who feel they have a right to know how much of their traffic is affected by Google’s AI implementations.
It’s publisher content that Google’s AI models are being trained on, and now their traffic is at stake. For Google to be so elusive about sharing that data breakdown feels disingenuous.
Pichai’s comments come across as tone-deaf to the plight of web publishers, who rely on search traffic to drive ad revenue and sustain their businesses.
With precise data on how AI previews impact click-through rates, publishers can adapt their strategies for greater visibility.
Antitrust Concerns Loom
Google’s reluctance to share this information also raises questions about anti-competitive practices.
As the dominant search engine, Google holds power over web traffic flow.
By keeping publishers in the dark about AI-driven metrics, the company could be seen as using its market position to unfairly disadvantage content creators.
This issue will likely attract further scrutiny from antitrust regulators, who are already investigating Google for alleged monopolistic behavior in the search market.
Long-Term Effects On Web Ecosystem
If publishers feel they’re not fairly compensated for their content or given the data they need to make informed decisions, it could disincentivize the creation of high-quality, original content.
This could lead to a poorer experience for internet users and less diversity of information online.
As AI becomes more integral to search, Google must find a way to collaborate with publishers and provide them with the insights they need to thrive.
FAQ
How does the introduction of AI previews by Google impact search traffic for publishers?
AI-generated search overviews might draw user attention away from traditional organic search results, leading to fewer clicks on publisher content.
As a result, the transparency and availability of separate traffic metrics for AI-generated results versus traditional search data become crucial for publishers to understand and respond to these changes effectively.
What are the main concerns of publishers regarding Google’s AI data transparency?
Publishers are particularly concerned about the lack of detailed data on traffic from AI-generated search previews. This transparency is vital for them to gauge the impact of AI on their website traffic and ad revenue.
Google’s reluctance to share this breakdown frustrates publishers, as it limits their ability to adapt their strategies to the new search environment.
Why does Google’s CEO believe providing specific data on AI preview traffic could be problematic?
Google CEO Sundar Pichai suggested that offering granular AI preview traffic data might encourage website owners to manipulate the system.
He believes providing detailed metrics could result in publishers designing their content specifically to game Google’s search engine, which may lead to a worse user experience.
What potential long-term impact could Google’s approach to AI search data have on the web ecosystem?
Publishers may produce less content if they aren’t compensated for their content or provided with data to make informed decisions. This could result in a poorer experience online and reduced diversity of information.
In a recent interview, Google CEO Sundar Pichai discussed the company’s implementation of AI in search results and addressed concerns from publishers and website owners about its potential impact on web traffic.
Background On AI In Google Search
Google has been gradually incorporating AI-generated overviews and summaries into its search results.
These AI overviews aim to provide users with quick answers and context upfront on the search page. However, publishers fear this could dramatically reduce website click-through rates.
Pichai Claims AI Drives Traffic
Despite concerns, Pichai maintained an optimistic outlook on how AI will affect the web ecosystem in the long run.
“I remain optimistic. Empirically, what we are seeing throughout these years is that human curiosity is boundless.”
The Google CEO claimed that the company’s internal data shows increased user engagement with AI overviews, including higher click-through rates on links within these previews compared to regular search results.
Pichai stated:
“When you give the context, it also exposes people to various branching off, jumping off, points, and so they engage more. So, actually, this is what drives growth over time.”
Unfortunately, Picahi didn’t provide specific metrics to support this assertion.
Balancing User Experience & Publisher Interests
Pichai claims that Google is attempting to balance meeting user expectations and sending website traffic, stating:
“I look at our journey, even the last year through the Search Generative Experience, and I constantly found us prioritizing approaches that would send more traffic while meeting user expectations.
… what’s positively surprising us is that people engage more, and that will lead to more growth over time for high-quality content.”
When pressed on anecdotal evidence of some websites losing significant traffic, Pichai cautioned against drawing broad conclusions from individual cases.
He argued that Google has provided more traffic to the web ecosystem over the past decade.
Pichai believes the sites losing traffic are the “aggregators in the middle.”
He stated:
“From our standpoint, when I look historically, even over the past decade, we have provided more traffic to the ecosystem, and we’ve driven that growth.
Ironically, there are times when we have made changes to actually send more traffic to the smaller sites. Some of those sites that complain a lot are the aggregators in the middle.
So should the traffic go to the restaurant that has created a website with their menus and stuff or people writing about these restaurants? These are deep questions. I’m not saying there’s a right answer.”
Takeaways For Website Owners & SEO Professionals
For those in the SEO community, Pichai’s comments offer insight into Google’s strategy and perspective but should be viewed with a degree of skepticism.
While the CEO painted a rosy picture of AI’s impact, concrete data was lacking to support his claims. Website owners must monitor their analytics closely to assess the real-world effects of AI overviews on their traffic.
As Google continues to roll out AI features in search, the dust is far from settled on this issue.
Pichai’s optimism aside, the true impact of AI on the web ecosystem remains to be seen. For now, publishers and SEOs must stay vigilant, adaptable, and vocal about their concerns in this rapidly shifting landscape.
A new study by Pew Research Center reveals the fleeting nature of online information: 38% of webpages from 2013 are no longer accessible a decade later.
The analysis, conducted in October, examined broken links on government and news websites and in the “References” section of Wikipedia pages.
The findings reveal that:
23% of news webpages and 21% of government webpages contain at least one broken link
Local-level government webpages, particularly those belonging to city governments, are especially prone to broken links
54% of Wikipedia pages have at least one link in their “References” section pointing to a non-existent page
Social Media Not Immune To Content Disappearance
To investigate the impact of digital decay on social media, Pew Research collected a real-time sample of tweets on X and monitored them for three months.
The study discovered that “nearly one-in-five tweets are no longer publicly visible on the site just months after being posted.”
In 60% of these cases, the original posting account was made private, suspended, or deleted.
In the remaining 40%, the account holder deleted the tweet, but the account still existed.
Certain types of tweets are more likely to disappear than others, with more than 40% of tweets written in Turkish or Arabic no longer visible within three months of posting.
Additionally, tweets from accounts with default profile settings are particularly susceptible to vanishing from public view.
Defining “Inaccessible” Links & Webpages
For the purpose of this report, Pew Research Center focused on pages that no longer exist when defining inaccessibility.
Other definitions, such as changed content or accessibility issues for visually impaired users, were beyond the scope of the research.
The study used a conservative approach, counting pages as inaccessible if they returned one of nine error codes, indicating that the page and/or its host server no longer exist or have become nonfunctional.
Why SEJ Cares
Digital decay raises important questions about the preservation and accessibility of online content for future generations.
Pew Research Center’s study sheds light on the extent of this problem across various online spaces, from government and news websites to social media platforms.
The high rate of link rot and disappearing webpages has implications for anyone who relies on the internet as a reliable source of information.
It poses challenges for citing online sources, as the original content may no longer be accessible in the future.
What This Means For SEO Professionals
This study underscores the need to regularly audit and update old content, as well as consistently monitor broken links and resolve them promptly.
SEO professionals should also consider the impact of digital decay on backlink profiles.
As external links to a website become inaccessible, it can affect the site’s link equity and authority in the eyes of search engines.
Monitoring and diversifying backlink sources can help mitigate the risk of losing valuable links to digital decay.
Lastly, the study’s findings on social media content prove that SEO efforts should focus on driving users back to more stable, owned channels like websites and email lists.
Google’s John Mueller indicated the possibility of changes to sitewide helpful content signals so that new pages may be allowed to rank. But there is reason to believe that even if that change goes through it may not be enough to help.
Helpful Content Signals
Google’s Helpful Content Signals (aka Helpful Content Update aka HCU) was originally a site-wide signal when launched in 2022. That meant that an entire site would be classified as unhelpful and become unable to rank, regardless if some pages were helpful.
Recently the signals associated with the Helpful Content System were absorbed into Google’s core ranking algorithm, generally changing them to page-level signals, with a caveat.
“Our core ranking systems are primarily designed to work on the page level, using a variety of signals and systems to understand the helpfulness of individual pages. We do have some site-wide signals that are also considered.”
There are two important takeaways:
There is no longer a single system for helpfulness. It’s now a collection of signals within the core ranking algorithm.
The signals are page-level but there are site-wide signals that can impact the overall rankings.
Some publishers have tweeted that the site-wide effect is impacting the ability of new helpful pages from ranking and John Mueller offered some hope.
If Google follows through with lightening the helpfulness signals so that individual pages are able to rank, there is reason to believe that it may not impact many websites that publishers and SEOs believe are suffering from sitewide helpfulness signals.
Publishers Express Frustration With Sitewide Algorithm Effects
“It’s frustrating when new content is also being penalized without having a chance to gather positive user signals. I publish something it goes straight to page 4 and stays there, regardless of if there are any articles out on the location.”
Someone else brought up the point that if helpfulness signals are page-level then in theory the better (helpful) pages should begin ranking but that’s not happening.
John Mueller Offers Hope
Google’s John Mueller responded to a query about sitewide helpfulness signals suppressing the rankings of new pages created to be helpful and later indicated there may be a change to the way helpfulness signals are applied sitewide.
“Yes, and I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”
Possible Change To Helpfulness Signals
Mueller followed up his tweet by saying that the search ranking team is working on a way to surface high quality pages from sites that may contain strong negative sitewide signals indicative of unhelpful content, providing relief to some sites that are burdened by sitewide signals.
“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”
Why Changes To Sitewide Signal May Not Be Enough
Google’s search console tells publishers when they’ve received a manual action. But it doesn’t tell publishers when their sites lost rankings due to algorithmic issues like helpfulness signals. Publishers and SEOs don’t and cannot “know” if their sites are affected by helpfulness signals. Just the core ranking algorithm contains hundreds of signals, so it’s important to keep an open mind about what may be affecting search visibility after an update.
Here are five examples of changes during a broad core update that can affect rankings:
The way a query is understood could have changed which affects what kinds of sites are able to rank
Quality signals changed
Rankings may change to respond to search trends
A site may lose rankings because a competitor improved their site
Infrastructure may have changed to accommodate more AI on the back end
A lot of things can influence rankings before, during, and after a core algorithm update. If rankings don’t improve then it may be time to consider that a knowledge gap is standing in the way of a solution.
Examples Of Getting It Wrong
For example, a publisher who recently lost rankings correlated the date the of their rankings collapse to the announcement of the site Reputation Abuse update. It’s a reasonable assumption that if the rankings drop on the same date of an update then it’s the update.
“@searchliaison feeling a bit lost here. Judging by the timing, we got hit by the Reputation Abuse algorithm. We don’t do coupons, or sell links, or anything else.
Very, very confused. We’ve been stable through all this and continue to re-work/remove older content that is poor.”
They posted a screenshot of the rankings collapse.
Screenshot Showing Search Visibility Collapse
SearchLiaison responded to that tweet by noting that Google is currently only doing manual actions. It’s reasonable to assume that an update that correlates to a ranking issue is related, one to the other.
But one cannot ever be 100% sure about the cause of a rankings drop, especially if there’s a knowledge gap about other possible reasons (like the five I listed above). This bears repeating: one cannot be certain that a specific signal is the reason for a rankings drop.
In another tweet SearchLiaison remarked about how some publishers mistakenly assumed they had an algorithmic spam action or were suffering from negative Helpful Content Signals.
“I’ve looked at many sites where people have complained about losing rankings and decide they have a algorithmic spam action against them, but they don’t.
…we do have various systems that try to determine how helpful, useful and reliable individual content and sites are (and they’re not perfect, as I’ve said many times before, anticipating a chorus of “whatabouts…..” Some people who think they are impacted by this, I’ve looked at the same data they can see in Search Console and … not really. “
SearchLiaison, in the same tweet, addressed a person who remarked that getting a manual action is more fair than receiving an algorithmic action, pointing out the inherent knowledge gap that would lead someone to surmise such a thing.
He tweeted:
“…you don’t really want to think “Oh, I just wish I had a manual action, that would be so much easier.” You really don’t want your individual site coming the attention of our spam analysts. First, it’s not like manual actions are somehow instantly processed.”
The point I’m trying to make (and I have 25 years of hands-on SEO experience so I know what I’m talking about), is to keep an open mind that maybe there’s something else going on that is undetected. Yes, there are such things as false positives, but it’s not always the case that Google is making a mistake, it could be a knowledge gap. That’s why I suspect that many people will not experience a lift in rankings if Google makes it easier for new pages to rank and if that happens, keep an open mind about maybe there’s something else going on.
AI can help brands and marketers be more efficient and productive and do things quicker, but it is not perfect and does have some drawbacks.
With the rise and adoption of AI into SEO workflows, processes, and tools, SEO pros must take an ethical approach to artificial intelligence.
What exactly does an ethical approach to AI mean?
An ethical approach involves using AI technologies transparently, fairly, and responsibly while respecting user privacy and ensuring the accuracy and integrity of information.
We are all aware that using AI is imperfect and can be full of inaccurate, biased, fluffy information, etc., which can cause many problems for agencies and marketers that rely on AI to create content.
With the March core update, sites that use AI content that was not edited, original, or helpful lost a substantial portion of organic traffic.
Here are some ways we can use AI to be more ethical.
Be Transparent And Provide Disclosure
Do not use generative AI to create content for publishing. If you use generative AI in parts of your process, you should be fully transparent to the brands you work with about how you use AI in your SEO practices.
Maintain Accuracy And Integrity
If you’re going to use AI, you should take a human-led approach to writing long-form content. Humans should always do the content creation, but AI can be helpful for brainstorming, organizing, rewording, transcription, and reworking content. In each case, outputs must be checked for originality using Copyscape or the tool of your choice.
Additionally, the information must be trustworthy and accurate. With the HCU being incorporated into the March core update, it’s more important than ever to focus on people-first content rather than content that is not helpful, useful, or satisfying the end user’s intent.
Be Original and Useful
With Google focusing on a good user and people-first content experience, we should not rely on AI content because of the inadequacy in training data, and a lack of originality. AI could be great for compiling a list of notes from people with first-hand experience and pulling them into a cohesive article, for example, but not to produce the list and facts, even with fact-checking.
Follow Compliance With Search Engine Guidelines
It’s imperative that we follow search engine guidelines and ethical standards.
AI should not be used to engage in practices like keyword stuffing, cloaking, or creating doorway pages. Instead, it should support the creation of high-quality and useful content.
Ethically using AI in SEO also means considering the broader impact on society. This entails promoting trustworthy, useful content that contributes positively to users’ knowledge and well-being.
Develop Safely & Respect Privacy
If you build your own tools and platforms with AI, ensure you have strong security protocols and practices to prevent causing any harm.
Always assess your technologies before launching them into the production environment to ensure they are safe and secure. Ensure to continue monitoring it after it is released to the general public.
LLMs are not secure. It may be necessary to get legal advice before implementing certain types of AI, like generative AI, in processes that include user/customer information. Updating a privacy policy may not be enough.
Never put proprietary and confidential information into a generative AI chatbot like ChatGPT. Most LLMs save all user inputs and the information could be used to generate responses to other users.
Respect Intellectual Property & Originality
One of the biggest issues with AI is intellectual property (IP). If I create some content using ChatGPT, who owns it?
We need to ensure that when AI recommends content, it is original and not taken from anywhere else. This can be problematic because some AI platforms don’t list the source of the information unless you specify chatbots to do so.
ChatGPT can tell you where the content sources are coming from if you list them in your prompt. For example, I asked ChatGPT to write me a 750-word blog post on the top things to do in NY and list the sources, and it did.
Screenshot from ChatGPT, April 2024
If you’re getting some information from ChatGPT, you need to credit the source and ensure they’re not copying other people’s content. Also, setting clear rules for using AI in making content can help avoid legal problems and ensure you’re fair and honest.
I checked the content that I created in ChatGPT, and according to Copyscape, it is full of similar text.
Screenshot from Copyscape, April 2024
Note: Please keep in mind that asking LLMs to cite sources doesn’t guarantee you’re citing the right content or that the content is original. The best and safest way to avoid accidental plagiarism is for humans to do the research and write the content.
Google Is Not About Content That Is Artificial And Lacking In Originality
With the rapid growth of AI-based tools entering the market and AI being incorporated into a lot of platforms and being used in daily SEO tasks, it is extremely important for us to adhere to ethical AI principles to ensure that the use of AI in SEO supports a fair, equitable, and user-focused search ecosystem.
Google has always been about quality and original content that offers value to end users and not content that is fully artificial, offers no value, lacks trust, is thin, duplicate, lacks originality, etc.
In order to compete in today’s competitive and ever-changing SERPs, focusing on improving E-E-A-T is more important than ever before because it is a quality signal that shows Google and end users that you’re the subject matter expert and authority in your niche.
It’s highly recommended to have thought leaders and experts in your niche create your content and show their expertise on your site.
Additionally, it’s important to focus on user experience and ensure that your site loads quickly, is easy to navigate, and helps users find exactly what they came to your site for.
Google’s John Mueller says the Search team is “explicitly evaluating” how to reward sites that produce helpful, high-quality content when the next core update rolls out.
The comments came in response to a discussion on X about the impact of March’s core update and September’s helpful content update.
In a series of tweets, Mueller acknowledged the concerns, stating:
“I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”
“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”
What Does This Mean For SEO Professionals & Site Owners?
Mueller’s comments confirm Google is aware of critiques about the March core update and is refining its ability to identify high-quality sites and reward them appropriately in the next core update.
For websites, clearly demonstrating an authentic commitment to producing helpful and high-quality content remains the best strategy for improving search performance under Google’s evolving systems.
The Aftermath Of Google’s Core Updates
Google’s algorithm updates, including the September “Helpful Content Update” and the March 2024 update, have far-reaching impacts on rankings across industries.
While some sites experienced surges in traffic, others faced substantial declines, with some reporting visibility losses of up to 90%.
As website owners implement changes to align with Google’s guidelines, many question whether their efforts will be rewarded.
There’s genuine concern about the potential for long-term or permanent demotions for affected sites.
Recovery Pathway Outlined, But Challenges Remain
In a previous statement, Mueller acknowledged the complexity of the recovery process, stating that:
“some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”
Mueller clarified that not all changes would require a new update cycle but cautioned that “stronger effects will require another update.”
While affirming that permanent changes are “not very useful in a dynamic world,” Mueller adds that “recovery” implies a return to previous levels, which may be unrealistic given evolving user expectations.
Despite the challenges, Mueller has offered glimmers of hope for impacted sites, stating:
“Yes, sites can grow again after being affected by the ‘HCU’ (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”
He says the process may require “deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants.”
Looking Ahead
Google’s search team is actively working on improving site rankings and addressing concerns with the next core update.
However, recovery requires patience, thorough analysis, and persistent effort.
The best way to spend your time until the next update is to remain consistent and produce the most exceptional content in your niche.
FAQ
How long does it generally take for a website to recover from the impact of a core update?
Recovery timelines can vary and depend on the extent and type of updates made to align with Google’s guidelines.
Google’s John Mueller noted that some changes might be reassessed quickly, while more substantial effects could take months and require additional update cycles.
Google acknowledges the complexity of the recovery process, indicating that significant improvements aligned with Google’s quality signals might be necessary for a more pronounced recovery.
What impact did the March and September updates have on websites, and what steps should site owners take?
The March and September updates had widespread effects on website rankings, with some sites experiencing traffic surges while others faced up to 90% visibility losses.
Publishing genuinely useful, high-quality content is key for website owners who want to bounce back from a ranking drop or maintain strong rankings. Stick to Google’s recommendations and adapt as they keep updating their systems.
To minimize future disruptions from algorithm changes, it’s a good idea to review your whole site thoroughly and build a content plan centered on what your users want and need.
Is it possible for sites affected by core updates to regain their previous ranking positions?
Sites can recover from the impact of core updates, but it requires significant effort and time.
Mueller suggested that recovery might happen over multiple update cycles and involves a deep analysis to align the site with current user expectations and modern search criteria.
While a return to previous levels isn’t guaranteed, sites can improve and grow by continually enhancing the quality and relevance of their content.
Google revealed details of two new crawlers that are optimized for scraping image and video content for “research and development” purposes. Although the documentation doesn’t explicitly say so, it’s presumed that there is no impact in ranking should publishers decide to block the new crawlers.
It should be noted that the data scraped by these crawlers are not explicitly for AI training data, that’s what the Google-Extended crawler is for.
GoogleOther Crawlers
The two new crawlers are versions of Google’s GoogleOther crawler that was launched in April 2023. The original GoogleOther crawler was also designated for use by Google product teams for research and development in what is described as one-off crawls, the description of which offers clues about what the new GoogleOther variants will be used for.
The purpose of the original GoogleOther crawler is officially described as:
“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”
Two GoogleOther Variants
There are two new GoogleOther crawlers:
GoogleOther-Image
GoogleOther-Video
The new variants are for crawling binary data, which is data that’s not text. HTML data is generally referred to as text files, ASCII or Unicode files. If it can be viewed in a text file then it’s a text file/ASCII/Unicode file. Binary files are files that can’t be open in a text viewer app, files like image, audio, and video.
The new GoogleOther variants are for image and video content. Google lists user agent tokens for both of the new crawlers which can be used in a robots.txt for blocking the new crawlers.
1. GoogleOther-Image
User agent tokens:
GoogleOther-Image
GoogleOther
Full user agent string:
GoogleOther-Image/1.0
2. GoogleOther-Video
User agent tokens:
GoogleOther-Video
GoogleOther
Full user agent string:
GoogleOther-Video/1.0
Newly Updated GoogleOther User Agent Strings
Google also updated the GoogleOther user agent strings for the regular GoogleOther crawler. For blocking purposes you can continue using the same user agent token as before (GoogleOther). The new Users Agent Strings are just the data sent to servers to identify the full description of the crawlers, in particular the technology used. In this case the technology used is Chrome, with the model number periodically updated to reflect which version is used (W.X.Y.Z is a Chrome version number placeholder in the example listed below)
The full list of GoogleOther user agent strings:
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36
GoogleOther Family Of Bots
These new bots may from time to time show up in your server logs and this information will help in identifying them as genuine Google crawlers and will help publishers who may want to opt out of having their images and videos scraped for research and development purposes.
Web directories, once tools for discovering websites in the early days of the Internet, have evolved over the past two decades.
While the rise of Google led many to assume that web directories would become obsolete, plenty of these online catalogs have adapted to remain relevant.
This article examines the current state of web directories and explores their value for websites and businesses across various industries.
You’ll find an overview of 20 web directories that continue to attract traffic and offer value to websites and businesses.
These directories range from general directories like Best of the Web (BOTW) and Yahoo Directory to niche-specific platforms like Blogarama for bloggers and local business directories like Google Business Profile, Yelp, and Foursquare.
Do Web Directories Still Exist?
While web directories may not hold the same prominence they once did, they exist and can offer value.
Web directories have evolved to cater to specific niches, industries, and local markets. They serve as a curated collection of websites, providing a targeted audience with a more focused and relevant browsing experience.
Local business directories such as Google Business Profile, Yelp, and Foursquare have gained importance. These platforms provide valuable information to potential customers and contribute to a business’s online presence and local SEO efforts.
Specialized directories, such as blogs, ecommerce, or specific industries, continue to thrive. These directories cater to a particular audience and can provide valuable exposure and referral traffic to websites within their niche.
Do Web Directories Still Have Any Value?
While the SEO value of web directories has diminished, they offer certain benefits to websites and businesses.
Referral traffic: High-quality, niche-specific directories can drive targeted referral traffic to your website.
Local visibility: Directories like Google Business Profile, Yelp, and Foursquare are essential for local businesses. These platforms help companies appear in local search results and on maps, making it easier for potential customers to find them.
Brand exposure: Being listed in reputable directories can help increase your brand’s exposure and visibility online. When users come across your listing in a trusted directory, it can lend credibility to your website and business.
Backlinks: Directory links have lost much of their SEO value, but a few high-quality, relevant links can still contribute to a well-rounded backlink profile. Focus on well-maintained directories that have strict submission guidelines and are relevant to your industry or niche.
Competitor analysis: Directories can be useful tools for competitor analysis. By examining which directories your competitors are listed in, you can identify new opportunities for your website and gain insights into their marketing strategies.
Where Do Web Directories Stand Today?
Google’s algorithm is complex.
While links are one of the top-ranking signals, Google no longer views all links equally.
Links from a web directory listing are much less influential than a super relevant contextual link from a high-authority site in your niche.
What’s a marketer to do?
Move beyond viewing web directories as a source for links.
Instead, view directories as a source of traffic and trust.
Any business with a local presence needs to maintain their local citations with a consistent NAP, but web directories won’t help your SEO beyond that.
The actual returns will be from the credibility and traffic they drive to your business site.
As you search for web directories, keep those two criteria in mind.
Consider these questions before you start filling out your listing:
Is this a reputable site? Put another way, if a customer saw me on this site, would they view my business as more – or less – legitimate?
Is my target audience likely to visit this site? If not, it’s probably not worth listing your business.
Now, let’s get into what you came here for: the web directories that are still relevant today.
Web Directories That Still Have Value Today
To remain relevant, many web directories of yore have transitioned beyond basic listings to detailed review sites.
Many of the sites listed below reflect this trend.
We could have included many more on this list – like Jasmine Directory, Brownbook, and Bloggapedia – but based on their current traffic numbers (or lack thereof), we’re not sure they’re worth the effort anymore.
Instead, this section focuses only on sites that are more than a mere citation opportunity for your business.
Useful Web Directories For Any Kind Of Website
1. BOTW
Best of the Web (BOTW) is a well-established web directory that has been around since 1994.
One of BOTW’s unique aspects is that it charges a fee for listings. While this may seem like a drawback, it helps maintain the directory’s quality by discouraging spam and low-quality submissions.
The directory’s human-edited listings and strict submission guidelines ensure that only high-quality websites are included, which can lend credibility to your site by association.
BOTW’s sub-directories for blogs and local businesses make it an attractive option for those niches.
The local business sub-directory, local.botw.org, is valuable for small businesses looking to improve their local SEO.
For bloggers, the blogs.botw.org sub-directory can help attract targeted traffic and increase their blog’s visibility.
2. AboutUs
Screenshot from AboutUs, April 2024
AboutUs is a unique web directory that has evolved from its original purpose as a business domain directory. Today, AboutUs allows websites of all types to be submitted and discussed.
One of AboutUs’s standout features is its wiki-style format, which allows users to contribute information about the websites listed in the directory. This collaborative approach helps to keep the directory up-to-date.
Being listed on AboutUs can help improve brand exposure. Companies can provide detailed information about their history, mission, and products or services.
Bloggers and content creators can also benefit from being listed on AboutUs. The directory’s diverse categories make it easy for users to discover new blogs and websites related to their interests.
For Blogs Only: One Web Directory To Rule Them All
3. Blogarama
Screenshot from Blogarama, April 2024
Blogarama is dedicated solely to blog listings, making it a niche-specific platform for bloggers looking to attract readers.
One of the standout features of Blogarama is its active management by the site’s administrators. The directory is regularly updated, with new blog listings and inactive or deleted blogs removed.
Another helpful feature of Blogarama is its RSS feed integration. Bloggers can provide their RSS feed URL during submission, automatically allowing the directory to update their listing with the latest posts.
The directory’s niche focus and diverse categories make it an attractive option for bloggers in various industries and with different target audiences.
Relevant Web Directories For Local Businesses
4. Google Business Profile
Google Business Profile (formerly known as Google My Business) is a free tool provided by Google that allows businesses to manage their online presence across Search and Maps.
One key benefit of Google Business Profile is its impact on local SEO. By creating and verifying a Business Profile, businesses can improve their chances of appearing in Google’s Local Pack, Local Finder, and Google Maps results.
Google Business Profile is particularly valuable for small and medium-sized businesses with a local focus, as it helps level the playing field and compete with larger, more established brands.
5. Bing Places
Screenshot from Bing Places, April 2024
Bing Places, Microsoft’s equivalent to Google Business Profile, allows businesses to manage their online presence on the Bing search engine.
Bing Places provides detailed analytics and insights, allowing businesses to track the performance of their listings, view customer interactions, and understand how customers find them online.
While Bing’s market share is less than Google’s, it has a significant user base, particularly among desktop users and older demographics.
Bing Places can be valuable for businesses targeting these audiences or operating in regions where Bing has a higher market share, such as the United States, the United Kingdom, and Canada.
6. Facebook
Facebook Pages offers a platform for companies to establish an online presence on the social network.
These Pages allow businesses to connect with their target audience, share content, and utilize Facebook’s advertising tools.
The platform’s options allow targeted ads based on user demographics, interests, and behaviors. The “Insights” feature also offers data on page performance and audience engagement.
Other features available on Facebook Business Pages include the “Shop” section for product sales, appointment booking, event creation, and customer support through Messenger.
7. Yelp
Screenshot from Yelp, April 2024
Yelp connects consumers with local businesses. Its collection of user-generated reviews provides insights and social proof for potential customers, influencing their decision-making.
Paid advertising options like Yelp Ads allow businesses to enhance visibility and attract potential customers.
Yelp’s filter algorithm determines which reviews are displayed based on authenticity and reliability. While this can occasionally frustrate businesses, it aims to protect the platform’s integrity for users.
8. Foursquare
Screenshot from Foursquare, April 2024
Foursquare is a local search and discovery platform that offers personalized recommendations based on user-generated tips, ratings, and machine learning algorithms.
The platform’s audience consists of tech-savvy urban users, particularly millennials, who actively seek new experiences and trust peer recommendations.
Foursquare has a strong presence in the food and beverage, nightlife, and travel industries. It’s an essential platform for restaurants, bars, cafes, and hotels looking to reach an experience-driven audience.
Additionally, Foursquare offers a suite of location data products and services for businesses and developers, including tools for location-based advertising, audience targeting, and foot traffic analysis.
9. Yellow Pages
Screenshot from The Real Yellow Pages, April 2024
Yellow Pages, now primarily known as YP.com, is an online directory connecting consumers with local businesses.
Yellow Pages are handy for businesses targeting a local audience, as users often have a high purchase intent. Paid advertising options are also available to enhance visibility.
Yellow Pages partners with other online platforms, such as Google, to expand the reach of its listings. The platform also provides digital marketing services to help businesses improve their online presence beyond the Yellow Pages platform.
A Chamber of Commerce is an organization that advocates for businesses in a specific area.
By joining a Chamber, business owners and professionals can access networking opportunities and connect with potential partners.
The audience for Chambers of Commerce primarily consists of local business owners and professionals from various industries who are invested in their area’s economic success and growth.
While Chambers provide valuable resources and support, business owners must actively work to leverage these opportunities for success.
11. Hotfrog
Screenshot from Hotfrog, April 2024
Hotfrog helps small and medium-sized businesses connect with potential customers and increase online visibility.
The platform offers free business listings and paid advertising options, such as featured listings and banner ads.
User-generated content, such as customer reviews and ratings, is encouraged on Hotfrog to build credibility and trust for businesses.
12. Superpages
Screenshot from Superpages, April 2024
Superpages helps connect consumers with local businesses across the United States.
Originally a print Yellow Pages directory, it has evolved into a comprehensive digital platform.
Superpages’ user base consists of consumers searching for local businesses and services across various industries, including home services, automotive, healthcare, dining, and professional services.
The platform prioritizes user-friendly experience through tools like detailed maps, directions, and the ability to save or share listings.
It also emphasizes customer reviews and ratings, which build credibility and trust for businesses while providing valuable feedback for improvement.
13. MerchantCircle
Screenshot from MerchantCircle, April 2024
MerchantCircle helps small and medium-sized businesses connect with local customers and nearby businesses.
MerchantCircle’s diverse user base comprises SMBs across various industries seeking to connect with local customers and businesses.
The platform fosters these connections through discussion forums, content sharing, and collaborative marketing opportunities for members of local business communities.
Uniquely, MerchantCircle emphasizes content creation, encouraging businesses to share blog posts, articles, and other materials to establish expertise and build trust.
14. Better Business Bureau
Screenshot from Better Business Bureau, April 2024
One of the BBB’s primary functions is to provide ratings that help consumers make informed decisions.
The BBB assigns businesses a letter grade (A+ through F) based on factors such as complaint history, time in business, and adherence to BBB standards. While a high rating can be a positive signal, the absence of a rating or a lower rating does not necessarily indicate a problem with the business.
Businesses can become accredited by the BBB by undergoing an evaluation process and agreeing to adhere to the BBB’s Code of Business Practices, signaling their commitment to ethical and transparent practices.
The BBB’s audience consists of consumers seeking information, guidance, and assistance in resolving business disputes. It also serves businesses seeking accreditation, dispute resolution services, and resources for best practices.
15. B2B Yellow Pages
Screenshot from B2B Yellowpages, April 2024
B2B Yellow Pages connects businesses with suppliers, partners, and service providers across various industries.
It offers an extensive database of business listings and allows companies to create detailed profiles showcasing their offerings and expertise.
The platform caters to decision-makers, procurement professionals, and business owners. It enables businesses to research potential suppliers and identify suitable partners.
B2B Yellow Pages is particularly valuable for small and medium-sized businesses looking to expand their reach and connect with new clients.
The platform facilitates connections and business interactions through messaging systems, request-for-quote functionality, and lead-generation tools. Additionally, it provides resources, such as articles and guides, to help companies stay informed about industry trends and best practices.
16. Nextdoor
Screenshot from Nextdoor, April 2024
Nextdoor is a private social networking platform designed to connect residents within specific neighborhoods and communities.
It emphasizes location-based networking, where users must verify their address to join and interact with actual neighbors.
Highly localized and community-focused businesses can benefit from having a presence on Nextdoor by actively engaging with residents, sharing relevant content, and building relationships.
17. eLocal
Screenshot from eLocal, April 2024
eLocal is an online platform that connects consumers with local businesses and service providers. It simplifies finding and hiring trusted professionals for various needs.
Consumers provide detailed information about their specific needs and location, which is used to match them with the most relevant and qualified service providers.
This approach helps businesses connect with consumers who are more likely to convert into actual customers.
18. Dexknows
Dexknows provides a platform for consumers to research and compare options based on user-generated reviews and ratings.
Businesses can respond to reviews, address concerns, and showcase their commitment to customer satisfaction.
The platform integrates with major search engines and directories, offering businesses additional features like photos, videos, and special offers to enhance their listings and attract more customers.
19. Alignable
Screenshot from Alignable, April 2024
Alignable aims to create a supportive network where entrepreneurs can share advice, resources, and opportunities.
One key feature is its focus on local networking. Users create profiles and connect with other local small business owners.
The platform caters to businesses across industries, from local retailers and service providers to home-based businesses and solopreneurs.
In addition to networking, Alignable provides resources like discussion forums, articles, and guides.
20. Local.com
Screenshot from Local.com, April 2024
Local.com is an online directory that helps users find local businesses, services, and events in their city.
In addition to its directory, Local.com offers a blog section with articles and guides on topics related to local living.
For businesses, Local.com allows them to claim and manage their listings on the platform, ensuring accurate and up-to-date information.
The platform partners with Yext, enabling businesses to manage their listings across multiple online directories.
What Else?
Beyond the directories listed above, additional niche directories with high traffic may be pertinent to your industry, like Avvo for attorneys, Thumbtack for local contractors, or Porch for home improvement professionals.
You can find an excellent list of these, helpfully organized by industry and domain authority, on BrightLocal.com.
There are also online services, notably Moz Local and Yext, that will create, update, and maintain your local citations across dozens of online directories.
A listing on many of these directories will be a citation for citation’s sake, but these services will include big names like Yahoo, Yelp, and others on our list.
Working with one of these services can significantly speed up adding your website (and take the work off your plate), which is why they aren’t free.
However, depending on how many websites you manage, they can be worth it.
Ways You Can Still Benefit From Web Directories
While web directories may have lost some of their former prominence, this exploration demonstrates they can still play a valuable role in a well-rounded online marketing strategy.
A few key takeaways include:
Don’t Overlook Specialized Niche Directories
As general web directories have waned, many niche-specific directories focused on blogs, e-commerce, local businesses, and more have emerged to serve targeted audiences.
Tapping into these specialized, relevant directories can help drive highly qualified traffic.
User Reviews Build Trust And Credibility
Many of today’s most valuable web directories have evolved into robust review platforms where user-generated feedback is paramount.
Actively managing your listings and responding to reviews on these sites is crucial for building reputation and credibility.
Leverage Directories For Local Marketing Visibility
The importance of local business directories like Google Business Profile and Yelp in enhancing local SEO and visibility to nearby customers has been reinforced.
An optimized, consistent listing presence is essential.
Quality Over Quantity For Backlinks
While overemphasizing web directories for link building is ill-advised, a few carefully selected, topically relevant directory backlinks can provide value as part of a natural link profile.
Marketers must approach web directories today with a more nuanced, focused strategy.
Final Thoughts On Web Directories
As you can see, there are still directories that provide value. If a directory receives traffic from your target audience, is relevant to your website, and maintains quality listings, it’s a good candidate for your backlink profile.
Local businesses may also find relevant directories in local newspapers, magazines, and on business websites. They can be good candidates if the directories are highly relevant to your website and receive traffic from your target audience.
Look at the page where your website would be listed and decide if you’re happy to be alongside the other websites on the page.
Follow these tips, and you’ll choose the most valuable directories for your business.