Archive

SEO

What Is Personal Branding? Here’s Why It’s So Important via @sejournal, @AdamHeitzman

Personal branding lets you take charge of your digital footprint, ensuring that what people find when they search for you online is accurate, positive, and aligned with your professional goals.
Examples Of Personal Branding That WorkNow that you understand the advantages of developing a personal brand, let’s look at a couple of real-life examples of professionals who have nailed their branding and reaped the rewards from it.
David Perell
Known as “The Writing Guy,” David Perell has built a strong personal brand around his expertise in online writing.
He has harnessed the power of the internet to inspire and coach thousands of people to improve their writing, build an online audience, and leverage that audience to propel their careers.
Thanks to his popular course “Write of Passage,” his prolific social media output, and his frequent podcast appearances, David has successfully positioned himself as one of the world’s leading proponents of the power of online writing.
Image from perell.com, April 2024
Marie Forleo
Marie Forleo is an entrepreneur, author, and the creator of MarieTV, an award-winning web show that helps viewers realize their potential in business and in their personal lives.
Her personal brand revolves around the philosophy that anyone can lead a dream life if they’re willing to put in the effort. The title of her book, “Everything Is Figureoutable,” has become a mantra for personal growth and resilience, emphasizing that all problems can be solved with enough determination.
Marie has successfully used her platform to inspire millions with her practical advice, engaging personality, and unwavering belief in the potential of her viewers.
Image from YouTube, April 2024
6 Steps To Build Your Own Personal Brand
So, what should you do to develop a personal brand?
Here are the six key steps to successful personal branding.
1. Define Your Brand Identity
The first step is all about self-reflection and clarity.
Your goal here is to figure out what you stand for and how you want to be perceived.
Identify what makes you unique – this could be a combination of your distinctive talents, perspectives, values, and passions. Consider your career achievements, distinguishing personality traits, and any notable challenges you’ve overcome that shape who you are today.
Remember to stay true to your authentic self, not just what you think others want to see. Being genuine will help you connect more deeply with your audience, shaping a brand identity that is both relatable and trustworthy.
Plus, being yourself is much less work than pretending to be someone else!
2. Determine Your Target Audience
Next, you need to clarify who your personal brand is speaking to.
Your target audience could be potential employers, clients in a specific industry, a professional community, or peers who share similar interests to you.
You should understand what they care about and their challenges so you can tailor your content and messaging to align with their specific needs.
3. Develop A Personal Brand Statement
A personal brand statement is a succinct description of who you are, what you do, who you serve, and why it matters.
Think of it as your professional tagline.
It should be compelling and memorable, encapsulating your unique value proposition.
For example, if you’re a software developer with a focus on educational technology, your personal brand statement might be something like: “Designing edtech to empower learners everywhere.”
A solid brand statement not only helps focus your communication across different platforms but is also a powerful tool that can differentiate you from your peers and competitors.
4. Optimize Your Online Presence
Once you’ve laid the groundwork for your personal brand, it’s time to update your digital footprint accordingly.
Make sure your social media profiles on platforms like LinkedIn and your personal website are polished, professional, and aligned with your brand identity. Consider removing any old posts or content pieces that don’t reflect the image you want to project.
It’s also worth investing in new professional headshots, a logo for your brand, and a consistent color scheme and design elements across your platforms. Visual consistency helps reinforce your brand identity and makes you easily recognizable to your audience.
5. Create And Share Valuable Content
Posting high-quality content is essential for establishing your authority and amplifying your message.
You can bolster your reputation and expand your reach by creating and sharing insightful blog posts, social media content, and videos, as well as appearing on other creators’ podcasts and YouTube channels.
Your content should not only reflect your professional insights but also your unique personality and perspectives.
Also, to build your email list, it’s a good idea to offer audience members a free, valuable resource on your website (like an ebook, webinar, or online course) in exchange for their email address.
6. Network And Engage With Your Community
Finally, building and nurturing a professional network is critical.
In addition to connecting with people online, you should actively engage with their content, contribute to relevant industry conversations, and participate in both virtual and in-person events related to your field.
Offering your expertise by answering questions, sharing insights, and providing valuable feedback solidifies your reputation as an approachable and knowledgeable thought leader.
By investing time in building relationships and promoting the work and efforts of your peers, you’ll foster more goodwill around your brand, which could open up new doors for you down the line.
Personal Branding Should Be Important For Everyone

Read More »
Keyword research

How AI Improves Keyword Research

Artificial intelligence can reveal content gaps and opportunities to improve organic search rankings. It is also helpful for keyword research.
Traditional keyword tools extend core terms with long-tail phrases. AI tools discover new terms by identifying related queries with the same search intent but not necessarily the same words.
What follows are AI tools for keyword discovery. Push these suggestions through traditional keyword research tools to discover long-tail opportunities and additional insights such as search volume and keyword difficulty.
I tested each tool with the same core phrase: “take text from image.”
SEO.ai
SEO.ai offers a free AI-powered tool to generate keyword suggestions. For “take text from image,” it produced the following variations.

text from image services
image to text conversion
OCR services
image transcription services
document digitization services
scanned documents to text
photo to text service
handwriting to text conversion
image processing text extraction
OCR text recognition
convert image to editable text
image text analysis
image to word conversion

Junia
Junia provides free AI-powered research and complementary data from Google Ads’ Keyword Planner on competition, monthly search volume, and cost-per-click. Junia’s suggestions were the most relevant to my initial phrase. I’ve reproduced them for this article.

Read More »
News

Google Declares It The “Gemini Era” As Revenue Grows 15% via @sejournal, @MattGSouthern

Alphabet Inc., Google’s parent company, announced its first quarter 2024 financial results today.While Google reported double-digit growth in key revenue areas, the focus was on its AI developments, dubbed the “Gemini era” by CEO Sundar Pichai.
The Numbers: 15% Revenue Growth, Operating Margins Expand
Alphabet reported Q1 revenues of $80.5 billion, a 15% increase year-over-year, exceeding Wall Street’s projections.
Net income was $23.7 billion, with diluted earnings per share of $1.89. Operating margins expanded to 32%, up from 25% in the prior year.
Ruth Porat, Alphabet’s President and CFO, stated:
“Our strong financial results reflect revenue strength across the company and ongoing efforts to durably reengineer our cost base.”
Google’s core advertising units, such as Search and YouTube, drove growth. Google advertising revenues hit $61.7 billion for the quarter.
The Cloud division also maintained momentum, with revenues of $9.6 billion, up 28% year-over-year.
Pichai highlighted that YouTube and Cloud are expected to exit 2024 at a combined $100 billion annual revenue run rate.
Generative AI Integration in Search
Google experimented with AI-powered features in Search Labs before recently introducing AI overviews into the main search results page.
Regarding the gradual rollout, Pichai states:
“We are being measured in how we do this, focusing on areas where gen AI can improve the Search experience, while also prioritizing traffic to websites and merchants.”
Pichai reports that Google’s generative AI features have answered over a billion queries already:
“We’ve already served billions of queries with our generative AI features. It’s enabling people to access new information, to ask questions in new ways, and to ask more complex questions.”
Google reports increased Search usage and user satisfaction among those interacting with the new AI overview results.
The company also highlighted its “Circle to Search” feature on Android, which allows users to circle objects on their screen or in videos to get instant AI-powered answers via Google Lens.
Reorganizing For The “Gemini Era”
As part of the AI roadmap, Alphabet is consolidating all teams building AI models under the Google DeepMind umbrella.
Pichai revealed that, through hardware and software improvements, the company has reduced machine costs associated with its generative AI search results by 80% over the past year.
He states:
“Our data centers are some of the most high-performing, secure, reliable and efficient in the world. We’ve developed new AI models and algorithms that are more than one hundred times more efficient than they were 18 months ago.
How Will Google Make Money With AI?
Alphabet sees opportunities to monetize AI through its advertising products, Cloud offerings, and subscription services.
Google is integrating Gemini into ad products like Performance Max. The company’s Cloud division is bringing “the best of Google AI” to enterprise customers worldwide.
Google One, the company’s subscription service, surpassed 100 million paid subscribers in Q1 and introduced a new premium plan featuring advanced generative AI capabilities powered by Gemini models.
Future Outlook
Pichai outlined six key advantages positioning Alphabet to lead the “next wave of AI innovation”:

Research leadership in AI breakthroughs like the multimodal Gemini model
Robust AI infrastructure and custom TPU chips
Integrating generative AI into Search to enhance the user experience
A global product footprint reaching billions
Streamlined teams and improved execution velocity
Multiple revenue streams to monetize AI through advertising and cloud

With upcoming events like Google I/O and Google Marketing Live, the company is expected to share further updates on its AI initiatives and product roadmap.

Featured Image: Sergei Elagin/Shutterstock

Read More »
News

Google Stresses The Need To Fact Check AI-Generated Content via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off The Record podcast, team members got hands-on with Gemini to explore creating SEO-related content.However, their experiment raised concerns over factual inaccuracies when relying on AI tools without proper vetting.
The discussion involved Lizzi Harvey, Gary Illyes, and John Mueller taking turns utilizing Gemini to write sample social media posts on technical SEO concepts.
As they analyzed Gemini’s output, Illyes highlighted a limitation shared by all AI tools:
“My bigger problem with pretty much all generative AI is the factuality – you always have to fact check whatever they are spitting out. That kind of scares me that now we are just going to read it live, and maybe we are going to say stuff that is not even true.”
Outdated SEO Advice Exposed
The concerns stemmed from an AI-generated tweet suggesting using rel=”prev/next” for pagination – a technique that Google has deprecated.
Gemini suggested publishing the following tweet:
“Pagination causing duplicate content headaches? Use rel=prev, rel=next to guide Google through your content sequences. #technicalSEO, #GoogleSearch.”
Harvey immediately identified the advice as outdated. Mueller confirms rel=prev and rel=next is still unsupported:
“It’s gone. It’s gone. Well, I mean, you can still use it. You don’t have to make it gone. It’s just ignored.”
Earlier in the podcast, Harvey warned inaccuracies could result from outdated training data information.
Harvey stated:
“If there’s enough myth circulating or a certain thought about something or even outdated informationthat has been blogged about a lot, it might come up in our exercise today, potentially.”
Sure enough, it took only a short time for outdated information to come up.
Human Oversight Still Critical
While the Google Search Relations team saw the potential for AI-generated content, their discussion stressed the need for human fact-checking.
Illyes’ concerns reflect the broader discourse around responsible AI adoption. Human oversight is necessary to prevent the spread of misinformation.
As generative AI use increases, remember that its output can’t be blindly trusted without verification from subject matter experts.
Why SEJ Cares
While AI-powered tools can potentially aid in content creation and analysis, as Google’s own team illustrated, a healthy degree of skepticism is warranted.
Blindly deploying generative AI to create content can result in publishing outdated or harmful information that could negatively impact your SEO and reputation.
Hear the full podcast episode below:
[embedded content]

FAQ

How can inaccurate AI-generated content affect my SEO efforts?

Using AI-generated content for your website can be risky for SEO because the AI might include outdated or incorrect information.
Search engines like Google favor high-quality, accurate content, so publishing unverified AI-produced material can hurt your website’s search rankings. For example, if the AI promotes outdated practices like using the rel=”prev/next” tag for pagination, it can mislead your audience and search engines, damaging your site’s credibility and authority.
It’s essential to carefully fact-check and validate AI-generated content with experts to ensure it follows current best practices.

How can SEO and content marketers ensure the accuracy of AI-generated output?

To ensure the accuracy of AI-generated content, companies should:

Have a thorough review process involving subject matter experts
Have specialists check that the content follows current guidelines and industry best practices
Fact-check any data or recommendations from the AI against reliable sources
Stay updated on the latest developments to identify outdated information produced by AI

Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, April 2024. 

Read More »
Google Patents & Research Papers

Google’s New Infini-Attention And SEO via @sejournal, @martinibuster

Google has published a research paper on a new technology called Infini-attention that allows it to process massively large amounts of data with “infinitely long contexts” while also being capable of being easily inserted into other models to vastly improve their capabilitiesThat last part should be of interest to those who are interested in Google’s algorithm. Infini-Attention is plug-and-play, which means it’s relatively easy to insert into other models, including those in use b Google’s core algorithm. The part about “infinitely long contexts” may have implications for how some of Google’s search systems may work.
The name of the research paper is: Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
Memory Is Computationally Expensive For LLMs
Large Language Models (LLM) have limitations on how much data they can process at one time because the computational complexity and memory usage can spiral upward significantly. Infini-Attention gives the LLM the ability to handle longer contexts while keeping the down memory and processing power needed.
The research paper explains:
“Memory serves as a cornerstone of intelligence, as it enables efficient computations tailored to specific contexts. However, Transformers …and Transformer-based LLMs …have a constrained context-dependent memory, due to the nature of the attention mechanism.
Indeed, scaling LLMs to longer sequences (i.e. 1M tokens) is challenging with the standard Transformer architectures and serving longer and longer context models becomes costly financially.”
And elsewhere the research paper explains:
“Current transformer models are limited in their ability to process long sequences due to quadratic increases in computational and memory costs. Infini-attention aims to address this scalability issue.”
The researchers hypothesized that Infini-attention can scale to handle extremely long sequences with Transformers without the usual increases in computational and memory resources.
Three Important Features
Google’s Infini-Attention solves the shortcomings of transformer models by incorporating three features that enable transformer-based LLMs to handle longer sequences without memory issues and use context from earlier data in the sequence, not just data near the current point being processed.
The features of Infini-Attention

Compressive Memory System
Long-term Linear Attention
Local Masked Attention

Compressive Memory System
Infini-Attention uses what’s called a compressive memory system. As more data is input (as part of a long sequence of data), the compressive memory system compresses some of the older information in order to reduce the amount of space needed to store the data.
Long-term Linear Attention
Infini-attention also uses what’s called, “long-term linear attention mechanisms” which enable the LLM to process data that exists earlier in the sequence of data that’s being processed which enables to retain the context. That’s a departure from standard transformer-based LLMs.
This is important for tasks where the context exists on a larger plane of data. It’s like being able to discuss and entire book and all of the chapters and explain how the first chapter relates to another chapter closer to the end of the book.
Local Masked Attention
In addition to the long-term attention, Infini-attention also uses what’s called local masked attention. This kind of attention processes nearby (localized) parts of the input data, which is useful for responses that depend on the closer parts of the data.
Combining the long-term and local attention together helps solve the problem of transformers being limited to how much input data it can remember and use for context.
The researchers explain:
“The Infini-attention incorporates a compressive memory into the vanilla attention mechanism and builds in both masked local attention and long-term linear attention mechanisms in a single Transformer block.”
Results Of Experiments And Testing
Infini-attention was tested with other models for comparison across multiple benchmarks involving long input sequences, such as long-context language modeling, passkey retrieval, and book summarization tasks. Passkey retrieval is a test where the language model has to retrieve specific data from within a extremely long text sequence.
List of the three tests:

Long-context Language Modeling
Passkey Test
Book Summary

Long-Context Language Modeling And The Perplexity Score
The researchers write that the Infini-attention outperformed the baseline models and that increasing the training sequence length brought even further improvements in the Perplexity score. The Perplexity score is a metric that measures language model performance with lower scores indicating better performance.
The researchers shared their findings:
“Infini-Transformer outperforms both Transformer-XL …and Memorizing Transformers baselines while maintaining 114x less memory parameters than the Memorizing Transformer model with a vector retrieval-based KV memory with length of 65K at its 9th layer. Infini-Transformer outperforms memorizing transformers with memory length of 65K and achieves 114x compression ratio.
We further increased the training sequence length to 100K from 32K and trained the models on Arxiv-math dataset. 100K training further decreased the perplexity score to 2.21 and 2.20 for Linear and Linear + Delta models.”
Passkey Test
The passkey test is wherea random number is hidden within a long text sequence with the task being that the model must fetch the hidden text. The passkey is hidden either near the beginning, middle or the end of the long text. The model was able to solve the passkey test up to a length of 1 million.
“A 1B LLM naturally scales to 1M sequence length and solves the passkey retrieval task when injected with Infini-attention. Infini-Transformers solved the passkey task with up to 1M context length when fine-tuned on 5K length inputs. We report token-level retrieval accuracy for passkeys hidden in a different part (start/middle/end) of long inputs with lengths 32K to 1M.”
Book Summary Test
Infini-attention also excelled at the book summary test by outperforming top benchmarks achieving new state of the art (SOTA) performance levels.
The results are described:
“Finally, we show that a 8B model with Infini-attention reaches a new SOTA result on a 500K length book summarization task after continual pre-training and task fine-tuning.
…We further scaled our approach by continuously pre-training a 8B LLM model with 8K input length for 30K steps. We then fine-tuned on a book summarization task, BookSum (Kry´sci´nski et al., 2021) where the goal is to generate a summary of an entire book text.
Our model outperforms the previous best results and achieves a new SOTA on BookSum by processing the entire text from book. …There is a clear trend showing that with more text provided as input from books, our Infini-Transformers improves its summarization performance metric.”
Implications Of Infini-Attention For SEO
Infini-attention is a breakthrough in modeling long and short range attention with greater efficiency than previous models without Infini-attention. It also supports “plug-and-play continual pre-training and long-context adaptationby design” which means that it can easily be integrated into existing models.
Lastly, the “continual pre-training and long-context adaptation” makes it exceptionally useful for scenarios where it’s necessary to constantly train the model on new data. This last part is super interesting because it may make it useful for applications on the back end of Google’s search systems, particularly where it is necessary to be able to analyze long sequences of information and understand the relevance from one part near the beginning of the sequence and another part that’s closer to the end.
Other articles focused on the “infinitely long inputs” that this model is capable of but where it’s relevant to SEO is how that ability to handle huge input and “Leave No Context Behind” is what’s relevant to search marketing and how some of Google’s systems might work if Google adapted Infini-attention to their core algorithm.
Read the research paper:
Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
Featured Image by Shutterstock/JHVEPhoto

Read More »
Google algorithm

Surviving the March 2024 Google core update

Google’s algorithm updates can shift the visibility of your sites. In March 2024, Google launched one of its biggest core algorithm updates yet, giving many people food for thought. This update differs from previous ones, as it targets low-quality, often AI-generated, content that’s cluttering search results. All of this affects the quality of information users find online, meaning Google had to step in and clean up. Here, we’ll give you more insights into the March 2024 Google core update.

Table of contents

Google updates are nothing new
Algorithm updates are not new. Updates are part of Google’s ongoing efforts to improve users’ experience. Google tries to find new ways of promoting high-quality, relevant, and trustworthy content. However, the March 2024 Google core update stands out. This is because it directly addresses the wave of AI content hitting the search engine. 
Generative AI is now widely accessible. As a result, the web has seen an influx of content lacking the depth and accuracy of content written by real people. While Google is clear that using AI to generate content is not prohibited, it does value and reward true quantity.
Understanding Google algorithm updates
Google’s algorithm is a list of rules for ranking sites in search results. As with most things, these rules aren’t static. The rules change often to adapt to new technologies and user behaviors. Today, one of those changes is the tidal wave of new content. This prompts Google to update its algorithm. As a result, it should help users find the most relevant, high-quality content.
Google launched the core update on March 5, and it still has not fully rolled out at the time of writing
Why do algorithm updates happen?
Google’s mission is to organize the world’s information and make it universally accessible and useful. For this, Google must continuously adjust its algorithms to understand and categorize content. Updates can target specific issues, like reducing spam. Others aim to prioritize expert content in medical searches. For instance, the 2018 “Medic” update focused on health and medical sites. Its goal was to uncover more authoritative content in areas where it matters most.
Content quality and user experience
Over the years, Google has increasingly prioritized content quality and user experience. Google algorithm updates such as “Panda” (2011) and “Penguin” (2012) were early examples, penalizing poor-quality content and manipulative link practices. Today, we see updates focusing on page experience (Core Web Vitals) and the helpfulness of content (Helpful Content Update). All of this shows Google is moving towards user-centric metrics.
It can be very insightful to learn about Google’s algorithm updates. The insights help you align your content strategy. It’s about understanding the need for content quality and user satisfaction. To survive, you must create highly relevant content that serves the user’s intent. Only then will your site remain a valuable resource for your audience.
Why the March 2024 Google Core Update?
We all notice the surge in content powered by generative AI. This technological leap has made it easy to produce content at scale. However, not all content meets the quality standards that Google seeks. The March 2024 core update is Google’s move to address this challenge. It hopes that users will once again find valuable and trustworthy content.
With this update, Google aims to reduce spammy content in search drastically by over 40%:

“We believe these updates will reduce the amount of low-quality content on Search and send more traffic to helpful and high-quality sites. Based on our evaluations, we expect that the combination of this update and our previous efforts will collectively reduce low-quality, unoriginal content in search results by 40%.”

The challenge of low-quality content
The core issue with AI-generated content isn’t the use of AI but the quality of that content. Pre-March 2024, search results started showing content that offered little value to users seeking information or solutions. This is a bad user experience and makes it harder for high-quality, human-crafted content to get the visibility it deserves.
This goes specifically for what Google calls scaled content abuse:

“Scaled content abuse is when many pages are generated for the primary purpose of manipulating search rankings and not helping users. This abusive practice is typically focused on creating large amounts of unoriginal content that provides little to no value to users, no matter how it’s created.”

Content quality and relevance
With the March 2024 core update, Google wants to highlight high-quality content. Of course, the update doesn’t penalize the use of AI in content creation. However, it aims to ensure that content meets the same standards as real authors. This means AI-generated content should be informative, accurate, and engaging. Also, it should offer unique perspectives or insights that benefit the user.
Google even said as much:

“There’s nothing new or special that creators need to do for this update as long as they’ve been making satisfying content meant for people. For those that might not be ranking as well, we strongly encourage reading our creating helpful, reliable, people-first content help page.”

The March 2024 Google core update is very interesting for content creators. We should use it as a call to action to raise our content standards. As a result, we’d publish unique content that truly enriches users’ online experiences. This update gives you the chance to review your content strategies. Try to embrace AI’s potential but prioritize the quality and authenticity only people provide. It’s all about balance! Together, we’ll ensure the web is informative, trustworthy, and user-friendly.
Hit by the March 2024 Google core update?
Identifying the impact of a Google core algorithm update is the first step toward recovery. The March 2024 Google core update might lead to changes in your site performance. However, you should distinguish this from online traffic’s usual ebb and flow. This requires carefully analyzing your site’s metrics.
Monitoring traffic and rankings
First things first: don’t panic. A sudden drop in traffic or a decline in rankings can be alarming. These things happen, so you must try to remain calm. As you know, these shifts can signal that an update has affected your site. Use analytics tools to monitor your site’s traffic patterns. Look for abrupt changes that coincide with the timing of Google’s announced updates.
Something definitely happened here
Analyzing different traffic sources
It’s crucial to differentiate between organic search, referral, and direct traffic. A decline in organic search traffic suggests your site’s visibility in SERPs might be waning. This could be due to the latest algorithm update. Conversely, changes in referral or direct traffic might indicate other factors at play that are unrelated to an update.
Use Google Search Console
Use Search Console to understand your site’s performance in Google’s SERPs. After an update, check for any notifications or warnings that could indicate specific issues Google has identified with your site. 
In GSC, look for sudden and unexpected losses. Losses in impressions and click-throughs would be the main things to check. If there are losses in impressions, try to identify the specific pages and queries that were hit. Was everything hit? Just certain queries? Certain pages? Confirm that the dates of the drops match the dates of the updates. This data can help pinpoint which aspects of your site were most affected.
Investigating the impact of a core update
Once you’ve identified potential clues, the next step involves digging deeper. This phase is about pinpointing the impacted elements and understanding why.
Dive deep into the data to figure out what happened when
Aligning traffic changes with update timelines
Start by aligning the observed traffic and ranking changes with the update’s rollout dates. Google’s Search Central Blog and X account are reliable sources for announcements and timelines. Plotting your traffic data against these dates helps confirm if the changes correlate with the updates.
Also, ensure that nothing else might have happened on your website that could have caused the traffic loss. Eliminate the other possibilities to make sure you’re solving the right riddle. It’s important to attribute shifts in performance to the update rather than other variables like seasonal trends or external site changes.
Dive into Google Search Console data
Google Search Console offers a wealth of data for your work. Pay special attention to the Performance report. This report provides insights into impressions, CTR, and rankings for your pages and queries. A drop in impressions might mean your pages appear less frequently in search results, possibly due to the update. Google has a helpful guide on debugging drops in search traffic that helps you get started.

Identifying content and technical shortfalls
After pinpointing the issues, assess whether these are related to content quality and relevance, technical SEO, or both. For content, consider factors like originality, depth, and user engagement. Does the content provide unique value beyond what’s already out there? 
Review your site holistically. Check the site structure, internal linking, mobile performance, page speed, and the Core Web Vitals for technical aspects. Google’s PageSpeed Insights and Lighthouse are helpful tools for this analysis.
Recovering from the Google’s March 2024 core update
After identifying how the update affected your site, the focus shifts to recovery. Recovering involves addressing content quality and technical SEO aspects, depending on where the issues lie. Here’s how to approach recovery, ensuring your site aligns more closely with Google’s standards.
Enhancing content quality
If the traffic loss results from rankings drop for queries you used to rank well for, that is probably an issue with your content. Start by looking at the content ranking for those terms now. Look for clues as to why that content might be preferable over yours, and adjust your content accordingly. Try to make your content more relevant to the user.
Helpful content questions
Google developed a set of guidelines to help you assess your content’s relevance and quality:

Does the content provide original information, reporting, research, or analysis? Assess whether your content offers new insights, unique viewpoints, or comprehensive research that adds value beyond what’s already available online.
Does the content provide a substantial, complete, or comprehensive topic description? Evaluate if your content thoroughly covers the topic, comprehensively addressing the audience’s questions, concerns, and related interests.
Is the content written by an expert or enthusiast who demonstrably knows the topic well? Consider whether the author has the necessary expertise, experience, or passion for the subject matter, evident in the content’s depth and accuracy.
Does the content have a clear purpose or goal that it successfully fulfills? Identify the primary objective of your content (to inform, entertain, persuade, etc.) and assess if it effectively achieves this goal.
Would someone reading your content leave feeling they’ve learned enough about a topic to help achieve their goal? Reflect on whether the reader would come away with actionable knowledge, solutions, or a deeper understanding of the subject.
Does the content present information that makes you trust it, such as clear sourcing, evidence of the author’s expertise, and a lack of factual errors? Verify the reliability and credibility of your content through accurate sourcing, showcasing the author’s qualifications, and ensuring factual correctness.
Is the content free from spelling or stylistic issues? Ensure your content is professionally presented, with attention to grammar, spelling, and style, making it accessible and enjoyable to read. 
Would you feel comfortable trusting this content for issues relating to YMYL? For content that impacts significant decisions (health, finance, etc.), consider if it meets the highest standards of accuracy and trustworthiness.
Is the content designed to meet the needs of a human audience rather than search engines? Create content that serves your audience’s interests and queries instead of search engines.
Does your site have a primary purpose or focus, and does your content support that purpose? Ensure your content aligns with and supports your website’s overarching theme or mission, providing a cohesive user experience.

Addressing technical issues
How you recover depends on the problem; the issues are usually related to your content. If you have a technical problem severe enough to cause a sudden loss in traffic, it probably was not caused by the core update. However, if you find a technical issue, you should fix it. Consider this: If your check engine light is on, washing your car and hanging a new air freshener isn’t helping the long-term outlook for the car’s driveability. 
Some areas to focus on:

Fix the performance of your mobile site: Having a well-performing mobile site is non-negotiable. Use Google’s Lighthouse to identify and rectify any usability issues. Keep ads at bay.
Page speed and Core Web Vitals: Fast-loading pages create a positive user experience. Use PageSpeed Insights to identify and fix issues impacting load times, such as large image files or slow server response times.  
Site structure and navigation: A well-organized site helps users and search engines find content efficiently. Review your site’s structure for logical hierarchy and navigation ease.

Best practices for future updates
Usually, Google algorithm updates don’t hit sites that follow the rules and play nice. Still, the last core update reminds us to work hard to improve our sites where possible. Looking ahead, there are several best practices you can adopt to stay ahead of the impact of future updates.
Stay informed about SEO news and updates
Keep track of SEO news and updates. Sometimes, Google offers insights into what site owners can do to prepare for updates. Follow reputable SEO blogs like ours — and the SEO update by Yoast — , forums, and other channels, like the Google Search Central Blog and the @searchliaison X account.
Continuously review and improve content
The quality of your content is never done. Regularly review your site’s content. It should always remain relevant, accurate, and valuable to your audience. Keep working on it by updating statistics, refreshing outdated references, and adding new insights to keep content engaging. Consider user feedback as an opportunity to enhance your content’s scope and depth.
Diversify your traffic sources
Try to prevent relying solely on organic search traffic. Explore other opportunities by diversifying your traffic sources with social media and email marketing. This makes your traffic more stable and allows you to reach new audiences.
All about the Google core update of March 2024
Adaptability, quality, and user focus are what you need to succeed. It’s hard work to recover if an algorithm update has hit you. The whole process offers valuable lessons to improve your site. In addition, you might find new growth opportunities.
Google’s algorithms will continue to evolve. The March 2024 core update aims to combat low-quality content. It underscores a shift towards a future that hopefully values genuine, insightful, and user-centric content.

Edwin Toonen

Edwin is a strategic content specialist. Before joining Yoast, he spent years honing his skill at The Netherlands’ leading web design magazine.

Coming up next!

Read More »
News

Google Crawler Documentation Has A New IP List via @sejournal, @martinibuster

Google updated their Googlebot and crawler documentation to add a range of IPs for bots triggered by users of Google products. The names of the feeds switched which is important for publishers who are whitelisting Google controlled IP addresses. The change will be useful for publishers who want to block scrapers who are using Google’s cloud and other crawlers not directly associated with Google itself.New List Of IP Addresses
Google says that the list contains IP ranges that have long been in use, so they’re not new IP address ranges.
There are two kinds of IP address ranges:

IP ranges that are initiated by users but controlled by Google and resolve to a Google.com hostname.These are tools like Google Site Verifier and presumably the Rich Results Tester Tool.
IP ranges that are initiated by users but not controlled by Google and resolve to a gae.googleusercontent.com hostname.These are apps that are on Google cloud or apps scripts that are called from Gooogle Sheets.

The lists that correspond to each category are different now.
Previously the list that corresponded to Google IP addresses was this one: special-crawlers.json (resolving to gae.googleusercontent.com)
Now the “special crawlers” list corresponds to crawlers that are not controlled by Google.
“IPs in the user-triggered-fetchers.json object resolve to gae.googleusercontent.com hostnames. These IPs are used, for example, if a site running on Google Cloud (GCP) has a feature that requires fetching external RSS feeds on the request of the user of that site.”
The new list that corresponds to Google controlled crawlers is: 
user-triggered-fetchers-google.json
“Tools and product functions where the end user triggers a fetch. For example, Google Site Verifier acts on the request of a user. Because the fetch was requested by a user, these fetchers ignore robots.txt rules.
Fetchers controlled by Google originate from IPs in the user-triggered-fetchers-google.json object and resolve to a google.com hostname.”
The list of IPs from Google Cloud and App crawlers that Google doesn’t control can be found here:
https://developers.google.com/static/search/apis/ipranges/user-triggered-fetchers.json
The list of IP from Google that are triggered by users and controlled by Google is here:
https://developers.google.com/static/search/apis/ipranges/user-triggered-fetchers-google.json
New Section Of Content
There is a new section of content that explains what the new list is about.
“Fetchers controlled by Google originate from IPs in the user-triggered-fetchers-google.json object and resolve to a google.com hostname. IPs in the user-triggered-fetchers.json object resolve to gae.googleusercontent.com hostnames. These IPs are used, for example, if a site running on Google Cloud (GCP) has a feature that requires fetching external RSS feeds on the request of the user of that site. ***-***-***-***.gae.googleusercontent.com or google-proxy-***-***-***-***.google.com user-triggered-fetchers.json and user-triggered-fetchers-google.json”
Google Changelog
Google’s changelog explained the changes like this:
“Exporting an additional range of Google fetcher IP addressesWhat: Added an additional list of IP addresses for fetchers that are controlled by Google products, as opposed to, for example, a user controlled Apps Script. The new list, user-triggered-fetchers-google.json, contains IP ranges that have been in use for a long time.
Why: It became technically possible to export the ranges.”
Read the updated documentation:Verifying Googlebot and other Google crawlers
Read the old documentation:Archive.org – Verifying Googlebot and other Google crawlers
Featured Image by Shutterstock/JHVEPhoto

Read More »