Google Confirms 3 Ways To Make Googlebot Crawl More via @sejournal, @martinibuster

Google’s Gary Illyes and Lizzi Sassman discussed three factors that trigger increased Googlebot crawling. While they downplayed the need for constant crawling, they acknowledged there a ways to encourage Googlebot to revisit a website.

1. Impact of High-Quality Content on Crawling Frequency

One of the things they talked about was the quality of a website. A lot of people suffer from the discovered not indexed issue and that’s sometimes caused by certain SEO practices that people have learned and believe are a good practice. I’ve been doing SEO for 25 years and one thing that’s always stayed the same is that industry defined best practices are generally years behind what Google is doing. Yet, it’s hard to see what’s wrong if a person is convinced that they’re doing everything right.

Gary Illyes shared a reason for an elevated crawl frequency at the 4:42 minute mark, explaining that one of triggers for a high level of crawling is signals of high quality that Google’s algorithms detect.

Gary said it at the 4:42 minute mark:

“…generally if the content of a site is of high quality and it’s helpful and people like it in general, then Googlebot–well, Google–tends to crawl more from that site…”

There’s a lot of nuance to the above statement that’s missing, like what are the signals of high quality and helpfulness that will trigger Google to decide to crawl more frequently?

Well, Google never says. But we can speculate and the following are some of my educated guesses.

We know that there are patents about branded search that count branded searches made by users as implied links. Some people think that “implied links” are brand mentions, but “brand mentions” are absolutely not what the patent talks about.

Then there’s the Navboost patent that’s been around since 2004. Some people equate the Navboost patent with clicks but if you read the actual patent from 2004 you’ll see that it never mentions click through rates (CTR). It talks about user interaction signals. Clicks was a topic of intense research in the early 2000s but if you read the research papers and the patents it’s easy to understand what I mean when it’s not so simple as “monkey clicks the website in the SERPs, Google ranks it higher, monkey gets banana.”

In general, I think that signals that indicate people perceive a site as helpful, I think that can help a website rank better. And sometimes that can be giving people what they expect to see, giving people what they expect to see.

Site owners will tell me that Google is ranking garbage and when I take a look I can see what they mean, the sites are kind of garbagey. But on the other hand the content is giving people what they want because they don’t really know how to tell the difference between what they expect to see and actual good quality content (I call that the Froot Loops algorithm).

What’s the Froot Loops algorithm? It’s an effect from Google’s reliance on user satisfaction signals to judge whether their search results are making users happy. Here’s what I previously published about Google’s Froot Loops algorithm:

“Ever walk down a supermarket cereal aisle and note how many sugar-laden kinds of cereal line the shelves? That’s user satisfaction in action. People expect to see sugar bomb cereals in their cereal aisle and supermarkets satisfy that user intent.

I often look at the Froot Loops on the cereal aisle and think, “Who eats that stuff?” Apparently, a lot of people do, that’s why the box is on the supermarket shelf – because people expect to see it there.

Google is doing the same thing as the supermarket. Google is showing the results that are most likely to satisfy users, just like that cereal aisle.”

An example of a garbagey site that satisfies users is a popular recipe site (that I won’t name) that publishes easy to cook recipes that are inauthentic and uses shortcuts like cream of mushroom soup out of the can as an ingredient. I’m fairly experienced in the kitchen and those recipes make me cringe. But people I know love that site because they really don’t know better, they just want an easy recipe.

What the helpfulness conversation is really about is understanding the online audience and giving them what they want, which is different from giving them what they should want. Understanding what people want and giving it to them is, in my opinion, what searchers will find helpful and ring Google’s helpfulness signal bells.

2. Increased Publishing Activity

Another thing that Illyes and Sassman said could trigger Googlebot to crawl more is an increased frequency of publishing, like if a site suddenly increased the amount of pages it is publishing. But Illyes said that in the context of a hacked site that all of a sudden started publishing more web pages. A hacked site that’s publishing a lot of pages would cause Googlebot to crawl more.

If we zoom out to examine that statement from the perspective of the forest then it’s pretty evident that he’s implying that an increase in publication activity may trigger an increase in crawl activity. It’s not that the site was hacked that is causing Googlebot to crawl more, it’s the increase in publishing that’s causing it.

Here is where Gary cites a burst of publishing activity as a Googlebot trigger:

“…but it can also mean that, I don’t know, the site was hacked. And then there’s a bunch of new URLs that Googlebot gets excited about, and then it goes out and then it’s crawling like crazy.”​

A lot of new pages makes Googlebot get excited and crawl a site “like crazy” is the takeaway there. No further elaboration is needed, let’s move on.

3. Consistency Of Content Quality

Gary Illyes goes on to mention that Google may reconsider the overall site quality and that may cause a drop in crawl frequency.

Here’s what Gary said:

“…if we are not crawling much or we are gradually slowing down with crawling, that might be a sign of low-quality content or that we rethought the quality of the site.”

What does Gary mean when he says that Google “rethought the quality of the site?” My take on it is that sometimes the overall site quality of a site can go down if there’s parts of the site that aren’t to the same standard as the original site quality. In my opinion, based on things I’ve seen over the years, at some point the low quality content may begin to outweigh the good content and drag the rest of the site down with it.

When people come to me saying that they have a “content cannibalism” issue, when I take a look at it, what they’re really suffering from is a low quality content issue in another part of the site.

Lizzi Sassman goes on to ask at around the 6 minute mark if there’s an impact if the website content was static, neither improving or getting worse, but simply not changing. Gary resisted giving an answer, simply saying that Googlebot returns to check on the site to see if it has changed and says that “probably” Googlebot might slow down the crawling if there is no changes but qualified that statement by saying that he didn’t know.

Something that went unsaid but is related to the Consistency of Content Quality is that sometimes the topic changes and if the content is static then it may automatically lose relevance and begin to lose rankings. So it’s a good idea to do a regular Content Audit to see if the topic has changed and if so to update the content so that it continues to be relevant to users, readers and consumers when they have conversations about a topic.

Three Ways To Improve Relations With Googlebot

As Gary and Lizzi made clear, it’s not really about poking Googlebot to get it to come around just for the sake of getting it to crawl. The point is to think about your content and its relationship to the users.

1. Is the content high quality?
Does the content address a topic or does it address a keyword? Sites that use a keyword-based content strategy are the ones that I see suffering in the 2024 core algorithm updates. Strategies that are based on topics tend to produce better content and sailed through the algorithm updates.

2. Increased Publishing Activity
An increase in publishing activity can cause Googlebot to come around more often. Regardless of whether it’s because a site is hacked or a site is putting more vigor into their content publishing strategy, a regular content publishing schedule is a good thing and has always been a good thing. There is no “set it and forget it” when it comes to content publishing.

3. Consistency Of Content Quality
Content quality, topicality, and relevance to users over time is an important consideration and will assure that Googlebot will continue to come around to say hello. A drop in any of those factors (quality, topicality, and relevance) could affect Googlebot crawling which itself is a symptom of the more importat factor, which is how Google’s algorithm itself regards the content.

Listen to the Google Search Off The Record Podcast beginning at about the 4 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Google Warns: URL Parameters Create Crawl Issues via @sejournal, @MattGSouthern

Gary Illyes, Analyst at Google, has highlighted a major issue for crawlers: URL parameters.

During a recent episode of Google’s Search Off The Record podcast, Illyes explained how parameters can create endless URLs for a single page, causing crawl inefficiencies.

Illyes covered the technical aspects, SEO impact, and potential solutions. He also discussed Google’s past approaches and hinted at future fixes.

This info is especially relevant for large or e-commerce sites.

The Infinite URL Problem

Illyes explained that URL parameters can create what amounts to an infinite number of URLs for a single page.

He explains:

“Technically, you can add that in one almost infinite–well, de facto infinite–number of parameters to any URL, and the server will just ignore those that don’t alter the response.”

This creates a problem for search engine crawlers.

While these variations might lead to the same content, crawlers can’t know this without visiting each URL. This can lead to inefficient use of crawl resources and indexing issues.

E-commerce Sites Most Affected

The problem is prevalent among e-commerce websites, which often use URL parameters to track, filter, and sort products.

For instance, a single product page might have multiple URL variations for different color options, sizes, or referral sources.

Illyes pointed out:

“Because you can just add URL parameters to it… it also means that when you are crawling, and crawling in the proper sense like ‘following links,’ then everything– everything becomes much more complicated.”

Historical Context

Google has grappled with this issue for years. In the past, Google offered a URL Parameters tool in Search Console to help webmasters indicate which parameters were important and which could be ignored.

However, this tool was deprecated in 2022, leaving some SEOs concerned about how to manage this issue.

Potential Solutions

While Illyes didn’t offer a definitive solution, he hinted at potential approaches:

  1. Google is exploring ways to handle URL parameters, potentially by developing algorithms to identify redundant URLs.
  2. Illyes suggested that clearer communication from website owners about their URL structure could help. “We could just tell them that, ‘Okay, use this method to block that URL space,’” he noted.
  3. Illyes mentioned that robots.txt files could potentially be used more to guide crawlers. “With robots.txt, it’s surprisingly flexible what you can do with it,” he said.

Implications For SEO

This discussion has several implications for SEO:

  1. Crawl Budget: For large sites, managing URL parameters can help conserve crawl budget, ensuring that important pages are crawled and indexed.in
  2. Site Architecture: Developers may need to reconsider how they structure URLs, particularly for large e-commerce sites with numerous product variations.
  3. Faceted Navigation: E-commerce sites using faceted navigation should be mindful of how this impacts URL structure and crawlability.
  4. Canonical Tags: Using canonical tags can help Google understand which URL version should be considered primary.

In Summary

URL parameter handling remains tricky for search engines.

Google is working on it, but you should still monitor URL structures and use tools to guide crawlers.

Hear the full discussion in the podcast episode below:

Free WordPress AI Writing Assistant By Jetpack via @sejournal, @martinibuster

Jetpack announced a free WordPress writing tool called Write Brief With AI that improves the clarity and conciseness of content. The AI writing assistant is based on an internal tool used at Automattic and is now available without limitations regardless of whether a user is subscribed to Jetpack AI Assistant or not.

Write Brief With AI Is Free

The new AI tool started as an internal writing tool used at Automattic, the company behind WordPress.com, Jetpack, WooCommerce, and other companies. They are now integrating as part of the Jetpack AI plugin. Although Jetpack AI is a premium plugin (with a limited free trial), the functionality and usage of Write Brief with AI is available to all users both free and paid.

What It Does

The new Jetpack AI writing tool does three important things that can improve engagement and the overall quality of the content.

  1. It measures the readability of the text.
  2. Flags long-winded sentences.
  3. Highlights words that convey uncertainty.

Importance Of Readability

Readability and a direct writing style are important for clearly expressing the content’s topic, which can indirectly benefit SEO, conversions, and engagement. This is because clarity and conciseness make the topic more evident and easily understood by search algorithms.

Why Removing Uncertainty Is Important

Regarding flagging words that sound uncertain, that has the effect of encouraging the writer to consider revisions that make the content more definitive and confident.

Here are examples of how confident writing improves content:

Example 1

This sentence expresses uncertainty:

I think we should consider expanding our marketing efforts.

This improved version of the same sentence is more confident:

We should expand our marketing efforts.

Example 2

This sentence is unconfident:

Maybe we should review the budget before making a decision.

This sentence is direct and definitive:

We should review the budget before making a decision.

The above examples show how improving directness and making sentences more decisive removes a level of ambiguity and makes them more understandable.

Will that help a web page rank better? Communicating without ambiguity makes it easy for search-related algorithms to understand content which in turn makes it easier to rank for the respective topic.

Embedded Within The WordPress Editor

The editor is located within the WordPress editor. Blocks must be enabled because it won’t work within the Classic Editor. Additionally, the functionality is turned off by default and has to be activated by toggling on within the AI Assistant Settings sidebar.

Should You Try Write Brief With AI?

If your site is already using blocks then it may be convenient to give the new writing assistant a try. The tool is focused on improving content according to best practices but not actually doing the writing itself. That’s a good use of AI because it preserves the authenticity of human authored content.

Download Jetpack and activate the free trial of the AI Assistant. Write Brief With AI is switched off by default, so toggle it on in the AI Assistant settings.  While AI Assistant is limited in how many times it can be used, Write Brief With AI is in Beta and can be used without limitations.

Download Jetpack here:

Jetpack by Automattic

Learn More About Write Brief With AI

Read more at the official WordPress.com announcement:

Clearer Writing at Your Fingertips: Introducing Write Brief with AI (Beta)

Read the documentation on requirements, activation instructions and how to use it:

Create Better Content with Jetpack AI

Featured Image by Shutterstock/Velishchuk Yevhen

Google’s AI Now Chooses Your Local Ad Photos via @sejournal, @MattGSouthern

Google has announced a new update to its Local Services Ads (LSA) platform, implementing an automated photo selection feature.

Ginny Marvin, Google Ads Liaison, revealed that the company will now choose photos from advertisers’ LSA profiles to display in their ads.

According to the announcement, this change is designed to increase ad engagement. The selection process will be based on an image’s perceived likelihood of improving user interaction with the advertisement.

Key Points of the Update:

  1. Photo inclusion may affect ad ranking
  2. Google recommends uploading 3-5 images to LSA profiles
  3. Photos should be high-quality, relevant, and original
  4. Not all ads will consistently include photos

Impact On Advertisers

This update introduces a new variable for Local Services advertisers to consider.

While Google suggests that adding photos could improve ad rankings, the actual impact remains to be seen.

Advertisers may need to reassess their visual content strategies in light of this change.

Photo Requirements & Recommendations

Google says photos must be high quality, relevant to the advertiser’s work, and original.

The company explicitly states that copied or stolen images are not permitted. Advertisers can manage their photos through the Profile and Budget page in their LSA dashboard.

Variable Photo Display

It’s important to note that photo inclusion in ads is not guaranteed. Google states that ad appearance will vary depending on user queries and other unspecified factors.

This variability may present challenges for advertisers seeking to control their ad presentation consistently.

As this feature rolls out, local service providers using Google’s advertising platform must monitor its effects on their ad performance and adjust their strategies accordingly.

How This Can Help You

This LSA update matters for digital marketers and local businesses.

It changes how visuals impact local service ads, potentially shaking up ad performance and user engagement.

What it means for LSA advertisers:

  • Better visibility: Good photos could boost your ad placement.
  • More clicks: Eye-catching visuals might up your CTR.
  • Edge over competitors: Quick adapters could get ahead.
  • Time-saver: No more manual image selection headaches.

What it means for marketers and agencies:

  • New optimization angles: Fresh ways to tweak LSA campaigns.
  • Added value for clients: Guide them on nailing their LSA imagery.
  • Data insights: Track how this change impacts performance metrics.

Keep a close eye on your LSA performance and be ready to pivot. Savvy marketers can turn this update into a win for their local ad game.


Featured Image: Mamun sheikh K/Shutterstock

Reddit Considers Adding AI-Powered Search Results via @sejournal, @MattGSouthern

During Reddit’s Q2 2024 earnings call, CEO Steve Huffman revealed the company is exploring implementing AI-powered search results on its platform.

Though details remain limited, this feature could enhance content discovery.

Huffman stated during the call:

“Later this year, we will begin testing new search result pages powered by AI to summarize and recommend content.”

He suggested this could help users find information on products, shows, and games and discover new communities.

Reddit’s consideration of AI search aligns with broader industry trends, as many tech companies integrate AI capabilities into their products.

Financial Context

This announcement was made alongside Reddit’s Q2 2024 financial results.

The company reported 54% year-over-year revenue growth, reaching $281.2 million for the quarter.

User growth also increased, though specific figures were not provided in this initial report.

Potential Challenges

While AI-powered search could improve content discovery, its implementation may face hurdles.

These could include technical challenges, user adoption concerns, and questions about how AI-curated results might affect the visibility of certain communities or content types on the platform.

Reddit hasn’t provided a specific timeline for testing or rolling out this feature, nor has it shared details on how it would be developed or implemented.

Reddit Blocks Most Search Engines

Any change to Reddit’s on-site search is notable, as it’s one of the only ways to search the website.

Reddit’s latest robots.txt update has prevented most search engines from crawling its recent content.

The big exception? Google, thanks to a $60M deal for AI training data.

See: Reddit Limits Search Engine Access, Google Remains Exception

Key points from the above article:

  • Only Google and Reddit’s on-site search can now reliably find fresh Reddit posts.
  • Bing, DuckDuckGo, and others are left in the cold for new content.
  • SEOs and marketers face new hurdles in tracking Reddit discussions.

This move fits the trend of platforms monetizing their content and protecting it from AI scrapers.

This could impact users in the following ways:

  • Users must use Google or Reddit’s native search for recent posts.
  • SEOs need new strategies for Reddit content monitoring.
  • Google gains an edge in providing access to Reddit’s vast user-generated content.

It’s a sign of the times as platforms balance openness with monetization in the AI era.

Looking Ahead

As Reddit continues to grow, reporting a 54% year-over-year revenue increase in Q2 2024, this AI initiative could play a pivotal role in the platform’s future.

As the company moves forward with testing and potential implementation, users and industry observers alike will be watching closely to see how this AI-powered search transforms the Reddit experience.


Featured Image: T. Schneider/Shutterstock

OpenAI Scraps ChatGPT Watermarking Plans via @sejournal, @MattGSouthern

OpenAI has decided against implementing text watermarking for ChatGPT-generated content despite having the technology ready for nearly a year.

This decision, reported by The Wall Street Journal and confirmed in a recent OpenAI blog post update, stems from user concerns and technical challenges.

The Watermark That Wasn’t

OpenAI’s text watermarking system, designed to subtly alter word prediction patterns in AI-generated text, promised near-perfect accuracy.

Internal documents cited by the Wall Street Journal claim it was “99.9% effective” and resistant to simple paraphrasing.

However, OpenAI has revealed that more sophisticated tampering methods, like using another AI model for rewording, can easily circumvent this protection.

User Resistance: A Key Factor

Perhaps more pertinent to OpenAI’s decision was the potential user backlash.

A company survey found that while global support for AI detection tools was strong, almost 30% of ChatGPT users said they would use the service less if watermarking was implemented.

This presents a significant risk for a company rapidly expanding its user base and commercial offerings.

OpenAI also expressed concerns about unintended consequences, particularly the potential stigmatization of AI tools for non-native English speakers.

The Search For Alternatives

Rather than abandoning the concept entirely, OpenAI is now exploring potentially “less controversial” methods.

Its blog post mentions early-stage research into metadata embedding, which could offer cryptographic certainty without false positives. However, the effectiveness of this approach remains to be seen.

Implications For Marketers and Content Creators

This news may be a relief to the many marketers and content creators who have integrated ChatGPT into their workflows.

The absence of watermarking means greater flexibility in how AI-generated content can be used and modified.

However, it also means that ethical considerations around AI-assisted content creation remain largely in users’ hands.

Looking Ahead

OpenAI’s move shows how tough it is to balance transparency and user growth in AI.

The industry needs new ways to tackle authenticity issues as AI content booms. For now, ethical AI use is the responsibility of users and companies.

Expect more innovation here, from OpenAI or others. Finding a sweet spot between ethics and usability remains a key challenge in the AI content game.


Featured Image: Ascannio/Shutterstock

Google Found in Violation of Antitrust Law, Judge Rules via @sejournal, @MattGSouthern

A federal judge has ruled that Google violated U.S. antitrust law by illegally maintaining monopolies in the markets for general search services and general search text advertising.

Judge Amit P. Mehta of the U.S. District Court for the District of Columbia, ruling in a case brought against Google by the Justice Department, said that Google had abused its monopoly power over the search business in part by paying companies to present its search engine as the default choice on their devices and web browsers.

Judge Mehta wrote in his opinion filed Monday:

“After having carefully considered and weighed the witness testimony and evidence, the court reaches the following conclusion: Google is a monopolist, and it has acted as one to maintain its monopoly. It has violated Section 2 of the Sherman Act.”

The court found that Google abused its dominant position in several ways:

  • Paying hefty sums to ensure default status on devices and browsers
  • Leveraging user data to reinforce its search engine’s dominance
  • Illegally protecting its monopoly over search-related advertising

Key Findings Of Anticompetitive Behavior

The judge found that Google’s agreements with Apple, Mozilla, and Android partners foreclosed about 50% of the search market and 45% of the search advertising market from rivals.

These exclusive distribution agreements deprived competitors like Microsoft’s Bing of the scale needed to compete with Google in search and search advertising.

Judge Mehta concluded that Google’s conduct had anticompetitive effects:

  • Foreclosing a substantial share of the market
  • Depriving rivals of scale needed to compete
  • Reducing incentives for rivals to invest and innovate in search

The case began in 2020 and culminated in a 10-week trial last fall.

Financial Revelations

The trial disclosed financial details of Google’s default search agreements.

In 2022, Google paid Apple $20 billion for default search placement on iOS devices, an increase from $18 billion in 2021.

Additionally, Google shares 36% of Safari’s search ad revenue with Apple.

These figures highlight the value of default search positioning in the industry.

Google’s Defense & Market Share

Throughout the trial, Google maintained that its market dominance resulted from superior product quality rather than anticompetitive practices.

The company disputed the DOJ’s estimate that it held a 90% share of the search market, arguing for a broader definition of its competitive landscape.

However, Judge Mehta rejected this defense:

“Google has thwarted true competition by foreclosing its rivals from the most effective channels of search distribution.”

Ruling On Search Advertising

On search advertising, the judge found Google could charge supra-competitive prices for text ads without rivals’ constraints.

However, the judge ruled in Google’s favor on some claims, finding Google doesn’t have monopoly power in the broader search advertising market.

Potential Ramifications

While Judge Mehta has yet to determine specific remedies, the ruling opens the door to potentially far-reaching consequences for Google’s business model. Possible outcomes could include:

  • Forced changes to Google’s search operations
  • Divestiture of specific business segments
  • Restrictions on default search agreements

The decision is likely to face appeals, and the final resolution may evolve, as seen in the Microsoft antitrust case of the 1990s.

Broader Context

This ruling sets a precedent that could influence other ongoing antitrust cases against tech giants like Amazon, Apple, and Meta.

It signals a shift in how century-old antitrust laws are applied to modern digital markets.

What’s Next

Google is expected to appeal the decision, potentially leading to a protracted legal battle that could shape the future of online search and digital advertising.

The Department of Justice and a group of attorneys general from 38 states and territories, who filed similar antitrust suits against Google in 2020, will eagerly anticipate the next steps in this legal battle.


Featured Image: Sergei Elagin/Shutterstock

Facebook Attracts Gen Z Users While TikTok’s Boomer Audience Grows via @sejournal, @MattGSouthern

According to a recent report by eMarketer, Facebook is experiencing a resurgence among Gen Z users, while TikTok is gaining traction with baby boomers.

Despite these shifts, both platforms maintain a stable core user base.

Facebook’s Gen Z Renaissance

Facebook’s seeing unexpected Gen Z growth despite overall decline. U.S. Gen Z users are projected to increase from 49.0% (33.9M) in 2024 to 56.9% (40.5M) by 2028.

Key drivers:

  1. Utility: Event planning, niche groups, and Marketplace appeal to younger users.
  2. Demo shift: ~36% of Gen Z are still under 18, many just entering the social media space.

E-commerce potential strong: 75.0% of Gen Z Facebook users (15-26) bought on Marketplace last year.

However, Gen Z still trails Gen X and millennials in user numbers and time spent on the platform. Interestingly, time on Facebook is decreasing for users under 55, suggesting a shift in how younger generations interact with the platform.

TikTok’s Boomer Boom

TikTok’s Gen Z market is saturated, but it’s seeing surprising growth among boomers.

Projections show a 10.5% increase in U.S. boomer users next year, from 8.7M to 9.7M.

This modest uptick underscores TikTok’s accessibility and its appeal to older adults who want to stay culturally relevant and connected with younger relatives.

While boomers are the fastest-growing demographic, TikTok adoption rates are rising steadily across all generations, indicating the platform’s broad appeal.

Shifting Social Media Landscape

Facebook use continues to decrease across all generations except Gen Z, highlighting the platform’s evolving role in the social media ecosystem.

This trend, coupled with TikTok’s growth among older users, suggests a blurring of generational lines in social media usage. Platforms that can adapt to changing user demographics while maintaining their core appeal will be best positioned for long-term success.

Implications For Marketers

Platforms and users are constantly changing. Brands must adapt or risk losing ground to competitors.

TikTok’s boomer growth opens up new avenues for brands targeting older demographics, but marketers should be mindful of the platform’s primarily young user base.

For Facebook marketers, the growing Gen Z user base presents new opportunities, especially in e-commerce via Marketplace. However, decreasing time spent on the platform means content needs to be more engaging and targeted.

Action items:

  1. Audit strategy: Check content appeal across age groups and platforms.
  2. Diversify: Create multi-faceted strategies for different demographics while maintaining brand identity.
  3. Leverage analytics: Track engagement by age group and adjust tactics.
  4. Test and optimize: Experiment with content formats and messaging for each platform.
  5. Stay current: Follow platform updates and demographic trends.

Stay flexible and update strategies as user demographics and preferences change.

Brands that can reach across generations while respecting platform-specific norms will likely see the most success in this changing landscape.


Screenshot from: Halfpoint/Shutterstock