Google Says This Will Cancel Your “Linking Power” via @sejournal, @martinibuster

Google’s John Mueller was asked in an SEO Office Hours podcast if blocking the crawl of a webpage will have the effect of cancelling the “linking power” of either internal or external links. His answer suggested an unexpected way of looking at the problem and offers an insight into how Google Search internally approaches this and other situations.

About The Power Of Links

There’s many ways to think of links but in terms of internal links, the one that Google consistently talks about is the use of internal links to tell Google which pages are the most important.

Google hasn’t come out with any patents or research papers lately about how they use external links for ranking web pages so pretty much everything SEOs know about external links is based on old information that may be out of date by now.

What John Mueller said doesn’t add anything to our understanding of how Google uses inbound links or internal links but it does offer a different way to think about them that in my opinion is more useful than it appears to be at first glance.

Impact On Links From Blocking Indexing

The person asking the question wanted to know if blocking Google from crawling a web page affected how internal and inbound links are used by Google.

This is the question:

“Does blocking crawl or indexing on a URL cancel the linking power from external and internal links?”

Mueller suggests finding an answer to the question by thinking about how a user would react to it, which is a curious answer but also contains an interesting insight.

He answered:

“I’d look at it like a user would. If a page is not available to them, then they wouldn’t be able to do anything with it, and so any links on that page would be somewhat irrelevant.”

The above aligns with what we know about the relationship between crawling, indexing and links. If Google can’t crawl a link then Google won’t see the link and therefore the link will have no effect.

Keyword Versus User-Based Perspective On Links

Mueller’s suggestion to look at it how a user would look at it is interesting because it’s not how most people would consider a link related question. But it makes sense because if you block a person from seeing a web page then they wouldn’t be able to see the links, right?

What about for external links? A long, long time ago I saw a paid link for a printer ink website that was on a marine biology web page about octopus ink. Link builders at the time thought that if a web page had words in it that matched the target page (octopus “ink” to printer “ink”) then Google would use that link to rank the page because the link was on a “relevant” web page.

As dumb as that sounds today, a lot of people believed in that “keyword based” approach to understanding links as opposed to a user-based approach that John Mueller is suggesting. Looked at from a user-based perspective, understanding links becomes a lot easier and most likely aligns better with how Google ranks links than the old fashioned keyword-based approach.

Optimize Links By Making Them Crawlable

Mueller continued his answer by emphasizing the importance of making pages discoverable with links.

He explained:

“If you want a page to be easily discovered, make sure it’s linked to from pages that are indexable and relevant within your website. It’s also fine to block indexing of pages that you don’t want discovered, that’s ultimately your decision, but if there’s an important part of your website only linked from the blocked page, then it will make search much harder.”

About Crawl Blocking

A final word about blocking search engines from crawling web pages. A surprisingly common mistake that I see some site owners do is that they use the robots meta directive to tell Google to not index a web page but to crawl the links on the web page.

The (erroneous) directive looks like this:

There is a lot of misinformation online that recommends the above meta description, which is even reflected in Google’s AI Overviews:

Screenshot Of AI Overviews

A screenshot of Google's AI Overviews recommending an erroneous robots directive configuration

Of course, the above robots directive does not work because, as Mueller explains, if a person (or search engine) can’t see a web page then the person (or search engine) can’t follow the links that are on the web page.

Also, while there is a “nofollow” directive rule that can be used to make a search engine crawler ignore links on  a web page, there is no “follow” directive that forces a search engine crawler to crawl all the links on a web page. Following links is a default that a search engine can decide for themselves.

Read more about robots meta tags.

Listen to John Mueller answer the question from the 14:45 minute mark of the podcast:

Featured Image by Shutterstock/ShotPrime Studio

Google Says How To Get More Product Rich Results via @sejournal, @martinibuster

In an SEO Office Hours podcast, Google’s John Mueller answered the question of how to get more product rich results to show in the search results. John listed four things that are important in order to get rich results for product listings.

Product Rich Results

Product search queries can trigger rich results that presents products in a visually rich manner that Google refers to as Search Experiences.

Google product search experiences can include:

  • Product snippets that include ratings, reviews, price, and whether availability information.
  • Visual representations of products
  • Knowledge panel with vendors and products
  • Product images in Google Images search results
  • Result enhancements (reviews, shipping information, etc.)

John Mueller Answers Question About Product Rich Results

The person asking the question wanted to know how to get more “product snippets in Search Console” which confused Mueller because product snippets are displayed in the search results, not search console. So Mueller answered the question in the context of search results.

This is the question:

“How to increase the number of product snippets in Search Console?”

John Mueller explained that there were four things to get right in order to qualify for product rich results.

Mueller answered:

“It’s not really clear to me what exactly you mean… If you’re asking about product rich results, these are tied to the pages that are indexed for your site. And that’s not something which you can change by force.

It requires that the page be indexed, that the page has valid structured data on it, and that our systems have determined that it’s worth showing this structured data.”

So, according to John Mueller, these are the four things to get right to qualify for product rich results:

  1. Page must be indexed
  2. The page has valid structured data
  3. Google’s systems determine that it’s worth showing
  4. Submit a product feed

1. Page Indexing

Getting a page indexed (and ranked) can be difficult for some search queries. People who come to me with this kind of problem tend to have content quality issues that can be traced back to using outdated SEO strategies like copying what’s already ranking in the SERPs but making it “better” which often results in content that’s not meaningfully different than what Google is already ranking.

Content quality on the page level and on the site level are important. Focusing on content that has that little extra, like better images, helpful graphs, or content that’s more concise, all of that is so much better than focusing on keywords and entities.

2. Valid Structured Data

This is another area that explains why some sites lose their rich results or fail to get them altogether. Google changes their structured data recommendations and usually the structured data plugins will update to conform to the new guidelines. But I’ve seen examples where that doesn’t happen. So when there’s a problem with rich results, go to Google’s Rich Results Test tool first.

It’s also important to be aware that getting the structured data correct is not a guarantee that Google will show rich results for that page, it’s just makes the page qualified to show in the rich results.

3. How Does Google Determine Something’s Worth Showing?

This is the part that Google doesn’t talk about. But if you’re read about reviews systems, quality guidelines, Google’s SEO starter guide and maybe even the Search Quality Raters Guidelines then that should be more than enough information to inform any question about content quality.

Google doesn’t say why they may decline to show an image thumbnail as a rich result or why they’ll not show a product in the rich results. My opinion is that debugging the issue is more productive if the problem is reconceptualized as a content quality issue. Images are content, if it’s on the page, even if it’s not text, it’s content. Evaluate all of the content in terms of how the images or products or whatever might look like in the search results. Does it look good as a thumbnail? Is the content distinctive or helpful or useful, etc.?

4. Merchant Feed

John Mueller lastly said that the merchant feed is another way to get products from a website to show as a rich result in Google.

Mueller answered:

“There’s also the possibility to submit a feed to your merchant center account, to show products there. This is somewhat separate, and has different requirements which I’ll link to. Often a CMS or platform will take care of these things for you, which makes it a bit easier.”

Mueller linked to this page:
Onboarding Guide – Create a feed

There’s also another page about Rich Snippets, which is more about text snippets:

Product snippet (Product, Review, Offer) structured data

Getting Product Rich Results in Google

While John Mueller listed four ways to get product rich results, Google Search Experiences, it’s not always as easy as 1, 2, 3, and 4. There are always nuances to be aware of.

Listen to the Google SEO Office Hours podcast at the 7:00 minute mark:

Featured Image by Shutterstock/ViDI Studio

Google Warns Of Last Chance To Export Notes Search Data via @sejournal, @martinibuster

Google updated their documentation for the Google Labs Google Notes experiment to remind users that Notes will go away at the end of July 2024 and showed how to download notes content, with a final deadline beyond which it will be impossible to retrieve it.

Google Notes

Notes is an experimental feature in Google Labs that lets users annotate search results with their ideas and experiences. The idea behind it is to make search more helpful and improve the quality of the search results through the opinions and insights of real people. It’s almost like Wikipedia where members of the public curate topics.

Google eventually decided that the Notes feature had undergone enough testing and they decided that their are shutting down Google Notes, a decision announced in April 2024.

Update To Documentation

The official documentation was updated to make it clear that Notes is shutting down at the end of July and that users who wish to download their data can do us with their Google Takeout, a Google Accounts feature that allows users to export their content from their Google Account. Google Takeout allows Google Account holders to export data from Google Calendar, Google Drive, Google Photos, a total of up to 56 kinds of content can be exported.

Google’s Search Central document changelog explains:

“A note about Notes

What: Added a note about the status of Notes to the Notes documentation.

Why: Notes is winding down at the end of July 2024.”

This is the new announcement:

“Notes is winding down at the end of July 2024. If you created a note, your notes content is available to download using Google Takeout through the end of August 2024.”

Check out the updated Google Notes documentation here:

Notes on Google Search and your website (experimental)

Featured Image by Shutterstock/ra2 studio

WP Engine WordPress Hosting Acquires NitroPack via @sejournal, @martinibuster

Managed WordPress web host WP Engine announced that they are acquiring NitroPack, a leading SaaS website performance optimization solution. The acquisition of of NitroPack by WP Engine demonstrates their continued focus on improving site performance for clients.

NitroPack

NitroPack is a relatively pricey but well regarded site performance solution that has for years been known as a leader. WP Engine and NitroPack formed a partnership in 2023 that would power WP Engine’s PageSpeed Boost product that is offered internally to customers. The NitroPack team will now become integrated within WP Engine this month, July.

There are no immediate plans to change the pricing options for NitroPack so it’s safe to say that it will continue to be a standalone product. WP Engine commented to Search Engine Journal that there will be no immediate changes in services pricing or billing for current NitroPack customers.

“We have no immediate plans to change the pricing options for NitroPack products.

Today NitroPack works with page builders and other hosting providers and that will continue to be available. In the coming months, we will continue to leverage NitroPack to enhance additional functionality to Page Speed Boost for WP Engine’s customers.”

What the acquisition means for WP Engine customers is that WP Engine will continue to leverage NitroPack’s technology to add even more functionalities to their PageSpeed Boost product.

The WP Engine spokesperson said that these new integrations will be coming to WP Engine PageSpeed Boost in a matter of months.

They shared:

“In the coming months, we will continue to leverage NitroPack’s strength to enhance additional functionality to Page Speed Boost.”

Read the official announcement:

WP Engine Acquires NitroPack, Extending Leadership in Managed WordPress Site Performance

Featured Image by Shutterstock/Asier Romero

OpenAI GPT-4o Mini Costs Less & Wallops Competition via @sejournal, @martinibuster

OpenAI rolled out GPT-4o mini, a replacement for GPT 3.5 Turbo that is more powerful than other models in its class. Because it’s hyper efficient, GPT 4o mini will make AI available to more people at a cheaper price through better end-user applications.

GPT-4o mini

GPT-4o mini is a highly efficient version of GPT-4o that is cheaper to run and is fast. Despite it’s designation as “mini” this language model is outperforms GPT-4 and GPT-3.5 turbo, as well as solidly outperforming Google’s comparable model, Gemini Flash 1.5.

Preliminary scores by the open source Large Language Model Systems Organizations shows GPT-4o Mini outperforming Anthropic’s Claude 3 Opus and Google’s Gemini Flash 1.5 and reaching benchmark scores that are comparable to GPT 4.5 Turbo and Gemini 1.5 Pro.

Screenshot Of Language Model Scores

Cost Effective Language Model

An important feature of GPT-4o mini is that it’s cheaper to use, 60% cheaper than GPT 3.5 Turbo, which means that companies that make AI products based on OpenAI language models will be able to offer high performance AI applications that cost significantly less. This makes AI available to more people around the world.

According to OpenAI:

“Today, we’re announcing GPT-4o mini, our most cost-efficient small model. We expect GPT-4o mini will significantly expand the range of applications built with AI by making intelligence much more affordable. GPT-4o mini scores 82% on MMLU and currently outperforms GPT-41 on chat preferences in LMSYS leaderboard(opens in a new window). It is priced at 15 cents per million input tokens and 60 cents per million output tokens, an order of magnitude more affordable than previous frontier models and more than 60% cheaper than GPT-3.5 Turbo.

a text and vision model in the Assistants API, Chat Completions API, and Batch API. Developers pay 15 cents per 1M input tokens and 60 cents per 1M output tokens (roughly the equivalent of 2500 pages in a standard book). We plan to roll out fine-tuning for GPT-4o mini in the coming days.”

GPT-4o mini Availability

GPT 4o mini is available today to users of ChatGPT Free, Plus and Team, with GPT-3.5 no longer a selectable option. Enterprise users will have access next week.

Read the official announcement:

GPT-4o mini: advancing cost-efficient intelligence

Featured Image by Shutterstock/Dean Drobot

Google Confirms Ranking Boost For Country Code Domains via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about a ranking preference given to sites that use country level domain names and explained how that compares to non-country domain names. The question occurred in the SEO Office Hours podcast.

ccTLD Aka Country Code Domain Names

Domain names that are specific to countries are called ccTLDs (Country Code Top Level Domains). These are domain names that target specific countries. Examples of these ccTLDs are .de (Germany), .in (India) and .kr (Korea). These kinds of domain names don’t target specific languages, they only target Internet users in a specific country.

Some ccTLDs are treated by Google for ranking purposes as if they are regular Generic Top Level Domains (gTLDs), which are domains that are not specific to a country. A popular example is .io, which technically is a ccTLD (pertaining to the British Indian Ocean Territory) but because of how it’s used, Google treats it like a regular gTLD (generic top level domain).

Ranking Boosts For ccTLDs

The question that Gary Illyes answered was about the ranking boost given to ccTLDs.

This is the question:

“When a Korean person searches Google in Korean, does a com.kr domain or a .com domain do better?”

Gary Illyes answered:

“Good question. Generally speaking the local domain names, in your case .kr, tend to do better because Google Search promotes content local to the user.”

A lot of people want to rank better in a specific country and one of the best practices for doing that is to register a domain name that is specific to the country. Google will give it a ranking boost over other sites that are not explicitly targeting a specific country.

Gary continued his answer by explaining the ranking boost of a ccTLD over a generic top level domain (gTLD), like .com, .net and so on.

This is Gary’s explanation:

“That’s not to say that a .com domain can’t do well, it can, but generally .kr has a little more benefit, albeit not too much. “

Targeting Country Versus Targeting Language

Lastly, Gary mentioned that targeting a user’s language has more impact than the domain name.

He continued his answer:

“If the language of a site matches the user’s query language, that probably has more impact than the domain name itself.”

A benefit of targeting a language is that a site is able regardless of the country that a user is searching from whereas the country code top level domain name targets a country.

Something that Gary didn’t mention is that using a ccTLD can inspire user trust from searchers whose country matches the country that the domain name is targeting and because of that searchers on Google may be more inclined to click on a search result that uses the geotargeted ccTLD.

If a user is in Korea they may feel that a .kr domain is meant specifically for them. If a searcher is in Australia they may feel more inclined to click on a .au domain name.

Listen to the podcast answer from the 3:35 minute mark:

Featured Image by Shutterstock/Dean Drobot

Google Clarifies H1-H6 Headings For SEO via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about the SEO value of hierarchically ordering heading elements (H1, H2, etc.). His answer offered an insight into the actual value of heading elements for digital marketing.

Heading Elements

In simple terms, HTML Elements are the building blocks of a web page and they all have their place much like the foundation and a roof of a home have their places in the overall structure.

Heading elements communicate the topic and subtopics of a web page and are literally a list of topics when a page is viewed just by their headings.

The World Wide Web Consortium (W3C), which defines HTML, describes headings like this:

“HTML defines six levels of headings. A heading element implies all the font changes, paragraph breaks before and after, and any white space necessary to render the heading. The heading elements are H1, H2, H3, H4, H5, and H6 with H1 being the highest (or most important) level and H6 the least.

Headers play a related role to lists in structuring documents, and it is common to number headers or to include a graphic that acts like a bullet in lists.”

Strictly speaking, it is absolutely correct to order headings according to their hierarchical structure.

What Google Says About Headings

The person asking the question commented that the SEO Starter Guide recommends using heading elements in “semantic” order for people who use screen readers (devices that translate text into spoken words) but that otherwise it’s not important for Google. The person asking the question wanted to know if the SEO Starter Guide was out of date because an SEO tool had a different recommendation.

Gary narrated the submitted question:

“I recently read on the SEO starter guide that “Having headings in semantic order is fantastic for screen readers, but from Google Search perspective, it doesn’t matter if you’re using them out of order.”

Is this correct because an SEO tool told me otherwise.”

It’s a good question because it makes sense to use heading elements in a way that shows the hierarchical importance of different sections of a web page, right?

Here’s Gary’s response:

“We update our documentation quite frequently to ensure that it’s always up to date. In fact the SEO starter guide was refreshed just a couple months back to ensure it’s still relevant, so what you read in the guide is as accurate as it can get.

Also, just because a non-Google tool tells you something is good or bad, that doesn’t make it relevant for Google; it may still be a good idea, just not necessarily relevant to Google.”

Is It Relevant For Google?

The official HTML standards are flexible about the use of headings.

Here’s what the standards say here:

“A heading element briefly describes the topic of the section it introduces. Heading information may be used by user agents, for example, to construct a table of contents for a document automatically.”

And here:

“The heading elements are H1, H2, H3, H4, H5, and H6 with H1 being the highest (or most important) level and H6 the least.”

The official HTML5 specifications for headings state that the hierarchical ordering is implied but that in both cases the headings communicate the start of a new section within a web page. Also, while the official standards encourage “nesting” headings for subtopics but that’s a “strong” encouragement and not a rigid rule.

“The first element of heading content in an element of sectioning content represents the heading for that section. Subsequent headings of equal or higher rank start new (implied) sections, headings of lower rank start implied subsections that are part of the previous one. In both cases, the element represents the heading of the implied section.

Sections may contain headings of any rank, but authors are strongly encouraged to either use only h1 elements, or to use elements of the appropriate rank for the section’s nesting level.”

That last part of the official standards is quite explicit that users are “encouraged” to only use H1 elements, which might sound crazy to some people, but that’s the reality. Still, that’s just an encouragement, not a rigid rule.

It’s only in the official HTML standards for heading elements in the context of accessibility that the recommendations are more rigid about using heading elements with a hierarchical structure (important to least important).

So as you can see, Google’s usage of heading elements appear to be in line with the official standards because the standards allow for deviation, except for accessibility reasons.

The SEO tool is correct that the proper use of heading elements is to put them into hierarchical order. But the tool is incorrect in saying that it’s better for SEO.

This means that H1 is the most important heading for screen readers but it’s not the most important for Google. When I was doing SEO in 2001, the H1 was the most important heading element. But that hasn’t been the case for decades.

For some reason, some SEO tools (and SEOs) still believe that H1 is the most important heading for Google. But that’s simply not correct.

Listen to the SEO Office Hours Podcast at the 13:17 minute mark:

Featured Image by Shutterstock/AlenD

Google’s John Mueller On How To Verify An SEO Agency’s Work via @sejournal, @MattGSouthern

In a recent session of Google’s SEO office-hours Q&A, the Search Relations team addressed a common concern among business owners: how to determine if an SEO agency is actively optimizing your website.

The Business Owner’s Question

The discussion was prompted by a business owner who asked:

“If I have an agency that is managing our organic SEO on a monthly basis, how can I tell if anyone has been actively optimizing? I have a suspicion that the agency has not been optimized out of site for years.”

Google’s Response

In response, John Mueller, a Search Relations team member, shared his experience collaborating with an agency on Google’s Search Central content.

Key Points from Mueller’s Advice

  1. Regular Meetings: Hold frequent discussions with the SEO agency to review their work.
  2. Progress Reports: Request reports that detail the site’s progress over time.
  3. Future Planning: Discussing upcoming work helps ensure the agency addresses your needs.
  4. Client Education: Clients should have a basic understanding of SEO work to better evaluate the agency’s efforts.

While acknowledging that increased engagement requires additional time from both parties, Mueller believes it’s worth the effort.

This allows you to check if the SEO agency is meeting your needs. However, he notes that you need to have some trust in your relationship with the agency.

Resources For SEO Education

To assist businesses in managing their SEO efforts, Mueller pointed to two valuable resources:

  1. Google’s guide on hiring an SEO provides insights into the selection process.
  2. The SEO starter guide offers a foundational understanding of SEO principles.

Mueller’s Full Response

“This is a great question. When we worked with an SEO agency for some of the Search Central content, we had regular meetings to discuss the work that they did, to look at reports about the site’s progress, and to discuss any upcoming work. This did require a bit more time, both from them and from us, but I found it very insightful. I think it helps to lightly understand the kind of work that an agency would do, so that you can confirm that they’re doing what you expect them to do, and even then there’s a component of trust involved. We have a page about hiring an SEO which has some insights, and there’s our SEO starter guide, which can explain a bit more. And also, perhaps some folks from the SEO industry can comment on how they’d help a client understand how they’re spending their time.”

Previous Discussions On SEO Hiring

This advice from Mueller echoes a similar discussion he initiated last year, where he sought recommendations on what businesses should look for when hiring SEO consultants.

The conversation among industry experts highlighted key factors such as experience, customization, transparency, and adherence to ethical practices.

For more insights on choosing the right SEO professional, refer to our previous coverage of that discussion.

When To Seek Professional SEO Help

For businesses unsure about when to seek professional SEO help, here’s an article that outlines five critical situations that warrant hiring an SEO expert.

These include when Google isn’t indexing your site, during site migrations or redesigns, when organic traffic drops significantly, to reverse manual actions, and when current SEO strategies aren’t yielding results.

This information complements Mueller’s advice by helping businesses recognize when professional intervention is necessary.


Featured Image: YouTube.com/GoogleSearchCentral

Research Confirms Google AIO Keyword Trends via @sejournal, @martinibuster

New research by enterprise search marketing company BrightEdge reveals dramatic changes to sites surfaced through Google’s AI Overviews search feature and though it maintains search market share, the data shows that AI search engine Perplexity is gaining ground at a remarkable pace.

Rapid & Dramatic Changes In AIO Triggers

The words that trigger AI Overviews are changing at an incredibly rapid pace. Some keyword trends in June may already changed in July.

AI Overviews were triggered 50% more times for keywords with the word “best” in them. But Google may have reversed that behavior because those phrases, when applied to products, don’t appear to be triggering AIOs in July.

Other AIO triggers for June 2024:

  • “What Is” keywords increased by 20% more times
  • “How to” queries increased by 15%
  • Queries with the phrase “”symptoms of” increased by about 12%
  • Queries with the word “treatment” increased by 10%

A spokesperson from BrightEdge responded to my questions about ecommerce search queries:

“AI’s prevalence in ecommerce is indeed increasing, with a nearly 20% rise in ecommerce keywords showing AI overviews since the beginning of July, and a dramatic 62.6% increase compared to the last week of June. Alongside this growth, we’re seeing a significant 66.67% uptick in product searches that contain both pros and cons from the AI overview. This dual trend indicates not only more prevalent use of AI in ecommerce search results but also more comprehensive and useful information being provided to consumers through features like the pros/cons modules.”

Google Search And AI Trends

BrightEdge used its proprietary BrightEdge Generative Parser™ (BGP) tool to identify key trends in search that may influence digital marketing for the rest of 2024. BGP is a tool that collects massive amounts of search trend data and turns it into actionable insights.

Their research estimates that each percentage point of search market share represents $1.2 billion, which means that gains as small as single digits are still incredibly valuable.

Jim Yu, founder and executive chairman of BrightEdge noted:

“There is no doubt that Google’s dominance remains strong, and what it does in AI matters to every business and marketer across the planet.

At the same time, new players are laying new foundations as we enter an AI-led multi-search universe. AI is in a constant state of progress, so the most important thing marketers can do now is leverage the precision of insights to monitor, prepare for changes, and adapt accordingly.

Google continues to be the most dominant source of search traffic, driving approximately 92% organic search referrals. A remarkable data point from the research is that AI competitors in all forms have not yet made a significant impact as a source of traffic, completely deflating speculation that AI competitors will cut into Google’s search traffic.

Massive Decrease In Reddit & Quora Referrals

Back in May 2024 Google Of interest to search marketers is that Google has followed through in reducing the amount of user generated content (UGC) surfaced through its AI Overviews search feature. UGC is responsible for many of the outrageously bad responses that generated negative press. BrightEdge’s research shows that referrals to Reddit and Quora from AI Overviews declined to “near zero” in the month of June.

Citations to Quora from AI Overviews are reported to have decreased by 99.69%. Reddit fared marginally etter in June with an 85.71% decrease

BrightEdge’s report noted:

“Google is prioritizing established, expert content over user discussions and forums.”

Bing, Perplexity And Chatbot Impact

Market share for Bing continues to increase but only by fractions of a percentage point, growing from 4.2% to 4.5%. But as they say, it’s better to be moving forward than standing still.

Perplexity on the other hand is growing at a monthly rate of 31%. Percentages however can be misleading because 31% of a relatively small number is still a relatively small number. Most publishers aren’t talking about all the traffic they’re getting from Perplexity so they still have a way to go. Nevertheless, a monthly growth rate of 31% is movement in the right direction.

Traffic from Chatbots aren’t really a thing, so this comparison should be put into that perspective. Sending referral traffic to websites isn’t really what chatbots like Claude and ChatGPT are about (at this point in time). The data shows that both Claude and ChatGPT are not sending much traffic.

OpenAI however is hiding referrals from the websites that it’s sending traffic to which makes it difficult to track it. Therefore a full understanding of the impact of LLM traffic, because ChatGPT uses a rel=noreferrer HTML attribute which hides all traffic originating from ChatGPT to websites. The use of the rel=noreferrer link attribute is not unusual though because it’s an industry standard for privacy and security.

BrightEdge’s analysis looks at this from a long term perspective and anticipates that referral traffic from LLMs will become more prevalent and at some point will become a significant consideration for marketers.

This is the conclusion reached by BrightEdge:

“The overall number of referrals from LLMs is small and expected to have little industry impact at this time. However, if this incremental growth continues, BrightEdge predicts it will influence where people search online and how brands approach optimizing for different engines.”

Before the iPhone existed, many scoffed at the idea of the Internet on mobile devices. So BrightEdge’s conclusions about what to expect from LLMs are not unreasonable.

AIO trends have already changed in July, pointing to the importance of having fresh data for adapting to fast changing AIO keyword trends.  BrightEdge delivers real-time data updated on a daily basis so that marketers can make better informed decisions.

Understand AI Overview Trends:

Ten Observations On AI Overviews For June 2024

Featured Image by Shutterstock/Krakenimages.com

Anthropic Announces Free Claude AI Chatbot For Android via @sejournal, @martinibuster

Anthropic announced the release of a new Claude Android app that uses their powerful Claude 3.5 Sonnet language model. The app is available free (with usage limits) and also with paid plans.

Anthropic Claude

Claude is a powerful AI chatbot that offers advanced reasoning, can do real-time image analysis, and can translate languages in real-time. Claude 3.5 Sonnet is Anthropic’s most advanced language model, introduced in late June 2024.

According to Anthropic:

“Claude 3.5 Sonnet raises the industry bar for intelligence, outperforming competitor models and Claude 3 Opus on a wide range of evaluations, with the speed and cost of our mid-tier model, Claude 3 Sonnet.

Claude 3.5 Sonnet sets new industry benchmarks for graduate-level reasoning (GPQA), undergraduate-level knowledge (MMLU), and coding proficiency (HumanEval). It shows marked improvement in grasping nuance, humor, and complex instructions, and is exceptional at writing high-quality content with a natural, relatable tone.”

Claude By Anthropic Android App

The Claude AI chatbot app is currently available for iOS and now it’s available from the Google Play store for Android users. Downloading and signing up is easy. Once signed in and verified users can start using Claude absolutely free. I downloaded it and gave it a try and was pleasantly surprised at its ability to help create a ramen recipe from scratch. A cool feature of the app is that it can continue chats from other devices.

The official announcement described various ways it’s useful:

“Use Claude for work or for fun. Whether you’re drafting a business proposal between meetings, translating menus while traveling, brainstorming gift ideas while shopping, or composing a speech while waiting for a flight, Claude is ready to assist you.”

Download the Claude by Anthropic Android App from Google Play:

Claude by Anthropic

Read the official announcement:

Claude Android app