Google May Rely Less On Hreflang, Shift To Auto Language Detection via @sejournal, @MattGSouthern

In the latest episode of Google’s “Search Off The Record” podcast, a member of the Search Relations team suggested that Google may be moving towards automatically detecting language versions of web pages, potentially reducing the need for manual hreflang annotations.

Google’s Stance On Automatic Language Detection

Gary Illyes, a Google analyst, believes that search engines should rely less on annotations like hreflang and more on automatically learned signals.

Illyes stated during the podcast:

“Ultimately, I would want less and less annotations, site annotations, and more automatically learned things.”

He argued that this approach is more reliable than the current system of manual annotations.

Illyes elaborated on the existing capabilities of Google’s systems:

“Almost ten years ago, we could already do that, and this was what, almost ten years ago.”

Illyes emphasized the potential for improvement in this area:

“If, almost ten years ago, we could already do that quite reliably, then why would we not be able to do it now.”

The Current State Of Hreflang Implementation

The discussion also touched on the current state of hreflang implementation.

According to data cited in the podcast, only about 9% of websites currently use hreflang annotations on their home pages.

This relatively low adoption rate might be a factor in Google’s consideration of alternative methods for detecting language and regional targeting.

Potential Challenges & Overrides

While advocating for automatic detection, Illyes acknowledged that website owners should be able to override automatic detections if necessary.

He conceded, “I think we should have overrides,” recognizing the need for manual control in some situations.

The Future Of Multilingual SEO

While no official changes have been announced, this discussion provides insight into the potential future direction of Google’s approach to multilingual and multi-regional websites.

Stay tuned for any official updates from Google on this topic.

What This Means For You

This potential shift in Google’s language detection and targeting approach could have significant implications for website owners and SEO professionals.

It could reduce the technical burden of implementing hreflang annotations, particularly for large websites with multiple language versions.

The top takeaways from this discussion include the following:

  1. It’s advisable to continue following Google’s current guidelines on implementing hreflang annotations.
  2. Ensure that your multilingual content is high-quality and accurately translated. This will likely remain crucial regardless of how Google detects language versions.
  3. While no immediate changes are planned, be ready to adapt your SEO strategy if Google moves towards more automatic language detection.
  4. If you’re planning a new multilingual site or restructuring an existing one, consider a clear and logical structure that makes language versions obvious, as this may help with automatic detection.

Remember, while automation may increase, having a solid understanding of international SEO principles will remain valuable for optimizing your global web presence.

Listen to the full podcast episode below:

Google Insights: Can Incorrect Hreflang Tags Hurt SEO? via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off The Record podcast, Gary Illyes, a Google’s Search Relations team member, addressed concerns about incorrect hreflang implementation and its potential impact on SEO.

Hreflang Errors: Less Problematic Than Expected?

During the discussion, Illyes was asked about the consequences of mismatched hreflang annotations and actual page content.

Specifically, he addressed scenarios where a page might be incorrectly labeled as one language while containing content in another.

Illyes stated:

“As far as I remember, I worked on the parsing implementation plus the promotion implementation of hreflang, and back then, it didn’t cause problems.”

However, he also noted that his direct experience with this was from around 2016, adding the following:

“That’s a few years back… since then, we changed so many things that I would have to check whether it causes problems.”

Language Demotion & Country Promotion

Providing further context, Illyes explained Google’s approach to language and country relevance:

“When I spelled out LDCP, I said the language demotion country promotion. So, for example, if someone is searching in German and your page is in English, then you would get a negative demotion in the search results.”

This suggests that while incorrect hreflang implementation might not directly cause problems, the actual language of the content still plays a vital role in search relevance.

Exceptions To Language Matching

Interestingly, Illyes pointed out that there are exceptions to strict language matching:

“It’s less relevant to the query to the person unless you are searching for something like ‘how do you spell banana’… Because then it doesn’t really matter… well no it does… it still matters but… because you’re searching for something in English, so we would think okay you want some page that explains how to spell banana in English, not German.”

What This Means For You

Understanding how Google handles hreflang and language mismatches can help inform international SEO strategies.

While Google’s systems appear to be somewhat forgiving of hreflang errors, the actual language of the content remains a key factor in search relevance.

Here are the top takeaways:

  1. While incorrect hreflang implementation may not directly penalize your site, it’s still best practice to ensure your annotations accurately reflect your content.
  2. The actual language of your content appears to be more important than hreflang annotations for search relevance.
  3. For specific queries, like spelling or language-learning topics, Google may be more flexible in presenting content in various languages.

As Illyes noted, Google’s systems have changed over time. Continue to monitor official Google documentation and announcements for the most up-to-date best practices in international SEO.

Listen to the full podcast episode below:


Featured Image: Longfin Media/Shutterstock

Google Hints Lowering SEO Value Of Country Code Top-Level Domains via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off The Record podcast, the company’s Search Relations team hinted at potential changes in how country-code top-level domains (ccTLDs) are valued for SEO.

This revelation came during a discussion on internationalization and hreflang implementation.

The Fading Importance Of ccTLDs

Gary Illyes, a senior member of Google’s Search Relations team, suggested that the localization boost traditionally associated with ccTLDs may soon be over.

Illyes stated:

“I think eventually, like in years’ time, that [ccTLD benefit] will also fade away.”

He explained that ccTLDs are becoming less reliable indicators of a website’s geographic target audience.

Creative Use Of ccTLDs For Branding

According to Illyes, the primary reason for this shift is the creative use of ccTLDs for branding purposes rather than geographic targeting.

He elaborated:

“Think about the all the funny domain names that you can buy nowadays like the .ai. I think that’s Antigua or something… It doesn’t say anything anymore about the country… it doesn’t mean that the content is for the country.”

Illyes further explained the historical context and why this change is occurring:

“One of the main algorithms that do the whole localization thing… is called something like LDCP – language demotion country promotion. So basically if you have like a .de, then for users in Germany you would get like a slight boost with your .de domain name. But nowadays, with .co or whatever .de, which doesn’t relate to Germany anymore, it doesn’t really make sense for us to like automatically apply that little boost because it’s ambiguous what the target is.”

The Impact On SEO Strategies

This change in perspective could have implications for international SEO strategies.

Traditionally, many businesses have invested in ccTLDs to gain a perceived advantage in local search results.

If Google stops using ccTLDs as a strong signal for geographic relevance, this could alter how companies approach their domain strategy for different markets.

Marketing Value Of ccTLDs

However, Illyes also noted that from a marketing perspective, there might still be some value in purchasing ccTLDs:

“I think from a marketing perspective there’s still some value in buying the ccTLDs and if I… if I were to run some… like a new business, then I would try to buy the country TLDs when I can, when like it’s monetarily feasible, but I would not worry too much about it.”

What This Means For You

As search engines become more capable of understanding content and context, traditional signals like ccTLDs may carry less weight.

This could lead to a more level playing field for websites, regardless of their domain extension.

Here are some top takeaways:

  1. If you’ve invested heavily in country-specific domains for SEO purposes, it may be time to reassess this strategy.
  2. Should the importance of ccTLDs decrease, proper implementation of hreflang tags becomes crucial for indicating language and regional targeting.
  3. While the SEO benefits may diminish, ccTLDs can still have branding and marketing value.
  4. Watch for official announcements or changes in Google’s documentation regarding using ccTLDs and international SEO best practices.

While no immediate changes were announced, this discussion provides valuable insight into the potential future direction of international SEO.

Listen to the full podcast episode below:

Google Advises Caution With AI Generated Answers via @sejournal, @martinibuster

Google’s Gary Illyes cautioned about the use of Large Language Models (LLMs), affirming the importance of checking authoritative sources before accepting any answers from an LLM. His answer was given in the context of a question, but curiously, he didn’t publish what that question was.

LLM Answer Engines

Based on what Gary Illyes said, it’s clear that the context of his recommendation is the use of AI for answering queries. The statement comes in the wake of OpenAI’s announcement of SearchGPT that they are testing an AI Search Engine prototype. It may be that his statement is not related to that announcement and is just a coincidence.

Gary first explained how LLMs craft answers to questions and mentions how a technique called “grounding” can improve the accuracy of the AI generated answers but that it’s not 100% perfect, that mistakes still slip through. Grounding is a way to connect a database of facts, knowledge, and web pages to an LLM. The goal is to ground the AI generated answers to authoritative facts.

This is what Gary posted:

“Based on their training data LLMs find the most suitable words, phrases, and sentences that align with a prompt’s context and meaning.

This allows them to generate relevant and coherent responses. But not necessarily factually correct ones. YOU, the user of these LLMs, still need to validate the answers based on what you know about the topic you asked the LLM about or based on additional reading on resources that are authoritative for your query.

Grounding can help create more factually correct responses, sure, but it’s not perfect; it doesn’t replace your brain. The internet is full of intended and unintended misinformation, and you wouldn’t believe everything you read online, so why would you LLM responses?

Alas. This post is also online and I might be an LLM. Eh, you do you.”

AI Generated Content And Answers

Gary’s LinkedIn post is a reminder that LLMs generate answers that are contextually relevant to the questions that are asked but that contextual relevance isn’t necessarily factually accurate.

Authoritativeness and trustworthiness is an important quality of the kind of content Google tries to rank. Therefore it is in publishers best interest to consistently fact check content, especially AI generated content, in order to avoid inadvertently becoming less authoritative. The need to verify facts also holds true for those who use generative AI for answers.

Read Gary’s LinkedIn Post:

Answering something from my inbox here

Featured Image by Shutterstock/Roman Samborskyi

Google Cautions On Blocking GoogleOther Bot via @sejournal, @martinibuster

Google’s Gary Illyes answered a question about the non-search features that the GoogleOther crawler supports, then added a caution about the consequences of blocking GoogleOther.

What Is GoogleOther?

GoogleOther is a generic crawler created by Google for the various purposes that fall outside of those of bots that specialize for Search, Ads, Video, Images, News, Desktop and Mobile. It can be used by internal teams at Google for research and development in relation to various products.

The official description of GoogleOther is:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Something that may be surprising is that there are actually three kinds of GoogleOther crawlers.

Three Kinds Of GoogleOther Crawlers

  1. GoogleOther
    Generic crawler for public URLs
  2. GoogleOther-Image
    Optimized to crawl public image URLs
  3. GoogleOther-Video
    Optimized to crawl public video URLs

All three GoogleOther crawlers can be used for research and development purposes. That’s just one purpose that Google publicly acknowledges that all three versions of GoogleOther could be used for.

What Non-Search Features Does GoogleOther Support?

Google doesn’t say what specific non-search features GoogleOther supports, probably because it doesn’t really “support” a specific feature. It exists for research and development crawling which could be in support of a new product or an improvement in a current product, it’s a highly open and generic purpose.

This is the question asked that Gary narrated:

“What non-search features does GoogleOther crawling support?”

Gary Illyes answered:

“This is a very topical question, and I think it is a very good question. Besides what’s in the public I don’t have more to share.

GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.

Historically Googlebot was used for this, but that kind of makes things murky and less transparent, so we launched GoogleOther so you have better controls over what your site is crawled for.

That said GoogleOther is not tied to a single product, so opting out of GoogleOther crawling might affect a wide range of things across the Google universe; alas, not Search, search is only Googlebot.”

It Might Affect A Wide Range Of Things

Gary is clear that blocking GoogleOther wouldn’t have an affect on Google Search because Googlebot is the crawler used for indexing content. So if blocking any of the three versions of GoogleOther is something a site owner wants to do, then it should be okay to do that without a negative effect on search rankings.

But Gary also cautioned about the outcome that blocking GoogleOther, saying that it would have an effect on other products and services across Google. He didn’t state which other products it could affect nor did he elaborate on the pros or cons of blocking GoogleOther.

Pros And Cons Of Blocking GoogleOther

Whether or not to block GoogleOther doesn’t necessarily have a straightforward answer. There are several considerations to whether doing that makes sense.

Pros

Inclusion in research for a future Google product that’s related to search (maps, shopping, images, a new feature in search) could be useful. It might be helpful to have a site included in that kind of research because it might be used for testing something good for a site and be one of the few sites chosen to test a feature that could increase earnings for a site.

Another consideration is that blocking GoogleOther to save on server resources is not necessarily a valid reason because GoogleOther doesn’t seem to crawl so often that it makes a noticeable impact.

If blocking Google from using site content for AI is a concern then blocking GoogleOther will have no impact on that at all. GoogleOther has nothing to do with crawling for Google Gemini apps or Vertex AI, including any future products that will be used for training associated language models. The bot for that specific use case is Google-Extended.

Cons

On the other hand it might not be helpful to allow GoogleOther if it’s being used to test something related to fighting spam and there’s something the site has to hide.

It’s possible that a site owner might not want to participate if GoogleOther comes crawling for market research or for training machine learning models (for internal purposes) that are unrelated to public-facing products like Gemini and Vertex.

Allowing GoogleOther to crawl a site for unknown purposes is like giving Google a blank check to use your site data in any way they see fit outside of training public-facing LLMs or purposes related to named bots like GoogleBot.

Takeaway

Should you block GoogleOther? It’s a coin toss. There are possible potential benefits but in general there isn’t enough information to make an informed decision.

Listen to the Google SEO Office Hours podcast at the 1:30 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Reddit Limits Search Engine Access, Google Remains Exception via @sejournal, @MattGSouthern

Reddit has recently tightened its grip on who can access its content, blocking major search engines from indexing recent posts and comments.

This move has sparked discussions in the SEO and digital marketing communities about the future of content accessibility and AI training data.

What’s Happening?

First reported by 404 Media, Reddit updated its robots.txt file, preventing most web crawlers from accessing its latest content.

Google, however, remains an exception, likely due to a $60 million deal that allows the search giant to use Reddit’s content for AI training.

Brent Csutoras, founder of Search Engine Journal, offers some context:

“Since taking on new investors and starting their pathway to IPO, Reddit has moved away from being open-source and allowing anyone to scrape their content and use their APIs without paying.”

The Google Exception

Currently, Google is the only major search engine able to display recent Reddit results when users search with “site:reddit.com.”

This exclusive access sets Google apart from competitors like Bing and DuckDuckGo.

Why This Matters

For users who rely on appending “Reddit” to their searches to find human-generated answers, this change means they’ll be limited to using Google or search engines that pull from Google’s index.

It presents new challenges for SEO professionals and marketers in monitoring and analyzing discussions on one of the internet’s largest platforms.

The Bigger Picture

Reddit’s move aligns with a broader trend of content creators and platforms seeking compensation for using their data in AI training.

As Csutoras points out:

“Publications, artists, and entertainers have been suing OpenAI and other AI companies, blocking AI companies, and fighting to avoid using public content for AI training.”

What’s Next?

While this development may seem surprising, Csutoras suggests it’s a logical step for Reddit.

He notes:

“It seems smart on Reddit’s part, especially since similar moves in the past have allowed them to IPO and see strong growth for their valuation over the last two years.”


FAQ

What is the recent change Reddit has made regarding content accessibility?

Reddit has updated its robots.txt file to block major search engines from indexing its latest posts and comments. This change exempts Google due to a $60 million deal, allowing Google to use Reddit’s content for AI training purposes.

Why does Google have exclusive access to Reddit’s latest content?

Google has exclusive access to Reddit’s latest content because of a $60 million deal that allows Google to use Reddit’s content for AI training. This agreement sets Google apart from other search engines like Bing and DuckDuckGo, which are unable to index new Reddit posts and comments.

What broader trend does Reddit’s recent move reflect?

Reddit’s decision to limit search engine access aligns with a larger trend where content creators and platforms seek compensation for the use of their data in AI training. Many publications, artists, and entertainers are taking similar actions to either block or demand compensation from AI companies using their content.


Featured Image: Mamun sheikh K/Shutterstock

Study Backs Google’s Claims: AI Search Boosts User Satisfaction via @sejournal, @MattGSouthern

A new study finds that despite concerns about AI in online services, users are more satisfied with search engines and social media platforms than before.

The American Customer Satisfaction Index (ACSI) conducted its annual survey of search and social media users, finding that satisfaction has either held steady or improved.

This comes at a time when major tech companies are heavily investing in AI to enhance their services.

Search Engine Satisfaction Holds Strong

Google, Bing, and other search engines have rapidly integrated AI features into their platforms over the past year. While critics have raised concerns about potential negative impacts, the ACSI study suggests users are responding positively.

Google maintains its position as the most satisfying search engine with an ACSI score of 81, up 1% from last year. Users particularly appreciate its AI-powered features.

Interestingly, Bing and Yahoo! have seen notable improvements in user satisfaction, notching 3% gains to reach scores of 77 and 76, respectively. These are their highest ACSI scores in over a decade, likely due to their AI enhancements launched in 2023.

The study hints at the potential of new AI-enabled search functionality to drive further improvements in the customer experience. Bing has seen its market share improve by small but notable margins, rising from 6.35% in the first quarter of 2023 to 7.87% in Q1 2024.

Customer Experience Improvements

The ACSI study shows improvements across nearly all benchmarks of the customer experience for search engines. Notable areas of improvement include:

  • Ease of navigation
  • Ease of using the site on different devices
  • Loading speed performance and reliability
  • Variety of services and information
  • Freshness of content

These improvements suggest that AI enhancements positively impact various aspects of the search experience.

Social Media Sees Modest Gains

For the third year in a row, user satisfaction with social media platforms is on the rise, increasing 1% to an ACSI score of 74.

TikTok has emerged as the new industry leader among major sites, edging past YouTube with a score of 78. This underscores the platform’s effective use of AI-driven content recommendations.

Meta’s Facebook and Instagram have also seen significant improvements in user satisfaction, showing 3-point gains. While Facebook remains near the bottom of the industry at 69, Instagram’s score of 76 puts it within striking distance of the leaders.

Challenges Remain

Despite improvements, the study highlights ongoing privacy and advertising challenges for search engines and social media platforms. Privacy ratings for search engines remain relatively low but steady at 79, while social media platforms score even lower at 73.

Advertising experiences emerge as a key differentiator between higher- and lower-satisfaction brands, particularly in social media. New ACSI benchmarks reveal user concerns about advertising content’s trustworthiness and personal relevance.

Why This Matters For SEO Professionals

This study provides an independent perspective on how users are responding to the AI push in online services. For SEO professionals, these findings suggest that:

  1. AI-enhanced search features resonate with users, potentially changing search behavior and expectations.
  2. The improving satisfaction with alternative search engines like Bing may lead to a more diverse search landscape.
  3. The continued importance of factors like content freshness and site performance in user satisfaction aligns with long-standing SEO best practices.

As AI becomes more integrated into our online experiences, SEO strategies may need to adapt to changing user preferences.


Featured Image: kate3155/Shutterstock

OpenAI Launches SearchGPT: AI-Powered Search Prototype via @sejournal, @MattGSouthern

OpenAI has announced the launch of SearchGPT, a prototype AI-powered search engine.

This move marks the company’s entry into the competitive search market, potentially challenging established players.

Key Features & Functionality

SearchGPT aims to directly answer user queries by combining AI language models with real-time web information.

Rather than offering a list of links, SearchGPT attempts to deliver concise responses with citations to source material.

Here’s an example of a search results page for the query: “music festivals in boone north carolina in august.”

Screenshot from openai.com/index/searchgpt-prototype/, July 2024.

The SearchGPT prototype includes:

  • A conversational interface allowing follow-up questions
  • Real-time information retrieval from web sources
  • In-line attributions and links to original content

Publisher Controls & Content Management

OpenAI is also introducing tools for publishers to manage how their content appears in SearchGPT, giving them more control over their presence in AI-powered search results.

Key points about the publisher controls include:

  1. Separate from AI training: OpenAI emphasizes that SearchGPT is distinct from the training of their generative AI models. Sites can appear in search results even if they opt out of AI training data.
  2. Content management options: Publishers can influence how their content is displayed and used within SearchGPT.
  3. Feedback mechanism: OpenAI has provided an email (publishers-feedback@openai.com) for publishers to share their thoughts and concerns.
  4. Performance insights: The company plans to share information with publishers about their content’s performance within the AI search ecosystem.

These tools are OpenAI’s response to ongoing debates about AI’s use of web content and concerns over intellectual property rights.

Publisher Partnerships & Reactions

OpenAI reports collaborating with several publishers during the development of SearchGPT.

Nicholas Thompson, CEO of The Atlantic, provided a statement supporting the initiative, emphasizing the importance of valuing and protecting journalism in AI search development.

Robert Thomson, News Corp’s chief executive, also commented on the project, stressing the need for a symbiotic relationship between technology and content and the importance of protecting content provenance.

Limited Availability & Future Plans

Currently, SearchGPT is available to a restricted group of users and publishers.

OpenAI describes it as a temporary prototype, indicating plans to integrate features into their existing ChatGPT product eventually.

Why This Matters

The introduction of SearchGPT represents a potential shakeup to the search engine market.

This development could have far-reaching implications for digital marketing, content creation, and user behavior on the internet.

Potential effects include:

  • Changes in content distribution and discovery mechanisms
  • New considerations for search engine optimization strategies
  • Evolving relationships between AI companies and content creators

Remember, this is still a prototype, and we have yet to see its capabilities.

There’s a waitlist available for those trying to get their hands on it early.

What This Means For You

AI-powered search might offer users more direct access to information. However, the accuracy and comprehensiveness of results may depend on publisher participation and content management choices.

For content creators and publishers, these new tools provide opportunities to have more say in how their work is used in AI search contexts.

While it may increase content visibility and engagement, it also requires adapting to new formats and strategies to ensure content is AI-friendly and easily discoverable.

As SearchGPT moves from prototype to integration with ChatGPT, it will be vital to stay informed about these developments and adapt your strategies.

The future of search is evolving, and AI is at the forefront of this transformation.

Bing’s Updated AI Search Will Make Site Owners Happy via @sejournal, @martinibuster

Bing is rolling out a new version of Generative Search that displays information in an intuitive way that encourages exploration but also prioritizes clicks from the search results to websites.

Microsoft introduced their new version of AI search:

“After introducing LLM-powered chat answers on Bing in February of last year, we’ve been hard at work on the ongoing revolution of search. …Today, we’re excited to share an early view of our new generative search experience which is currently shipping to a small percentage of user queries.”

New Layout

Bing’s announcement discusses new features that not only make it easy for users to find information, Bing also makes it easy for users to see the organic search results and click through and browse websites.

On the desktop view, Bing shows three panels:

  • A table of content on the left
  • AI answers in the center (with links to website sources)
  • Traditional organic search results on the right hand side
  • Even more organic search results beneath “the fold”

The table of contents that is on the right hand side is invites exploration. It has the main topic at the top, with directly related subtopics beneath it. This is so much better than a People Also Asked type of navigation because it invites the user to explore and click on an organic search result to keep on exploring.

Screenshot: Table Of Contents

This layout is the result of a conscious decision at Bing to engineer it so that that it preserves and encourages clicks to websites.

Below is a screenshot of the new generative AI search experience. What’s notable is how Bing surrounds the AI answers with organic search results.

Screenshot Of The New Bing AI Search Results

Bing makes a point to explain that they have tested the new interface to make sure that the search results will send the same amount of traffic and to avoid creating a layout that results in an increase in zero click search results.

When other search engines talk about search quality it is always from the context of user satisfaction. Bing’s announcement makes it clear that sustaining traffic to websites was an important context that guided the design of the new layout.

Below is a screenshot of a typical Bing AI search result for a query about the life span of elephants.

Note that all the areas that I bounded with blue boxes are AI answers while everything outside of the blue boxes are organic search results.

Screenshot Of Mix of AI And Organic Results

Bing's new AI search layout emphasizes organic search results

The screenshot makes it clear that there is a balance of organic search results and AI answers. In addition to those contextually relevant organic search results there are also search results on the right hand side (not shown in the above screenshot).

Microsoft’s blog post explained:

“We are continuing to look closely at how generative search impacts traffic to publishers. Early data indicates that this experience maintains the number of clicks to websites and supports a healthy web ecosystem. The generative search experience is designed with this in mind, including retaining traditional search results and increasing the number of clickable links, like the references in the results.”

Bing’s layout is a huge departure from the zero-click style of layouts seen in other search engines. Bing has purposely designed their generative AI layout to maintain clicks to websites. It cannot be overstated how ethical Bing’s approach to the web ecosystem is.

Bing Encourages Browsing And Discovery

An interesting feature of Bing’s implementation of generative AI search is that it shows the answer to the initial question first, and it also anticipates related questions. This is similar to a technique called “information gain” where an AI search assistant will rank an initial set of pages that answers a search query, but will also rank a second, third and fourth set of search results that contain additional information that a user may be interested in, information on related topics.

What Bing does differently from the Information Gain technique is that Bing displays all the different search results on a single page and then uses a table of contents on the left hand side that makes it easy for a user to click and go straight to the additional AI answers and organic search results.

Bing’s Updated AI Search Is Rolling Out Now

Bing’s newly updated AI search engine layout is slowly rolling out and they are observing the feedback from users. Microsoft has already tested it and is confident that it will continue to send clicks to websites. Search engines have a relationship with websites, what is commonly referred to as the web ecosystem. Every strong relationship is based on giving, not taking. When both sides give it creates a situation where both sides receive.

More search engines should take Bing’s approach of engineering their search results to satisfy users in a way that encourages discovery on the websites that originate the content.

Read Bing’s announcement:

Introducing Bing generative search

Featured Image by Shutterstock/Primakov

Google To Upgrade All Retailers To New Merchant Center By September via @sejournal, @MattGSouthern

Google has announced plans to transition all retailers to its updated Merchant Center platform by September.

This move will affect e-commerce businesses globally and comes ahead of the holiday shopping season.

The Merchant Center is a tool for online retailers to manage how their products appear across Google’s shopping services.

Key Changes & Features

The new Merchant Center includes several significant updates.

Product Studio

An AI-powered tool for content creation. Google reports that 80% of current users view it as improving efficiency.

This feature allows retailers to generate tailored product assets, animate still images, and modify existing product images to match brand aesthetics.

It also simplifies tasks like background removal and image resolution enhancement.

Centralized Analytics

A new tab consolidating various business insights, including pricing data and competitive analysis tools.

Retailers can access pricing recommendations, competitive visibility reports, and retail-specific search trends, enabling them to make data-driven decisions and capitalize on popular product categories.

Redesigned Navigation

Google claims the new interface is more intuitive and cites increased setup success rates for new merchants.

The platform now offers simplified website verification processes and can pre-populate product information during setup.

Initial User Response

According to Google, early adopters have shown increased engagement with the platform.

The company reports a 25% increase in omnichannel merchants adding product offers in the new system. However, these figures have yet to be independently verified.

Jeff Harrell, Google’s Senior Director of Merchant Shopping, states in an announcement:

“We’ve seen a significant increase in retention and engagement among existing online merchants who have moved to the new Merchant Center.”

Potential Challenges and Support

While Google emphasizes the upgrade’s benefits, some retailers, particularly those comfortable with the current version, may face challenges adapting to the new system.

The upgrade’s mandatory nature could raise concerns among users who prefer the existing interface or have integrated workflows based on the current system.

To address these concerns, Google has stated that it will provide resources and support to help with the transition. This includes tutorial videos, detailed documentation, and access to customer support teams for troubleshooting.

Industry Context

This update comes as e-commerce platforms evolve, with major players like Amazon and Shopify enhancing their seller tools. Google’s move is part of broader efforts to maintain competitiveness in the e-commerce services sector.

The upgrade could impact consumers by improving product listings and providing more accurate information across Google’s shopping services.

For the e-commerce industry as a whole, it signals a continued push towards AI-driven tools and data-centric decision-making.

Transition Timeline

Google states that retailers will be automatically upgraded by September if they still need to transition.

The company advises users to familiarize themselves with the new features before the busy holiday shopping period.


Featured Image: BestForBest/Shutterstock