Google Says GSC Sitemap Uploads Don’t Guarantee Immediate Crawls via @sejournal, @martinibuster

Google’s John Mueller answered a question about how many sitemaps to upload, and then said there are no guarantees that any of the URLs will be crawled right away.

A member of the r/TechSEO community on Reddit asked if it’s enough to upload the main sitemap.xml file, which then links to the more granular sitemaps. What prompted the question was their concern over recently changing their website page slugs (URL file names).

That person asked:

“I submitted “sitemap.xml” to Google Search Console, is this sufficient or do I also need to submit page-sitemap.xml and sitemap-misc.xml as separate entries for it to work?
I recently changed my website’s page slugs, how long will it take for Google Search Console to consider the sitemap”

Mueller responded that uploading the sitemap index file (sitemap.xml) was enough and that Google would proceed from there. He also shared that it wasn’t necessary to upload the individual granular sitemaps.

What was of special interest were his comments indicating that uploading sitemaps didn’t “guarantee” that all the URLs would be crawled and that there is no set time for when Googlebot would crawl the sitemap URLs. He also suggested using the Inspect URL tool.

He shared:

“You can submit the individual ones, but you don’t really need to. Also, sitemaps don’t guarantee that everything is recrawled immediately + there’s no specific time for recrawling. For individual pages, I’d use the inspect URL tool and submit them (in addition to sitemaps).”

Is There Value In Uploading All Sitemaps?

According to John Mueller, it’s enough to upload the index sitemap file. However, from our side of the Search Console, I think most people would agree that it’s better not to leave it to chance that Google will or will not crawl a URL. For that reason, SEOs may decide it’s reassuring to go ahead and upload all sitemaps that contain the changed URLs.

The URL Inspection tool is a solid approach because it enables SEOs to request crawling for a specific URL. The downside of the tool is that you can only request this for one URL at a time. Google’s URL Inspection tool does not support bulk URL submissions for indexing.

See also: Bing Recommends lastmod Tags For AI Search Indexing

Featured Image by Shutterstock/Denis OREA

LinkedIn Study: Professionals Trust Their Networks Over AI & Search via @sejournal, @MattGSouthern

LinkedIn reports that professionals are more likely to seek workplace advice from people they know than from AI tools or search engines.

A new LinkedIn study finds that 43% turn to their networks first, with nearly two-thirds saying colleagues help them decide faster and with more confidence.

Key Findings

LinkedIn’s research indicates that professional networks rank ahead of AI and search for advice at work, with 43% naming their network as the first stop.

Sixty-four percent say colleagues improve the quality and speed of decision-making. The study also notes an 82% rise in posts about feeling overwhelmed or navigating change, suggesting that people are looking for clarity from trusted human voices.

Pressure To Learn AI

Learning about AI is causing stress for many people. Over half (51%) say upskilling feels like a second job, 33% feel embarrassed about their knowledge, and 35% feel nervous discussing AI at work.

Additionally, 41% say the fast pace of AI changes affects their well-being. Younger workers, especially Gen Z, are more likely to exaggerate their AI skills compared to Gen X.

Among those aged 18 to 24, 75% believe AI cannot replace the intuition from trusted colleagues. This aligns with the finding that people prefer advice from known experts, especially when the stakes are high.

Implications For B2B Buying And Marketing

The study shows that 77% of B2B marketing leaders say audiences rely on both a company’s channels and their professional networks. Millennials and Gen Z now represent 71% of B2B buyers, leading marketers to invest in trusted individuals within those networks.

Eighty percent of marketers plan to increase spending on community-driven content featuring creators, employees, and experts. They believe that trusted creators are key to building credibility with younger buyers.

This highlights that social discovery and community participation matter as much as search rankings. Content that’s easy to share and linked to recognized experts may reach more people than generic brand messages.

Why This Matters

As professionals turn to their networks for advice, you may need to adjust how you build trust and generate demand.

You can do this by encouraging your employees to share messages, working with trusted creators, and creating expert-led content that’s easy to find on social media.

While traditional SEO and paid ads still matter, networks can affect how people find, discuss, and validate your content before they visit your website.

Looking Ahead

As more people use AI, professionals are learning to combine new tools with their own judgment. Marketers can gain lasting benefits by focusing on building real relationships, rather than just mastering AI tools.

Methodology

The findings are based on research commissioned by LinkedIn and conducted by Censuswide. The study included 19,268 professionals and 7,000 B2B marketers from 14 countries, conducted from July 3 to July 15, 2025.

The percentages and program details mentioned above are taken directly from LinkedIn’s pressroom post.


Featured Image: Nurulliaa/Shutterstock

Google Brings Loyalty Offerings To Merchant Retailers via @sejournal, @brookeosmundson

Google has announced a new set of Merchant loyalty offerings, giving retailers a way to surface existing member perks.

Retailers who have loyalty offerings to their customers, such exclusive pricing, shipping, and points, can now show across both free listings and paid Shopping ads.

In addition to the loyalty offering, Google Ads is introducing a new loyalty goal to help brands optimize toward higher-value customers rather than focusing purely on short-term clicks.

The move, which officially launched on August 26, 2025, signals Google’s deeper investment in connecting retention strategies with its commerce ecosystem.

For retailers already managing robust loyalty programs, this rollout could be an opportunity to strengthen visibility and attract repeat shoppers directly within Google surfaces.

What is the New Loyalty Offering?

Merchant Center retailers can now activate a loyalty add-on within Merchant Center to display member benefits in Google Shopping results.

This includes member-only pricing, shipping perks, or points. This can appear across Search, the Shopping tab, free listings, as well as Wallet.

To go along with this loyalty offering, Google Ads is now offering a loyalty goal.

This gives advertisers the ability to steer Smart Bidding toward audiences with a higher lifetime value. This means campaign optimization shifts from a narrow one-time transaction focus to a longer-term view that considers repeat purchases and retention.

Where do Loyalty Perks Show Up?

Loyalty benefits can now appear across multiple touchpoints. Shoppers may see a member price next to the standard price or a shipping perk highlighted in listings.

Loyalty offerings example in Google Shopping adImage credit: Google Ads, August 2025

In the United States, retailers using Customer Match can show personalized loyalty annotations to identified members.

Google also allows member pricing to appear for unknown members in the U.S. and Australia, with more countries currently in beta testing.

This shift makes loyalty more visible during product research and comparison, when shoppers are deciding where to buy.

Who Can Take Advantage of Loyalty Offerings?

The program is currently available in the U.S., U.K., Germany, France, and Australia. Merchants must have an existing loyalty program and enable the loyalty add-on within Merchant Center.

To qualify, member pricing discounts must be at least 5% off or five units of local currency. Only national-level loyalty pricing is supported, and if a site-wide promotion is running, that will override any member pricing in ads.

Importantly, retailers need to use the dedicated “loyalty_program” attribute in their product feed. This supplies details like:

  • Member price
  • Points
  • Shipping benefits
  • Other member perks.

Google requires consistency between submitted feed data and what appears on-site.

Customer Match is required to show known-member personalization in ads within the U.S. Google is also piloting its use in free listings.

How do Retailers Get Started?

Retailers should begin by enabling the loyalty add-on in Merchant Center. Membership tiers and benefits must be clearly defined.

Feeds should be updated with the correct “loyalty_program” attributes. Customer Match lists need to be uploaded and kept current to unlock personalization for U.S. shoppers.

From there, testing the new loyalty goal in Google Ads will be key. Advertisers should compare performance against other bid strategies and review Merchant Center’s loyalty reporting to measure impact.

Highlighting Membership Value

Google’s loyalty features give retailers new ways to highlight membership value where it matters most: at the point of discovery. By surfacing perks in Search and Shopping, brands can differentiate themselves before the click.

The addition of a loyalty goal also encourages smarter optimization. Campaigns can focus not just on conversion volume but on the quality and long-term value of customers.

For retailers with established loyalty programs, this rollout is worth exploring now. It connects retention strategies with acquisition in a way that could drive measurable impact.

Perplexity’s Discover Pages Offer A Surprising SEO Insight via @sejournal, @martinibuster

A post on LinkedIn called attention to Perplexity’s content discovery feed called Discover, which generates content on trending news topics. It praised the feed as a positive example of programmatic SEO, although some said that its days in Google’s search results are numbered. Everyone in that discussion believes those pages are one thing. In fact, they are something else entirely.

Context: Perplexity Discover

Perplexity publishes a Discover feed of trending topics. The page is like a portal to the news of the day, featuring short summaries and links to web pages containing the full summary plus links to the original news reporting.

SEOs have noticed that some of those pages are ranking in Google Search, spurring a viral discussion on LinkedIn.

Perplexity Discover And Programmatic SEO

Programmatic SEO is the use of automation to optimize web content and could also apply to scaled content creation. It can be tricky to pull off well and can result in a poor outcome if not.

A LinkedIn post calling attention to the Perplexity AI-generated Discover feed cited it as an example of programmatic SEO “on steroids.”

They wrote:

“For every trending news topic, it automatically creates a public webpage.

These pages are now showing up in Google Search results.

When clicked, users land on a summary + can ask follow-up questions in the chatbot.

…This is such a good Programmatic SEO tactic put on steroids!”

One of the comments in that discussion hailed the Perplexity pages as an example of good programmatic SEO:

“This is a very bold move by Perplexity. Programmatic SEO at scale, backed by trending topics, is a smart way to capture attention and traffic. The key challenge will be sustainability – Google may see this as thin content or adjust algorithms against it. Still, it shows how AI + SEO is evolving faster than expected.”

Another person agreed:

“SEO has been part of their growth strategy since last year, and it works for them quite well”

The rest of the comments praised Perplexity’s SEO as “bold” and “clever” as well as providing “genuine user value.”

But there were also some that predicted that “Google won’t allow this trend…” and that “Google will nerf it in a few weeks…”

The overall sentiment of Perplexity’s implementation of programmatic SEO was positive.

Except that there is no SEO.

 Perplexity Discover Is Not Programmatic SEO

Contrary to what was said in the LinkedIn discussion, Perplexity is not engaging in “programmatic SEO,” nor are they trying to rank in Google.

A peek at the source code of any of the Discover pages shows that the title elements and the meta descriptions are not optimized to rank in search engines.

Screenshot Of A Perplexity Discover Web Page

Every single page created by Perplexity appears to have the exact same title and meta description elements:

Perplexity

Every page contains the same canonical tag:

https://www.perplexity.ai” />

It’s clear that Perplexity’s Discover pages are not optimized for Google Search and that the pages are not created for search engines.

The pages are created for humans.

Given how the Discover pages are not optimized, it’s not a surprise that:

  • Every page I tested failed to rank in Google Search.
  • It’s clear that Perplexity is engaged in programmatic SEO.
  • Perplexity’s Discover pages are not created to rank in Google Search.
  • Perplexity’s Discover pages are created specifically for humans.
  • If any pages rank in Google, that’s entirely an accident and not by design.

What Is Perplexity Actually Doing?

Perplexity’s Discover pages are examples of something bigger than SEO. They are web pages created for the benefit of users. The fact that no SEO is applied shows that Perplexity is focused on making the Discover pages destinations that users turn to in order to keep in touch with the events of the day.

Perplexity Discover is a user-first web destination created with zero SEO, likely because the goals are more ambitious than depending on Google for traffic.

The Surprising SEO Insight?

It may well be that a good starting point for creating a website and forming a strategy for promoting it lies outside the SEO sandbox. In my experience, I’ve had success creating and promoting outside the standard SEO framework, because SEO strategies are inherently limited: they have one goal, ranking, and miss out on activities that create popularity.

SEO limits how you can promote a site with arbitrary rules such as: 

  • Don’t obtain links from sites that nofollow their links.
  • Don’t get links from sites that have low popularity.
  • Offline promotion doesn’t help your site rank.

And here’s the thing: promoting a site with strategies focused on building brand name recognition with an audience tends to create the kinds of user behavior signals that we know Google is looking for.

Check out Perplexity’s Discover at perplexity.ai/discover.

Featured Image by Shutterstock/Cast Of Thousands

Google Wants To Show More Links In AI Mode via @sejournal, @MattGSouthern

Google says it’s actively working to surface more source links inside AI Mode.

Robby Stein, VP of Product for Google Search, outlined changes designed to make links more visible.

Stein wrote on X that Google has been testing where links appear inside AI answers and that the long-term “north star” is to show more inline links.

He added that people are more likely to click when links are embedded with context directly in the response.

Stein stated:

“We’ve been experimenting with how and where to show links in ways that are most helpful to users and sites… our long term north star is to show more inline links.”

What’s Changing

Link Carousels On Desktop.

Google has launched carousels that surface multiple source links directly inside AI Mode responses on desktop. Stein said mobile support is coming soon.

The idea is to present links with enough context to help people decide where to go next without hunting below the answer.

Smarter Inline Links

Google is rolling out model updates that decide where inline links appear within the response text.

The system is trained to place links at moments when people are most likely to click out to see where information came from or to learn more.

Stein noted you might see fluctuations over the next few weeks as this is deployed, with a longer-term push toward more inline links overall.

Web Guide

Separately, Google’s Web Guide experiment uses a custom Gemini model to group useful links by topic.

It launched in Search Labs on the “Web” tab and, for opted-in users, will begin appearing on the main “All” tab when systems determine it could help for a query.

Google introduced Web Guide in July and indicated it would expand beyond the Web tab over time.

Why It Matters

How Google presents links in AI Mode can influence how people reach your site.

Placing carousels within the answer and adjusting inline placements differ from links that appear only below the response. This may change click behavior depending on the query and presentation.

Looking Ahead

Google is trying to strike a balance between innovation and supporting publishers. Expect continued testing around link density, placement, and labeling as Google refines AI mode.


Featured Image: subh_naskar/Shutterstock

Perplexity Launches Comet Plus, Shares Revenue With Publishers via @sejournal, @MattGSouthern

Perplexity announced Comet Plus, a monthly subscription that pays participating publishers when people read their work and when AI systems use it to answer questions.

The company says subscriber payments go to partners, with a small portion retained to cover compute costs.

How Comet Plus Works

Comet Plus will be available for $5 per month. Existing Perplexity Pro and Max subscribers will have Comet Plus included.

Subscribers get direct access to participating publisher sites, answers informed by those sources, and agent workflows that can complete tasks on those sites. The offering is tied to the Comet browser and assistant.

About Revenue Sharing

Perplexity positions Comet Plus as a compensation model for an AI-centric web.

Publishers are paid for three interaction types:

  1. Human visits
  2. Search citations
  3. Agent actions.

Perplexity’s example of “agent traffic” is Comet Assistant scanning a calendar and suggesting relevant reading from publisher sites.

The idea is to reflect how people now consume information across browsing, AI answers, and agent workflows.

Perplexity wrote:

“Comet Plus is the first compensation model… based on three types of internet traffic: human visits, search citations, and agent actions.”

Availability

Interested publishers can email publishers@perplexity.ai to request to join the program.

Why It Matters

For publishers and marketers, the model expands monetization and measurement beyond traditional clicks.

Websites are testing a range of responses to AI usage of their content, from blocking crawlers to signing licenses.

Comet Plus differs from flat-fee deals by tying payouts to actual user and assistant activity, which could align compensation more closely with real demand.

Looking Ahead

Perplexity says it will announce an initial roster of publishing partners when the Comet browser becomes available to all users for free.

Early adoption, reporting transparency, and real revenue for partners will determine whether this model becomes a viable framework or stays a niche experiment.

Perplexity Comet Browser Vulnerable To Prompt Injection Exploit via @sejournal, @martinibuster

Brave published details about a security issue with Comet, Perplexity’s AI browser, that enables an attacker to inject a prompt into the browser and gain access to data in other open browser tabs.

Comet AI Browser Vulnerability

Brave described a vulnerability that can be activated when a user asks the Comet AI browser to summarize a web page. The LLM will read the web page, including any embedded prompts that command the LLM to take action on any open tabs

According to Brave:

“The vulnerability we’re discussing in this post lies in how Comet processes webpage content: when users ask it to “Summarize this webpage,” Comet feeds a part of the webpage directly to its LLM without distinguishing between the user’s instructions and untrusted content from the webpage. This allows attackers to embed indirect prompt injection payloads that the AI will execute as commands. For instance, an attacker could gain access to a user’s emails from a prepared piece of text in a page in another tab.”

A post on Simon Willison’s Weblog shared that Perplexity tried to patch the vulnerability but the fix does not work.

A developer posted the following on X:

“Why is no one talking about this?

This is why I don’t use an AI browser

You can literally get prompt injected and your bank account drained by doomscrolling on reddit:”

Things aren’t looking good for Comet Browser at this time.

Non-Profit Organization Announces Free Domain Names via @sejournal, @martinibuster

A non-profit organization that is supported by Cloudflare, GitHub, and other organizations has open-sourced domain names, making them available with no catches or hidden fees. The sponsor of the free domain names explains that their purpose is not to replace commercial domain names but to offer an open-source alternative for developers, students, and people who want to create a hobby site for free.

The goal is to encourage making the Internet a free and open space so that everyone can publish and express themselves online without financial barriers.

DigitalPlat

The open source domains are offered by DigitalPlat, a non-profit organization that’s sponsored by 1Password, The Hack Club (The Hack Foundation), twilio, GitHub and Cloudflare.

The Hack Foundation is a certified non-profit organization of high school students that receive support from hundreds of supporters including Google.org and Elon Musk. The organization was founded in 2016.

According to their website:

“In 2018, The Hack Foundation expanded to act as a nonprofit fiscal sponsor for Hack Clubs, hackathons, community organizations, and other for-good projects.

Today, hundreds of diverse groups ranging from a small town newspaper in Vermont to the largest high-school hackathon in Pennsylvania are fiscally sponsored by The Hack Foundation.”

A notice posted on The Hack Foundation donation web page explains their connection to DigitalPlat:

“The DigitalPlat Foundation is a global non-profit organization that supports open-source and community development while exploring innovative projects. All funds are supervised and managed by The Hack Foundation, and are strictly regulated in compliance with US IRS guidance and legal requirements under section 501(c)(3). “

DigitalPlat FreeDomain

The free domain names can be registered via DigitalPlat and the free domains project is open source, licensed under AGPL-3.0.

An announcement was made by the GitHubs Projects Community on X with a link to a GitHub page for the free domains where the following domain extensions are listed as choices:

  • .DPDNS.ORG
  • .US.KG
  • .QZZ.IO
  • .XX.KG

Technically, those are subdomains. But so are .uk.com domains.

The official GitHub page for the domains recommends using Cloudflare, FreeDNS by Afraid.org, or Hostry for managing the DNS for zero cost.

The .KG domain is from the country code of Kyrgyzstan. DPDNS.ORG is the domain name of DigitalPlat FreeDomain. .US.KG is operated by the DigitalPlat Foundation, a non-profit charitable organization that’s sponsored by The Hack Foundation.

The Open-Source Projects page for the free domains explains the purpose and goals of the free domain offers:

“The project is open source (licensed under AGPL-3.0), transparent, and backed by The Hack Foundation, a U.S. 501(c)(3) nonprofit. This isn’t a trial or a limited-time offer—it’s a sustainable effort to increase accessibility on the web.”

Full directions for registering a free domain name can be found here.

Featured Image by Shutterstock/TenPixels

Google: Why Lazy Loading Can Delay Largest Contentful Paint (LCP) via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off the Record podcast, Martin Splitt and John Mueller discussed when lazy loading helps and when it can slow pages.

Splitt used a real-world example on developers.google.com to illustrate a common pattern: making every image lazy by default can delay Largest Contentful Paint (LCP) if it includes above-the-fold visuals.

Splitt said:

“The content management system that we are using for developers.google.com … defaults all images to lazy loading, which is not great.”

Splitt used the example to explain why lazy-loading hero images is risky: you tell the browser to wait on the most visible element, which can push back LCP and cause layout shifts if dimensions aren’t set.

Splitt said:

“If you are using lazy loading on an image that is immediately visible, that is most likely going to have an impact on your largest contentful paint. It’s like almost guaranteed.”

How Lazy Loading Delays LCP

LCP measures the moment the largest text or image in the initial viewport is painted.

Normally, the browser’s preload scanner finds that hero image early and fetches it with high priority so it can paint fast.

When you add loading="lazy" to that same hero, you change the browser’s scheduling:

  • The image is treated as lower priority, so other resources start first.
  • The browser waits until layout and other work progress before it requests the hero image.
  • The hero then competes for bandwidth after scripts, styles, and other assets have already queued.

That delay shifts the paint time of the largest element later, which increases your LCP.

On slow networks or CPU-limited devices, the effect is more noticeable. If width and height are missing, the late image can also nudge layout and feel “jarring.”

SEO Risk With Some Libraries

Browsers now support a built-in loading attribute for images and iframes, which removes the need for heavy JavaScript in standard scenarios. WordPress adopted native lazy loading by default, helping it spread.

Splitt said:

“Browsers got a native attribute for images and iframes, the loading attribute … which makes the browser take care of the lazy loading for you.”

Older or custom lazy-loading libraries can hide image URLs in nonstandard attributes. If the real URL never lands in src or srcset in the HTML Google renders, images may not get picked up for indexing.

Splitt said:

“We’ve seen multiple lazy loading libraries … that use some sort of data-source attribute rather than the source attribute… If it’s not in the source attribute, we won’t pick it up if it’s in some custom attribute.”

How To Check Your Pages

Use Search Console’s URL Inspection to review the rendered HTML and confirm that above-the-fold images and lazy-loaded modules resolve to standard attributes. Avoid relying on the screenshot.

Splitt advised:

“If the rendered HTML looks like it contains all the image URLs in the source attribute of an image tag … then you will be fine.”

Ranking Impact

Splitt framed ranking effects as modest. Core Web Vitals contribute to ranking, but he called it “a tiny minute factor in most cases.”

What You Should Do Next

  • Keep hero and other above-the-fold images eager with width and height set.
  • Use native loading="lazy" for below-the-fold images and iframes.
  • If you rely on a library for previews, videos, or dynamic sections, make sure the final markup exposes real URLs in standard attributes, and confirm in rendered HTML.

Looking Ahead

Lazy loading is useful when applied selectively. Treat it as an opt-in for noncritical content.

Verify your implementation with rendered HTML, and watch how your LCP trends over time.


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral, August 2025.