Google Answers Question About Structured Data And Logged Out Users via @sejournal, @martinibuster

Someone asked if showing different content to logged-out users than to logged-in users and to Google via structured data is okay. John’s answer was unequivocal.

This is the question that was asked:

“Will this markup work for products in a unauthenticated view in where the price is not available to users and they will need to login (authenticate) to view the pricing information on their end? Let me know your thoughts.”

John Mueller answered:

“If I understand your use-case, then no. If a price is only available to users after authentication, then showing a price to search engines (logged out) would not be appropriate. The markup should match what’s visible on the page. If there’s no price shown, there should be no price markup.”

What’s The Problem With That Structured Data?

The price is visible to logged-in users, so technically the content (in this case the product price) is visible for those users who are logged-in. It’s a good question because a good case can be made that the content shown to Google is available, kind of like behind a paywall, in this case it’s for logged-in users.

But that’s not good enough for Google and it’s not really comparable to paywalls because these are two different things. Google is judging what “on the page” means based on what logged-out users will see on the page.

Google’s guideline about the structured data matching what’s on the page is unambiguous:

“Don’t mark up content that is not visible to readers of the page.

…Your structured data must be a true representation of the page content.”

This is a question that gets asked fairly frequently on social media and in forums so it’s good to go over it for those who might not know yet.

Read More

Confirmed CWV Reporting Glitch In Google Search Console

Google’s New Graph Foundation Model Improves Precision By Up To 40X

Featured Image by Shutterstock/ViDI Studio

Brave Search API Now Available Through AWS Marketplace via @sejournal, @martinibuster

Brave Search and Amazon Web Services (AWS) announced the availability of the Brave Search API in the new AI Agents and Tools category of the AWS Marketplace.

AI Agents And Tools Category Of AWS Marketplace

AWS is entering the AI agent space with a new marketplace that enables entrepreneurs to select from hundreds of AI agents and tools from their new AWS category.

According to the AWS announcement:

“With this launch, AWS Marketplace becomes a single destination where customers can find everything needed for successful AI agent implementations— includes not just agents themselves, but also the critical components that make agents truly valuable—knowledge bases that power them with relevant data, third-party guardrails that enhance security, professional services to support implementation, and deployment options that enable agents to seamlessly interoperate with existing software.”

Customers can choose a pay-as-you-go pricing structure or through a monthly or yearly pricing.

Brave Search

Brave is an independent, privacy-focused search engine. The Brave Search API provides AI LLMs with real-time data, can power agentic search, and can be used for creating applications that need access to the Internet.

The Brave Search API already supplies many of the top AI LLMs with up to date search data.

According to Brian Brown, Chief Business Officer at Brave Software:

“By offering the Brave Search API in AWS Marketplace, we’re providing customers with a streamlined way to access the only independent search API in the market, helping them buy and deploy agent solutions faster and more efficiently. Our customers in foundation models, search engines, and publishing are already using these capabilities to power their chatbots, search grounding, and research tools, demonstrating the real-world value of the only commercially-available search engine API at the scale of the global Web.”

Featured Image by Shutterstock/Deemerwha studio

Google Adds Comparison Mode To Search Console’s 24-Hour View via @sejournal, @MattGSouthern

Google has rolled out a new comparison feature in Search Console, letting you analyze hourly performance data against two baselines: the previous 24 hours and the same day one week earlier.

The feature expands on Search Console’s 24-hour performance view, which launched in December. With this new capability, you can compare short-term trends more easily within Search Console’s performance reports.

Building On Near Real-Time Data

The original 24-hour view introduced hourly granularity and reduced the lag in data availability.

Now, the comparison feature adds context to that data. Instead of viewing isolated metrics, you can measure shifts in clicks, impressions, average CTR, and position over time.

The feature appears across Search Console’s performance reports for Search, Discover, and Google News.

How It Works

The comparison mode lives within the same interface as the 24-hour view and operates based on your local timezone.

You can toggle between viewing data for the last 24 hours, the previous 24 hours, and the same day from the week before. Visual indicators show how each metric has changed hour by hour.

Why This Matters

Before this update, the 24-hour view was a valuable but somewhat isolated tool. While it gave fast access to recent performance, there was no way to tell whether a spike or dip was meaningful without exporting the data for external comparison.

Now, you can assess whether fluctuations are part of a broader trend or a one-off anomaly.

For marketers and SEOs, this could help:

  • Validate the impact of content updates or site changes sooner.
  • Spot issues or opportunities that occur at specific times of day.
  • Establish baseline expectations for hourly performance.

News publishers and ecommerce sites with time-sensitive strategies may find this especially useful when timing is critical to outcomes.

Looking Ahead

Over the past year, Search Console has evolved from multi-day delays to near real-time feedback paired with reporting options.

As always, the rollout is gradual, so not all properties may see the new feature immediately. But once live, it fits directly into existing workflows, requiring no additional setup.


Featured Image: Roman Samborskyi/Shutterstock

Cloudflare DDoS Report: 63% Of Known Attacks Blamed On Competitors via @sejournal, @martinibuster

Cloudflare released their 2025 Q2 DDoS Threat Report, which names the top ten sources of DDoS attacks and cites businesses targeting competitors as the largest source of DDoS attacks, according to surveyed respondents who had identified their attackers.

Survey: Who Attacked You?

Cloudflare surveyed customers about DDoS attacks, and 29% claimed to have identified the sources of those attacks. Of those who identified the attackers, 63% pointed to competitors, the largest of whom were businesses in the crypto, gambling, and gaming industries. 21% of the respondents who identified their attackers said they were victims of state‑sponsored attacks, and 5% said they had accidentally attacked themselves, something that can happen with server misconfigurations

This is how Cloudflare explained it:

“When asked who was behind the DDoS attacks they experienced in 2025 Q2, the majority (71%) of respondents said they didn’t know who attacked them. Of the remaining 29% of respondents that claimed to have identified the threat actor, 63% pointed to competitors, a pattern especially common in the Gaming, Gambling and Crypto industries. Another 21% attributed the attack to state-level or state-sponsored actors, while 5% each said they’d inadvertently attacked themselves (self-DDoS), were targeted by extortionists, or suffered an assault from disgruntled customers/users.”

Most Attacked Locations

One would think that the United States would be the most attacked location, given how many businesses and websites are located there. But the most attacked location was China, which climbed from position three to position one. Brazil also climbed four positions to second place. Turkey dropped four positions to land in sixth place, and Hong Kong dropped to seventh place. Vietnam, however, jumped fifteen places to land in eighth place.

Top Ten Most DDoS-Attacked Countries

  1. China
  2. Germany
  3. India
  4. South Korea
  5. Turkey
  6. Hong Kong
  7. Vietnam
  8. Russia
  9. Azerbaijan

Top Attacked Industries

Telecommunications was the most attacked industry, followed by Internet and Information Technology Services. Gaming and Gambling were the third and fourth most attacked industries, followed by Banking/Financial and Retail industries.

  1. Telecommunications
  2. Internet
  3. Information Technology and Services
  4. Gaming
  5. Gambling and Casinos
  6. Banking and financial Services
  7. Retail
  8. Agriculture
  9. Computer Software
  10. Government

Top Country-Level Sources Of DDOS Attacks

Cloudflare’s data shows that Ukraine is the fifth‑largest source of DDoS attacks, but doesn’t say which areas of Ukraine are responsible. When I look at my logs of bot attacks, the Ukrainian‑origin bots are consistently in Russian‑occupied territories. Cloudflare should have made a distinction about this point, in my opinion.

The country of origin doesn’t mean that one country is shiftier than another. For example, the Netherlands rank as the ninth‑largest source of DDoS attacks, and that may be the case because they have strong user privacy laws that protect VPN users and are well positioned for low latency to both Europe and North America.

Cloudflare also provide the following note about country-level origins:

“It’s important to note that these “source” rankings reflect where botnet nodes, proxy or VPN endpoints reside — not the actual location of threat actors. For L3/4 DDoS attacks, where IP spoofing is rampant, we geolocate each packet to the Cloudflare data center that first ingested and blocked it, drawing on our presence in over 330 cities for truly granular accuracy.”

Top Ten Country Origins Of DDOS Attacks

  1. Indonesia
  2. Singapore
  3. Hong Kong
  4. Argentina
  5. Ukraine
  6. Russia
  7. Ecuador
  8. Vietnam
  9. Netherlands
  10. Thailand

Top ASN Sources Of DDOS Attacks

An ASN (Autonomous System Number) is a unique number assigned to networks or groups of networks that share the same rules for routing internet traffic. SEOs and publishers who track the origin of bad traffic and use .htaccess to block millions of IP ranges will recognize a number of the networks on this list. Hetzner, OVH, Tencent, Microsoft, the Google Cloud Platform, and Alibaba are all usual suspects.

According to Cloudflare, Hetzner dropped from first place as the origin of DDoS attacks to third place. DigitalOcean was formerly the number one source of DDoS attacks and was pushed down to position two by Drei‑K‑Tech‑GmbH, which jumped six places to become the leading source of DDoS attacks.

Top Ten Network Sources Of DDOS Attacks

  1. Drei-K-Tech-GmbH
  2. DigitalOcean
  3. Hetzner
  4. Microsoft
  5. Viettel
  6. Tencent
  7. OVH
  8. Chinanet
  9. Google Cloud Platform
  10. Alibaba

DDOS Attacks Could Be Better Mitigated

Cloudflare noted that it has a program that allows cloud computing providers to rapidly respond to bad actors abusing its networks. It’s not just DDoS attacks that originate at cloud and web hosting providers; it’s also bots scanning for vulnerabilities and actively trying to hack websites. If more providers joined Cloudflare, there could be fewer DDoS attacks, and the web would be a lot safer place.

This is how Cloudflare explains it:

“To help hosting providers, cloud computing providers and any Internet service providers identify and take down the abusive accounts that launch these attacks, we leverage Cloudflare’s unique vantage point to provide a free DDoS Botnet Threat Feed for Service Providers. Over 600 organizations worldwide have already signed up for this feed, and we’ve already seen great collaboration across the community to take down botnet nodes.”

Read the Cloudflare report:

Hyper-volumetric DDoS attacks skyrocket: Cloudflare’s 2025 Q2 DDoS threat report

Wix Announces AI Visibility Overview Citation & Sentiment Tracking Tool via @sejournal, @martinibuster

Wix adds support for Generative Engine Optimization (GEO) with new AI visibility tool called The AI Visibility Overview, available to users with a Wix Business Manager account in English, with more languages rolling out soon. AI Visibility Overview enables users to track citations, track AI query volume and traffic plus benchmark performance against competitors.

AI Visibility Overview

Wix continues its innovative and forward-thinking approach to adding AI-powered tools that provide users with real-world benefits that help get work done. Tracking AI visibility is an advanced capability that no other CMS offers.

The newly announced AI tools provides the following benefits for Generative Engine Optimization (GEO):

  • “Manage AI citations & visibility:
    Users can track how often their website is cited by AI platforms in response to relevant queries, as well as add, or remove questions to better reflect their business.
  • Monitor brand sentiment across LLMs:
    GEO empowers users to stay informed on how their brand is perceived by analyzing sentiment, perception, and positioning in AI-generated content.
  • Benchmark visibility and competitive context:
    Users can compare their AI visibility performance to competitors to gain a better understanding of how their visibility stacks up against industry peers, identify growth opportunities, and discover which other sources are being cited in similar contexts.
  • Measure AI-driven traffic & query volume:
    Users can see how much traffic is driven to their site from AI platforms, as well as how frequently people ask about their brand or services in these engines.”

AI Visibility

Business customers are increasingly searching with AI, and Wix’s new AI Visibility Overview is the right tool at the right time for businesses, enabling them to keep up with where customers are today and offering a competitive advantage over the competition.

Read more about the new tool here:

Wix Analytics: About the AI Visibility Overview

Featured Image by Shutterstock/Roman Samborskyi

Google Rolls Out Gemini 2.5 Pro & Deep Search For Paid Subscribers via @sejournal, @MattGSouthern

Google is rolling out two enhancements to AI Mode in Labs: Gemini 2.5 Pro and Deep Search.

These capabilities are exclusive to users subscribed to Google’s AI Pro and AI Ultra plans.

Gemini 2.5 Pro Now Available In AI Mode

Subscribers can now access Gemini 2.5 Pro from a dropdown menu within the AI Mode tab.

Screenshot from: blog.google/products/search/deep-search-business-calling-google-search, July 2025.

While the default model remains available for general queries, the 2.5 Pro model is designed to handle more complex prompts, particularly those involving reasoning, mathematics, or coding.

In an example shared by Google, the model walks through a multi-step physics problem involving gravitational fields, showing how it can solve equations and explain its reasoning with supporting links.

Screenshot from: blog.google/products/search/deep-search-business-calling-google-search, July 2025.

Deep Search Offers AI-Assisted Research

Today’s update also introduces Deep Search, which Google describes as a tool for conducting more comprehensive research.

The feature can generate detailed, citation-supported reports by processing multiple searches and aggregating information across sources.

Google stated in its announcement:

“Deep Search is especially useful for in-depth research related to your job, hobbies, or studies.”

Availability & Rollout

These features are currently limited to users in the United States who subscribe to Google’s AI Pro or AI Ultra plans and have opted into AI Mode through Google Labs.

Google hasn’t provided a firm timeline for when all eligible users will receive access, but rollout has begun.

The “experimental” label on Gemini 2.5 Pro suggests continued adjustments based on user testing.

What This Means

The launch of Deep Search and Gemini 2.5 Pro reflects Google’s broader effort to incorporate generative AI into the search experience.

For marketers, the shift raises questions about visibility in a time when AI-generated summaries and reports may increasingly shape user behavior.

If Deep Search becomes a commonly used tool for information gathering, the structure and credibility of content could play a larger role in discoverability.

Gemini 2.5 Pro’s focus on reasoning and code-related queries makes it relevant for more technical users. Google has positioned it as capable of helping with debugging, code generation, and explanation of advanced concepts. Similar to tools like ChatGPT’s coding features or GitHub Copilot.

Its integration into Search may appeal to users who want technical assistance without leaving the browser environment.

Looking Ahead

The addition of these features behind a paywall continues Google’s movement toward monetizing AI capabilities through subscription services.

While billed as experimental, these updates may provide early insight into how the company envisions the future of AI in search: more automated, task-oriented, and user-specific.

Search professionals will want to monitor how these features evolve, as tools like Deep Search could become more widely adopted.

Google Search Can Now Call Local Businesses Using AI via @sejournal, @MattGSouthern

Google has introduced a new AI-powered calling feature in Search that contacts local businesses on a user’s behalf to gather pricing and availability details.

The feature, rolling out to all U.S. Search users this week, allows people to request information from multiple businesses with a single query.

When searching for services like pet grooming or dry cleaning, users may now see a new option to “Have AI check pricing.”

How It Works

After selecting the AI option, users are guided through a form to provide details about the service they need.

Google’s AI then calls relevant local businesses to gather information such as pricing, appointment availability, and service options. The responses are consolidated and presented to the user.

The experience starts with a typical local search, such as “pet groomers near me.” If the AI calling feature is available, users can specify details like:

  • Pet type, breed, and size
  • Requested services (e.g., bath, nail trim, haircut)
  • Time preferences (e.g., within 48 hours)
  • Preferred method of communication (SMS or email)

According to a Google spokesperson, the AI determines which businesses to contact based on traditional local search rankings. Only those that appear in results for the relevant query and match the user’s criteria will be contacted.

What It Looks Like

Examples show a multi-step process where users enter information and confirm their request.

Google displays responses from participating businesses, including prices and availability, all gathered through automated calls.

Before submitting a request, users must confirm that Google can call businesses and share the submitted details. The process is governed by Google’s privacy policy, and users are informed of how their data will be used.

Business Participation & Control

Businesses can manage whether they receive these AI-driven calls via their Business Profile settings.

Google describes the feature as creating “new opportunities” to connect with potential customers, while also giving businesses control over participation.

Available to All (With Premium Perks)

The AI calling feature is available to all users in the U.S., though Google AI Pro and AI Ultra subscribers benefit from higher usage limits.

Google says more agentic AI features will debut for these subscribers before expanding globally.

What This Means

Because the AI selects businesses using standard local search rankings, maintaining strong local SEO becomes even more important.

Businesses with optimized listings and higher rankings are more likely to receive calls and capture leads.

This could also shift how businesses handle inbound requests. Those that rely on phone calls may want to prepare staff or systems to handle more frequent, possibly scripted, AI-initiated inquiries.

Looking Ahead

By automating time-consuming tasks like gathering service quotes, Google aims to make Search more actionable.

Adoption will depend on how well the AI handles real-world complexity, as well as how many businesses opt in.

For marketers and local service providers, it’s another sign that search visibility directly connects to lead generation. Keeping Business Profile data accurate and staying visible in local results could increasingly determine whether a business gets contacted at all.

Confirmed CWV Reporting Glitch In Google Search Console via @sejournal, @martinibuster

Google Search Console Core Web Vitals (CWV) reporting for mobile is experiencing a dip that is confirmed to be related to the Chrome User Experience Report (CrUX). Search Console CWV reports for mobile performance show a marked dip beginning around July 10, at which point the reporting appears to stop completely.

Not A Search Console Issue

Someone posted about it on Bluesky

“Hey @johnmu.com is there a known issue or bug with Core Web Vitals reporting in Search Console? Seeing a sudden massive drop in reported URLs (both “good” and “needs improvement”) on mobile as of July 14.”

The person referred to July 14th, but that’s the date the reporting hit zero. The drop actually starts closer to July 10th, which you can see when you hover a cursor at the point that the drops begin.

Google’s John Mueller responded:

“These reports are based on samples of what we know for your site, and sometimes the overall sample size for a site changes. That’s not indicative of a problem. I’d focus on the samples with issues (in your case it looks fine), rather than the absolute counts.”

The person who initially started the discussion responded to inform Mueller that this isn’t just on his site, the peculiar drop in reporting is happening on other sites.

Mueller was unaware of any problem with CWV reporting so he naturally assumed that this was an artifact of natural changes in Internet traffic and user behavior. So his next response continued under the assumption that this wasn’t a widespread issue:

He responded:

“That can happen. The web is dynamic and alive – our systems have to readjust these samples over time.”

Then Jamie Indigo responded to confirm she’s seeing it, too. 

“Hey John! Thanks for responding 🙂 It seems like … everyone beyond the usual ebb and flow. Confirming nothing in the mechanics have changed?”

At this point it was becoming clear that this weird behavior wasn’t isolated to just one site and Mueller’s response to Jamie reflected this growing awareness.  Mueller confirmed that there’s nothing happening on the Search Console side, leaving it open about the CrUX side of the Core Web Vitals reporting.

His response:

“Correct, nothing in the mechanics changed (at least with regards to Search Console — I’m also not aware of anything on the Chrome / CrUX side, but I’m not as involved there).”

CrUX CWV Field Data

CrUX is the acronym for the Chrome User Experience report. It’s CWV reporting based on real website visits. The data is collected from Chrome browser website visits by users who have opted in to reporting their data for the report.

Google’s Chrome For Developers page explains:

“The Chrome User Experience Report (also known as the Chrome UX Report, or CrUX for short) is a dataset that reflects how real-world Chrome users experience popular destinations on the web.

CrUX is the official dataset of the Web Vitals program. All user-centric Core Web Vitals metrics are represented.

CrUX data is collected from real browsers around the world, based on certain browser options which determine user eligibility. A set of dimensions and metrics are collected which allow site owners to determine how users experience their sites.”

Core Web Vitals Reporting Outage Is Widespread

At this point more people joined the conversation, with Alan Bleiweiss offering both a comment and a screenshot showing the same behavior where the reporting completely drops off is happening on the Search Console CWV reports for other websites.

He posted:

“oooh Google had to slow down server requests to set aside more power to keep the swimming pools cool as the summer heats up.”

Here’s a closeup detail of Alan’s screenshot of a Search Console CWV report:

Screenshot Of CWV Report Showing July 10 Drop

I searched the Chrome Lighthouse changelog to see if there’s anything there that corresponds to the drop but nothing stood out.

So what is going on?

CWV Reporting Outage Is Confirmed

I next checked the X and Bluesky accounts of Googlers who work on the Chrome team and found a post by Barry Pollard, Web Performance Developer Advocate on Google Chrome, who had posted about this issue last week.

Barry posted a note about a reporting outage on Bluesky:

“We’ve noticed another dip on the metrics this month, particularly on mobile. We are actively investigating this and have a potential reason and fix rolling out to reverse this temporary dip. We’ll update further next month. Other than that, there are no further announcements this month.”

Takeaways

Google Search Console Core Web Vitals (CWV) data drop:
A sudden stop in CWV reporting was observed in Google Search Console around July 10, especially on mobile.

Issue is widespread, not site-specific:
Multiple users confirmed the drop across different websites, ruling out individual site problems.

Origin of issue is not at Search Console:
John Mueller confirmed there were no changes on the Search Console side.

Possible link to CrUX data pipeline:
Barry Pollard from the Chrome team confirmed a reporting outage and mentioned a fix may be rolled out at an unspecified time in the future.

We now know that this is a confirmed issue. Google Search Console’s Core Web Vitals reports began showing a reporting outage around July 10, leading users to suspect a bug. The issue was later acknowledged by Barry Pollard as reporting outage affecting CrUX data, particularly on mobile.

Featured Image by Shutterstock/Mix and Match Studio

WordPress Malware Scanner Plugin Contains Vulnerability via @sejournal, @martinibuster

Wordfence published an advisory on the WordPress Malcure Malware Scanner plugin, which was discovered to have a vulnerability rated at a severity level of 8.1. At the time of publishing, there is no patch to fix the problem.

Screenshot Showing 8.1 Severity Rating

Malcure Malware Scanner Vulnerability

The Malcure Malware Scanner plugin, installed on over 10,000 WordPress websites, is vulnerable to “Arbitrary File Deletion due to a missing capability check on the wpmr_delete_file() function” by authenticated attackers. The fact that an attacker needs authentication as a user makes it a little less likely for it to be exploited, however not by much because it only requires subscriber level authentication, which is the lowest level of authentication. The “subscriber” role is the default level of registration on a WordPress website (if registration is allowed).

According to Wordfence:

“This makes it possible for authenticated attackers, with Subscriber-level access and above, to delete arbitrary files making remote code execution possible. This is only exploitable when advanced mode is enabled on the site.”

There is no known patch available for the plugin and users are cautioned to take necessary actions such as uninstalling the plugin to mitigate risk.

The plugin is currently unavailable for download with a notice showing that it is under review.

Screenshot Of Malcure Plugin At WordPress Repository

Read More WordPress News

WordPress Update 6.8.2 – Ends Security Support For 0.9% of Sites

Featured Image by Shutterstock/Kues

Anthropic’s New Financial Tool Signals Shift To Offering Specialized Services via @sejournal, @martinibuster

Anthropic announced a new Financial Analysis Solution powered by its Claude 4 and Claude Code models. This is Anthropic’s first foray into a major vertical-focused platform, signaling a shift toward AI providers building tools that directly address common pain points in business workflows and productivity.

Claude For Financial Services

Anthropic’s Claude’s new service is an AI-powered financial analysis tool that’s targeted to financial professionals. It offers data integration via MCP (Model Context Protocol) and secure handling of data and total privacy. No user data is used for training Claude’s generative models.

According to the announcement:

“Claude has real-time access to comprehensive financial information including:

  • Box enables secure document management and data room analysis
  • Daloopa supplies high-quality fundamentals and KPIs from SEC filings
  • Databricks offers unified analytics for big data and AI workloads
  • FactSet provides comprehensive equity prices, fundamentals, and consensus estimates
  • Morningstar contributes valuation data and research analytics
  • PitchBook delivers industry-leading private capital market data and research, empowering users to source investment and fundraising opportunities, conduct due diligence and benchmark performance, faster and with greater confidence
  • S&P Global enables access to Capital IQ Financials, earnings call transcripts, and more–essentially your entire research workflow”

Takeaway:

This launch may signal a shift among AI providers toward building industry-specific tools that solve problems for professionals, rather than offering only general-purpose models that others use to provide the same solutions. Generative AI companies have the ability to stitch together solutions from big data providers in ways that smaller companies can’t.

Read more at Anthropic:

Transform financial services with Claude

Featured Image by Shutterstock/gguy