HTTP Archive Report: 61% Of Cookies Enable Third-Party Tracking via @sejournal, @MattGSouthern

HTTP Archive published 12 chapters of its annual Web Almanac, revealing disparities between mobile and desktop web performance.

The Almanac analyzes data from millions of sites to track trends in web technologies, performance metrics, and user experience.

This year’s Almanac details changes in technology adoption patterns that will impact businesses and users.

Key Highlights

Mobile Performance Gap

The most significant finding centers on the growing performance gap between desktop and mobile experiences.

With the introduction of Google’s new Core Web Vital metric, Interaction to Next Paint (INP), the gap has become wider than ever.

“Web performance is tied to what devices and networks people can afford,” the report notes, highlighting the socioeconomic implications of this growing divide.

The data shows that while desktop performance remains strong, mobile users—particularly those with lower-end devices—face challenges:

  • Desktop sites achieve 97% “good” INP scores
  • Mobile sites lag at 74% “good” INP scores
  • Mobile median Total Blocking Time is 18 times higher than desktop

Third-Party Tracking

The report found that tracking remains pervasive across the web.

“We find that 61% of cookies are set in a third-party context,” the report states, noting that these cookies can be used for cross-site tracking and targeted advertising.

Key privacy findings include:

  • Google’s DoubleClick sets cookies on 44% of top websites
  • Only 6% of third-party cookies use partitioning for privacy protection
  • 11% of first-party cookies have SameSite set to None, potentially enabling tracking

CMS Market Share

In the content management space, WordPress continues its dominance, with the report stating:

“Of the over 16 million mobile sites in this year’s crawl, WordPress is used by 5.7 millions sites for a total of 36% of sites.”

However, among the top 1,000 most-visited websites, only 8% use identifiable CMS platforms, suggesting larger organizations opt for custom solutions.

In the ecommerce sector, WooCommerce leads with 38% market share, followed by Shopify at 18%.

The report found that “OpenCart is the last of the 362 detected shop systems that manage to secure a share above 1% of the market.”

PayPal remains most detected payment method (3.5% of sites), followed by Apple Pay and Shop Pay.

Performance By Platform

Some platforms markedly improved Core Web Vitals scores over the past year.

Squarespace increased from 33% good scores in 2022 to 60% in 2024, while others like Magento and WooCommerce continue to face performance challenges.

The remaining chapters of the Web Almanac are expected to be published in the coming weeks.

Structured Data Trends

The deprecation of FAQ and HowTo rich results by Google hasn’t significantly impacted their implementation.

This suggests website owners find value in these features beyond search.

Google expanded support for structured data types for various verticals, including vehicles, courses, and vacation rentals.

Why This Matters

These findings highlight that mobile optimization remains a challenge for developers and businesses.

HTTP Archive researchers noted in the report:

“These results highlight the ongoing need for focused optimization efforts, particularly in mobile experience.

The performance gap between devices suggests that many users, especially those on lower-end mobile devices, may be experiencing a significantly degraded web experience.”

Additionally, as privacy concerns grow, the industry faces pressure to balance user tracking with privacy protection.

Businesses reliant on third-party tracking mechanisms may need to adapt their marketing and analytics strategies accordingly.

The 2024 Web Almanac is available on HTTP Archive’s website; the remaining chapters are expected to be published in the coming weeks.


Featured Image: BestForBest/Shutterstock

Google’s Updated Machine Learning Courses Build SEO Understanding via @sejournal, @martinibuster

Google has updated its machine learning crash course with new videos and modules on large language models and automated machine learning. These courses are useful introductions to the technologies behind modern search engines and generative AI, information that will make you a better SEO.

What Is Google’s Machine Learning Crash Course?

Google’s machine learning course is an easy to understand introduction to machine learning. It shows what machine learning is all about and how it can be useful to you and your business.

The different courses are self-contained in modules, beginning with introductions to the fundamentals of Linear Regression, Logistic Regression, and Binary Classification Models

The other modules cover:

  • Data
    How to work with machine learning data
  • Advanced Machine Learning Models
    Introductions to Neural Networks, Embeddings, and Large Language Models
  • Real-world ML
    These modules cover best practices for deploying machine learning models in the real world.

The new course adds topics that include:

Large language models

New Large Language Model (LLM) Module

The Large Language Models module is a new addition to the courses and is a good way to get up to speed fast with the technology and be conversant about it.

Google’s documentation shows what students learn with the module:

“Define a few different types of language models and their components.

Describe how large language models are created and the importance of context and parameters.

Identify how large language models take advantage of self-attention.

Reveal three key problems with large language models.

Explain how fine-tuning and distillation can improve a model’s predictions and efficiency.”

Google recommends first taking six other courses before starting the LLM module, so as to gain an understanding of the fundamentals. The six recommended courses look very interesting:

  1. Introduction to Machine Learning
  2. Linear regression
  3. Working with categorical data
  4. Datasets, generalization, and overfitting
  5. Neural networks
  6. Embeddings

The courses for linear regression, neural networks and embeddings can arguably be called essential for SEOs because these technologies have been a major part of how search ranking algorithms work. Obtaining a basic understanding about these technologies will improve your ability to understand how the backend of search engines work.

Many misleading ideas are popular in the SEO community because they sound like common sense, much like some answers you may have experienced from generative AI make sense but are hallucinations. Learning what these technologies are and how they work will help you become a better search marketer.

Read Google’s announcement:

Our Machine Learning Crash Course goes in depth on generative AI

Featured Image by Shutterstock/ProStockStudio

Featured Image by Shutterstock/studiostoks

YouTube Ad Updates: Enhanced Shorts Monetization via @sejournal, @MattGSouthern

YouTube expands Shorts monetization with new ad controls, measurement tools, and creator partnerships.

  • Advertisers can now run campaigns exclusively in Shorts feed or focus on horizontal content.
  • New interactive stickers and animated ads will enhance advertising options by year-end.
  • YouTube is expanding measurement capabilities with third-party sales lift tracking and improved Brand Lift surveys.
WordPress 6.7 Released – This Is Why It’s A Winner via @sejournal, @martinibuster

WordPress has released version 6.7, codenamed Rollins. This update introduces better font controls, a new default theme, enhanced design tools for easier page creation, 65 accessibility improvements, and performance optimizations. Version 6.7 focuses on making it easy to build attractive, high-performance websites.

Twenty Twenty-Five Theme

Twenty Twenty-Five is the new default theme that ships with WordPress. Twenty Twenty-Five was purpose intentionally built to offer users an easier and more intuitive experience for creating websites.

The official WordPress Twenty Twenty-Five documentation explains:

“While ideating Twenty Twenty-Five, one recurring idea was that simple things should be intuitive while complex things should be possible. This concept of simplicity and complexity leads to a reliable foundation for extending a default WordPress experience to make it yours.

Twenty Twenty-Five embodies ultimate flexibility and adaptability, showcasing the many ways WordPress enables people to tell their stories with many patterns and styles to choose from.”

The key improvements are:

  • Better Patterns
    WordPress patterns are pre-designed ready to use blocks for different parts of a page. This allows users to choose from pre-made sections of a web page like the header, call-to-actions, pricing tables and on. Twenty Twenty-Five ships with a wide range of patterns that are appropriate for different kinds of sites.
  • Improved Styles
    Better support for fonts in multiple languages plus bundled color variations.

New Templates
There are three base templates that can serve as a starting point for creating a website.

The new template versions are:

  • Personal Blog (Default)
    The Personal Blog template is focused on simplicity and ease of use.
  • Photo Blog (Alternative)
    This template has multiple layouts that are suitable for image heavy sites.
  • Complex Blog (Alternative)
    This template is intended for complex websites, offering more design flexibility.

Typography

As part of the emphasis on a better design experience, WordPress 6.7 features better font management that allows users to more control over fonts.

The WordPress announcement explains:

“Create, edit, remove, and apply font size presets with the next addition to the Styles interface. Override theme defaults or create your own custom font size, complete with fluid typography for responsive font scaling.”

New Zoom Out Feature

WordPress 6.7 has a new design feature that lets users zoom out from the details and see what the site looks like as a whole so that users can swap out block patterns and see what it looks like in macro view. This is in keeping with the focus on making it easy to design attractive websites.

Accessibility Improvements

The documentation for WordPress 6.7 was not as organized as it usually is, making it difficult to navigate to the documentation for the 65 improvements to accessibility are. WordPress documentation is usually better but it seems less organized this time.

This is what the announcement said about the accessibility improvements:

65+ accessibility fixes and enhancements focus on foundational aspects of the WordPress experience, from improving user interface components and keyboard navigation in the Editor, to an accessible heading on WordPress login screens and clearer labeling throughout.”

Performance Updates

The latest version of the WordPress core ships with faster pattern loading and better PHP 8+ support. Old code (deprecated) is removed to create a more lightweight theme, plus a new auto size component that improves lazy-loading images.

That last improvement to lazy loading should help improve core web vitals scores because the Auto Sizes feature helps the browser select the right image size from the CSS and use that to build the web page, rather than using the image size itself. CSS is usually downloaded before images, so having to depend on image size is redundant and slower. Chrome shipped with this ability last year, December 2023.

Engineering lead at Google Chrome Addy Osmani tweeted about it last year:

“Chrome is shipping support for lazy-loaded images with srcset, this allows the browser to use the layout width of the image in order to select the source url from the srcset.

For lazy-loaded images, CSS is often available before the image load begins. The browser can take the actual width of the image from CSS and use that as if it was the image’s sizes.”

The official WordPress announcement for the auto sizes for lazy loading explains:

WordPress documentation for the auto sizes feature explains:

“WordPress 6.7 adds sizes=”auto” for lazy-loaded images. This feature, which was recently added to the HTML specification, allows the browser to use the rendered layout width of the image when selecting a source from the srcset list, since lazy loaded images don’t load until after the layout is known.”

Is It Safe To Download WordPress 6.7?

Most developers discussing the latest version of WordPress in the private Dynamic WordPress Facebook group report that updating to the latest version is easy and trouble-free.

But some developers reported maintenance mode errors that were easily resolved by deleting the .maintenance file (maintenance mode file. The .maintenance mode error doesn’t happen because there’s something wrong with the update, it’s usually because there’s something going on with the upstream server that’s providing the update. The WordPress.org 6.7 documentation page was temporarily down so maybe the WordPress servers were experiencing too much traffic.

Featured Image by Shutterstock/Asier Romero

Google November Core Update: 6 Insights From Millions of Queries via @sejournal, @martinibuster

Enterprise SEO platform BrightEdge is actively monitoring millions of keyword search results, detecting real-time trends in Google’s AI Overview and organic search tied to the ongoing November 2024 Core algorithm update. The data suggests six preliminary observations on the direction of Google’s algorithm and what publishers and SEOs need to know now.

AI Overviews is a search feature, so any changes to Google’s core ranking algorithm will be reflected in AIO, especially because there are several ways AIO and the organic search results overlap.

1. Overlap Between Organic And AIO Citations

One of the more interesting trends that continues this month is an overlap between the websites cited in AIO and the organic search results. This shift, first noticed in September, was highly noticeable, especially within the top ten organic search results. High ranking organic content has a high chance of becoming a citation in AIO. This trend suggests that Google is increasingly aligning AIO citations with the organic search algorithm.

How Google is aligning organic SERPs with AIO can only be speculated because Google has not commented on this trend. It may be that AIO is grounding itself in organic search results that themselves are increasingly more precisely aligned to search query topicality.

Google’s information gain patent describes a way to rank websites that closely links Google’s organic search ranking with an AI-based search interface. These trends that BrightEdge noticed align with that kind of symmetry between AI Search and organically ranked answers.

2. Shopping Queries Ranked Differently

The trend for overlap between organic and AIO SERPs doesn’t manifest in shopping related queries.

But this trend doesn’t hold for shopping queries. Organic shopping SERPs and AIO results are increasingly uncoupled and going in different different directions. BrightEdge interpreted the data to mean that additional supporting results in AIO are why organic and AIO for shopping queries increasingly don’t match.

Google’s algorithm update won’t be finished until about two weeks from now. However BrightEdge’s Generative Parser technology is showing how the search results are trending that hint at what’s going on under the surface of the search results.

3. Downward Trends In Overlap

BrightEdge shared that ranking overlap between organic and AIO initially experienced a slight increase in volatility (+2.3%) leading into November 8th but that it subsequently started trending downward (-3.7%) on the following two days and the downward trend continued as the update was announced.

4. Increased Volatility In Overlap

After the release of the update the volatility between organic search results and AIO began to seriously spike. BrightEdge interprets the changes as suggesting that there is a pattern of redistribution. In my opinion this may reflect changes to both AIO and organic rankings which at some point should stabilize. The scale of the changes at the lowest ranking levels (positions 21-30) indicate a high level of volatility.

How SERPs Are Currently Trending Since Update Announcement:

  • Top 10 positions: +10.6% increase in volatility
  • Positions 11-20: -5.9% decline in volatility
  • Positions 21-30: +23.3% increase in volatility

5. Industry Specific Changes

It must be stressed that what BrightEdge’s Generative Parser is reporting represents real-time changes across millions of search results which are indicative of the scale of changes within the search results. BrightEdge next looks at specific industries and at this time is seeing significant shifts in e-commerce queries and notable changes in Education related queries.

Here are changes by industry:

  • E-commerce showing -22.1% shift in top citations
  • Education observing moderate -7.3% adjustment
  • Healthcare maintaining stability at -1.5% shift
  • B2B Tech recording -0.4% change

6. Patterns In How AIO Cites Content

The volatility patterns give an early tentative indication of what kinds of queries Google is giving priority in this update. Again, these are real-time results that are subject to change as new parts of the update are rolled out.

BrightEdge’s Insights From Volatility Rates:

  • “Educational content maintaining stronger stability
  • Product-focused content showing higher volatility
  • Research-oriented sites demonstrating resilience
  • Industry expertise appearing to gain prominence”

Takeaway From Real-Time Volatility

BrightEdge gave Search Engine Journal their interpretation of what the real-time data might suggest for future AIO citations:

  • “Prioritize genuine user value in content creation over keyword optimization
  • Don’t ignore the importance of your content that may not be on page 1 for your target keywords
  • Carefully monitor your AIO citations as the data suggests there could be some fluctuations”

Reason For Optimism?

There’s a lot of negative sentiment to this update that is easily understandable because 2024 has been a bad year for many publishers. For example, a common complaint on X (formerly Twitter) is that Google shows too much Reddit content.

Google’s AI Overviews has not been welcomed by publishers at any level or in any industry because it’s brutal to see your content reworded by Google’s AI then added into a summary that includes reworded content from competitors, with just a tiny hard to see link for the citation.

Frank Pine, executive editor of Media News Group and Tribune Publishing (a network of 68 newspapers) was quoted earlier this year by the New York Times as remarking that Google’s AI Overviews is “cannibalizing” content and harming publishers.

The Times quoted him:

“It potentially chokes off the original creators of the content,” Mr. Pine said. The feature, AI Overviews, felt like another step toward generative A.I. replacing ‘the publications that they have cannibalized…’”

At this point in time it doesn’t do anyone good to sit around and grumble. Keep an eye on the search results to monitor changes as this update rolls out and follow the data.

Read more about the November 2024 Google Core Algorithm Update and learn more about AI Overviews here.

Featured Image by Shutterstock/Inkley Studio

Google SERPs Without Favicons Is A Glitch via @sejournal, @martinibuster

Last week, reports claimed Google was testing mobile search results without favicons or colors, eliminating any advantage conferred by a well optimized favicon. However, a Google representative confirmed this was not a test.

Favicons In Search Results

The word favicon is short for “favorite icons,” an image meant to be shown in browser tabs, in bookmarks and in Google search results. Google recently recommended that publishers use favicons that are at least 32×32 pixels in size.

Google shows favicons in search results, which makes it a subtle way for ranked websites to attract attention and traffic. For that reason it would be a blow for many websites should Google remove the favicons because it would further erode search visibility.

Favicons Missing?

Lily Ray tweeted a screenshot showing that the favicons were missing in search results.

The screenshot clearly shows that the favicons are missing from the search results. However the reason for that happening turned out to be something else entirely.

Googler Rajan Patel, VP of Engineering at Google, tweeted a response that explained that this was not the way the search results should be displaying. He explained that this kind of search result is meant for low end devices that can’t support rich results.

He tweeted:

This isn’t intended. @lilyraynyc Did you have JavaScript disabled on the device? What kind of device was it? The experience you’re showing is what we show for lower end devices that can’t support the richness of the search result page.”

Lily Ray responded:

“JS is not disabled. iPhone iOS 18.1 on an iPhone 15 pro. I was either using the Chrome app or Google app to search, can’t remember which.”

Rajan Patel speculated that it may have been a misconfigured experiment.

Glitches At Google

It turns out to be an example of Google glitching. Something that wasn’t explored is the whether the Internet connection or latency may have played a role in causing Google to react to the SERP request as if it was being made from a low quality device.  In any case, it’s an example of how complex serving search results can be.

Featured Image by Shutterstock/tomertu

Google Ad Manager Launches Curation Capabilities For Agencies via @sejournal, @brookeosmundson

Just in – Google announces launch of new curation capability for agency advertisers.

The updates in Google Ad Manager are designed to streamline workflow, reduce complexity, and drive efficiencies for agencies.

These changes offer agencies new tools and capabilities to improve ad operations and campaign management.

Per Ginny Marvin, Google Ads Liaison, curation with Google Ad Manager is launching in North America, and will roll out globally early next year.

Here’s a breakdown of the key updates, along with insights on how they’ll impact agency operations moving forward.

The Launch Of Curation

In the recent blog post, Google announced the launch of their curation tool.

Google recognizes the challenge advertisers currently face trying to reach the right audiences across multiple networks and platforms.

With curation, agencies will be able to connect more easily with partners to find curated auction inventory packages, along with activating data segments within their agency Ad Manager account.

Agencies in turn can expect more accurate forecasted and increased match rates for audience targeting.

Another item rolling out with this launch is more billing efficiency.

Now, Google Ad Manager will handle billing and payments to data providers and inventory curators, which frees up admin time that would be spent creating separate invoices or different payment processes.

As of the announcement, curation supports the following:

  • Audigent
  • IAS
  • Liveramp
  • Lotame
  • Multilocal
  • Permutive
  • PrimeAudience
  • Scope3

More Streamlined Programmatic Buying For Agencies

Google Ad Manager already has a streamlined user interface for agencies to report and analyze campaigns across different DSPs.

In addition, Google has one central location for agencies to discover new deals, advanced forecasting capabilities, and packaged deals across participating DSPs.

Google states that agencies can negotiate directly with publishers to create a single deal ID that works across all their buying platforms. It’s currently available for Preferred Deals and Private Auctions through Display & Video 360 and Yahoo.

Per the announcement, support for Programmatic Guaranteed and other DSPs are coming in future months.

The Importance of These Updates for Agencies

PPC marketers are currently juggling multiple platforms, increased client demands, and tighter deadlines.

Google’s new Ad Manager curation tool hopes to help ease this burden, allowing agencies to operate more efficiently and keep the focus on driving value.

For PPC agencies, these updates come with tangible benefits and strategic implications:

  • Improved Client Relationships: With more insightful and customizable reporting, agencies can offer clients a more transparent view. This enhanced visibility can strengthen client relationships by building trust through transparency. When clients are better informed, agencies can demonstrate value more efficiently.
  • Increased Focus on Strategy Over Operations: Automation in inventory management, order creation, and billing processes allows agencies to shift focus from operational tasks to higher-level strategy. Agencies can now allocate more resources to developing creative and more impactful campaigns, ultimately driving better results for clients.
  • Scalability Made Easier: As agencies grow, managing a large number of campaigns across multiple clients becomes challenging. These new Ad Manager tools provide a framework that makes it easier to scale operations.

Final Thoughts

Google’s curation with Ad Manager represent a step forward in meeting evolving needs of agencies.

For PPC agencies, the ability to automate processes, manage inventory, and access enhanced reporting with greater flexibility can lead to improved results and stronger client relationships.

These tools allow agencies to focus on what matters most – driving impactful campaigns and delivering meaningful results.

Google’s Martin Splitt: Duplicate Content Doesn’t Impact Site Quality via @sejournal, @MattGSouthern

Google’s Search Central team has released a new video in its “SEO Made Easy” series. In it, Search Advocate Martin Splitt addresses common concerns about duplicate content and provides practical solutions for website owners.

Key Takeaways

Despite concerns in the SEO community, Google insists that duplicate content doesn’t harm a site’s perceived quality.

Splitt states:

“Some people think it influences the perceived quality of a site but it doesn’t. It does cause some challenges for website owners though, because it’s harder to track performance of pages with duplicates.”

However, it can create several operational challenges that website owners should address:

  • Difficulty in tracking page performance metrics
  • Potential competition between similar content pieces
  • Slower crawling speeds, especially at scale

Splitt adds:

“It might make similar content compete with each other and it can cause pages to take longer to get crawled if this happens at a larger scale. So it’s not great and is something you might want to clean up, but it isn’t something that you should lose sleep over.”

Three Solutions

1. Implement Canonical Tags

Splitt recommends using canonical tags in HTML or HTTP headers to indicate preferred URLs for duplicate content.

While Google treats these as suggestions rather than directives, they help guide the search engine’s indexing decisions.

Splitt clarifies:

“This tag is often used incorrectly by website owners so Google search can’t rely on it and treats it as a hint but might choose a different URL anyway.”

2. Manage Internal Links and Redirects

When Google chooses different canonical URLs than specified, website owners should:

  • Review and update internal links to point to preferred canonical URLs
  • Consider implementing 301 redirects for external links
  • Ensure redirects are appropriately configured to maintain site performance

3. Consolidate Similar Content

The most strategic approach involves combining similar pages to:

  • Improve user experience
  • Streamline Search Console reporting
  • Reduce site clutter

Splitt explains:

“If you find that you have multiple very similar pages, even if Google doesn’t consider them duplicates, try to combine them. It makes information easier to find for your users, will make reporting in Google Search Console easier to work with, and will reduce clutter on your site.”

Search Console Notices

Google Search Console may flag pages with various duplicate content notices:

  • “Duplicate without user-selected canonical”
  • “Alternate page with proper canonical tag”
  • “Duplicate Google chose different canonical than user”

These notifications indicate that Google has indexed the content, possibly under different URLs than initially intended.

International SEO Considerations

Splitt addresses duplicate content in international contexts, noting that similar content across multiple language versions is acceptable and handled appropriately by Google’s systems.

He states:

“If you find that you have multiple very similar pages, even if Google doesn’t consider them duplicates, try to combine them. It makes information easier to find for your users, will make reporting in Google Search Console easier to work with, and will reduce clutter on your site.”

Why This Matters

This guidance represents Google’s current stance on duplicate content and clarifies best practices for content organization and URL structure optimization.

See the full video below:


Featured Image: AnnaKu/Shutterstock

Google Is Updating Its Customer Match Policy via @sejournal, @brookeosmundson

Google sent out an update to its Customer Match policy to advertisers yesterday, going into effect in January 2025.

Customer Match allows advertisers to use their first-party data to reach people across many different campaign types, making it one of the most valuable features available for brands.

The new policy change introduces new standards designed to protect user experience. Read on to learn more about the update and how advertisers can prepare for the change.

The Customer Match Policy Update

In its email, Google stated that advertisers who misuse Customer Match may have their access taken away.

Google Ads email to advertisers about Customer Match policy updates.

They cite items like:

  • Having the potential to cause harm to users
  • Creating a poor user experience

If advertisers receive repeat violation of these items, it could lead to an account suspension.

However, Google reiterated that account suspensions for violating Customer Match policies wouldn’t be immediate.

Advertisers will get a warning issued at least seven days before any type of Google Ads account suspension.

In the short email to Google Ads account managers, they were also clear on how they gather feedback for consideration of its Customer Match policy violations:

  • User feedback: Advertisers should expect Google to scrutinize feedback from users, especially if their ads receive negative interactions or reported concerns.
  • Prevalence or severity of abuse: Ads deemed abusive or misleading could lead to Customer Match restrictions.
  • Repeated violations of Customer Match policy: Consistent breaches of the policy will be grounds for account suspension.

This update is slated to go into effect on January 13, 2025.

What This Means For Advertisers

If you’re an advertiser who is using Customer Match for any Google Shopping campaigns, now is a great time to revisit how you’re collecting user data.

Below are a few examples to ensure you’re collecting user data within compliance policy:

  • Make sure you’re getting a user’s consent before collecting their email address (or any other type of data)
  • Check your targeting settings to ensure you’re not targeting anyone under the age of 18
  • Don’t overly personalize your ads and monitor user feedback
  • Educate your team to ensure everyone involved in managing Google Ads campaigns understands these changes.

Navah Hopkins from Optmyzr provided her perspective on the Customer Match policy, stating:

This email just went out to advertisers letting us know that Customer Match is a privilege, not a right.

This is a great reminder that these policies Google has in place is here to help us gain effectiveness in advertising. But if brands are caught misusing the policies in any way, it can be taken away at any time.

The upcoming policy change introduces new standards designed to protect user experience, which Google continues to emphasize as a priority across its platforms.

By focusing on relevance, quality, and compliance, brands can continue to leverage Customer Match without interruption.

Digital Marketers See Schema Structured Data Shifting Beyond SEO via @sejournal, @martinibuster

An interesting discussion emerged on Twitter inspired by an article written by Jono Alderson. The article proposes thinking about Schema.org structured data markup as a way for emerging AI technologies to better understand and surface published Internet content.

Schema.org Structured Data Markup

The content on a website is called unstructured data because there is no formal organized structure to it that labels each part of the content in a machine readable way. Structured data on the other hand is the exact same content but organized with labels that identify images, authors, and content so that a machine can immediately understand it.

Schema.org structured data markup is generally seen by publishers and the SEO community as something to use in order to make a web page eligible for rich results features in Google. That way of thinking is manifested in the many SEO and Schema.org WordPress plugins that are limited to outputting structured data that Google may use for surfacing rich results.

New AI technologies that can use structured data are here, requiring search marketers to consider a new approach to how structured data is deployed. What Jono encouraged in the article is to think of structured data as a way to create a “data-first foundation” that is ready for the near future.

The article proposes thinking of Schema.org markup as a way to communicate what a web page is about and how it relates to everything else on the website. Jono writes:

“But don’t shy away from building a connected graph of broader, “descriptive”” schema just because Google’s not showing an immediate return. These “descriptive” types and relationships might end up being the lifeline between your content and the AI models of the future.”

Jono tweeted about his article on X (formerly Twitter) and Martha van Berkel, founder of SchemaApp, agreed with Jono’s article that the role of Schema structured data markup is shifting.

She tweeted:

“I agree with you that the role of schema markup is changing. Building a knowledge graph to manage how your website/content is understood with schema, and then asking it questions will be more important than optimizing for Rich Results or for Google.”

Ammon Johns tweeted:

“The biggest issue with Schema is that it is largely just self-declaration, no different in essence to META content, and we know how reliable Google decided that stuff was. So Google will use it, but they are unlikely to fully trust it.”

Ammon is right of course that structured data can’t be blindly trusted. One way to solve that problem is to use a smaller index of high quality websites the wa Perplexity AI does.

Gagan Ghotra tweeted how they sometimes would like to expand their use of structured data but are limited by what the SEO and structured data tools offer.

Read Jono Alderson’s X discussion here.

Read Jono’s article:

What if Schema.org is just… Labels?

Featured Image by Shutterstock/PureSolution