Google Confirms Site Reputation Abuse Update via @sejournal, @martinibuster

Google’s SearchLiaison confirmed that Google’s site reputation abuse update started on Monday May 6th. Many sites from across the web took down webpages that could be perceived as hosting third-party content for the purpose of ranking in search engines.

Site Reputation Abuse

An old strategy that’s made a recent comeback is one in which a marketer will piggyback their content on another website in order to rank it in search engines. The best way to describe the practice is that a publisher is piggybacking on another publishers website.

Some newbie marketers slapped the awkward name parasite SEO to the practice. Parasite SEO is an inept name for this strategy because a parasite subsists on an unwilling host organism but this approach to ranking is by agreement and not one site attacking another one without permission.

This isn’t a low-level affiliate marketer strategy though. It’s also one that’s practiced by many major brands, particularly for credit cards and product reviews.

Google Targets Third Party Content

This specific spam policy targets sites that host third party content in which the host publisher has little to do with the content published on their site. It takes more than just hosting third party content however to be targeted as spam.

Google’s formal definition is:

“Site reputation abuse is when third-party pages are published with little or no first-party oversight or involvement, where the purpose is to manipulate Search rankings by taking advantage of the first-party site’s ranking signals. Such third-party pages include sponsored, advertising, partner, or other third-party pages that are typically independent of a host site’s main purpose or produced without close oversight or involvement of the host site, and provide little to no value to users.”

Google’s SearchLiaison confirmed in a tweet that the policy went into effect today.

He tweeted:

“It’ll be starting later today. While the policy began yesterday, the enforcement is really kicking off today.”

Some big brand sites have recently removed sections of their site that featured product reviews that lack evidence that the reviewer actually handled the reviewed products. The reviews lacked original product photos, no product measurements and no testing results.

Read Google’s guidelines on Site Reputation Abuse.

Featured Image by Shutterstock/Lets Design Studio

Apple’s “Intelligent Search” Will Summarize Webpages via @sejournal, @martinibuster

A report based on independently verified accounts notes that Apple’s Safari 18 will come with an Intelligent Search feature that summarize webpages in response to search queries and there may be a new feature called Web Eraser that allows users to permanently remove text, images, advertisements from webpages.

The Potential For Disruption By Apple AI Search

Apple has been gathering website data for years through its Internet crawler that uses the user agent, Applebot. The harvested data has in the past been used in the context of Siri and Apple’s Spotlight Suggestions feature.

Many in the search community have been aware of Applebot and have welcomed the prospect of a new search engine from Apple but despite constant crawling Apple has not released a search engine. A reason Apple has not released a standalone search engine may be that it’s become apparent that the best way to challenge Google Search is with a technology that replaces search engines altogether, much like how the Apple iPhone made digital cameras obsolete.

The latest news coming out about Safari 18 appears to confirm that supplanting Google is the strategy that Apple is pursuing.

Duane Forrester, formerly of Bing and now at Yext, commented on the potentially disruptive quality of Apple’s new technology:

“Intelligent Search could change how iOS consumers get, see and interact with content and answers. We are likely to see Apple taking a big step forward, into the consumer-accessible AI conversation which has been dominated by OpenAI, Microsoft, and Google et al to this point. Our phones are about to become AI-enabled and that could be ground zero for AI Agents. It’s going to impact and potentially change consumer behavior and be the gateway to new experiences.

The approach Apple is taking has the potential to disrupt not just search engines but also the search optimization and publishing industry, who have have been waiting years for an Apple search engine. But the extent of that disruption depends on how Apple implements their AI web search summarizer.

Webpage Summarization

Although news reports didn’t provide details as to how the new search result summaries will appear, it seems reasonable to speculate that Apple will provide attribution in the form of a link to websites so that users can click through to the website.

Duane Forrester speculated:

“Apple was just in court for Google’s anti-trust trial and likely isn’t keen to dive into “monopoly” waters themselves. My guess is that any “result” will have a source.”

This is what was reported:

“With the release of Safari 18, Apple is expected to introduce article summarization through a new Intelligent Search feature — meaning users will have the option to generate a brief summary of any webpage currently on screen.

Apple’s built-in AI software can analyze the keywords and phrases within a webpage or document and generate a short summary containing only the most important information.”

SEOs have been giddy about the prospect of an Apple search engine for years. It now appears that the Google Killer they’ve been waiting for could possibly result in less traffic from search queries but to what extent it’s impossible to tell at this point.

One search marketing expert mused in a private chat that if Intelligent Search summarizes more than it links out then that may signal it’s time to start selling off the domain names they’ve invested in.

On-Device Processing

An interesting feature of the text summarization is that the technology that creates the summary (called Ajax) resides on the mobile device itself. What Ajax does is extract keywords, entities, and use the data to identify the topic and a loose summary of a webpage which is then turned into a text summary for the user.

This is how the functionality is described:

“In analyzing texts, the software takes into account all relevant information available. It can recognize and classify entities such as companies, people, and locations. For instance, if a name appears at the top of a text, the software will likely recognize the name as belonging to the text’s author.”

Apple Also Plans A Web Eraser

As if an Apple search summarizer isn’t bad enough Apple reportedly has a “Web Eraser” functionality planned for Safari. Web Eraser is a feature that removes content from webpages so that site visitors don’t have to look at it anymore. Things like advertising, videos, comments, suggested reading and maybe even popups could be permanently blocked by the Web Eraser.  Once a user “erases” a block of content from a webpage that block stays erased for the site visitor on subsequent visits.

According to the a report about the Apple Web Eraser:

“The feature is expected to build upon existing privacy features within Safari and will allow users to erase unwanted content from any webpage of their choosing. Users will have the option to erase banner ads, images, text or even entire page sections, all with relative ease.”

Technological Disruptions

It’s a natural response to experience anxiety in the face of changes. For many, the dawning of AI Search is their first experience of a major change. But for those of us who have been in search for 25+ years we have experienced and grown accustomed to sudden and transformative changes that alter publishing and SEO. Like Duane, I tend to feel that Apple’s implementation of an AI search engine that summarizes websites will be disruptive but not to the point that it harms websites. It’s in Apple’s self-interest to not disrupt the Internet to the point of extinction.

Featured Image by Shutterstock/frantic00

Google Connects Imported User Data To GA4 Audiences via @sejournal, @MattGSouthern

Google has streamlined the process for using imported user data to build audiences in Google Analytics 4 (GA4).

This eliminates the need for users to visit a website or app before joining designated audiences.

Google’s official announcement states:

“We’ve made it easier to use imported user data with audiences. Now, data you upload, joined via either user ID or client ID can be used to qualify users for your GA4 audiences right away, without them needing to visit your site or app first.”

Google illustrated the new capability with an example:

“Imagine you have a ‘Gold’ loyalty tier audience. By uploading your loyalty tier data, ‘Gold’ users will automatically be added to your GA4 audience without requiring further activity from them.”

Streamlining Audience Building

Google is tapping into GA4’s user data import functionality, which allows you to add external data like customer loyalty status, purchase histories, and lifetime values from CRM systems.

Before this change, uploaded user data informed analysis but didn’t automatically qualify users for audiences.

The new process provides a more seamless way to construct valuable audience segments using offline user insights.

A Google Analytics help document on user data import confirms:

“You can enhance user segmentation and remarketing audiences by importing user metadata that you store outside of Analytics.”

Requirements

Importing user data requires unique User IDs, Client IDs (web), and App Instance IDs (apps) to map user records. Google cautions against uploading files “that include duplicate keys.

Strict guidelines prohibit uploading “personally identifiable information with user-data import.” User IDs must be hashed.

Once uploaded and processed, the imported data becomes instantly available for audience qualification without additional actions. Subsequent uploads or data collection can overwrite values.

Why SEJ Cares

The change bridges the gap between an organization’s customer database and GA4 audience strategy.

Previously, uploaded offline user data informed analysis but didn’t directly build audiences. Building audiences depended on tracked site/app behavior separate from imported attributes like loyalty status.

This overcomes a limitation to utilizing holistic customer insights for GA4 audiences, creating an opportunity for audience-centric use cases.

How It Helps

This update to GA4 can simplify your audience marketing workflows in the following ways.

Audience List Uploads & Portability

You can now translate offline customer lists and CRM segmentation into GA4 audience definitions, allowing audience portability across systems.

Dynamic Audience Refreshes

You can overwrite imported values through periodic data uploads based on the latest offline qualifications. This removes the need for complex script-based audience refreshes or manual list management.

Richer Audience Enrichment

Uploaded attributes provide an additional layer of demographic, interest, intent, and engagement indicators.

These can be layered into audience definitions alongside on-site and app behavior, enabling richer segmentation logic.

Looking Ahead

This update could be a difference-maker for brands that leverage promotions, content experiences, and messaging tailored to their offline customer stratification, like loyalty tiers.

While the ability to upload offline user attributes has existed, instantly aligning those attributes to GA4 audience definitions opens up new opportunities for richer audience segmentation, tailored marketing, and optimized customer experiences.


Featured Image: Piotr Swat/Shutterstock

Big Change To Google’s Product Structured Data Page via @sejournal, @martinibuster

Google revamped their product structured data documentation by splitting up one comprehensive page into three pages that are more closely focused on their topics, providing an example of how to rewrite a page that’s too big and turn it into multiple topically relevant webpages.

Google Product Structured Data

Product structured data is essential for ecommerce and product review websites because they help make them eligible for rich result listings in Google’s search engine results pages (SERPs). When correctly deployed, the product structured data can make a significant contribution to the amount of traffic received by a website and earnings.

Google’s restructuring of the official documentation gives the ecommerce community a lot to take in but also simplifies the increasingly large products structured data webpage.

What Changed?

The most notable change to the documentation is that the entire document has been split into three pages. The original document, Product Structured Data, was renamed to Introduction To Product Structured Data. The word count went from 4,808 words to only 667 words, with approximately 50% of the new document containing the same content. Aside from trivial changes there is a brand new section of the revamped page called Deciding Which Markup To Use, which serves as a jumping off point to the two new pages.

Merchants, product review site publishers and SEOs now have three product structured data documents to read:

  1. Introduction To Product structured data
  2. Product Snippet Structured Data
  3. Merchant listing Structured Data

In addition to the above changes there’s a new blue tinted callout box that draws attention to the Product Variant Structured Data page that replaces similar text that was buried in the document and easily overlooked.

Screenshot of Callout Box

Screenshot

Document Extensively Rewritten

There are instances where the headings of the new documentation were rewritten to make clear what the topic of each section is about.

The new Introduction To Structured Data page now contains a brand new section. The old section was called “Result types” and the revamped section is called Deciding Which Markup To Use, a more descriptive heading.

This is the new content:

“Deciding which markup to use
There are two main classes of product structured data. Follow the requirements for the type that best suits your use case:

Product snippets: For product pages where people can’t directly purchase the product. This markup has more options for specifying review information, like pros and cons on an editorial product review page.

Merchant listings: For pages where customers can purchase products from you. This markup has more options for specifying detailed product information, like apparel sizing, shipping details, and return policy information.

Note that there is some overlap between these two features. In general, adding the required product information properties for merchant listings means that your product pages can also be eligible for product snippets. Both features have their own enhancements, so be sure to review both when deciding which markup makes sense in the context of your site (the more properties you can add, the more enhancements your page can be eligible for).”

First New Page: Product Snippets

I addition to the revamped introduction to product structured data there is now a new standalone page focused on adding product structured data to become eligible for rich results about ratings, reviews, price and product availability.

The title element for the new page is How To Add Product Snippet Structured Data. Excluding some of the structured data examples, the new product snippet page has about 2,500 words.

Much of the content on this new page isn’t new. The similarities between a section of the old 6,000 word mega-page and this new standalone page indicates that they split this section off the old documentation and turned it into its own page..

Second New Page: Merchant Listing Structured Data

The second new page is dedicated to the product structured data that’s specific to merchant listings, critical for surfacing products in the shopping-related rich results which Google calls Merchant Listing Experiences.

Apart from a single illustration, the dedicated page for Merchant Listing Structured Data has no information at all about what Merchant Listing Experiences are and how they look in the search results. The second paragraph of the new page encourages a reader to visit the Introduction To Product Structured Data webpage to learn more about what the Merchant Listing Experiences rich results look like.

An argument could be made that Merchant Structured Data page has context for the Merchant Listing Experiences information. But someone at Google decided that this one page needs to be 100% focused on a single topic (structured data).

The new webpage comes in at about (more or less) 1,700 words.

Editorial Decisions Based On User Needs

Rather than have one giant and comprehensive page about a topic the decision was made to split into its constituent subtopics, resulting in three pages that are now more tightly focused and presumably will not only be better for users but will also rank better.

Read The Documentation

Review the original version of the documentation:

Internet Archive: Product (Product, Review, Offer) structured data

Read the new documentation:

Introduction to Product structured data

Product snippet (Product, Review, Offer) structured data

Merchant listing (Product, Offer) structured data

Featured Image by Shutterstock/achinthamb

Google Performance Max For Marketplaces: Advertise Without A Website via @sejournal, @MattGSouthern

Google has launched a new advertising program called Performance Max for Marketplaces, making it easier for sellers on major e-commerce platforms to promote their products across Google’s advertising channels.

The key draw? Sellers no longer need a website or a Google Merchant Center account to start.

The official Google Ads Help documentation states:

“Performance Max for Marketplaces helps you reach more customers and drive more sales of your products using a marketplace. After you connect your Google Ads account to the marketplace, you can create Performance Max campaigns that send shoppers to your products there.”

The move acknowledges the growing importance of online marketplaces like Amazon in product discovery.

For sellers already listing products on marketplaces, Google is providing a way to tap into its advertising ecosystem, including Search, Shopping, YouTube, Gmail, and more.

As ecommerce marketer Mike Ryan pointed out on LinkedIn:

“Polls vary, but a recent single-choice survey showed that 50% of consumers start product searches on Amazon, while a multiple-choice survey showed that 66% of consumers start on Amazon.”

The source for his data is a 2023 report by PowerReviews.

Getting Started

To use Performance Max for Marketplaces, sellers need an active account on a participating marketplace platform and a Google Ads account.

Google has yet to disclose which marketplaces are included. We contacted Google to request a list and will update this article when we receive it.

Once the accounts are linked, sellers can launch Performance Max campaigns, drawing product data directly from the marketplace’s catalog.

Google’s documentation states:

“You don’t need to have your own website or Google Merchant Center account. You can use your existing marketplace product data to create ads with product information, prices, and images.”

Conversion tracking for sales is handled by the marketplace, with sales of the advertiser’s products being attributed to their Google campaigns.

While details on Performance Max For Marketplaces are still emerging, Google is providing information when asked directly.

Navah Hopkins states on LinkedIn she received these additional details:

“I finally got a straight answer from Google that we DO need a Merchant Center for this, we just don’t need one to start with.”

Differences From Standard Performance Max

These are the key differences from regular Performance Max campaigns:

  • No URL expansion, automatically-created assets, or video assets
  • No cross-account conversion tracking or new customer acquisition modeling
  • No audience segmentation reporting

Why SEJ Cares

Performance Max for Marketplaces represents a new way to use Google advertising while operating on third-party platforms.

Getting products displayed across Google’s ecosystem without the overhead of a standalone ecommerce presence is a significant opportunity.

How This Can Help You

Through Google’s ecosystem, merchants have new ways to connect with customers.

Performance Max for Marketplaces is a potential difference maker for smaller retailers that have struggled to gain traction through Google’s standard shopping campaigns.

Established merchants invested in Google Ads may find the program opens new merchandising opportunities. By making an entire marketplace catalog available for ad serving, sellers could uncover previously undiscovered pockets of demand.

The success of Performance Max for Marketplaces will depend on its execution and adoption by major players like Amazon and Walmart.


Featured Image: Tada Images/Shutterstock

Google’s Search Engine Market Share Drops As Competitors’ Grows via @sejournal, @MattGSouthern

According to data from GS Statcounter, Google’s search engine market share has fallen to 86.99%, the lowest point since the firm began tracking search engine share in 2009.

The drop represents a more than 4% decrease from the previous month, marking the largest single-month decline on record.

Screenshot from: https://gs.statcounter.com/search-engine-market-share/, May 2024.

U.S. Market Impact

The decline is most significant in Google’s key market, the United States, where its share of searches across all devices fell by nearly 10%, reaching 77.52%.

Screenshot from: https://gs.statcounter.com/search-engine-market-share/, May 2024.

Concurrently, competitors Microsoft Bing and Yahoo Search have seen gains. Bing reached a 13% market share in the U.S. and 5.8% globally, its highest since launching in 2009.

Yahoo Search’s worldwide share nearly tripled to 3.06%, a level not seen since July 2015.

Screenshot from: https://gs.statcounter.com/search-engine-market-share/, May 2024.

Search Quality Concerns

Many industry experts have recently expressed concerns about the declining quality of Google’s search results.

A portion of the SEO community believes that the search giant’s results have worsened following the latest update.

These concerns have begun to extend to average internet users, who are increasingly voicing complaints about the state of their search results.

Alternative Perspectives

Web analytics platform SimilarWeb provided additional context on X (formerly Twitter), stating that its data for the US for March 2024 suggests Google’s decline may not be as severe as initially reported.

SimilarWeb also highlighted Yahoo’s strong performance, categorizing it as a News and Media platform rather than a direct competitor to Google in the Search Engine category.

Why It Matters

The shifting search engine market trends can impact businesses, marketers, and regular users.

Google has been on top for a long time, shaping how we find things online and how users behave.

However, as its market share drops and other search engines gain popularity, publishers may need to rethink their online strategies and optimize for multiple search platforms besides Google.

Users are becoming vocal about Google’s declining search quality over time. As people start trying alternate search engines, the various platforms must prioritize keeping users satisfied if they want to maintain or grow their market position.

It will be interesting to see how they respond to this boost in market share.

What It Means for SEO Pros

As Google’s competitors gain ground, SEO strategies may need to adapt by accounting for how each search engine’s algorithms and ranking factors work.

This could involve diversifying SEO efforts across multiple platforms and staying up-to-date on best practices for each one.

The increased focus on high-quality search results emphasizes the need to create valuable, user-focused content that meets the needs of the target audience.

SEO pros must prioritize informative, engaging, trustworthy content that meets search engine algorithms and user expectations.

Remain flexible, adaptable, and proactive to navigate these shifts. Keeping a pulse on industry trends, user behaviors, and competing search engine strategies will be key for successful SEO campaigns.


Featured Image: Tada Images/Shutterstock

New Anthropic Claude Team Plan Versus ChatGPT Team via @sejournal, @martinibuster

Anthropic announced a new Team plan that gives businesses the opportunity to access more powerful features with increased management and security controls that make it a strong option for companies to consider.

Generative AI For Teams

Modern SaaS technologies for businesses generally come with a team version that allows collaboration within a company and also gives security and control to management so that proprietary documents don’t accidentally leak to the public.

Open AI launched their ChatGPT Team plan in January 2024, which offered a secured workspace for users within a company at a reasonable monthly and yearly subscription model. Anthropic has finally launched their own Team version of Claude with features that exceed what’s offered by ChatGPT Teams.

Claude Team Compares To ChatGPT Team

Claude is known as a great model for creative purposes and a team version that’s more powerful than the regular paid version makes it even more attractive but the important question is how does it compare to ChatGPT Team?

ChatGPT Team is $5/month cheaper (on the yearly billing plan) than Anthropic’s collaborative plan. Otherwise they are both priced at $30/month per user.

Claude Team is a step up from ChatGPT’s Team in one important way and that’s the context window. A context window is a metric of how much data a model can process at one time. The larger the context window the more data the model can analyze in one batch.

ChatGPT Team offers a context window of 32k but Anthropic Claude Team users enjoy a whopping 200k context window which is about 150,000 words or 500 pages of text.

Team Versus Pro Version

When it comes to collaborating within a company, Claude Team is a better value than the regular Pro version because it provides more usage than the regular Pro plan which means that individual users can do more work than users on the Pro plan. The Team version also offers collaborative features like the ability to create a shared database that can be used for projects. Beyond that it offers administrative tools for managing users and billing.

More features are on the way according to the official announcement:

“In the coming weeks, we will be releasing additional collaboration features, including citations from reliable sources to verify AI-generated claims, integrations with data repositories like codebases or CRMs, and iterating with colleagues on AI-generated documents or projects—all while maintaining the highest standards of security and safety.”

Read the announcement here:

Introducing The Claude Team Plan

Wix Proposals Tool Helps Businesses Win More Clients via @sejournal, @martinibuster

Wix announced a new a feature that allows businesses to create business proposals and manage payments, streamlining the process of converting prospects into clients and setting up payments.

Proposals Powered By Prospero

The new feature, called Wix Proposals, is powered by the Prospero business proposal platform which streamlines the process of creating a professional-looking proposal and automates invoices and contracts.

There are three key features

  1. Proposal Design
    There are templates that Wix users can use or build their own.
  2. Payment Scheduler
    This makes it easy to create multiple ways to pay such as one-time payments or multiple payments.
  3. Digital Signature Support

Helps Businesses Convert More Clients

Wix Proposals features are designed to help businesses be more successful by streamlining tasks related to winning more business and receiving payments.

According to the announcement:

“‘Wix Proposals offers business owners the tools needed to create, manage, and finalize proposals with ease,” said Asaf Remler, Director of Strategic Partnerships at Wix. “By empowering businesses to take control of their long-term financial engagements, and with a focus on user-friendly design and powerful features, we believe it will redefine the way professionals across various industries approach proposal creation and management. We’re enabling users to manage long-term financial engagements with several payments milestones, ultimately helping Wix to capture new GPV that was usually being processed offline”

“At Prospero, we believe proposals are more than just documents – they’re the first handshake, the elevator pitch, the decisive turning point in a crucial business negotiation. Wix Proposals helps businesses to tell their stories with captivating proposals built for conversion,” said Tomer Aharon, Prospero Co-Founder and CEO. “Through the seamless integration of Wix’s innovative platform and Prospero’s industry-leading expertise, businesses in any industry can unlock a potent competitive advantage, ensuring their proposals not only stand out but win the deal. We are thrilled about this partnership and proud that our platform is included in Wix’s offering, empowering users with cutting-edge tools to elevate their proposal game.’”

Read more on how to get started with Wix Proposals

Wix Proposals by Prospero: Adding and Setting up Proposals

Featured Image by Shutterstock/monticello

What To Know About Medium-Level WordPress Vulnerabilities via @sejournal, @martinibuster

The majority of WordPress vulnerabilities, about 67% of them discovered in 2023, are rated as medium level. Because of they’re the most common, it makes sense to understand what they are and when they represent an actual security threat. These are the facts about those kinds of vulnerabilities what you should know about them.

What Is A Medium Level Vulnerability?

A spokesperson from WPScan, a WordPress Security Scanning company owned by Automattic, explained that they use the Common Vulnerability Scoring System (CVSS Scores) to rate the severity of a threat. The scores are based on a numbering system from 1 – 10 and ratings from low, medium, high, and critical.

The WPScan spokesperson explained:

“We don’t flag levels as the chance of happening, but the severity of the vulnerability based on FIRST’s CVSS framework. Speaking broadly, a medium-level severity score means either the vulnerability is hard to exploit (e.g., SQL Injection that requires a highly privileged account) or the attacker doesn’t gain much from a successful attack (e.g., an unauthenticated user can get the content of private blog posts).

We generally don’t see them being used as much in large-scale attacks because they are less useful than higher severity vulnerabilities and harder to automate. However, they could be useful in more targeted attacks, for example, when a privileged user account has already been compromised, or an attacker knows that some private content contains sensitive information that is useful to them.

We would always recommend upgrading vulnerable extensions as soon as possible. Still, if the severity is medium, then there is less urgency to do so, as the site is less likely to be the victim of a large-scale automated attack.

An untrained user may find the report a bit hard to digest. We did our best to make it as suitable as possible for all audiences, but I understand it’d be impossible to cover everyone without making it too boring or long. And the same can happen to the reported vulnerability. The user consuming the feed would need some basic knowledge of their website setup to consider which vulnerability needs immediate attention and which one can be handled by the WAF, for example.

If the user knows, for example, that their site doesn’t allow users to subscribe to it. All reports of subscriber+ vulnerabilities, independent of the severity level, can be reconsidered. Assuming that the user maintains a constant review of the site’s user base.

The same goes for contributor+ reports or even administrator levels. If the person maintains a small network of WordPress sites, the admin+ vulnerabilities are interesting for them since a compromised administrator of one of the sites can be used to attack the super admin.”

Contributor-Level Vulnerabilities

Many medium severity vulnerabilities require a contributor-level access. A contributor is an access role that gives that registered user the ability to write and submit content, although in general they don’t have the ability to publish them.

Most websites don’t have to worry about security threats that require contributor level authentication because most sites don’t offer that level of access.

Chloe Chamberland – Threat Intelligence Lead at Wordfence explained that most site owners shouldn’t worry about medium level severity vulnerabilities that require a contributor-level access in order to exploit them because most WordPress sites don’t offer that permission level. She also noted that these kinds of vulnerabilities are hard to scale because exploiting them is difficult to automate.

Chloe explained:

“For most site owners, vulnerabilities that require contributor-level access and above to exploit are something they do not need to worry about. This is because most sites do not allow contributor-level registration and most sites do not have contributors on their site.

In addition, most WordPress attacks are automated and are looking for easy to exploit high value returns so vulnerabilities like this are unlikely to be targeted by most WordPress threat actors.”

Website Publishers That Should Worry

Chloe also said that publishers who do offer contributor-level permissions may have several reasons to be concerned about these kinds of exploits:

“The concern with exploits that require contributor-level access to exploit arises when site owners allow contributor-level registration, have contributors with weak passwords, or the site has another plugin/theme installed with a vulnerability that allows contributor-level access in some way and the attacker really wants in on your website.

If an attacker can get their hands on one of these accounts, and a contributor-level vulnerability exists, then they may be provided with the opportunity to escalate their privileges and do real damage to the victim. Let’s take a contributor-level Cross-Site Scripting vulnerability for example.

Due to the nature of contributor-level access, an administrator would be highly likely to preview the post for review at which point any injected JavaScript would execute – this means the attacker would have a relatively high chance of success due to the admin previewing the post for publication.

As with any Cross-Site Scripting vulnerability, this can be leveraged to add a new administrative user account, inject backdoors, and essentially do anything a site administrator could do. If a serious attacker has access to a contributor-level account and no other trivial way to elevate their privileges, then they’d likely leverage that contributor-level Cross-Site Scripting to gain further access. As previously mentioned, you likely won’t see that level of sophistication targeting the vast majority of WordPress sites, so it’s really high value sites that need to be concerned with these issues.

In conclusion, while I don’t think a vast majority of site owners need to worry about contributor-level vulnerabilities, it’s still important to take them seriously if you allow user registration at that level on your site, you don’t enforce unique strong user passwords, and/or you have a high value WordPress website.”

Be Aware Of Vulnerabilities

While the many of the medium level vulnerabilities may not be something to worry about it’s still a good idea to stay informed of them. Security Scanners like the free version of WPScan can give a warning when a plugin or theme becomes vulnerable. It’s a good way to have a warning system in place to keep on top of vulnerabilities.

WordPress security plugins like Wordfence offer a proactive security stance that actively blocks automated hacking attacks and can be further tuned by advanced users to block specific bots and user agents. The free version of Wordfence offers significant protection in the form of a firewall and a malware scanner. The paid version offers protection for all vulnerabilities as soon as they’re discovered and before the vulnerability is patched. I use Wordfence on all of my websites and can’t imagine setting up a website without it.

Security is generally not regarded as an SEO issue but it should be considered as one because failure to secure a site can undo all the hard word done to make a site rank well.

Featured Image by Shutterstock/Juan villa torres

OpenAI To Show Content & Links In Response To Queries via @sejournal, @martinibuster

OpenAI content deal will enhance ChatGPT with the ability to show real-time content with links in response to queries. OpenAI quietly took steps to gaining more search engine type functionality as part of a content licensing deal that may have positive implications for publishers and SEO.

Content Licensing Deal

OpenAI agreed to content licensing with the Financial Times, a global news organization with offices in London, New York, across continental Europe and Asia.

Content licensing deals between AI organizations and publishers are generally about getting access to high quality training data. The training data is then used by language models to learn connections between words and concepts. This deal goes far beyond that use.

ChatGPT Will Show Direct Quotes With Attribution

What makes this content licensing deal between The Financial Times and OpenAI is that there is a reference to giving attribution to content within ChatGPT.

The announced licensing deal explicitly mentions the use of the licensed content so that ChatGPT could directly quote it and provide links to the licensed content.

Further, the licensing deal is intended to help improve ChatGPT’s “usefulness”, which is vague and can mean many things, but it takes on a slightly different meaning when used in the context of attributed answers.

The Financial Times agreement states that the licensing deal is for use in ChatGPT when it provides “attributed content” which is content with an attribution, commonly a link to where the content appeared.

This is the part of the announcement that references attributed content:

“The Financial Times today announced a strategic partnership and licensing agreement with OpenAI, a leader in artificial intelligence research and deployment, to enhance ChatGPT with attributed content, help improve its models’ usefulness by incorporating FT journalism, and collaborate on developing new AI products and features for FT readers. “

And this is the part of the announcement that mentions ChatGPT offering users attributed quotes and links:

“Through the partnership, ChatGPT users will be able to see select attributed summaries, quotes and links to FT journalism in response to relevant queries.”

The Financial Times Group CEO was even more explicit about OpenAI’s intention to show content and links in ChatGPT:

“This is an important agreement in a number of respects,” said FT Group CEO John Ridding. “It recognises the value of our award-winning journalism and will give us early insights into how content is surfaced through AI. …this partnership will help keep us at the forefront of developments in how people access and use information.

OpenAI understands the importance of transparency, attribution, and compensation…”

Brad Lightcap, COO of OpenAI directly referenced showing real-time news content in ChatGPT but more important he referenced OpenAI exploring new ways to show content to its user base.

Lastly, the COO stated that they embraced disruption, which means innovation that creates a new industry or paradigm, usually at the expense of an older one, like search engines.

Lightcap is quoted:

“We have always embraced new technologies and disruption, and we’ll continue to operate with both curiosity and vigilance as we navigate this next wave of change.”

Showing direct quotes of Financial Times content with links to that content is very similar to how search engines work. This is a big change to how ChatGPT works and could be a sign of where ChatGPT is going in the future, a functionality that incorporates online content with links to that content.

Something Else That Is Possibly Related

Someone on Twitter recently noticed a change that is related to “search” in relation to ChatGPT.

This change involves an SSL security certificate that was added for a subdomain of ChatGPT.com. ChatGPT.com is a domain name that was snapped up by someone to capitalize on the 2022 announcement of ChatGPT by OpenAI. OpenAI eventually acquired the domain and it’s been redirecting to ChatGPT.

The change that was noticed is to the subdomain: search.chatgpt.com.

This is a screenshot of the tweet:

Screenshot of SSL logs for search.chatgpt.com

Big News For SEO and Publishers

This is significant news for publishers and search marketers ChatGPT will become a source of valuable traffic if OpenAI takes ChatGPT in the direction of providing attributed summaries and direct quotes.

How Can Publishers Get Traffic From ChatGPT?

Questions remain about attributed quotes with links in response to relevant queries. Here are X unknowns about ChatGPT attributed links.

  • Does this mean that only licensed content will be shown and linked to in ChatGPT?
  • Will ChatGPT incorporate and use most web data without licensing deals in the same way that search engines do?
  • OpenAI may incorporate an Opt-In model where publishers can use a notation in Robots.txt or in meta data to opt-in to receiving traffic from ChatGPT.
  • Would you opt into receiving traffic from ChatGPT in exchange for allowing your content to be used for training?
  • How would SEOs and publisher’s equation on ChatGPT change if their competitors are all receiving traffic from ChatGPT?

Read the original announcement:

Financial Times announces strategic partnership with OpenAI

Featured Image by Shutterstock/Photo For Everything