8 Out Of 10 TikTok Videos By Brands Fail To Capture Attention via @sejournal, @gregjarboe

Despite TikTok’s increasing importance to marketers, the majority of brands are not getting it right, and their videos on the social media platform are under-performing.

According to new research, 84% of the video content released by brands on TikTok fails to generate strong positive emotions, capture attention, or enhance brand recall.

The mini-study conducted by DAIVID, which aids advertisers in assessing and enhancing the impact of their content on a large scale, also found that 24% of TikTok videos were triggering strong negative emotions, potentially damaging to brand reputations.

Their study evaluates the effectiveness of video content being shared on the social platform using a variety of metrics, including the positive and negative emotions elicited by each video, the attention they generated, and the impact the content had on various brand metrics, such as brand recall.

The study’s findings include:

  • Only 16% of the branded TikTok videos scored higher than the average Creative Effectiveness Score (CES) of 5.8 out of 10 – a composite metric created by DAIVID that combines the three main drivers of effectiveness: attention, emotions, and memory.
  • 60% of branded TikTok videos were simply forgettable, with below-average positive emotional responses and below-average brand recall. They also ranked above the global average for causing confusion and boredom.
  • 24% of branded TikTok videos triggered strong, extreme, negative emotions like anxiety, fear, discomfort, disgust, and shame.
  • Overall, branded TikTok content was 9% less likely to generate intense positive emotions than the global average and attracted 2.5% less attention.

In a press release, Ian Forrester, the CEO and founder of DAIVID, said:

“This research reveals that the vast majority of content being released on TikTok simply isn’t up to scratch. Sixty per cent of the creative is simply forgettable, under-indexing for positive emotions and over-indexing for negative emotions such as confusion and boredom. In one ear and out of the other for the viewer.”

He added:

“Yet, even more concerning for brands are the 24% of videos that evoked intense, extreme, negative emotions such as disgust, anxiety and shame. If these emotions are attached to the brand, they’re likely to do the brand damage, negatively impacting future sales potential.

This should be a wake-up call to brands and underlines the importance of analyzing the effectiveness of your social creative beyond just the basic reach, impressions and engagement rate data provided by the platforms to understand the real impact that it’s having.”

What Marketers Can Do To Avoid Negative Reactions

I realize that many marketers working at other big brands will ask, what should I do next?

For starters, read “39 Emotions Digital Marketers Can Use In Advertising.”

You’ll discover the latest research conducted at the University of California, Berkeley; Stanford University; and the Ehrenberg-Bass Institute for Marketing Science at the University of South Australia has uncovered: “Generally, it’s advisable for … brands to leave us with a positive emotion.”

Next, read “TikTok Trends 2024: The Most Important Trends To Watch.” Among other things, you’ll learn that TikTok launched its Creative Center earlier this year.

This can help you find inspiration by giving you insights into what type of content is trending in your country, broken down into hashtags, whether lip-syncing audio, popular creators, or specific video formats.

Finally, read the “10 Most Emotionally Engaging Olympics Ads (For Paris 2024 & For All Time).”

Yes, these video ads were uploaded to YouTube, not TikTok. But Procter & Gamble (P&G) dominates DAIVID’s all-time chart, with five of the top most emotionally engaging Olympics ads – including the top three tear-jerking positions.

So, you’ll want to figure out what they know that you don’t.

Spoiler alert: Brands should focus on creating more TikTok videos that generate intense positive emotions, including hope, admiration, amusement, and trust, as well as making fewer TikTok videos that evoke intense negative emotions, such as anger, disgust, fear, and anxiety.

We shouldn’t need new research to reinforce this important lesson. But, if 84% of TikTok videos by brands are underperforming and 24% generate a strong negative reaction, then I guess it’s time to ask some hard questions.

What SEO Pros Should Do To Seize This Opportunity

Some SEO professionals might mistakenly assume that they should “move along – nothing to see here” because they don’t optimize TikTok videos. But they should reconsider.

TikTok videos have been appearing in Google’s search results for a couple of years. If you need to verify this, then Google “most viewed TikTok videos in 24 hours.”

So, what should you do about this?

If you are a savvy SEO, then you’re already incorporating relevant keywords in your video title, caption, and hashtags, leveraging trending topics, engaging with your audience, and collaborating with other creators to increase visibility.

But if another group within your company or client is cranking out TikTok content without optimizing it first, then reach out and suggest that you get together and maybe suggest a “lunch and learn” session, where training can occur in an informal setting.

If you want to do a little homework to make suggestions on optimizing TikTok content, try reading the following.

For starters, read Video SEO: 10 Steps to Optimizing Videos for Search and Discovery.” It focuses on video SEO best practices for YouTube.

But if you plan to optimize videos for TikTok.com, which gets 2.7 billion visits a month worldwide, then you should also optimize videos for YouTube.com, which gets 73.0 billion visits a month worldwide, according to Semrush.

Next, read The Future of SEO Lies in the “Messy Middle” of the Purchase Journey.”

Among other things, you’ll learn that people look for information about products and brands in a looping process across a huge array of online sources, including search engines, social video platforms, and review websites, in two mental modes: exploration and evaluation.

Finally, check out “Customer Personas Can Transform SEO, PPC, and Content Marketing,” which was published in March 2021.

It says:

“… decision-making is not a rational process, but one driven mainly by how people feel. The rational brain layers on reasons for our choices only after they’re made.” This explains why video should be a critical component of any future SEO strategy.

It’s Time To Pay Attention To Video SEO

Many SEO professionals have been busy preparing for the threat of a “searchquake” that was supposed to be triggered by Search Generative Experiences (SGE), so they may have overlooked the opportunity of video SEO.

But TikTok does present an opportunity for many brands with huge potential if you are one of the brands getting it right.

It’s time for SEO professionals, as well as marketers, to pay attention to video marketing and to do their homework to understand why some brands are generating negative emotions. And to learn how they can be the ones that get the positive reactions.

The data from above was taken from a study conducted by DAIVID, a global creative effectiveness platform.

More resources: 


Featured Image: Pheelings Media/Shutterstock

Google’s Guidance About The Recent Ranking Update via @sejournal, @martinibuster

Google’s Danny Sullivan explained the recent update, addressing site recoveries and cautioning against making radical changes to improve rankings. He also offered advice for publishes whose rankings didn’t improve after the last update.

Google’s Still Improving The Algorithm

Danny said that Google is still working on their ranking algorithm, indicating that more changes (for the positive) are likely on the way. The main idea he was getting across is that they’re still trying to fill the gaps in surfacing high quality content from independent sites. Which is good because big brand sites don’t necessarily have the best answers.

He wrote:

“…the work to connect people with “a range of high quality sites, including small or independent sites that are creating useful, original content” is not done with this latest update. We’re continuing to look at this area and how to improve further with future updates.”

A Message To Those Who Were Left Behind

There was a message to those publishers whose work failed to recover with the latest update, to let them know that Google is still working to surface more of the independent content and that there may be relief on the next go.

Danny advised:

“…if you’re feeling confused about what to do in terms of rankings…if you know you’re producing great content for your readers…If you know you’re producing it, keep doing that…it’s to us to keep working on our systems to better reward it.”

Google Cautions Against “Improving” Sites

Something really interesting that he mentioned was a caution against trying to improve rankings of something that’s already on page one in order to rank even higher. Tweaking a site to get from position six or whatever to something higher has always been a risky thing to do for many reasons I won’t elaborate on here. But Danny’s warning increases the pressure to not just think twice before trying to optimize a page for search engines but to think three times and then some more.

Danny cautioned that sites that make it to the top of the SERPs should consider that a win and to let it ride instead of making changes right now in order to improve their rankings. The reason for that caution is that the search results continue to change and the implication is that changing a site now may negatively impact the rankings in a newly updated search index.

He wrote:

“If you’re showing in the top results for queries, that’s generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two”

How Google Handled Feedback

There was also some light shed on what Google did with all the feedback they received from publishers who lost rankings. Danny wrote that the feedback and site examples he received was summarized, with examples, and sent to the search engineers for review. They continue to use that feedback for the next round of improvements.

He explained:

“I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. …I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.”

Feedback Itself Didn’t Lead To Recovery

Danny also pointed out that sites that recovered their rankings did not do so because of they submitted feedback to Google. Danny wasn’t specific about this point but it conforms with previous statements about Google’s algorithms that they implement fixes at scale. So instead of saying, “Hey let’s fix the rankings of this one site” it’s more about figuring out if the problem is symptomatic of something widescale and how to change things for everybody with the same problem.

Danny wrote:

“No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don’t work that way.”

That feedback didn’t lead to recovery but was used as data shouldn’t be surprising. Even as far back as the 2004 Florida Update Matt Cutts collected feedback from people, including myself, and I didn’t see a recovery for a false positive until everyone else also got back their rankings.

Takeaways

Google’s work on their algorithm is ongoing:
Google is continuing to tune its algorithms to improve its ability to rank high quality content, especially from smaller publishers. Danny Sullivan emphasized that this is an ongoing process.

What content creators should focus on:
Danny’s statement encouraged publishers to focus on consistently creating high quality content and not to focus on optimizing for algorithms. Focusing on quality should be the priority.

What should publishers do if their high-quality content isn’t yet rewarded with better rankings?
Publishers who are certain of the quality of their content are encouraged to hold steady and keep it coming because Google’s algorithms are still being refined.

Read the post on LinkedIn.

Featured Image by Shutterstock/Cast Of Thousands

Google Analytics Update: Plot Up To Five Metrics At Once via @sejournal, @MattGSouthern

Google has rolled out changes to Analytics, adding features to help you make more sense of your data.

The update brings several key improvements:

  • You can now compare up to five different metrics side by side.
  • A new tool automatically spots unusual trends in your data.
  • A more detailed report on transactions gives a closer look at revenue.
  • The acquisition reports now separate user and session data more clearly.
  • It’s easier to understand what each report does with new descriptions.

Here’s an overview of these new features, why they matter, and how they might help improve your data analysis and decision-making.

Plot Rows: Enhanced Data Visualization

The most prominent addition is the “Plot Rows” feature.

You can now visualize up to five rows of data simultaneously within your reports, allowing for quick comparisons and trend analysis.

This feature is accessible by selecting the desired rows and clicking the “Plot Rows” option.

Anomaly Detection: Spotting Unusual Patterns

Google Analytics has implemented an anomaly detection system to help you identify potential issues or opportunities.

This new tool automatically flags unusual data fluctuations, making it easier to spot unexpected traffic spikes, sudden drops, or other noteworthy trends.

Improved Report Navigation & Understanding

Google Analytics has added hover-over descriptions for report titles.

These brief explanations provide context and include links to more detailed information about each report’s purpose and metrics.

Key Event Marking In Events Report

The Events report allows you to mark significant events for easy reference.

This feature, accessed through a three-dot menu at the end of each event row, helps you prioritize and track important data points.

New Transactions Report For Revenue Insights

For ecommerce businesses, the new Transactions report offers granular insights into revenue streams.

This feature provides information about each transaction, utilizing the transaction_id parameter to give you a comprehensive view of sales data.

Scope Changes In Acquisition Reports

Google has refined its acquisition reports to offer more targeted metrics.

The User Acquisition report now includes user-related metrics such as Total Users, New Users, and Returning Users.

Meanwhile, the Traffic Acquisition report focuses on session-related metrics like Sessions, Engaged Sessions, and Sessions per Event.

What To Do Next

As you explore these new features, keep in mind:

  • Familiarize yourself with the new Plot Rows function to make the most of comparative data analysis.
  • Pay attention to the anomaly detection alerts, but always investigate the context behind flagged data points.
  • Take advantage of the more detailed Transactions report to understand your revenue patterns better.
  • Experiment with the refined acquisition reports to see which metrics are most valuable for your needs.

As with any new tool, there will likely be a learning curve as you incorporate these features into your workflow.


FAQ

What is the “Plot Rows” feature in Google Analytics?

The “Plot Rows” feature allows you to visualize up to five rows of data at the same time. This makes it easier to compare different metrics side by side within your reports, facilitating quick comparisons and trend analysis. To use this feature, select the desired rows and click the “Plot Rows” option.

How does the new anomaly detection system work in Google Analytics?

Google Analytics’ new anomaly detection system automatically flags unusual data patterns. This tool helps identify potential issues or opportunities by spotting unexpected traffic spikes, sudden drops, or other notable trends, making it easier for users to focus on significant data fluctuations.

What improvements have been made to the Transactions report in Google Analytics?

The enhanced Transactions report provides detailed insights into revenue for ecommerce businesses. It utilizes the transaction_id parameter to offer granular information about each transaction, helping businesses get a better understanding of their revenue streams.


Featured Image: Vladimka production/Shutterstock

Mediavine Bans Publisher For Overuse Of AI-Generated Content via @sejournal, @MattGSouthern

According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.

Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.

The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.

AI Content Triggers Account Terminations

The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”

Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:

“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”

Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”

Consequently, Mediavine terminated the publisher’s account “effective immediately.”

The Risks Of Low-Quality AI Content

This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”

In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”

Mediavine warned in the post:

“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”

The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.

Mediavine states:

“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”

Targeting ‘AI Clickbait Kingpin’ Tactics

While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.

According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.

His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.

Potential Implications

Lost Revenue

Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.

Perhaps the most immediate and tangible implication is the risk of losing ad revenue.

For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.

Devalued Domains

Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.

If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.

Damaged Reputations & Brands

Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.

Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.

In Summary

AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.

These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.

It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.

The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.

We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.


Featured Image: Simple Line/Shutterstock

Google Users Warned Of Surging Malvertising Campaigns via @sejournal, @MattGSouthern

Cybersecurity researchers are warning people over a troubling rise in “malvertising”—the use of online ads to deploy malware, phishing scams, and other attacks.

A report from Malwarebytes found that malvertising incidents in the U.S. surged 42% last fall.

The prime target? Unsuspecting users conducting searches on Google.

Jérôme Segura, senior director of research at Malwarebytes, warns:

“What I’m seeing is just the tip of the iceberg. Hackers are getting smarter and the ads are often so realistic that it’s easy to be duped.”

Poisoned Paid Promotions

The schemes frequently involve cybercriminals purchasing legitimate-looking sponsored ad listings that appear at the top of Google search results.

Clicking these can lead to drive-by malware downloads or credential phishing pages spoofing major brands like Lowe’s and Slack.

Segura explained of one recent Lowe’s employee portal phishing attack:

“You see the brand, even the official logo, and for you it’s enough to think it’s real.”

Undermining User Trust

Part of what makes these malvertising attacks so volatile is they hijack and undermine user trust in Google as an authoritative search source.

Stuart Madnick, an information technology professor at MIT, notes:

“You see something appearing on a Google search, you kind of assume it is something valid.”

The threats don’t end with poisoned promotions, either. Malicious ads can also sneak through on trusted websites.

Protecting Against Malvertising: For Users

Experts advise several precautions to reduce malvertising risk, including:

  • Carefully vet search ads before taking any actions
  • Keeping device operating systems and browsers updated
  • Using ad-blocking browser extensions
  • Reporting suspicious ads to Google for investigation

Madnick cautioned:

“You should assume that this could happen to you no matter how careful you are.”

Staying vigilant against malvertising exploits will become more critical as cyber attackers evolve their deceptive tactics.

Protecting Against Malvertising: For Websites

While individual users must stay vigilant, websites are also responsible for implementing safeguards to prevent malicious ads from being displayed on their platforms.

Some best practices include:

Ad Verification Services

Many websites rely on third-party ad verification services and malware scanning tools to monitor the ads being served and block those identified as malicious before reaching end users.

Whitelisting Ad Sources

Rather than accepting ads through open real-time bidding advertising exchanges, websites can whitelist only thoroughly vetted and trusted ad networks and sources.

Review Process

For an added layer of protection, websites can implement a human review process on top of automated malware scanning to manually analyze ads before serving them to visitors.

Continuous Monitoring

Malvertisers constantly update their techniques, so websites must monitor their ad traffic data for anomalies or suspicious patterns that could indicate a malicious campaign.

By implementing multi-layered ad security measures, websites can avoid unknowingly participating in malvertising schemes that put their visitors at risk while protecting their brand reputation.


Featured Image: Bits And Splits/Shutterstock

Why Google Indexes Blocked Web Pages via @sejournal, @martinibuster

Google’s John Mueller answered a question about why Google indexes pages that are disallowed from crawling by robots.txt and why the it’s safe to ignore the related Search Console reports about those crawls.

Bot Traffic To Query Parameter URLs

The person asking the question documented that bots were creating links to non-existent query parameter URLs (?q=xyz) to pages with noindex meta tags that are also blocked in robots.txt. What prompted the question is that Google is crawling the links to those pages, getting blocked by robots.txt (without seeing a noindex robots meta tag) then getting reported in Google Search Console as “Indexed, though blocked by robots.txt.”

The person asked the following question:

“But here’s the big question: why would Google index pages when they can’t even see the content? What’s the advantage in that?”

Google’s John Mueller confirmed that if they can’t crawl the page they can’t see the noindex meta tag. He also makes an interesting mention of the site:search operator, advising to ignore the results because the “average” users won’t see those results.

He wrote:

“Yes, you’re correct: if we can’t crawl the page, we can’t see the noindex. That said, if we can’t crawl the pages, then there’s not a lot for us to index. So while you might see some of those pages with a targeted site:-query, the average user won’t see them, so I wouldn’t fuss over it. Noindex is also fine (without robots.txt disallow), it just means the URLs will end up being crawled (and end up in the Search Console report for crawled/not indexed — neither of these statuses cause issues to the rest of the site). The important part is that you don’t make them crawlable + indexable.”

Takeaways:

1. Mueller’s answer confirms the limitations in using the Site:search advanced search operator for diagnostic reasons. One of those reasons is because it’s not connected to the regular search index, it’s a separate thing altogether.

Google’s John Mueller commented on the site search operator in 2021:

“The short answer is that a site: query is not meant to be complete, nor used for diagnostics purposes.

A site query is a specific kind of search that limits the results to a certain website. It’s basically just the word site, a colon, and then the website’s domain.

This query limits the results to a specific website. It’s not meant to be a comprehensive collection of all the pages from that website.”

2. Noindex tag without using a robots.txt is fine for these kinds of situations where a bot is linking to non-existent pages that are getting discovered by Googlebot.

3. URLs with the noindex tag will generate a “crawled/not indexed” entry in Search Console and that those won’t have a negative effect on the rest of the website.

Read the question and answer on LinkedIn:

Why would Google index pages when they can’t even see the content?

Featured Image by Shutterstock/Krakenimages.com

Google May Unify Schema Markup & Merchant Center Feed Data via @sejournal, @MattGSouthern

Google revealed it’s working to bridge the gap between two key product data sources that power its shopping results – website markup using schema.org structured data and product feeds submitted via Google Merchant Center.

The initiative, mentioned during a recent “Search Off The Record” podcast episode, aims to achieve one-to-one parity between the product attributes supported by schema.org’s open-source standards and Google’s merchant feed specifications.

Leveraging Dual Product Data Pipelines

In search results, Google leverages structured data markup, and Merchant Center product feeds to surface rich product listings.

Irina Tuduce, a longtime Google employee involved with the company’s shopping search infrastructure, says merchants should utilize both options.

Tuduce stated:

“We recommend doing both. Because, as I said, in signing up on the Merchant Center UI, you make sure some of your inventory, the one that you specify, will be in the Shopping results. And you can make sure you’ll be on dotcom on the Shopping tab and Image tab.

And then, if you specify how often you want us to refresh your data, then you can be sure that that information will be refreshed. Otherwise, yeah, you don’t know when we will have the resources to recrawl you and update that information.”

Meanwhile, implementing schema.org markup allows Google to extract product details from websites during the crawling process.

Reconciling Markup and Feed Discrepancies

However, discrepancies can arise when the product information in a merchant’s schema.org markup doesn’t perfectly align with the details provided via their Merchant Center feed uploads.

Tuduce explained

“If you don’t have the schema.org markup on your page, we’ll probably stick to the inventory that you specify in your feed specification.”

Google’s initiative aims to resolve such discrepancies.

Simplifying Merchant Product Data Management

Unifying the product attributes across both sources aims to simplify data management and ensure consistent product listings across Google.

Regarding the current inconsistencies between schema.org markup and merchant feed specifications, Tuduce says:

“The attributes overlap to a big extent, but there are still gaps that exist. We will want to address those gaps.”

As the effort progresses, Google plans to keep marketers informed by leveraging schema.org’s active GitHub community and opening the update process to public feedback.

The unified product data model could keep product details like pricing, availability, and variant information consistently updated and accurately reflected across Google’s search results.

Why This Matters

For merchants, consistent product listings with accurate, up-to-date details can boost visibility in Google’s shopping experiences. Streamlined data processes also mean less redundant work.

For consumers, a harmonized system translates to more relevant, trustworthy shopping journeys.

What You Can Do Now

  • Audit current product data across website markup and merchant feeds for inconsistencies.
  • Prepare to consolidate product data workflows as Google’s unified model rolls out.
  • Implement richer product schema markup using expanded vocabulary.
  • Monitor metrics like impressions/clicks as consistent data surfaces.
  • Prioritize product data hygiene and frequent catalog updates.

By aligning your practices with Google’s future plans, you can capitalize on new opportunities for streamlined product data management and enhanced shopping search visibility.

Hear the full discussion below, starting around the 12-minute mark:

New LiteSpeed Cache Vulnerability Puts 6 Million Sites at Risk via @sejournal, @martinibuster

Another vulnerability was discovered in the LiteSpeed Cache WordPress plugin—an Unauthenticated Privilege Escalation that could lead to a total site takeover. Unfortunately, updating to the latest version of the plugin may not be enough to resolve the issue.

LiteSpeed Cache Plugin

The LiteSpeed Cache Plugin is a website performance optimization plugin that has over 6 million installations. A cache plugin stores a static copy of the data used to create a web page so that the server doesn’t have to repeatedly fetch the exact same page elements from the database every time a browser requests a web page.

Storing the page in a “cache” reduced the server load and speeds up the time it takes to deliver a web page to a browser or a crawler.

LiteSpeed Cache also does other page speed optimizations like compressing CSS and JavaScript files (minifying), puts the most important CSS for rendering a page in the HTML code itself (inlined CSS) and other optimizations that together make a site faster.

Unauthenticated Privilege Escalation

An unauthenticated privilege escalation is a type of vulnerability that allows a hacker to attain site access privileges without having to sign in as a user. This makes it easier to hack a site in comparison to an authenticated vulnerability that requires a hacker to first attain a certain privilege level before being able to execute the attack.

Unauthenticated privilege escalation typically occurs because of a flaw in a plugin (or theme) and in this case it’s a data leak.

Patchstack, the security company that discovered the vulnerability writes that vulnerability can only be exploited under two conditions:

“Active debug log feature on the LiteSpeed Cache plugin.

Has activated the debug log feature once before (not currently active now) and the /wp-content/debug.log file is not purged or removed.”

Discovered By Patchstack

The vulnerability was discovered by researchers at Patchstack WordPress security company, which offers a free vulnerability warning service and advanced protection for as little as $5/month.

Oliver Sild Founder of Patchstack explained to Search Engine Journal how this vulnerability was discovered and warned that updating the plugin is not enough, that a user still needs to manually purge their debug logs.

He shared these specifics about the vulnerability:

“It was found by our internal researcher after we processed the vulnerability from a few weeks ago.

Important thing to keep in mind with this new vulnerability is that even when it gets patched, the users still need to purge their debug logs manually. It’s also a good reminder not to keep debug mode enabled in production.”

Recommended Course of Action

Patchstack recommends that users of LiteSpeed Cache WordPress plugin update to at least version 6.5.0.1.

Read the advisory at Patchstack:

Critical Account Takeover Vulnerability Patched in LiteSpeed Cache Plugin

Featured Image by Shutterstock/Teguh Mujiono

SearchGPT vs. Google: Early Analysis & User Feedback via @sejournal, @MattGSouthern

OpenAI, the company behind ChatGPT, has introduced a prototype of SearchGPT, an AI-powered search engine.

The launch has sparked considerable interest, leading to discussions about its potential to compete with Google.

However, early studies and user feedback indicate that while SearchGPT shows promise, it has limitations and needs more refinement.

Experts suggest it needs further development before challenging current market leaders.

Study Highlights SearchGPT’s Strengths and Weaknesses

SE Ranking, an SEO software company, conducted an in-depth analysis of SearchGPT’s performance and compared it to Google and Bing.

The study found that SearchGPT’s search results are 73% similar to Bing’s but only 46% similar to Google’s.

Interestingly, 26% of domains ranking in SearchGPT receive no traffic from Google, indicating opportunities for websites struggling to gain traction.

The study highlighted some of SearchGPT’s key features, including:

  • The ability to summarize information from multiple sources Provide a conversational interface for refining searches Offering an ad-free user experience.
  • However, the research noted that SearchGPT lacks the variety and depth of Google’s search results, especially for navigational, transactional, and local searches.
  • The study also suggested that SearchGPT favors authoritative, well-established websites, with backlinks being a significant ranking factor.

Around 32% of all SearchGPT results came from media sources, increasing to over 75% for media-related queries.

SE Ranking notes that SearchGPT needs improvement in providing the latest news, as some news results were outdated.

User Experiences & Limitations Reported By The Washington Post

The Washington Post interviewed several early testers of SearchGPT and reported mixed reviews.

Some users praised the tool’s summarization capabilities and found it more helpful than Google’s AI-generated answers for certain queries.

Others, however, found SearchGPT’s interface and results less impressive than those of smaller competitors like Perplexity.

The article also highlighted instances where SearchGPT provided incorrect or “hallucinated” information, a problem that has plagued other AI chatbots.

While the SE Ranking study estimated that less than 1% of searches returned inaccurate results, The Washington Post says there’s significant room for improvement.

The article also highlighted Google’s advantage in handling shopping and local queries due to its access to specialized data, which can be expensive to acquire.

Looking Ahead: OpenAI’s Plans For SearchGPT and Potential Impact on the Market

OpenAI spokesperson Kayla Wood revealed that the company plans to integrate SearchGPT’s best features into ChatGPT, potentially enhancing the popular language model’s capabilities.

When asked about the possibility of including ads in SearchGPT, Wood stated that OpenAI’s business model is based on subscriptions but didn’t specify whether SearchGPT would be offered for free or as part of a ChatGPT subscription.

Despite the excitement surrounding SearchGPT, Google CEO Sundar Pichai recently reported continued growth in the company’s search revenue, suggesting that Google may maintain its dominant position even with the emergence of new AI-powered search tools.

Top Takeaways

Despite its current limitations, SearchGPT has the potential to shake up online information seeking. As OpenAI iterates based on user feedback, its impact may grow significantly.

Integrating SearchGPT’s best features into ChatGPT could create a more powerful info-seeking tool. The proposed subscription model raises questions about competition with free search engines and user adoption.

While Google’s search revenue and specialized query handling remain strong, SearchGPT could carve out its own niche. The two might coexist, serving different user needs.

For SearchGPT to truly compete, OpenAI must address accuracy issues, expand query capabilities, and continuously improve based on user input. It could become a viable alternative to traditional search engines with ongoing development.


Featured Image: Robert Way/Shutterstock

Google Confirms It’s Okay To Ignore Spam Scores via @sejournal, @martinibuster

Google’s John Mueller answered a Reddit question about how to lower a website’s spam score. His answer reflected an important insight about third-party spam scores and their relation to how Google ranks websites.

What’s A Spam Score?

A spam score is the opinion of a third-party tool that reviews data like inbound links and on page factors based on whatever the tool developers believe are spam-related factors and signals. While there are a few things about SEO that most people can agree on there is a lot more about SEO that digital marketers dispute.

The reality is that third-party tools use unknown factors to assign a spam score, which reflects how a search engine might use unknown metrics to assess website quality. That’s multiple layers of uncertainty to trust.

Should You Worry About Spam Scores?

The question asked in Reddit was about whether they should be worrying about a third-party spam score and what can be done to achieve a better score.

This is the question:

“My site is less than 6 months old with less than 60 blog posts.

I was checking with some tool it says I have 302 links and 52 referring domains. My worry is on the spam score.

How should I go about reducing the score or how much is the bad spam score?”

Google’s John Mueller answered:

“I wouldn’t worry about that spam score.

The real troubles in your life are apt to be things that never crossed your worried mind, the kind that blindside you at 4 p.m. on some idle Tuesday.”

He then followed up with a more detailed response:

“And to be more direct – Google doesn’t use these spam scores. You can do what you want with them. They’re not going to change anything for your site.

I’d recommend taking the time and instead making a tiny part of your website truly awesome, and then working out what it would take the make the rest of your website like that. This spam score tells you nothing in that regard. Ignore it.”

Spam Scores Tells You Nothing In That Regard

John Mueller is right, third-party spam scores don’t reflect site quality. They’re only opinions based on what the developers of a tool believe, which could be outdated, could be insufficient, we just don’t know because the factors used to calculate third-party spam scores are secret.

In any case, there is no agreement about what ranking factors are, no agreement of what on-page and off-page factors are and even the idea of “ranking factors” is somewhat debatable because nowadays Google uses various signals to determine if a site is trustworthy and relies on core topicality systems to understand search queries and web pages. That’s a world-away from using ranking factors to score web pages. Can we even agree on whether there’s a difference between ranking factors and signals? Where does something like a (missing) quality signal even fit in a third-party spam metric?

Popular lists of 200 ranking factors often contain factual errors and outdated ideas based on decades-old concepts of how search engines rank websites. We’re in a period of time when search engines are somewhat moving past the concepts of “ranking factors” in favor of  core topicality systems for understanding web pages (and search queries) and an AI system called SpamBrain that weeds out low-quality websites.

So yes, Mueller makes a valid point when he advises not to worry about spam scores.

Read the discussion on Reddit:

Is site spam score of 1% bad?

Featured Image by Shutterstock/Krakenimages.com