New Anthropic Claude Team Plan Versus ChatGPT Team via @sejournal, @martinibuster

Anthropic announced a new Team plan that gives businesses the opportunity to access more powerful features with increased management and security controls that make it a strong option for companies to consider.

Generative AI For Teams

Modern SaaS technologies for businesses generally come with a team version that allows collaboration within a company and also gives security and control to management so that proprietary documents don’t accidentally leak to the public.

Open AI launched their ChatGPT Team plan in January 2024, which offered a secured workspace for users within a company at a reasonable monthly and yearly subscription model. Anthropic has finally launched their own Team version of Claude with features that exceed what’s offered by ChatGPT Teams.

Claude Team Compares To ChatGPT Team

Claude is known as a great model for creative purposes and a team version that’s more powerful than the regular paid version makes it even more attractive but the important question is how does it compare to ChatGPT Team?

ChatGPT Team is $5/month cheaper (on the yearly billing plan) than Anthropic’s collaborative plan. Otherwise they are both priced at $30/month per user.

Claude Team is a step up from ChatGPT’s Team in one important way and that’s the context window. A context window is a metric of how much data a model can process at one time. The larger the context window the more data the model can analyze in one batch.

ChatGPT Team offers a context window of 32k but Anthropic Claude Team users enjoy a whopping 200k context window which is about 150,000 words or 500 pages of text.

Team Versus Pro Version

When it comes to collaborating within a company, Claude Team is a better value than the regular Pro version because it provides more usage than the regular Pro plan which means that individual users can do more work than users on the Pro plan. The Team version also offers collaborative features like the ability to create a shared database that can be used for projects. Beyond that it offers administrative tools for managing users and billing.

More features are on the way according to the official announcement:

“In the coming weeks, we will be releasing additional collaboration features, including citations from reliable sources to verify AI-generated claims, integrations with data repositories like codebases or CRMs, and iterating with colleagues on AI-generated documents or projects—all while maintaining the highest standards of security and safety.”

Read the announcement here:

Introducing The Claude Team Plan

Wix Proposals Tool Helps Businesses Win More Clients via @sejournal, @martinibuster

Wix announced a new a feature that allows businesses to create business proposals and manage payments, streamlining the process of converting prospects into clients and setting up payments.

Proposals Powered By Prospero

The new feature, called Wix Proposals, is powered by the Prospero business proposal platform which streamlines the process of creating a professional-looking proposal and automates invoices and contracts.

There are three key features

  1. Proposal Design
    There are templates that Wix users can use or build their own.
  2. Payment Scheduler
    This makes it easy to create multiple ways to pay such as one-time payments or multiple payments.
  3. Digital Signature Support

Helps Businesses Convert More Clients

Wix Proposals features are designed to help businesses be more successful by streamlining tasks related to winning more business and receiving payments.

According to the announcement:

“‘Wix Proposals offers business owners the tools needed to create, manage, and finalize proposals with ease,” said Asaf Remler, Director of Strategic Partnerships at Wix. “By empowering businesses to take control of their long-term financial engagements, and with a focus on user-friendly design and powerful features, we believe it will redefine the way professionals across various industries approach proposal creation and management. We’re enabling users to manage long-term financial engagements with several payments milestones, ultimately helping Wix to capture new GPV that was usually being processed offline”

“At Prospero, we believe proposals are more than just documents – they’re the first handshake, the elevator pitch, the decisive turning point in a crucial business negotiation. Wix Proposals helps businesses to tell their stories with captivating proposals built for conversion,” said Tomer Aharon, Prospero Co-Founder and CEO. “Through the seamless integration of Wix’s innovative platform and Prospero’s industry-leading expertise, businesses in any industry can unlock a potent competitive advantage, ensuring their proposals not only stand out but win the deal. We are thrilled about this partnership and proud that our platform is included in Wix’s offering, empowering users with cutting-edge tools to elevate their proposal game.’”

Read more on how to get started with Wix Proposals

Wix Proposals by Prospero: Adding and Setting up Proposals

Featured Image by Shutterstock/monticello

What To Know About Medium-Level WordPress Vulnerabilities via @sejournal, @martinibuster

The majority of WordPress vulnerabilities, about 67% of them discovered in 2023, are rated as medium level. Because of they’re the most common, it makes sense to understand what they are and when they represent an actual security threat. These are the facts about those kinds of vulnerabilities what you should know about them.

What Is A Medium Level Vulnerability?

A spokesperson from WPScan, a WordPress Security Scanning company owned by Automattic, explained that they use the Common Vulnerability Scoring System (CVSS Scores) to rate the severity of a threat. The scores are based on a numbering system from 1 – 10 and ratings from low, medium, high, and critical.

The WPScan spokesperson explained:

“We don’t flag levels as the chance of happening, but the severity of the vulnerability based on FIRST’s CVSS framework. Speaking broadly, a medium-level severity score means either the vulnerability is hard to exploit (e.g., SQL Injection that requires a highly privileged account) or the attacker doesn’t gain much from a successful attack (e.g., an unauthenticated user can get the content of private blog posts).

We generally don’t see them being used as much in large-scale attacks because they are less useful than higher severity vulnerabilities and harder to automate. However, they could be useful in more targeted attacks, for example, when a privileged user account has already been compromised, or an attacker knows that some private content contains sensitive information that is useful to them.

We would always recommend upgrading vulnerable extensions as soon as possible. Still, if the severity is medium, then there is less urgency to do so, as the site is less likely to be the victim of a large-scale automated attack.

An untrained user may find the report a bit hard to digest. We did our best to make it as suitable as possible for all audiences, but I understand it’d be impossible to cover everyone without making it too boring or long. And the same can happen to the reported vulnerability. The user consuming the feed would need some basic knowledge of their website setup to consider which vulnerability needs immediate attention and which one can be handled by the WAF, for example.

If the user knows, for example, that their site doesn’t allow users to subscribe to it. All reports of subscriber+ vulnerabilities, independent of the severity level, can be reconsidered. Assuming that the user maintains a constant review of the site’s user base.

The same goes for contributor+ reports or even administrator levels. If the person maintains a small network of WordPress sites, the admin+ vulnerabilities are interesting for them since a compromised administrator of one of the sites can be used to attack the super admin.”

Contributor-Level Vulnerabilities

Many medium severity vulnerabilities require a contributor-level access. A contributor is an access role that gives that registered user the ability to write and submit content, although in general they don’t have the ability to publish them.

Most websites don’t have to worry about security threats that require contributor level authentication because most sites don’t offer that level of access.

Chloe Chamberland – Threat Intelligence Lead at Wordfence explained that most site owners shouldn’t worry about medium level severity vulnerabilities that require a contributor-level access in order to exploit them because most WordPress sites don’t offer that permission level. She also noted that these kinds of vulnerabilities are hard to scale because exploiting them is difficult to automate.

Chloe explained:

“For most site owners, vulnerabilities that require contributor-level access and above to exploit are something they do not need to worry about. This is because most sites do not allow contributor-level registration and most sites do not have contributors on their site.

In addition, most WordPress attacks are automated and are looking for easy to exploit high value returns so vulnerabilities like this are unlikely to be targeted by most WordPress threat actors.”

Website Publishers That Should Worry

Chloe also said that publishers who do offer contributor-level permissions may have several reasons to be concerned about these kinds of exploits:

“The concern with exploits that require contributor-level access to exploit arises when site owners allow contributor-level registration, have contributors with weak passwords, or the site has another plugin/theme installed with a vulnerability that allows contributor-level access in some way and the attacker really wants in on your website.

If an attacker can get their hands on one of these accounts, and a contributor-level vulnerability exists, then they may be provided with the opportunity to escalate their privileges and do real damage to the victim. Let’s take a contributor-level Cross-Site Scripting vulnerability for example.

Due to the nature of contributor-level access, an administrator would be highly likely to preview the post for review at which point any injected JavaScript would execute – this means the attacker would have a relatively high chance of success due to the admin previewing the post for publication.

As with any Cross-Site Scripting vulnerability, this can be leveraged to add a new administrative user account, inject backdoors, and essentially do anything a site administrator could do. If a serious attacker has access to a contributor-level account and no other trivial way to elevate their privileges, then they’d likely leverage that contributor-level Cross-Site Scripting to gain further access. As previously mentioned, you likely won’t see that level of sophistication targeting the vast majority of WordPress sites, so it’s really high value sites that need to be concerned with these issues.

In conclusion, while I don’t think a vast majority of site owners need to worry about contributor-level vulnerabilities, it’s still important to take them seriously if you allow user registration at that level on your site, you don’t enforce unique strong user passwords, and/or you have a high value WordPress website.”

Be Aware Of Vulnerabilities

While the many of the medium level vulnerabilities may not be something to worry about it’s still a good idea to stay informed of them. Security Scanners like the free version of WPScan can give a warning when a plugin or theme becomes vulnerable. It’s a good way to have a warning system in place to keep on top of vulnerabilities.

WordPress security plugins like Wordfence offer a proactive security stance that actively blocks automated hacking attacks and can be further tuned by advanced users to block specific bots and user agents. The free version of Wordfence offers significant protection in the form of a firewall and a malware scanner. The paid version offers protection for all vulnerabilities as soon as they’re discovered and before the vulnerability is patched. I use Wordfence on all of my websites and can’t imagine setting up a website without it.

Security is generally not regarded as an SEO issue but it should be considered as one because failure to secure a site can undo all the hard word done to make a site rank well.

Featured Image by Shutterstock/Juan villa torres

OpenAI To Show Content & Links In Response To Queries via @sejournal, @martinibuster

OpenAI content deal will enhance ChatGPT with the ability to show real-time content with links in response to queries. OpenAI quietly took steps to gaining more search engine type functionality as part of a content licensing deal that may have positive implications for publishers and SEO.

Content Licensing Deal

OpenAI agreed to content licensing with the Financial Times, a global news organization with offices in London, New York, across continental Europe and Asia.

Content licensing deals between AI organizations and publishers are generally about getting access to high quality training data. The training data is then used by language models to learn connections between words and concepts. This deal goes far beyond that use.

ChatGPT Will Show Direct Quotes With Attribution

What makes this content licensing deal between The Financial Times and OpenAI is that there is a reference to giving attribution to content within ChatGPT.

The announced licensing deal explicitly mentions the use of the licensed content so that ChatGPT could directly quote it and provide links to the licensed content.

Further, the licensing deal is intended to help improve ChatGPT’s “usefulness”, which is vague and can mean many things, but it takes on a slightly different meaning when used in the context of attributed answers.

The Financial Times agreement states that the licensing deal is for use in ChatGPT when it provides “attributed content” which is content with an attribution, commonly a link to where the content appeared.

This is the part of the announcement that references attributed content:

“The Financial Times today announced a strategic partnership and licensing agreement with OpenAI, a leader in artificial intelligence research and deployment, to enhance ChatGPT with attributed content, help improve its models’ usefulness by incorporating FT journalism, and collaborate on developing new AI products and features for FT readers. “

And this is the part of the announcement that mentions ChatGPT offering users attributed quotes and links:

“Through the partnership, ChatGPT users will be able to see select attributed summaries, quotes and links to FT journalism in response to relevant queries.”

The Financial Times Group CEO was even more explicit about OpenAI’s intention to show content and links in ChatGPT:

“This is an important agreement in a number of respects,” said FT Group CEO John Ridding. “It recognises the value of our award-winning journalism and will give us early insights into how content is surfaced through AI. …this partnership will help keep us at the forefront of developments in how people access and use information.

OpenAI understands the importance of transparency, attribution, and compensation…”

Brad Lightcap, COO of OpenAI directly referenced showing real-time news content in ChatGPT but more important he referenced OpenAI exploring new ways to show content to its user base.

Lastly, the COO stated that they embraced disruption, which means innovation that creates a new industry or paradigm, usually at the expense of an older one, like search engines.

Lightcap is quoted:

“We have always embraced new technologies and disruption, and we’ll continue to operate with both curiosity and vigilance as we navigate this next wave of change.”

Showing direct quotes of Financial Times content with links to that content is very similar to how search engines work. This is a big change to how ChatGPT works and could be a sign of where ChatGPT is going in the future, a functionality that incorporates online content with links to that content.

Something Else That Is Possibly Related

Someone on Twitter recently noticed a change that is related to “search” in relation to ChatGPT.

This change involves an SSL security certificate that was added for a subdomain of ChatGPT.com. ChatGPT.com is a domain name that was snapped up by someone to capitalize on the 2022 announcement of ChatGPT by OpenAI. OpenAI eventually acquired the domain and it’s been redirecting to ChatGPT.

The change that was noticed is to the subdomain: search.chatgpt.com.

This is a screenshot of the tweet:

Screenshot of SSL logs for search.chatgpt.com

Big News For SEO and Publishers

This is significant news for publishers and search marketers ChatGPT will become a source of valuable traffic if OpenAI takes ChatGPT in the direction of providing attributed summaries and direct quotes.

How Can Publishers Get Traffic From ChatGPT?

Questions remain about attributed quotes with links in response to relevant queries. Here are X unknowns about ChatGPT attributed links.

  • Does this mean that only licensed content will be shown and linked to in ChatGPT?
  • Will ChatGPT incorporate and use most web data without licensing deals in the same way that search engines do?
  • OpenAI may incorporate an Opt-In model where publishers can use a notation in Robots.txt or in meta data to opt-in to receiving traffic from ChatGPT.
  • Would you opt into receiving traffic from ChatGPT in exchange for allowing your content to be used for training?
  • How would SEOs and publisher’s equation on ChatGPT change if their competitors are all receiving traffic from ChatGPT?

Read the original announcement:

Financial Times announces strategic partnership with OpenAI

Featured Image by Shutterstock/Photo For Everything

Google Ads To Automatically Pause Low-Activity Keywords via @sejournal, @MattGSouthern

Google is emailing advertisers to notify them it will soon begin automatically pausing low-activity keywords.

According to the email, positive keywords in search ad campaigns will be considered low-activity if they haven’t generated impressions in over a year.

This change is designed to help advertisers focus on valuable keywords rather than clutter their accounts with underperforming keywords.

Google’s Email To Advertisers

In an email circulated to advertisers, Google states:

“Starting in June 2024, we will begin to automatically pause low-activity keywords.”

The email defined “low-activity” as keywords created over 13 months ago that have registered zero impressions during that time.

Rationale Behind The Change

Google explained the reason for this change:

“We want to help advertisers simplify their accounts and focus on keywords that drive results.”

Advertiser Options & Recommendations

You can unpause automatically paused keywords, though Google recommends only unpausing keywords that will receive impressions in the coming weeks.

Unpaused keywords will be automatically paused again if they fail to generate any impressions over the next three months.

Google’s email reads:

“If you decide that a paused keyword is necessary for your campaign, you can unpause it. However, we strongly recommend that you only unpause keywords that you believe will receive impressions in the coming weeks. Unpaused keywords will be automatically paused again if they don’t receive any impressions over the next 3 months.”

Why SEJ Cares

By automatically pausing low-activity keywords, Google aims to declutter advertiser accounts and encourage a more focused approach to high-performing keywords.

However, some advertisers may find value in maintaining low-activity keywords for various reasons, such as capturing long-tail searches or aligning with specific marketing goals.

What This Means For Advertisers

This change means advertisers need to review and refine their keyword strategies.

With Google Ads set to pause low-activity keywords automatically, advertisers must be more proactive in identifying and prioritizing high-performing keywords.

This may involve revisiting campaign goals, adjusting bidding strategies, and ensuring that keyword lists are up-to-date and relevant.

Use this update to reevaluate keyword strategies and consider new, relevant keywords with potential.


Featured Image: foxanst/Shutterstock

Google’s John Mueller On Website Recovery After Core Updates via @sejournal, @MattGSouthern

John Mueller, a Google Search Advocate, provided guidance this week regarding the path forward for websites impacted by recent search algorithm updates.

The discussion started on X (formerly Twitter) by SEO professional Thomas Jepsen.

Jepsen tagged Mueller, asking:

“Google has previously said Google doesn’t hold a grudge and sites will recover once issues have been solved. Is that still the case after HCU?”

Mueller’s response offered hope to site owners while being realistic about the challenges ahead.

Addressing Recovery Timelines

Mueller affirmed Google’s stance on not holding grudges, stating, “That’s still the case.”

However, he acknowledged the complexity of rankings, saying:

“…some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller pointed to a Google help document explaining the nuances. The document reads:

“Broad core updates tend to happen every few months. Content that was impacted in Search or Discover by one might not recover—assuming improvements have been made—until the next broad core update is released.

Do keep in mind that improvements made by site owners aren’t a guarantee of recovery, nor do pages have any static or guaranteed position in our search results. If there’s more deserving content, that will continue to rank well with our systems.”

The Comments Sparking Debate

Jepsen probed further, asking, “Is a core update what’s needed for HCU-affected sites to recover (assuming they’ve fixed their issues)?”

Mueller’s response highlighted how situations can differ:

“It depends on the situation… I realize there’s a big space between the situations, but generalizing doesn’t help. Sometimes it takes a lot of work on the site, a long time, and an update.”

The thread grew as user @selectgame raised concerns about Google Discover traffic, to which Mueller replied:

“Google Discover is affected by core updates as well as other parts of Search (and there are more policies that apply to Discover).”

Growing Frustrations

Prominent industry figure Lily Ray voiced mounting frustrations, stating,

“…many HCU-affected websites – which have been making all kinds of improvements over the last 7 months – have only seen further declines with the March Core Update.

I have seen some sites lose 90% or more of their SEO visibility since the HCU, with the last few weeks being the nail in the coffin, despite making significant improvements.”

Ray continued:

“And in my professional opinion, many of these sites did not deserve anywhere near that level of impact, especially the further declines over the past month.”

Mueller hasn’t responded to Ray’s tweet at this time.

Looking Ahead

As the search community awaits Google’s next moves, the path to recovery appears arduous for many impacted by recent algorithm reassessments of “Helpful Content.”

Site improvements don’t guarantee immediate recovery, so publishers face an uphill battle guided only by Google’s ambiguous public advice.

Why SEJ Cares

The March 2024 core update has proven disastrous for many websites, with severe traffic losses persisting even after sites try to improve low-quality content, address technical issues, and realign with Google’s guidelines.

Having clear, actionable guidance from Google on recovering from core update updates is invaluable.

As evidenced by the frustrations expressed, the current communications leave much to be desired regarding transparency and defining a straightforward recovery path.

How This Can Help You

While Mueller’s comments provide some insights, the key takeaways are:

  • Regaining previous rankings after an algorithm hit is possible if sufficient content/site quality improvements are made.
  • Recovery timelines can vary significantly and may require a future core algorithm update.
  • Even with enhancements, recovery isn’t guaranteed as rankings depend on the overall pool of competing content.

The path is undoubtedly challenging, but Mueller’s comments underscore that perseverance with substantial site improvements can eventually pay off.


FAQ

Can SEO professionals predict recovery time for a website hit by core updates?

SEO professionals can’t pinpoint when a site will recover after a core Google algorithm update.

Reasons for this include:

  • Google releases core updates every few months, so sites may need to wait for the next one.
  • It can take months for Google to reassess and adjust rankings.
  • How competitive the query is also impacts if and when a site recovers.

Does making site improvements after a core update ensure recovery in rankings and visibility?

After making improvements following a Google algorithm update, regaining your previous rankings isn’t guaranteed.

Reasons why include:

  • Your impacted content may not recover until the next core update, provided you’ve implemented enough site improvements.
  • Google’s search results are dynamic, and rankings can fluctuate based on the quality of competitor content.
  • There’s no fixed or guaranteed position in Google’s search results.

What is the relationship between Google Discover traffic and core search updates?

Google’s core algorithm updates that impact regular search results also affect Google Discover.

However, Google Discover has additional specific policies that determine what content appears there.

This means:

  • Improving your content and website quality can boost your visibility on Google Discover, just like regular searches.
  • You may see changes in your Discover traffic when Google rolls out core updates.
  • Your SEO and content strategy should account for potential impacts on regular searches and Google Discover.


Featured Image: eamesBot/Shutterstock

Google News Deletes Manual Submission Option via @sejournal, @martinibuster

Google removed their documentation on manually adding publications for review to be included Google News. An announcement via new documentation advises that publishers can no longer manually add their sites to be considered for Google News but existing publishers will continue to have access to customization features but that those too will be removed soon.

Google Publisher Center

The Google Publisher Center was a service that allows publishers to submit their site content to Google News. Publishers could also submit a constantly updated feed that would alert Google to newly published articles. Google could also use a sitemap to discover newly published news articles.

While Google offered publishers a way to submit webpages to Google for display in Google News, Google never promised to show those sites.

Google also has maintained that they had automated ways to include content into Google News.

Google News Inclusion System

Google removed the ability for publishers to manually submit websites to be considered for inclusion to Google News. The process is transitioning to a fully automated system of discovery and inclusion.

The new system will be rolling out later this year.

The “Submit your publication for review” documentation was removed and is replaced with a 404 response webpage.

Screenshot of 404 page not found displayed in publisher center help page

New Documentation

Google’s new documentation explains that publishers will be unable to do manual submissions and that current publishers will continue to have access to the submission tools but that those tools will be removed some time later in 2024.

From the new Publisher Center documentation:

“Starting today, publishers can no longer add publications to the Publisher Center. We are making this change as part of a transition to roll out automatically created publication pages later this year.”

The new documentation further explains that current publishers with access to the Publisher Center will continue to have access to customization features but that access will at some point be removed.

What’s next for publication pages?
By adding a publication to Publisher Center, publishers were able to manually create a source page for users to follow. Publishers with manually created publication pages will continue to have access to customization features until later this year, when pages will shift to being automatically created. Users will continue to be able to follow their favorite publications.”

SearchLiaison Explains Reasons For Change

Google SearchLiaison posted this explanation on X (formerly known as Twitter):

“To clarify (some people get this, I know — but others get confused)….

Content is automatically considered for Google News and news surfaces, as has been the case since Dec. 2019.  That hasn’t changed.

Publisher Center offered a way to create source pages manually, which did not somehow get content overall considered for Google News and news surfaces. Some who made these pages would get confused and think source page = content in Google News and news surfaces. We’ve tried to clarify this more over the years…

Despite that, some still get confused because … yeah, it’s confusing! So part of this change is eliminating that confusion. Sites small and big are still automatically considered…

Also, anticipating those who say we should do more to include more news sites — moving to automatic consideration has helped, versus the past where someone would actually have to know to apply.

That said, yes, there’s always more we can do — it’s something I push on (and I’m not alone) especially in cases where sites are not having their content appear at all in news queries but are referenced by other who do. I’d really like to see us improve with this.”

How Sites Can Be Included In Google News

Sites wishing to be included into Google News need to abide by Google’s Content policies. Google then uses algorithmic signals to determine eligibility for being shown in Google News as well as in additional Google News surfaces such as Top Stories and the News Tab in the Search Results.

Google News Ranking Signals

Understanding the ranking signals Google uses for Google News is more important now than ever before.

Google lists general factors as part of the Google News ranking signals and a separate group of factors for personalized news (this is the “For You” tab of Google News).

General Google News Ranking Factors

  • Relevance of content
  • Prominence
  • Authoritativeness
  • Freshness
  • Location
  • Language

Ranking Factors For Personalized News

  • Interests
  • Usability
  • User preferences for topics or publishers

That’s the extent of hints offered by Google on what to know about Google News ranking factors.

A statement reads:

“You can improve your site’s ranking by maintaining a Google-friendly website.

…While we’re happy to help you with technical issues in Google News, we can’t provide much feedback regarding ranking. We appreciate your understanding.”

Read Google’s newly updated documentation for eligibility for Google News:
An update on publication pages

Read Google News content policies:
Google News Content Policies

See an archive of the missing Publisher Center support page:
Archive.org – Submit your publication for review

Featured Image by Shutterstock/robuart

Google Ads Performance Max Updates For Full-Funnel Optimization via @sejournal, @MattGSouthern

Google Ads has announced a series of updates to Performance Max campaigns.

The enhancements give advertisers more granular control over their campaigns, improve customer insights, and manage budgets better.

New Customer Value Mode & Acquisition Goals

Google Ads has introduced Customer Value mode, currently in beta, designed for purchase conversion goals.

This update allows advertisers to focus on high-value customers and optimize their campaigns accordingly.

Additionally, new customer acquisition goals are now available in Search Ads 360 (SA360), enabling businesses to target and attract new customers.

Winning Back Lapsed Customers

Google Ads has launched a customer retention goal (in beta) to help advertisers win back lapsed customers.

This feature enables businesses to re-engage with inactive customers, potentially increasing customer loyalty and retention rates.

Detailed Demographics & Audience Insights

Performance Max now offers detailed demographics, including age and gender groups, in its audience insights.

This update gives advertisers a more nuanced understanding of their target audience, enabling them to create more targeted and effective campaigns.

Budget Pacing Insights & Forecasting

To help advertisers better manage their budgets, Google Ads has introduced budget pacing insights.

These insights provide projected campaign spend and forecasted conversion performance.

This feature allows businesses to make informed decisions about their budget allocation and optimize their campaigns for maximum ROI.

Account-Level IP Address Exclusions

Performance Max now supports account-level IP address exclusions, giving advertisers greater control over their campaigns and enabling them to exclude specific IP addresses from their targeting.

Advertisers can refer to the official Google Ads blog post for more information on these updates and how to leverage them for improved campaign performance.

Testing Optimization Strategies

Perhaps the most intriguing update is the ability to run experiments testing optimization strategies directly within Performance Max campaigns.

The first strategy being tested is “Final URL expansion,” which dynamically points ads to the most relevant landing page.

According to Google data cited:

“Advertisers who use Final URL expansion with Performance Max campaigns see an average increase of over 9% in conversions/conversion value at a similar Cost Per Action (CPA)/Return on Ad Spend (ROAS).”

The experiments will split traffic, with a portion dedicated to testing while tracking performance against the original setup.

Data will “reveal how this change impacts your campaign’s performance.”

Why SEJ Cares

These updates demonstrate Google Ads’ ongoing commitment to providing advertisers with tools and insights to succeed in an increasingly competitive market.

The ability to optimize for different customer lifecycle stages, test optimization strategies, and detect audience opportunities are all valuable enhancements.

Performance Max is Google’s future for performance marketing, so staying up-to-date on new features and best practices is critical.

How This Can Help You

The new customer lifetime value and retention goals could open up strategies to better nurture customers through the entire marketing funnel from acquisition to repeat purchases.

More sophisticated audience insights about age and gender may highlight opportunities to tailor messaging better.

Running optimization experiments could unlock unique performance gains, especially if utilizing the dynamic Final URL expansion option.

Taking advantage of these new controls and insights within Performance Max could improve marketers’ efficiency and return on investment.


Featured Image: Tada Images/Shutterstock

Google’s Gary Illyes On AI, Site Migrations, & “SEO Is Dead” Claims via @sejournal, @MattGSouthern

Following the recent SERPConf event, Google Search Analyst Gary Illyes shared insights that provided a glimpse into the latest trends and challenges in the SEO world.

His LinkedIn post touches on a range of topics, from technical issues to the role of AI, providing a thorough overview of the industry’s current state.

Addressing hreflang Frustrations

One key discussion area was the ongoing struggle with hreflang, a markup that indicates a website’s language and regional variations.

Illyes acknowledged the difficulties, stating, “hreflang is annoying. I don’t disagree.”

However, he was open to finding a better solution, adding:

“I’m still very open to coming up with something less annoying, but it needs to work for small sites and mammoths as well, while delivering at least the same amount of information.”

The AI Conundrum: Fear vs Innovation

The rise of AI, particularly generative AI, was a hot topic at the conference.

While some attendees expressed concerns, Illyes offered a balanced perspective.

“AI is a tool that you can use and, just like with any tool, misuse. It’s not in any way different from any other feature accessible to you.”

He encouraged SEOs to “use it as you like, liberally and innovatively” but cautioned against contributing to the abundance of low-quality content already present online.

Site Migrations: A Persistent Challenge

Another area of concern highlighted by Illyes was the ongoing issue of site migrations gone wrong.

“This shouldn’t happen,” he stated, urging webmasters who have experienced prolonged issues with site migrations to contact Google for assistance.

The Persistent “SEO is Dead” Myth

Addressing the recurring claim that SEO is becoming obsolete, Illyes dismissed it as a cyclical phenomenon.

“Every single time something changes on search engines there’s a plethora of posts published about how SEO is dead now,” he remarked.

Drawing from the history of search engines, he added:

“Looking at the history of search engines and according to these posts, SEO has been dying since cca 2003. Hell of a slow death.”

Why SEJ Cares

As SEO evolves, with AI and other emerging technologies playing a prominent role, Illyes’ insights show there are both challenges and opportunities.

His willingness to address concerns head-on and acknowledge areas for improvement may foster a more open dialogue between Google and SEO professionals.

How This Can Help You

There are several key takeaways for SEO professionals:

  • Stay tuned for potential hreflang improvements. Illyes suggested that changes could be on the horizon.
  • Embrace AI responsibly. Illyes encourages SEO professionals to explore these technologies innovatively while emphasizing the importance of maintaining high standards and producing quality content.
  • Reach out to Google regarding site migration issues. With site migrations being a pain point, Illyes’ recommendation to escalate prolonged issues highlights the need for proactive communication with Google.
  • Tune out the “SEO is dead” noise. Illyes offered a reality check for those overwhelmed by doomsday narratives. Understanding that the industry has weathered similar claims can help you maintain focus.

Featured Image: fran_kie/Shutterstock

5 Things To Consider Before A Site Migration via @sejournal, @martinibuster

One of the scariest SEO tasks is a site migration because the stakes are so high and the pitfalls at every step . Here are five tips that will help keep a site migration on track to a successful outcome.

Site Migrations Are Not One Thing

Site Migrations are not one thing, they are actually different scenarios and the only thing they have in common is that there is always something that can go wrong.

Here are examples of some of the different kinds of site migrations:

  • Migration to a new template
  • Migrating to a new web host
  • Merging two different websites
  • Migrating to a new domain name
  • Migrating to a new site architecture
  • Migrating to a new content management system (CMS)
  • Migrating to a new WordPress site builder

There are many ways a site can change and more ways for those changes to result in a negative outcome.

The following is not a site migration checklist. It’s five suggestions for things to consider.

1. Prepare For Migration: Download Everything

Rule number one is to prepare for the site migration. One of my big concerns is that the old version of the website is properly documented.

These are some of the ways to document a website:

  • Download the database and save it in at least two places. I like to have a backup of the backup stored on a second device.
  • Download all the website files. Again, I prefer to save a backup of the backup stored on a second device.
  • Crawl the site, save the crawl and export it as a CSV or an XML site map. I prefer to have redundant backups just in case something goes wrong.

An important thing to remember about downloading files by FTP is that there are two formats for downloading files: ASCII and Binary.

  1. Use ASCII for downloading files that contain code, like CSS, JS, PHP and HTML.
  2. Use Binary for media like images, videos and zip files.

Fortunately, most modern FTP software have an automatic setting that should be able to distinguish between the two kinds of files. A sad thing that can happen is to download image files using the ASCII format which results in corrupted images.

So always check that your files are all properly downloaded and not in a corrupted state. Always consider downloading a copy for yourself if you have hired a third party to handle the migration or a client is doing it and they’re downloading files. That way if they fail with their download you’ll have an uncorrupted copy backed up.

The most important rule about backups: You can never have too many backups!

2. Crawl The Website

Do a complete crawl of the website. Create a backup of the crawl. Then create a backup of the backup and store it on a separate hard drive.

After the site migration, this crawl data can be used to generate a new list for crawling the old URLs to identify any URLs that are missing (404), are failing to redirect, or are redirecting to the wrong webpage. Screaming Frog also has a list mode that can crawl a list of URLs saved in different formats, including as an XML sitemap, and directly input into a text field.  This is a way to crawl a specific batch of URLs as opposed to crawling a site from link to to link.

3. Tips For Migrating To A New Template

Website redesigns can be can be a major source of anguish when they go wrong. On paper, migrating a site to a new template should be a one-to-one change with minimal issues. In practice that’s not always the case.  For one, no template can be used off the shelf, it has to be modified to conform to what’s needed, which can mean removing and/or altering the code.

Search marketing expert Nigel Mordaunt (LinkedIn), who recently sold his search marketing agency, has experience migrating over a hundred sites and has important considerations for migrating to a new WordPress template.

This is Nigel’s advice:

“Check that all images have the same URL, alt text and image titles, especially if you’re using new images.

Templates sometimes have hard-coded heading elements, especially in the footer and sidebars. Those should be styled with CSS, not with H tags. I had this problem with a template once where the ranks had moved unexpectedly, then found that the Contact Us and other navigation links were all marked up to H2. I think that was more of a problem a few years ago. But still, some themes have H tags hard coded in places that aren’t ideal.

Make sure that all URLs are the exact same, a common mistake. Also, if planning to change content then check that the staging environment has been noindexed then after the site goes live make sure that the newly uploaded live site no longer contains the noindex robots meta tag.

If changing content then be prepared the site to perhaps be re-evaluated by Google. Depending on the size of the site, even if the changes are positive it may take several weeks to be rewarded, and in some cases several months. The client needs to be informed of this before the migration.

Also, check that analytics and tracking codes have been inserted into the new site, review all image sizes to make sure there are no new images that are huge and haven’t been scaled down. You can easily check the image sizes and heading tags with a post-migration Screaming Frog crawl. I can’t imagine doing any kind of site migration without Screaming Frog.”

4. Advice For Migrating To A New Web Host

Mark Barrera (LinkedIn), VP SEO, Newfold Digital (parent company of Bluehost), had this to say about crawling before a site migration in preparation for a migration to a new web host:

“Thoroughly crawl your existing site to identify any indexing or technical SEO issues prior to the move.

Maintain URL Structure (If Possible): Changing URL structures can confuse search engines and damage your link equity. If possible, keep your URLs the same.

301 Redirects: 301 Redirects are your friend. Search engines need to be informed that your old content now lives at a new address. Implementing 301 redirects from any old URLs to their new counterparts preserves link equity and avoids 404 errors for both users and search engine crawlers.

Performance Optimization: Ensure your new host provides a fast and reliable experience. Site speed is important for user experience.

Be sure to do a final walkthrough of your new site before doing your actual cutover. Visually double-check your homepage, any landing pages, and your most popular search hits. Review any checkout/cart flows, comment/review chains, images, and any outbound links to your other sites or your partners.

SSL Certificate: A critical but sometimes neglected aspect of hosting migrations is the SSL certificate setup. Ensuring that your new host supports and correctly implements your existing SSL certificate—or provides a new one without causing errors is vital. SSL/TLS not only secures your site but also impacts SEO. Any misconfiguration during migration can lead to warnings in browsers, which deter visitors and can temporarily impact rankings.

Post migration, it’s crucial to benchmark server response times not just from one location, but regionally or globally, especially if your audience is international. Sometimes, a new hosting platform might show great performance in one area but lag in other parts of the world. Such discrepancies can affect page load times, influencing bounce rates and search rankings. “

5. Accept Limitations

Ethan Lazuk, SEO Strategist & Consultant, Ethan Lazuk Consulting, LLC, (LinkedIn, Twitter) offers an interesting perspective on site migrations on the point about anticipating client limitations imposed upon what you are able to do. It can be frustrating when a client pushes back on advice and it’s important to listen to their reasons for doing it.

I have consulted over Zoom with companies whose SEO departments had concerns about what an external SEO wanted to do. Seeking a third party confirmation about a site migration plan is a reasonable thing to do. So if the internal SEO department has concerns about the plan, it’s not a bad idea to have a trustworthy third party take a look at it.

Ethan shared his experience:

“The most memorable and challenging site migrations I’ve been a part of involved business decisions that I had no control over.

As SEOs, we can create a smart migration plan. We can follow pre- and post-launch checklists, but sometimes, there are legal restrictions or other business realities behind the scenes that we have to work around.

Not having access to a DNS, being restricted from using a brand’s name or certain content, having to use an intermediate domain, and having to work days, weeks, or months afterward to resolve any issues once the internal business situations have changed are just a few of the tricky migration issues I’ve encountered.

The best way to handle these situations require working around client restrictions is to button up the SEO tasks you can control, set honest expectations for how the business issues could impact performance after the migration, and stay vigilant with monitoring post-launch data and using it to advocate for resources you need to finish the job.”

Different Ways To Migrate A Website

Site migrations are a pain and should be approached with caution. I’ve done many different kinds of migrations for myself and have assisted them with clients. I’m currently moving thousands of webpages from a folder to the root and it’s complicated by multiple redirects that have to be reconfigured, not looking forward to it. But migrations are sometimes unavoidable so it’s best to step up to it after careful consideration.

Featured Image by Shutterstock/Krakenimages.com