CMOs Under Pressure: The Unseen Challenges In B2B Marketing via @sejournal, @MattGSouthern

A recent study of 121 B2B CMOs and marketing leaders has uncovered current marketing industry challenges.

The study by Bospar, CMO Huddles, and Redpoint examines the concept of an “underground recession” in marketing departments and its implications for professionals in the field.

Key findings include:

  • Budget constraints and their impact on marketing strategies
  • Changing deal cycles and their effects on revenue
  • Staffing challenges and increased pressure on marketing teams
  • Evolving CMO roles and job market trends

Read on for a data-driven exploration of the current state of B2B marketing.

Marketing Industry In A “Hidden” Recession

Despite positive macroeconomic indicators, including a 9.34% increase in the S&P 500 since the beginning of 2024, marketing departments are experiencing a different reality.

The survey found that 69% of respondents believe their industry is in a recession, while 61% feel that the overall unemployment rate doesn’t accurately reflect the situation in their sector.

Key Challenges Facing CMOs

The study identified four main trends making the job of marketing leaders increasingly difficult:

  1. Budget Cuts & Revenue Declines: 77% of marketing leaders reported flat or reduced budgets, with 38% experiencing cuts of at least 3%.
  2. Longer Deal Cycles: 54% of respondents noted extended sales cycles, impacting revenue timing and marketing budgets.
  3. Staffing Cuts & Layoffs: Half of the surveyed companies experienced layoffs, with 41% seeing cuts within their marketing departments.
  4. Pressure to Deliver More with Less: 69% of marketing leaders were asked to do more with reduced budgets in the past year.

Personal & Professional Toll on CMOs

The pressures of the current economic environment are reportedly taking a toll on marketing leaders.

67% of respondents reported that the past year’s challenges have impacted their overall well-being.

Many experienced adverse effects, including reduced exercise (80%), less time off (70%), and weight gain (40%).

Declining Job Prospects For CMOs

The study also highlighted a concerning trend in the job market for CMOs.

LinkedIn data shows a 62% decrease in CMO job postings in the United States from February 2023 to February 2024.

This decline is partly attributed to companies consolidating marketing responsibilities under other C-suite roles.

Adapting To The New Reality

Despite these challenges, industry experts emphasize the need for CMOs to adapt and evolve their strategies.

MarTech entrepreneur Jon Miller suggests that “the old playbooks just aren’t working anymore, and it’s time for a new playbook (and new technology) that aligns with modern buyers.”

Drew Neisser of CMO Huddles recommends four key areas for CMOs to focus on:

  1. Role expansion beyond traditional marketing duties
  2. Metrics expansion to demonstrate marketing’s full value
  3. Idea concentration to maximize impact with limited resources
  4. AI implementation to drive innovation and efficiency

Why Does This Matter?

This study shows what’s happening in marketing beyond the rosy economic headlines.

It matters because:

  • It explains why your job might feel harder lately.
  • It shows we need to get creative with our strategies.
  • It highlights why proving marketing’s value is so important right now.

What Does This Mean For You?

Here’s what you should keep in mind:

  • Learn skills that clearly show your worth, like data analysis.
  • Get ready to do more with less – focus on what really matters.
  • Look for ways to expand your role in the company.
  • Network more – it could help you find new opportunities.
  • Keep learning about new trends and tools.
  • Take care of yourself – everyone’s feeling the pressure, not just you.

Featured Image: Ground Picture/Shutterstock

Google’s Web Crawler Fakes Being “Idle” To Render JavaScript via @sejournal, @MattGSouthern

In a recent episode of the Search Off The Record podcast, it was revealed that Google’s rendering system now pretends to be “idle” to trigger certain JavaScript events and improve webpage rendering.

The podcast features Zoe Clifford from Google’s rendering team, who discussed how the company’s web crawlers deal with JavaScript-based sites.

This revelation is insightful for web developers who use such methods to defer content loading.

Google’s “Idle” Trick

Googlebot simulates “idle” states during rendering, which triggers JavaScript events like requestIdleCallback.

Developers use this function to defer loading less critical content until the browser is free from other tasks.

Before this change, Google’s rendering process was so efficient that the browser was always active, causing some websites to fail to load important content.

Clifford explained:

“There was a certain popular video website which I won’t name…which deferred loading any of the page contents until after requestIdleCallback was fired.”

Since the browser was never idle, this event wouldn’t fire, preventing much of the page from loading properly.

Faking Idle Time To Improve Rendering

Google implemented a system where the browser pretends to be idle periodically, even when it’s busy rendering pages.

This tweak ensures that idle callbacks are triggered correctly, allowing pages to fully load their content for indexing.

Importance Of Error Handling

Clifford emphasized the importance of developers implementing graceful error handling in their JavaScript code.

Unhandled errors can lead to blank pages, redirects, or missing content, negatively impacting indexing.

She advised:

“If there is an error, I just try and handle it as gracefully as possible…web development is hard stuff.”

What Does This Mean?

Implications For Web Developers

  • Graceful Error Handling: Implementing graceful error handling ensures pages load as intended, even if certain code elements fail.
  • Cautious Use of Idle Callbacks: While Google has adapted to handle idle callbacks, be wary of over-relying on these functions.

Implications For SEO Professionals

  • Monitoring & Testing: Implement regular website monitoring and testing to identify rendering issues that may impact search visibility.
  • Developer Collaboration: Collaborate with your development team to create user-friendly and search engine-friendly websites.
  • Continuous Learning: Stay updated with the latest developments and best practices in how search engines handle JavaScript, render web pages, and evaluate content.

Other Rendering-Related Topics Discussed

The discussion also touched on other rendering-related topics, such as the challenges posed by user agent detection and the handling of JavaScript redirects.

The whole podcast provides valuable insights into web rendering and the steps Google takes to assess pages accurately.

See also: Google Renders All Pages For Search, Including JavaScript-Heavy Sites


Featured Image: fizkes/Shutterstock

CWV & Google Page Experience Ranking Factor Updated via @sejournal, @martinibuster

The June 2024 Chrome User Experience Report (CrUX) is out and it shows that websites in the real-world experienced an averaged across the board improvement in all Core Web Vitals (CWV) website performance scores. Some of the improvements are attributable to a change in how Interaction To Next Paint is measured, which will be good news to websites with dialog modals (popups).

CrUX Dataset

The CrUX dataset consists of actual Core Web Vitals performance scores as measured in Chrome browsers when visiting websites. The data comes from browsers that were voluntarily opted in to report website performance metrics. The CrUX dataset is publicly available and is used by PageSpeed Insights, third party tools.

CrUX Influences Page Experience Ranking Factor

The CrUX report is used for Google’s Page Experience Ranking Factor. The data is publicly available and can be used for evaluating performance, including competitor performance. CrUX is important because it is one of the only metrics that a website publishers can check that have something to do with a website ranking factor.

According to Google’s overview documentation:

“The data collected by CrUX is available publicly through a number of Google tools and third-party tools and is used by Google Search to inform the page experience ranking factor.”

While the influence of the Page Experience Ranking Factor may be on the lower side, it’s still important for reasons outside of algorithms like improving conversions and ad clicks.

June 2024 Dataset

The dataset for June 2024 has been published and it shows that Core Web Vitals (CWV) website performance scores have incrementally risen across the board by modest percentages. This shows that website performance continues to be a focus for websites. Most of the popular content management systems are doing their best to improve, with WordPress making positive improvements with each new version that’s released.

The following scores are for origins. Origins are the entire website, which is different from Pages.

These are the average origin scores:

  • Largest Contentful Paint (LCP)
    This is a measurement of how fast the main content of a page loads. It specifically measures the largest image or content block that’s visible in a browser (viewport).
    63.4% (↑ 2.0%) had good LCP
  • Cumulative Layout Shift (CLS)
    Measures how long it takes for web page layout to become stable without elements jumping and shifting on the page.
    77.8% (↑ 0.5%) had good CLS
  • Interaction to Next Paint (INP)
    INP measures how long it takes for a web page to become responsive to user interactions
    84.1% (↑ 1.1%) had good INP
  • Percentage Of Sites With Good CWV
    This is the percentage of sites that had passing scores across all three Core Web Vitals metrics
    51.0% (↑ 2.3%) had good LCP, CLS and INP

Changes To INP Measurements

Chrome made changes to how long it takes for a page to become interactive (Interaction to Next Paint – INP) is measured, making it more accurate. This may have helped to increase the scores of some sites that were inadvertently ranked lower for INP because the metric failed to account for some kinds of popups.

The Chrome team explained:

“The Chrome team has been continuing work on improving efficiencies in Chrome’s handling of the Core Web Vitals metrics and recently launched some changes to INP which may have contributed to the positive trend this month. The most notable change is to better handle use of the basic modal dialogs (alert, confirm, print). While technically these are synchronous and block the main thread—and so are not recommended if there are alternatives—they do present user feedback for an interaction. They were previously not counted as presentation feedback for INP, which could result in very high INP values for sites that did use these. From Chrome 127 the presentation of the modal will mark the end measurement time for INP and so should lead to improved INP times for those sites.”

Read the June 2024 CWV Announcement

The 202406 dataset is live

Featured Image by Shutterstock/Ivan Dudka

The Reason Why Google Uses The Word “Creators” via @sejournal, @martinibuster

Google’s SearchLiaison responded to criticism over how they refer to website publishers with an answer that reflects not just changing times but also the practical reasons for doing so. The answer reflects how important it is for digital marketing to maintain the flexibility to bend with change.

Change: There Isn’t Always A Motivation

The discussion began with a tweet by someone who objected to the use of the phrase “creators” instead of other terms like businesses or publishers because the word creators minimizes the fact that there are businesses behind the websites.

This is the tweet:

“Notice the term “creators” in this piece. This is an example of Google’s successful effort to change the narrative. In the past they have used “publishers”, “businesses”, and just “web sites”. But “creators” minimizes business impact. And clearly some are falling for the trap.”

Keeping Up With The Pace Of Change

SearchLiaison’s response reflected something that is commonly misunderstood, which is that everything changes, including fashion, customs, norms and even speech. Those who lack self-awareness on this point will blink and miss it when the page turns on their generation and another one steps forward to take take their place at the center of the world.

This is especially true for SEO, where Google typically is a brand new search engine every five years.

This is SearchLiaison’s answer:

“We used to say “webmasters” in the past, and that doesn’t really speak to so many people who have an interest in appearing in search results. That’s in part why we have tended to say “creators” more — though not exclusively — for years now. It’s not a particularly new thing. It’s also why Search Central got its new name in 2020, the whole “webmasters” isn’t really that inclusive (or used) term: https://developers.google.com/search/blog/2020/11/goodbye-google-webmasters

“Publishers” tends to be heard by and used by those involved in news publishing. Businesses often just think of themselves as businesses. SEOs tend to be SEOs, and if you use that term, you exclude those who don’t think of SEOs but want to understand some of the things we share.

So “creators” tends to be the catch-all term we used, as imperfect as it is, because sometime you really need one term rather than “Here’s what creators and SEO and businesses and brands and news publishers and etc etc should know about something….”

All that said, I am seeing more of a need to use creators as less a catch-all and more to refer to people like Brandon who really do view themselves as content creators first-and-foremost. The work they do can be much different than an SEO, or a content marketer, or a local business and so on.”

And in a follow up he continued:

“We do say web sites when talking about web sites. But “web sites” isn’t a term that’s workable when addressing the people who are involved with web sites and have questions about their content appearing in search results.”

Ephemeral Quality Of Digital Marketing

It’s not just Google that changes, people change as well. Demand for certain products peak and then disappear. Ringtones used to be the hot affiliate product and then it was not. Technology drives change as well, as we’re currently seeing with AI.

Google’s choice of the word creators is a small marker of change. You can roll with it or simply roll your own.

Featured Image by Shutterstock/Mix and Match Studio

Google’s Indifference To Site Publishers Explained via @sejournal, @martinibuster

An interview with Google’s SearchLiaison offered hope that quality sites hit by Google’s algorithms may soon see their traffic levels bounce back. But that interview and a recent Google podcast reveal deeper issues that may explain why Google seems indifferent to publishers with every update.

The interview by Brandon Saltalamacchia comes against the background of many websites having lost traffic due to Google’s recent algorithm updates that have created the situation where Google feels that their algorithms are generally working fine for users while many website publishers are insisting that no, Google’s algorithms are not working fine.

Search ranking updates are just one reason why publishers are hurting. The decision at Google to send more traffic Reddit is also impacting website owners. It’s a fact that Reddit traffic is surging.  Another issue bedeviling publishers is AI Overviews, where Google’s AI is summarizing answers derived from websites so that users no longer have to visit a website to get their answers.

Those changes are driven by a desire to increase user satisfaction. The problem is that website publishers have been left out of the equation that determines whether the algorithm is working as it should.

Google Historically Doesn’t Focus On Publishers

A remark by Gary Illyes in a recent Search Off The Record indicated that in Gary’s opinion Google is all about the user experience because if search is good for the user then that’ll trickle down to the publishers and will be good for them too.

In the context of Gary explaining whether Google will announce that something is broken in search, Gary emphasized that search relations is focused on the search users and not the publishers who may be suffering from whatever is broken.

John Mueller asked:

“So, is the focus more on what users would see or what site owners would see? Because, as a Search Relations team, we would focus more on site owners. But it sounds like you’re saying, for these issues, we would look at what users would experience.”

Gary Illyes answered:

“So it’s Search Relations, not Site Owners Relations, from Search perspective.”

Google’s Indifference To Publishers

Google’s focus on satisfying search users can in practice turn into indifference toward publishers.  If you read all the Google patents and research papers related to information retrieval (search technology) the one thing that becomes apparent is that the measure of success is always about the users. The impact to site publishers are consistently ignored. That’s why Google Search is perceived as indifferent to site publishers, because publishers have never been a part of the search satisfaction equation.

This is something that publishers and Google may not have wrapped their minds around just yet.

Later on, in the Search Off The Record  podcast, the Googlers specifically discuss how an update is deemed to be working well regardless if a (relatively) small amount of publishers are complaining that Google Search is broken, because what matters is if Google perceives that they are doing the right thing from Google’s perspective.

John said:

“…Sometimes we get feedback after big ranking updates, like core updates, where people are like, “Oh, everything is broken.”

At the 12:06 minute mark of the podcast Gary made light of that kind of feedback:

“Do we? We get feedback like that?”

Mueller responded:

“Well, yeah.”

Then Mueller completed his thought:

“I feel bad for them. I kind of understand that. I think those are the kind of situations where we would look at the examples and be like, “Oh, I see some sites are unhappy with this, but overall we’re doing the right thing from our perspective.”

And Gary responded:

“Right.”

And John asks:

“And then we wouldn’t see it as an issue, right?”

Gary affirmed that Google wouldn’t see it as an issue if a legit publisher loses traffic when overall the algorithm is working as they feel it should.

“Yeah.”

It is precisely that shrugging indifference that a website publisher, Brandon Saltalamacchia, is concerned about and discussed with SearchLiaison in a recent blog post.

Lots of Questions

SearchLiaison asked many questions about how Google could better support content creators, which is notable because Google has a long history of focusing on their user experience but seemingly not also considering what the impact on businesses with an online presence.

That’s a good sign from SearchLiaison but not entirely a surprise because unlike most Googlers, SearchLiaison (aka Danny Sullivan) has decades of experience as a publisher so he knows what it’s like on our side of the search box.

It will be interesting if SearchLiaison’s concern for publishers makes it back to Google in a more profound way so that there’s a better understanding that the Search Ecosystem is greater than Google’s users and encompasses website publishers, too. Algorithm updates should be about more than how they impact users, the updates should also be about how they impact publishers.

Hope For Sites That Lost Traffic

Perhaps the most important news from the interview is that SearchLiaison expressed that there may be changes coming over the next few months that will benefit the publishers who have lost rankings over the past few months of updates.

Brandon wrote:

“One main take away from my conversation with Danny is that he did say to hang on, to keep doing what we are doing and that he’s hopeful that those of us building great websites will see some signs of recovery over the coming months.”

Yet despite those promises from Danny, Brandon didn’t come away with hope.

Brandon wrote:

“I got the sense things won’t change fast, nor anytime soon. “

Read the entire interview:

A Brief Meeting With Google After The Apocalypse

Listen to the Search Off The Record Podcast

Featured Image by Shutterstock/Roman Samborskyi

Google Says There’s No Way To Block Content From Discover Feed via @sejournal, @MattGSouthern

Google officials confirmed on X (formerly Twitter) that there’s no way to block content from appearing in Google Discover, despite the ability to do so for Google News.

The conversation was initiated by Lily Ray, who raised concerns about a common challenge where certain content may not be suitable for Google News or Discover but performs well in organic search results.

Ray states:

“We have experienced many situations with publisher clients where it would be helpful to prevent some content from being crawled/indexed specifically for Google News & Discover.

However, this content often performs incredibly well in organic search, so it’s not a good idea to noindex it across the board.

This content often falls in the grey area of what is forbidden in Google’s News & Discover guidelines, but still drives massive SEO traffic. We have noticed that having too much of this content appears to be detrimental to Discover performance over time.

Outside of your guidelines for SafeSearch – has Google considered a mechanism to prevent individual pages from being considered for News/Discover?”

Google’s Response

In response to Ray’s question, Google’s Search Liaison pointed to existing methods for blocking content from Google News.

However, upon checking with John Mueller of Google’s Search Relations team, the Liaison confirmed these methods don’t extend to Google Discover.

The Search Liaison stated:

“John [Mueller] and I pinged, and we’re pretty sure there’s not an option to just block content from Discover.”

Recognizing the potential value of such a feature, he added:

“That would seem useful, so we’ll pass it on.”

What Does This Mean?

This admission from Google highlights a gap in publishers’ ability to control how Google crawls their content.

While tools exist to manage content crawling for Google News and organic search results, the lack of similar controls for Discover presents a challenge.

Google’s Search Liaison suggests there’s potential for more granular controls, though there are no immediate plans to introduce content blocking features for Discover.


Featured Image: Informa Plus/Shutterstock

Google Renders All Pages For Search, Including JavaScript-Heavy Sites via @sejournal, @MattGSouthern

In a recent episode of Google’s “Search Off The Record” podcast, Zoe Clifford from the rendering team joined Martin Splitt and John Mueller from Search Relations to discuss how Google handles JavaScript-heavy websites.

Google affirms that it renders all websites in its search results, even if those sites rely on JavaScript.

Rendering Process Explained

In the context of Google Search, Clifford explained that rendering involves using a headless browser to process web pages.

This allows Google to index the content as a user would see it after JavaScript has executed and the page has fully loaded.

Clifford stated

“We run a browser in the indexing pipeline so we can index the view of the web page as a user would see it after it has loaded and JavaScript has executed.”

All HTML Pages Rendered

One of the podcast’s most significant revelations was that Google renders all HTML pages, not just a select few. Despite the resource-intensive process, Google has committed to this approach to ensure comprehensive indexing.

Clifford confirmed:

“We just render all of them, as long as they’re HTML and not other content types like PDFs.”

She acknowledged that while the process is expensive, accessing the full content of web pages, especially those relying heavily on JavaScript, is necessary.

Continuous Browser Updates

The team also discussed Google’s shift to using the “Evergreen Googlebot” in 2019.

This update ensures that Googlebot, Google’s web crawling bot, stays current with the latest stable version of Chrome.

This change has improved Google’s ability to render and index modern websites.

What This Means for Website Owners & Developers

  1. Good news for JavaScript: If your website uses a lot of JavaScript, Google will likely understand it.
  2. Speed still matters: Although Google can handle JavaScript better, having a fast-loading website is still important.
  3. Keep it simple when you can: While it’s okay to use JavaScript, try not to overdo it. Simpler websites are often easier for both Google and visitors to understand.
  4. Check your work: Use Google’s free tools, like Fetch As Google, to ensure search crawlers can render your site.
  5. Think about all users: Remember that some people might have slow internet or older devices. Ensure your main content works even if JavaScript doesn’t load perfectly.

Wrapping Up

Google’s ability to handle JavaScript-heavy websites gives developers more freedom. However, it’s still smart to focus on creating fast, easy-to-use websites that work well for everyone.

By keeping these points in mind, you can keep your website in good shape for both Google and your visitors.

Listen to the full podcast episode below:

Google’s Now Translating SERPs Into More Languages via @sejournal, @martinibuster

Google updated their documentation to reflect that it added eight new languages to its translated results feature, broadening the reach of publishers to an increasingly global scale, with automatic  translations to a site visitor’s native language.

Google Translated Results

Translated Results is a Google Search feature that will automatically translate the title link and meta description into the local language of a user, making a website published in one language available to a searcher in another language. If the searcher clicks on the link of a translated result the web page itself will also be automatically translated.

According to Google’s documentation for this feature:

“Google doesn’t host any translated pages. Opening a page through a translated result is no different than opening the original search result through Google Translate or using Chrome in-browser translation. This means that JavaScript on the page is usually supported, as well as embedded images and other page features.”

This feature benefits publishers because it makes their website available to a larger audience.

Search Feature Available In More Languages

Google’s documentation for this feature was updated to reflect that it is now available in eight more languages.

Users who speak the following languages will now have automatic access to a broader range of websites.

List Of Added Languages

  • Arabic
  • Gujarati
  • Korean
  • Persian
  • Thai
  • Turkish
  • Urdu
  • Vietnamese

Why Did It Take So Long?

It seems odd that Google didn’t already translate results into so many major languages like Turkish, Arabic or Korean. So I asked international SEO expert Christopher Shin (LinkedIn profile) about why it might have taken so long for Google to do this in the Korean language.

Christopher shared:

Google was always facing difficulties in the South Korean market as a search engine, and that has to do mainly with Naver and Kakao, formerly known as Daum.

But the whole paradigm shift to Google began when more and more students that went abroad to where Google is the dominant search engine came back to South Korea. When more and more students, travelers abroad etc., returned to Korea, they started to realize the strengths and weaknesses of the local search portals and the information capabilities these local portals provided. Laterally, more and more businesses in South Korea like Samsung, Hyundai etc., started to also shift marketing and sales to global markets, so the importance of Google as a tool for companies was also becoming more important with the domestic population.

Naver is still the dominant search portal, but not to retrieve answers to specific queries, rather for the purpose of shopping, reviews etc.

So I believe that market prioritization may be a big part as to the delayed introduction of Translated Google Search Results. And in terms of numbers, Korea is smaller with only roughly 52M nationwide and continues to decline due to poor birth rates.

Another big factor as I see it, has to do with the complexity of the Korean language which would make it more challenging to build out a translation tool that only replicates a simple English version. We use the modern Korean Hangeul but also the country uses Hanja, which are words from the Chinese origin. I used to have my team use Google Translate until all of them complained that Naver’s Papago does a better job, but with the introduction of ChatGPT, the competitiveness offered by Google was slim.”

Takeaway

It’s not an understatement to say that 2024 has not been a good year for publishers, from the introduction of AI Overviews to the 2024 Core Algorithm Update, and missing image thumbnails on recipe blogger sites, there hasn’t been much good news coming out of Google. But this news is different because it creates the opportunity for publisher content to be shown in even more languages than ever.

Read the updated documentation here:

Translated results in Google Search

Featured Image by Shutterstock/baranq