Google Brings AI Ad Image Editing To Search, Display, & More via @sejournal, @MattGSouthern

Google expands AI-powered ad image editing to more campaigns, enhancing creative capabilities for advertisers across its platform.

  • AI-powered image editing is expanding to search, Display, App, and Demand Gen campaigns.
  • Google’s AI campaign builder is expanding beyond English-speaking markets.
  • Google is balancing AI automation with more granular advertiser controls.
Google Ads Expands AI Campaign Tools To More Languages via @sejournal, @MattGSouthern

Google expands AI search campaign tools to new languages, adds creative capabilities and advertiser controls to optimize performance.

  • Google is rolling out its AI search campaign building tool to German, French, and Spanish.
  • Advertisers get more AI-powered creative tools and customization options across campaigns.
  • New advertiser controls include negative keywords for Performance Max and omnichannel bidding.
Content Decay And Refresh Strategies To Maintain Site Relevancy via @sejournal, @ronlieback

Before I launched my agency, I worked for several others and noticed a troubling trend.

Many focused solely on creating new on-site content for their clients, often neglecting older posts and pages. This was especially common with blogs at a time when the trend was to prioritize quantity over quality.

The situation always reminded me of the “pump-and-dump” strategies in the stock market – short-term mindsets that result sometimes in wins and sometimes in massive losses.

I knew this approach was flawed and ended in what I call “content decay.” When I launched my agency in 2017, I focused on refreshing older content as much as creating new content.

The results immediately impressed – and continued to impress.

For example, earlier this year, one of our commercial pest control clients had an underperforming blog post that was created by a previous agency. The content was decent but lacked many on-page SEO elements, especially header tags and internal links (two were actually dead!).

We updated internal links and all other on-page SEO elements and rewrote around 30% of the content. That single blog post jumped to the top position for target keywords in the target location within six weeks.

After amplifying it on social media, which naturally attracted other shares, quality links, and a Google Business Profile, we were able to attribute nearly $100,000 in new revenue to that one piece of content.

This experience convinced me that content decay is a serious problem for many businesses and needs to be addressed ASAP. This issue also inspired me to restructure our service offerings, making content refresh a core service for our clients.

What Is Content Decay?

Content decay happens when a webpage experiences a gradual decline in traffic over time. This can be due to several factors.

Search engine algorithms are constantly updating, and what worked a year ago may not work today.

New competitors are constantly popping up, creating newer content that may be more aligned with current audience preferences. Additionally, your content may simply become stale.

This problem has worsened with the rise of AI-generated content. Many brands use AI to churn out as much content as possible without a content strategy to keep it fresh and relevant.

With the right content decay strategies, you can combat content decay and ensure your content remains relevant long after you hit “publish.”

Recognizing The Signs Of Content Decay

First, you need to be able to identify content decay before you can fix it.

Pay attention to your engagement metrics and watch for these signs of decaying content:

  • Decrease in organic traffic to that page/post.
  • Lower overall search engine rankings.
  • Outdated information.
  • High bounce rate.
  • Low average time on page.
  • Fewer social shares.
  • Negative user feedback.

Content Decay Strategies That Will Revitalize Your Content

So how do you combat content decay and improve user experience?

Here are a few content decay strategies to revitalize your content and keep it performing well.

Conduct Regular Content Audits

Periodic content audits help you identify underperforming pages or those needing an update.

Tools like Google Analytics, Google Search Console, Semrush, and Ahrefs track page performance and pinpoint content that would benefit from refreshing.

This will improve your content marketing strategy and boost your online presence. When conducting a content audit, I recommend focusing on key metrics like:

  • Organic traffic.
  • Bounce rate.
  • Conversion rate.
  • Time on page.

Update And Refresh Your Existing Content

Remember, you don’t just have to create new content. Sometimes, refreshing older content is a better use of your time and resources. And when combined with new content on a consistent basis, you’ve maximized your potential results.

If you have content that is performing well but could use some tuning, simply update it slightly and republish it with a new date. Content updating doesn’t have to be a daunting task.

Focus on making a few key changes that will make a big difference.

Content updating can be as simple as adding a few sentences or as complex as rewriting entire sections or refreshing internal links that point to better-performing pages (and making sure those better-performing pages also point back!).

No matter the approach, be sure to let Google and other search engines know that you’ve updated your content.

This will help them crawl and index your content more quickly. Here are a few specific content update ideas that reinforce why you or your agency must stay educated on all the latest – I argue weekly because of how fast industries change nowadays.

  • Update outdated statistics.
  • Add new information based on the latest research and developments in your field.
  • Cut the fluff and use shorter sentences and paragraphs to improve the content’s readability and open up “psychological space” that readers can digest more easily.
  • Add more visuals to your content, like images, videos, and infographics. Regarding videos, we constantly try to get company leaders to produce a short video discussing the focus of a blog or service page. The goal is to upload that to YouTube and link back to the article, then embed the video in the actual article itself. This helps in numerous ways, keeping people engaged and helping them become brand loyalists quickly.
  • Ensure your content is optimized for current SEO best practices. This includes using relevant keywords throughout your content and ensuring your website is mobile-friendly.
  • Check for and fix broken links. Broken links can frustrate users and hurt your search engine rankings.
  • Make sure your content is still relevant to your target audience. Your target audience may change over time, and your content needs to reflect that.

Repurpose Outdated Content

Instead of letting older pieces of content gather dust in your archives, give them new life by repurposing them into other formats. This is a great content strategy for getting more mileage from your existing content.

For example, you could turn a blog post into a video, infographic, or even a podcast episode.

When you repurpose content, you make the most of your existing content while also reaching a wider audience. Repurposing content is an effective way to breathe new life into your content and reach a wider audience.

Content Format Repurposing Ideas
Blog post Create an infographic, video, or social media post based on the information. Turn it into a downloadable checklist, template, or worksheet.
Infographic Break it down into smaller, individual visuals for social media. Expand on each point in a series of blog posts or email newsletters.
Video Transcribe the video into a blog post or create short, shareable clips for social media. Extract the audio and create a podcast episode.
Podcast Episode Transcribe the episode and turn it into a blog post or create short, shareable audiograms for social media. Pull out key quotes and create social media graphics.

Sunset Content That’s Past Its Prime

It’s a good rule of thumb to keep high-performing content for as long as possible. However, not all content is worth saving. Content sunsetting is the practice of removing outdated or irrelevant content from your website.

Not all content needs to be updated. If you have a piece of content that’s factually incorrect or no longer relevant to your target audience, it’s usually best to remove it entirely.

However, you can also choose to redirect that URL to a more relevant page on your site rather than deleting it completely.

Make Use Of User Feedback

User feedback can be incredibly valuable when it comes to identifying content decay.

You can gain valuable insights by using tools like Google Analytics and your Search Console, but don’t stop there. Use comments and social media to your advantage, too.

See what people are saying (or not saying) about your content. What resonates with them? What falls flat? This feedback is like gold when figuring out what content to update and refresh.

Consider sending out surveys to your audience, asking what topics they’d like to see covered or what content they find most helpful.

Create A Content Review Schedule

The best way to stay on top of your content refresh efforts is to create a content review schedule and stick to it. Life gets busy, and a schedule will ensure that your content remains relevant and engaging and doesn’t get lost in the shuffle.

For example, you could review all of your website content every quarter and flag any that needs updating. This ensures that you never let a piece of content go stale.

My agency monitors individual pages/posts weekly. Depending on the size of the website, from those producing 25 new pieces of content monthly to three pieces monthly, we overhaul older pieces on different timelines.

For example, for our large website campaign clients with 200+ pages/posts, we overhaul them monthly, say 5 or so. For a smaller website, the pages/posts will be overhauled quarterly.

Regularly Review Your Content, And Make It A Priority

Content decay is a real problem for websites of all sizes.

By implementing these content decay strategies, you can breathe new life into your old content. You’ll make it more relevant to your audience.

Not only that, but you will also improve your search engine rankings and boost traffic to your site. Regularly review your content, and make it a priority to keep things fresh, updated, and engaging.

More resources:


Featured Image: Vitalii Vodolazskyi/Shutterstock

Here’s what I made of Snap’s new augmented-reality Spectacles

Before I get to Snap’s new Spectacles, a confession: I have a long history of putting goofy new things on my face and liking it. Back in 2011, I tried on Sony’s head-mounted 3D glasses and, apparently, enjoyed them. Sort of. At the beginning of 2013, I was enamored with a Kickstarter project I saw at CES called Oculus Rift. I then spent the better part of the year with Google’s ridiculous Glass on my face and thought it was the future. Microsoft HoloLens? Loved it. Google Cardboard? Totally normal. Apple Vision Pro? A breakthrough, baby. 

Anyway. Snap announced a new version of its Spectacles today. These are AR glasses that could finally deliver on the promises devices like Magic Leap, or HoloLens, or even Google Glass, made many years ago. I got to try them out a couple of weeks ago. They are pretty great! (But also: See above)

These fifth-generation Spectacles can display visual information and applications directly on their see-through lenses, making objects appear as if they are in the real world. The interface is powered by the company’s new operating system, Snap OS. Unlike typical VR headsets or spatial computing devices, these augmented-reality (AR) lenses don’t obscure your vision and re-create it with cameras. There is no screen covering your field of view. Instead, images appear to float and exist in three dimensions in the world around you, hovering in the air or resting on tables and floors.

Snap CTO Bobby Murphy described the intended result to MIT Technology Review as “computing overlaid on the world that enhances our experience of the people in the places that are around us, rather than isolating us or taking us out of that experience.” 

In my demo, I was able to stack Lego pieces on a table, smack an AR golf ball into a hole across the room (at least a triple bogey), paint flowers and vines across the ceilings and walls using my hands, and ask questions about the objects I was looking at and receive answers from Snap’s virtual AI chatbot. There was even a little purple virtual doglike creature from Niantic, a Peridot, that followed me around the room and outside onto a balcony. 

But look up from the table and you see a normal room. The golf ball is on the floor, not a virtual golf course. The Peridot perches on a real balcony railing. Crucially, this means you can maintain contact—including eye contact—with the people around you in the room. 

To accomplish all this, Snap packed a lot of tech into the frames. There are two processors embedded inside, so all the compute happens in the glasses themselves. Cooling chambers in the sides did an effective job of dissipating heat in my demo. Four cameras capture the world around you, as well as the movement of your hands for gesture tracking. The images are displayed via micro-projectors, similar to those found in pico projectors, that do a nice job of presenting those three-dimensional images right in front of your eyes without requiring a lot of initial setup. It creates a tall, deep field of view—Snap claims it is similar to a 100-inch display at 10 feet—in a relatively small, lightweight device (226 grams). What’s more, they automatically darken when you step outside, so they work well not just in your home but out in the world.

You control all this with a combination of voice and hand gestures, most of which came pretty naturally to me. You can pinch to select objects and drag them around, for example. The AI chatbot could respond to questions posed in natural language (“What’s that ship I see in the distance?”). Some of the interactions require a phone, but for the most part Spectacles are a standalone device. 

It doesn’t come cheap. Snap isn’t selling the glasses directly to consumers but requires you to agree to at least one year of paying $99 per month for a Spectacles Developer Program account that gives you access to them. I was assured that the company has a very open definition of who can develop for the platform. Snap also announced a new partnership with OpenAI that takes advantage of its multimodal capabilities, which it says will help developers create experiences with real-world context about the things people see or hear (or say).

The author of the post standing outside wearing oversize Snap Spectacles. The photo is a bit goofy
It me.

Having said that, it all worked together impressively well. The three-dimensional objects maintained a sense of permanence in the spaces where you placed them—meaning you can move around and they stay put. The AI assistant correctly identified everything I asked it to. There were some glitches here and there—Lego bricks collapsing into each other, for example—but for the most part this was a solid little device. 

It is not, however, a low-profile one. No one will mistake these for a normal pair of glasses or sunglasses. A colleague described them as beefed-up 3D glasses, which seems about right. They are not the silliest computer I have put on my face, but they didn’t exactly make me feel like a cool guy, either. Here’s a photo of me trying them out. Draw your own conclusions.

Q&A: Joe Natoli, Author of ‘UX Team of One’

Joe Natoli is the co-author of “The User Experience Team of One, Second Edition,” a seminal book to help smaller ecommerce businesses improve customer experience by “doing more with less.”

I asked Natoli, the founder of Give Good UX and a 30-year user-experience consultant, what’s changed in the decade since the book’s acclaimed first edition.

Joe Natoli: A lot — not just in UX, but in business as a whole. Customer expectations across the web have changed. The way we buy products has changed radically. Part of being in any business is the constant necessity to upgrade to meet customer wants, needs, and expectations — everything to do with user experience. If you’re not getting the desired results, there’s a reason. You need to find it.

Joe Natoli

Joe Natoli

In this second edition, we addressed key questions: What do people want? Why do they want it? What should happen here? How do we figure out what’s going to move the needle?

Jean Gazis: How do merchants stay current amid nonstop change?

Natoli: It boils down to audience expectations. People want to buy things in certain ways. There’s no controlling that. When your competitors are there already, you have to get there yesterday. The methods in the book help do that much faster than traditional UX processes.

The time you have to work with determines what you do. If you can carve out a day to talk to customers, do it. But you have to build the functionality. You have to design things that are easily rolled back. Roll it out, test it, watch it. The minute it looks like a bad decision, go back to where you were.

Gazis: Another trade-off is researching ahead of time and testing after.

Natoli: It’s a question of the situation. There are instances where research is unnecessary — for example, a low-risk change that’s quick and does not risk alienating customers. Just put it out there and watch what happens.

If it’s a major change, such as another step to the checkout flow, where shoppers have to validate their information or log in before they can buy, that’s a different story. Research that upfront because it’s high-risk and could halt your sales. But the research doesn’t have to be lengthy.

I tell teams to take what they can get. If you’ve got a day, it’s a day. Something is always better than nothing. Some of the methods in the book are for internal use. If a merchant doesn’t have time for research, that’s fine. Just put yourself in the customer’s place and run through the process.

Gazis: How do you measure the value of UX for ecommerce?

Natoli: There is no excuse for not having basic analytics in place. It’s dead simple — from one line of code on every page. Merchants must understand what they’re measuring and a tool to do it.

It’s easy to assume that everybody knows what they should be asking. I don’t think that’s the case. In the book we try to walk through the process: “What questions do I ask? Where do I start? How do I find these things out?” It’s about thinking before deciding. Figure out what is worth doing and what to avoid. I’ve seen countless ecommerce sites ruin their checkout, believe it or not.

Gazis: What are the critical UX aspects for ecommerce?

The User Experience Team of One, Second Edition

Natoli: Merchants have to remove every element of friction. You have impatient shoppers looking to buy a product. Their wallets are out, and they’re thinking, “The minute I find this, I’m going to buy it.” Your content should reflect “here’s what’s in it for you.”

You can’t just make claims. You need to show people what they’re getting. So the UX of an ecommerce site has to prove why a product is worthwhile right now. Answer in a prominent manner, “Why this is worth my money? Why it’s worth my time? How is it useful and valuable to me?”

That’s what I mean by friction. A checkout process has friction if it runs counter to conventional expectations of what happens first, what happens next, how much information you’re asking for, and when you’re asking. Any time the checkout contains something unexpected, that’s friction. Shoppers’ brains are used to a pattern. It’s habit and reflex. The minute something breaks that pattern, it’s a moment of doubt.

Ease of use separates one ecommerce site from another. How easy can people do business? Not investing time and money in the user experience is the most short-sighted thing I can think of.

15 AI WordPress Plugins for SEO, Content, Code, More

With approximately 60,000 free plugins, WordPress is a content management system with near-endless customization. The recent wave of development in artificial intelligence has produced new plugins for creating and managing a WordPress site.

Here is a list of AI-powered WordPress plugins for building a website, generating content, designing forms, managing search engine optimization, and helping customers.

AI Plugins for WordPress

Jetpack from Automattic includes tools for security, performance, marketing, and design. Jetpack AI Assistant provides an intuitive chat interface. Provide a prompt, and Jetpack AI Assistant generates compelling blog posts, detailed pages, structured lists, and comprehensive tables. Adjust the content tone to your audience. Use the AI assistant to translate text into numerous languages. Jetpack’s Write Brief With AI is an AI-powered writing tool for the WordPress block editor.

Web page of Jetpack

Jetpack

Uncanny Automator connects plugins and automates a WordPress site without code. Connect posts and site activity to OpenAI, with support for ChatGPT and Dall-E models. Use Automator and OpenAI to generate email campaigns, featured images, social media posts, translations, forms, search-optimized descriptions, and more.

Tidio is a customer experience plugin with live chat, chatbots, and multichannel communication. Automatically answer most visitor questions with the Lyro AI chatbot. Lyro quickly learns from your FAQs and formulates complex answers to solve problems. The AI stays within the boundaries of your knowledge base and is easily updated.

Bertha AI is a content assistant for generative AI on WordPress and Chrome. Built on multiple large language models, it generates conversion-focused images, titles, product descriptions, and blog posts that are SEO optimized. Betha AI works with major page builders and WordPress SEO tools.

Web page of Bertha AI

Bertha AI

Divi AI is a website builder that can create pages with a simple prompt. Use AI to generate layouts, write content and code, and generate images. Write and improve text-based content on demand. Generate content, then jump into Divi’s visual builder to make edits. Divi AI analyzes existing content and website details to recommend the content you should add next.

All in One SEO is a plugin and marketing toolkit for optimizing settings, adding local info, tracking keyword rankings, performing SEO audits, automating internal linking, and more. Generate SEO titles and meta descriptions via ChatGPT.

Elementor is a website builder for WordPress. It features a drag-and-drop editor, template library, advanced design tools, and 40-plus free widgets to customize your design. It also features native AI integration to amplify the design and content creation process. Instantly create sections, text, code, and images, or reference layouts from other websites.

Web page of Elementor

Elementor.

Starter Templates is a website builder that integrates with ZipWP, an AI website creation platform. Select the type of website you’re creating and provide the business name and description. Then use AI to refine your description and produce the design, copy, and visuals to match your brand.

Formidable Forms creates surveys, polls, quizzes, calculators, and forms for registration, payment, leads, and email signups. Utilize ChatGPT to automate form submission responses and engage users with advanced date pickers and dynamic field relationships for a smart and intuitive user experience. Use AI to answer questions, recommend products, and offer services.

SEOPress optimizes a WordPress site for organic search rankings. Use it to build custom HTML and XML sitemaps, add structured data, create optimized breadcrumbs, and improve social sharing. Use AI to generate metadata and alternative texts for image files. Bulk actions are supported.

GetGenie AI is an AI-powered content and SEO assistant. Use it to generate search-optimized text, keywords, images, and analysis. Perform natural-language-processing keyword research, run head-to-head analysis, and develop content. GetGenie leverages GPT-3.5, GPT-4, DaVinci, and advanced AI algorithms.

Web page of GetGenie AI

GetGenie AI

AI Power provides tools to generate content, images, and forms with customizable options. Generate quality, longer articles using OpenAI’s GPT language models, GPT-3.5 Turbo, GPT-4 and GPT-4o. Improve product descriptions with WooCommerce integration. Convert your text into lifelike speech with ElevenLabs, Google, and OpenAI text-to-speech integrations. Generate large volumes of content instantly with the bulk editor tool.

Yoast SEO is a plugin with comprehensive analysis tools to elevate search engine performance and content readability. Yoast AI provides premium features for higher click-through rates. AI Generate enables users to produce meta descriptions, page titles, and blog and social posts. AI Optimize improves existing content for search engines, featuring SEO analysis with easy dismiss or apply options.

Translate WordPress with GTranslate uses AI (particularly neural machine translation) and deep learning techniques to provide accurate and context-aware human-level translations. The system continuously improves through user interactions as the model adapts to new phrases, slang, and language usage trends. GTranslate has 103 languages and is SEO-compatible.

AI Engine offers tools for translation, suggestions, and SEO. Create your own chatbot, craft content and images, coordinate AI-related work using templates, leverage the AI Copilot for faster work, track statistics and usage, and more.

Web page of AI Engine

AI Engine

Winning At Bidding: Tips For Effective Google Shopping Bid Management via @sejournal, @brookeosmundson

Google Shopping ads can be a powerful revenue driver – but to get the most out of it, you need to master bid management.

Whether you’re an in-house marketer or working at an agency, effective bid management is crucial for scaling success.

However, understanding how to optimize and adjust bids effectively in Google Shopping ads can be challenging, especially with all the different settings and levers that can be pulled!

Google Shopping has come a long way since its original inception in 2022 – and up until 2012, Google Shopping was free!

It seems that every year, more is needed to win the bidding war against rising costs in the Google Ads platform and keep brands’ profitability in check.

In this article, we’ll explore the strategies, tools, and best practices that can help you win the bidding war and maximize ROI on your Google Shopping campaigns.

1. Understanding The Google Shopping Auction Model And Its Impact On Bidding

Before we jump into bid management strategies, it’s essential to understand how Google Shopping works behind the scenes. Unlike Search campaigns, Google Shopping doesn’t rely on keywords to trigger ads.

Instead, product listing ads (PLAs) appear based on a combination of your product feed data and the user’s search intent. Google uses a unique auction system, and your bids interact with factors like relevance, user behavior, and other competitors in the space.

The Role Of Quality Score In Google Shopping

Quality Score plays a role in Google Shopping bid management, but a bit differently from Search campaigns.

Factors such as the product feed quality, landing page relevance, and historical campaign performance can influence how often your ads appear and at what cost. Here’s how to ensure you’re optimizing for Quality Score in Shopping:

  • Product Feed Optimization: Ensure that your product titles, descriptions, and attributes are clear and relevant.
  • Accurate Categorization: Place your products in the most appropriate categories for better relevance.
  • Optimized Landing Page: Make sure the page that users land on after clicking the ad is optimized for a better user experience, and don’t forget about mobile!

How Bid Amount Affects Visibility

Higher bids don’t always guarantee visibility, and low bids don’t always exclude you from auctions.

It’s a balance of ensuring your product feed is optimized while bidding strategically based on the product’s potential to convert.

Bidding strategies should reflect the actual performance of your products and overall business goals related to those campaigns.

2. Craft A Strategic Bidding Approach

One of the first decisions you need to make when managing Google Shopping bids is whether to rely on manual or automated bidding.

Both approaches have advantages depending on your business objectives, campaign budget, and the scale of your operations.

  • Manual Bidding: This gives you more control, allowing you to adjust bids based on performance. For example, if you notice that certain products are underperforming, you can reduce their bids to allocate budget to higher-performing products.
  • Automated Bidding: Automated strategies like Maximize Conversion Value or Target ROAS (Return on Ad Spend) use machine learning to adjust your bids dynamically based on real-time auction signals. These can be ideal for large product catalogs or when performance data is inconsistent across different products.

Google has added more automated bidding strategies over the years, making it easier to effectively bid based on your business goals.

However, the added complexity of choosing Standard Shopping campaigns versus the newer Performance Max campaign type allows for different bid strategies.

If choosing Standard Shopping campaigns, you have the option of these two automated bid strategies:

  • Maximize Clicks: Helps you get as many clicks as possible within your target daily budget.
  • Target ROAS: Helps you maximize conversion value while reaching an average return on ad spend that you choose.
Standard Shopping Bid StrategiesScreenshot from author, August 2024

If you choose to set up a Performance Max campaign with your product feed linked, you have the option of more bid strategies:

  • Maximize Conversions: Helps generate the most amount of conversions within your daily budget, regardless of conversion value.
  • Maximize Conversion Value: Helps generate the highest conversion value within your daily budget.

<span class=

Additionally, Performance Max campaigns have the optional “Target ROAS” input to yield a little bit more control over your campaign bid strategy.

Optional <span class=

Lastly, you now have the option to choose how to bid for acquiring new customers – a very welcome addition to further maximize those ad dollars!

In Google Ads, you can choose to either bid higher for new customers than existing customers. Or, you can bid for new customers only.

Customer Acquisition Bid Strategy in Google AdsScreenshot from author, August 2024

For Google Shopping campaigns specifically, you may want to choose to bid higher for new customers instead of excluding them altogether, especially if your brand is used to having repeat customers.

This essentially means you’re willing to pay more to get a new customer, knowing they will likely purchase again in the future, leading to incremental revenue.

For higher-ticket items that users may only purchase once every few years, it may be worthwhile to choose ‘bid for new customers only’.

At the end of the day, make sure to choose the customer acquisition strategy that aligns with your business goals.

Segmentation And Granularity In Bidding

A key component to effective bid management is to segment your campaigns and ad groups properly. This allows for more granular control over bids and enables better performance optimization.

  • Product-Level Bidding: Rather than bidding at the campaign or ad group level, product-level bidding allows you to adjust bids based on each product’s unique performance metrics. Products that generate more conversions or revenue should receive higher bids, while underperforming products can have bids scaled back.
  • Segment by Profit Margin or Price Point: Grouping products based on their profit margins or price points can help you adjust bids based on the product’s value to your business. High-margin products may justify higher bids since they offer better ROI.
  • Seasonality and Time Sensitivity: Adjust bids based on trends in user behavior throughout the year. For instance, products may perform better during certain seasons or promotional events, requiring temporary bid increases.

3. Use Your Own Data & KPIs To Inform Your Bid Management

Knowing which metrics to monitor is critical for making informed bidding decisions. Below are the core KPIs to watch closely:

  • Cost Per Conversion (or CPA) and Return On Ad Spend (ROAS): These two metrics provide insights into your campaign’s efficiency. You want to identify which products or campaigns have the highest ROAS and optimize bidding for those.
  • Impression Share and Click-Through Rate (CTR): These metrics can give you a sense of how your bids are affecting visibility. If you’re seeing low impression shares on profitable products, it may be time to increase your bids.
  • Conversion Rate: Analyze conversion rates to identify which products are most likely to turn clicks into sales, then adjust your bids accordingly.
  • Lifetime Value (LTV) and Customer Acquisition Costs (CAC): If your business has repeat purchases, focusing on lifetime value can give you an advantage when bidding on products that may have lower immediate returns but higher long-term value.

Knowing these KPIs for your business can help shape your bid management strategy and make strategic changes based on your Google Shopping performance in comparison to your business’s set KPIs.

For example, if you have an average conversion rate of 4% as a whole, but your Google Shopping campaigns are only providing a 2% conversion rate, that may tell you something needs to be optimized.

You may need to take a look at the keywords your products are showing up for and do some negative keyword management. Additionally, maybe your ads aren’t reaching the right users and you need to further refine audience targeting within your campaigns.

Another example of using your own data is knowing your profit margin for products. If you have a subset of products that have a high-profit margin, you can add a custom label into your product feed that denotes those products.

From there, you can segment your campaigns to have a higher priority on those particular products or choose to bid higher on them because they’re worth more to you and your business.

Leveraging Bid Simulators And Other Tools

Google also provides several tools that can help inform your bid decisions:

  • Bid Simulators: These help you understand how different bid levels would impact your impression share, clicks, and conversions. You can use this data to adjust your bids in a way that maximizes your return without overspending.
  • Custom Labels: By using custom labels in your product feed, you can segment your products by performance, seasonality, or promotion. This way, you can quickly adjust bids based on these factors.
  • Scripts and Third-Party Tools: Tools like Optmyzr or custom Google Ads scripts can automate bid adjustments based on performance data, allowing you to focus on strategy rather than manual labor.

4. Optimize Your Bid Management For Long-Term Success

Bid management is not a “set it and forget it” exercise. Continual testing, analyzing, and adjusting are necessary to maintain and improve campaign performance over time.

Bid management should also not be used for daily volatile changes at the campaign level.

If you’re micromanaging performance each day and changing bid strategies too often, you may end up with sub-optimal performance because you’re not giving Google enough time to learn and optimize based on performance.

It’s about finding a healthy balance between “set and forget” and “over-optimizing”. Going back to point #3, knowing the values of your core business metrics and goals makes it easier to react to performance swings and know when to take action.

Some ways to optimize for long-term success include:

  • A/B Testing on Bids: Running A/B tests on bid adjustments allows you to assess the impact of bid changes without risking your entire budget. Test different bidding strategies on subsets of your campaigns or products to see what delivers the best performance.
  • Seasonal Adjustments: Stay ahead of trends by adjusting your bids before key periods like Black Friday or holiday shopping spikes. Predictive adjustments can help you capture market share before your competitors ramp up.
  • Monitor Competitor Behavior: Keep an eye on your competition. If you notice that competitors are aggressively bidding on certain products, you may need to adjust your strategy to compete, either by raising bids or adjusting product listings.

Prepare For Future Changes In Google Ads

Google Shopping is constantly evolving, and as machine learning models become more sophisticated, the way bids are managed will continue to change.

Staying informed about new features, tools, and best practices will help you maintain a competitive edge. Subscribe to updates, attend industry events, and engage with the community to keep your knowledge fresh.

Summary

Google Shopping bid management requires a balance between data-driven strategies, an understanding of the auction system, and a willingness to experiment and adapt.

By leveraging the tips and strategies discussed in this article, you’ll be better equipped to navigate the complexities of Google Shopping and drive profitable growth for your brand or clients.

Keep refining your approach, test new strategies, and stay on top of Google Shopping developments to stay competitive and ahead of the bid war.

More resources: 


Featured Image: voronaman/Shutterstock

Page Speed Insights: 6 Powerful Tips To Optimize Your Website via @sejournal, @DebugBear

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

Having a fast website is important not just to provide a great experience for visitors, but also as an SEO ranking factor.

You’ve probably heard of Google’s PageSpeed Insights tool before.

But do you know how to get the most out of PageSpeed Insights? We’ll look at 6 key tips to help you optimize your website performance.

What Is PageSpeed Insights (PSI)?

Website performance has long impacted Google rankings. Accordingly, Google first launched its free PageSpeed Insights tool back in 2010.

PSI is built to help website operators check how fast their website is as well as provide recommendations for how to improve it.

Why Does Page Speed Matter For SEO?

In 2021, Google introduced a new set of website performance metrics, called the Core Web Vitals. The three metrics are:

  • Largest Contentful Paint: how fast does your website load?
  • Cumulative Layout Shift: do page elements move around unexpectedly?
  • Interaction to Next Paint: does the page respond to user input quickly?

A good page experience is rewarded in Google rankings. There’s a “Good” rating threshold for each metric that you need to reach.

Graphic showing Core Web Vitals rating thresholds, September 2024

How To Test Your Website With PageSpeed Insights

Running a performance test with PageSpeed Insights is easy:

  1. Open PageSpeed Insights
  2. Enter your website URL
  3. Click “Analyze”

Test results will appear in just a few seconds. There’s a lot of data, but we’ll explain what it all means next.

Screenshot of test result on PageSpeed Insights, September 2024

1. Understand Where PageSpeed Insights Data Comes From

Each test result on PageSpeed Insights consists of two key sections: “Discover what real users are experiencing” and “Diagnose performance issues”. Each section shows a different type of page speed data.

What Is The Real User Data In PageSpeed Insights?

The real user data in PSI comes from the Chrome User Experience Report (CrUX).

This data is collected from Chrome users on desktop devices and on mobile devices running Android. To contribute to the CrUX report, users need to:

  • Be logged into their Google account
  • Have opted into browser history synchronization
  • Have enabled usage statistics reporting

Wondering if your experiences are included in this real user data? Open the chrome://ukm URL in your Chrome browser and check if metrics collection is enabled.

The real user tells you how fast your website is for actual visitors and how it’s impacting your SEO.

However, the CrUX report also comes with some limitations:

  • Data is always aggregated over a 28-day period, so you won’t immediately see if your website is getting worse
  • You can see how fast your website is, but CrUX does not tell give you any diagnostic data to speed it up
  • Not every page on your website will have CrUX data, as a minimum number of recorded visits has to be reached before Google publishes the data.

You can use a real user monitoring (RUM) tool to get around these limitations. RUM data has several advantages over CrUX data, like instant updates and detailed diagnostics.

Screenshot of a Core Web Vitals trendline in DebugBear real user monitoring, September 2024

What Is The Diagnostic Data In PageSpeed Insights?

While the real user data tells you how well your site is doing, the diagnostic data gives you insight into how to optimize it.

PageSpeed Insights uses Google’s open source Lighthouse tool to test your website and provide a detailed analysis. A Lighthouse test is run in a controlled lab environment, which can means that a lot more information information can be collected compared to real user data.

The lab-test is also run on-demand, and is not subject to the 28-day delay that applies to CrUX data.

At the top of the Lighthouse report Google’s shows an overall Performance score between 0 and 100. This score does not directly impact rankings – Google uses CrUX data for that. However, a good Lighthouse score usually means that your website is also loading quickly for real users.

The Lighthouse score itself determined based on 5 performance metrics:

  • First Contentful Paint: how quickly does the page start loading?
  • Largest Contentful Paint: when does the main page content show up?
  • Total Blocking Time: are user interactions blocked by CPU processing?
  • Cumulative Layout Shift: does content move around after it appears?
  • Speed Index: how quickly does the page content render overall?
Screenshot of performance metrics in PageSpeed Insights, September 2024

Below the overall Lighthouse assessment you can find diagnostic insight that suggests concrete changes you can make to optimize your website.

Each row audits one particular aspect of your performance. For example, if you eliminate render-blocking resources then it will take less time for page content on your website to become visible.

Screenshot of performance diagnostics in PageSpeed Insights, September 2024

2. Use The Score Calculator To See What’s Dragging Your Score Down

If you want to improve your Performance score on PageSpeed Insights, where do you start?

Every Lighthouse report includes a “View Calculator” link that takes you to the Lighthouse Scoring Calculator. This tool tells you how much of the five metrics that Google has measured is contributing to the overall score.

For example, here we can see that the page we’ve tested has a good Cumulative Layout Shift score, while the Largest Contentful Paint receives a poor rating.

We can also see that each metric is assigned a weight. For example, 30% of the Performance score is determined by the subscore for the Total Blocking Time metric.

Screenshot of the Lighthouse Scoring Calculator, September 2024

3. Review Phase Data For The Largest Contentful Paint (LCP) Metric

One of the most insightful audits is often the “Largest Contentful Paint element.”

This audit shows you the largest content element on the page. The LCP metric measures how long it takes after opening the page for this element to become visible. The largest content element can be any type of page content, for example, a heading or an image.

That’s very useful, but Lighthouse actually provides additional insight by breaking the LCP metric down into four phases (also called subparts):

  • Time to First Byte (TTFB): how quickly does the website server provide the HTML document?
  • Load Delay: How soon after loading the document does the LCP image start downloading
  • Load Time: How long does it take to download the LCP image?
  • Render Delay: How soon after loading the LCP resource does the LCP element become visible?

This information will tell you where you need to focus on your optimization.

For example, in the screenshot below, we can see that the LCP image loaded quickly but then wasn’t rendered right away by the browser. That could be because other resources on the page were blocking the page from rendering.

Screenshot of the Lighthouse Largest Contentful Paint element audit, September 2024

Google recently ran an analysis to find out what LCP subparts contribute the most to the overall metric value. They found that server response time and image load delay are the biggest factors in LCP optimization for most websites.

While many website performance recommendations have focused on using compact modern image formats, image load time was found to be a minor factor on most slow websites.

However, you should still check the data for your website to see what optimizations can have the most impact.

4. Performance Score Variability Between Tests: What Does It Mean?

We’ve already seen that the real user CrUX data is aggregated over a 28-day period. Accordingly, its value is stable and only changes very gradually.

But the same can’t be said about the Performance score and other metrics measured in the lab. Testing the same page twice will rarely result in the exact same measurements, and often will show high variation. And if you run Lighthouse with other tools like Chrome DevTools you’re likely to see even bigger differences.

There are many reasons for differences between Lighthouse tests, for example:

  • Differences in server response time
  • Variation in content, for example due to A/B tests or advertisements
  • Differences across test devices and test locations
  • Inaccuracies during data collection

Google has written a detailed guide on Lighthouse variability. You can run tests several times and look at the average to get a more consistent assessment.

Data Accuracy: Observed Vs Simulated Data

One common reason for discrepancies between page speed testing tools is the way the data is collected. In a lab test the network is throttled to a fixed speed, typically to match a slower mobile data connection. The way this throttling is achieved can impact your measurements.

PageSpeed Insights uses an approach called simulated throttling. Measurements are collected on a fast network connection. After that, a simulation of a slow 4G connection is applied to estimate how the page might have loaded on a mobile device.

You can install the Site Speed Chrome extension to view the original observed metrics when running a test on PageSpeed Insights.

Screenshot of Lighthouse reported and observed metrics, September 2024

Simulated data can sometimes be unreliable, as the Lighthouse simulation doesn’t handle all real life edge cases that can happen when opening a website.

For example, in this test we can see that the Largest Contentful Paint metric is reported as one second worse than the values observed when opening the page in Chrome.

However, the original values for the First Continental Paint and for the Largest Contentful Paint metrics were identical. This suggests that the simulated metrics could potentially not match what real users experience.

You can check the settings section of the Lighthouse report to see if the metrics were measured as reported or if a simulation has been applied.

Screenshot of Lighthouse settings, September 2024

If you want to get reliable page speed data, the free DebugBear page speed test is built to provide the most accurate insight. Collecting real measurements takes a bit longer than running a simulation, but it will also help you make the best decisions when optimizing your website speed.

Why Des The Real User Data Not Match The Lighthouse Test Rresults?

When testing your website on PageSpeed Insights you’ll often find that the real user metrics are much better than those reported by the synthetic Lighthouse test. Why is that?

That’s because the Lighthouse test uses a very slow network connection. The CrUX Core Web Vitals data looks at the slowest 25% of user experiences on your website, but typically, even those visits come from a device that has a decent network connection.

So, a bad Lighthouse performance score doesn’t necessarily mean that you’ll fail Google’s Core Web Vitals assessment. But it can indicate that some users are having a poor experience and that there’s more room for improvement.

Screenshot of real user and lab-based performance metrics in PageSpeed Insights, September 2024

5. Use The PSI API To Automate Performance Testing

Got a lot of pages on your website you want to test? You can use the PageSpeed Insights API to automatically run website tests in bulk.

The API provides more detailed performance metrics and details on each Lighthouse audit. For example, you can use the API to see the most common performance recommendations across your website.

There’s even a way to access PageSpeed Insights data directly in Google Sheets.

JSON page speed data reported by the PageSpeed Insights API, September 2024

6. Know When To Reach For A Different Tool

PageSpeed Insights is a great tool to run a quick performance for a specific URL on your website. However, as we’ve seen above, this data comes with some limitations.

If you just want to get a site-wide overview of Core Web Vitals on your website, the quickest way to find this data is using Google Search Console.

Search Console will show you exactly how many pages on your website are slow or need to be improved.

Screenshot of Core Web Vitals data in Google Search Console, September 2024

Need to dive deep into CPU performance, for example to optimize the new Interaction to Next Paint metric?

The Performance tab in Chrome’s developer tools provides a detailed analysis of all kinds of CPU processing that happens on your website.

Screenshot of a website performance profile in Chrome DevTools, September 2024

Finally, if you want to optimize how different resources are loaded on your website, the DebugBear website speed test can be invaluable.

This test can provide a detailed report on what resources are loaded by your website, when they load, and how they impact rendering.

Screenshot of a website request waterfall in DebugBear, September 2024

How To Always Stay Ahead Of Your Website Speed

PageSpeed Insights and other performance tests are a great starting point for optimizing your website. However, without continuous monitoring, you risk reintroducing problems without noticing.

DebugBear is a monitoring platform for Core Web Vitals that lets you continuously test both your own website and those of your competitors.

Screenshot of the DebugBear performance dashboard, September 2024

In addition to scheduled lab testing, DebugBear also keeps track of Google CrUX data and collects real user analytics directly on your website.

The real user data provides a wide range of insight to not just help you keep track of performance but actively improve it:

  • See what LCP subpart is causing the biggest delay for your visitors
  • Find specific interactions and scripts that cause a poor Interaction to Next Paint score
  • Identify specific countries or devices where performance is worse than usual
Screenshot of real user monitoring data in DebugBear, September 2024

Deliver A Great User Experience

PageSpeed Insights is a helpful tool for any website owner, not just telling you how fast your website is in the real world, but also giving you concrete advice on how to optimize it.

However, if you’d like to go beyond the data PSI provides and test your website continuously, you can sign up for a free 14-day DebugBear trial.

This article has been sponsored by DebugBear, and the views presented herein represent the sponsor’s perspective.

Ready to start optimizing your website? Sign up for DebugBear and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by DebugBear. Used with permission.

Google is funding an AI-powered satellite constellation that will spot wildfires faster

Early next year, Google and its partners plan to launch the first in a series of satellites that together would provide close-up, frequently refreshed images of wildfires around the world, offering data that could help firefighters battle blazes more rapidly, effectively, and safely.

The online search giant’s nonprofit and research arms have collaborated with the Moore Foundation, the Environmental Defense Fund, the satellite company Muon Space, and others to deploy 52 satellites equipped with custom-developed sensors over the coming years. 

The FireSat satellites will be able to spot fires as small as 5 by 5 meters (16 by 16 feet) on any speck of the globe. Once the full constellation is in place, the system should be capable of updating those images about every 20 minutes, the group says.

Those capabilities together would mark a significant upgrade over what’s available from the satellites that currently provide data to fire agencies. Generally, they can provide either high-resolution images that aren’t updated rapidly enough to track fires closely or frequently refreshed images that are relatively low-resolution.

The Earth Fire Alliance collaboration will also leverage Google’s AI wildfire tools, which have been trained to detect early indications of wildfires and track their progression, to draw additional insights from the data.

The images and analysis will be provided free to fire agencies around the world, helping to improve understanding of where fires are, where they’re moving, and how hot they’re burning. The information could help agencies stamp out small fires before they turn into raging infernos, place limited firefighting resources where they’ll do the most good, and evacuate people along the safest paths.

“In the satellite image of the Earth, a lot of things can be mistaken for a fire: a glint, a hot roof, smoke from another fire,” says Chris Van Arsdale, climate and energy research lead at Google Research and chairman of the Earth Fire Alliance. “Detecting fires becomes a game of looking for needles in a world of haystacks. Solving this will enable first responders to act quickly and precisely when a fire is detected.”

Some details of FireSat were unveiled earlier this year. But the organizations involved will announce additional information about their plans today, including the news that Google.org, the company’s charitable arm, has provided $13 million to the program and that the inaugural launch is scheduled to occur next year. 

Reducing the fog of war

The news comes as large fires rage across millions of acres in the western US, putting people and property at risk. The blazes include the Line Fire in Southern California, the Shoe Fly Fire in central Oregon, and the Davis Fire south of Reno, Nevada.

Wildfires have become more frequent, extreme, and dangerous in recent decades. That, in part, is a consequence of climate change: Rising temperatures suck the moisture from trees, shrubs, and grasses. But fires increasingly contribute to global warming as well. A recent study found that the fires that scorched millions of acres across Canada last year pumped out 3 billion tons of carbon dioxide, four times the annual pollution produced by the airline industry.

GOOGLE

Humans have also increased fire risk by suppressing natural fires for decades, which has allowed fuel to build up in forests and grasslands, and by constructing communities on the edge of wilderness boundaries without appropriate rules, materials, and safeguards

Observers say that FireSat could play an important role in combating fires, both by enabling fire agencies to extinguish small ones before they grow into large ones and by informing effective strategies for battling them once they’re crossed that point.

“What these satellites will do is reduce the fog of war,” says Michael Wara, director of the climate and energy policy program at Stanford University’s Woods Institute for the Environment, who is focused on fire policy issues. “Like when a situation is really dynamic and very dangerous for firefighters and they’re trying to make decisions very quickly about whether to move in to defend structures or try to evacuate people.” 

(Wara serves on the advisory board of the Moore Foundation’s Wildfire Resilience Initiative.)

Some areas, like California, already have greater visibility into the current state of fires or early signs of outbreaks, thanks to technology like Department of Defense satellites, remote camera networks, and planes, helicopters, and drones. But FireSat will be especially helpful for “countries that have less-well-resourced wildland fighting capability,” Wara adds.

Better images, more data, and AI will not be able to fully counter the increased fire dangers. Wara and other fire experts argue that regions need to use prescribed burns and other efforts to more aggressively reduce the buildup of fuel, rethink where and how we build communities in fire-prone areas, and do more to fund and support the work of firefighters on the ground. 

Sounding an earlier alarm for fires will only help reduce dangers when regions have, or develop, the added firefighting resources needed to combat the most dangerous ones quickly and effectively. Communities will also need to put in place better policies to determine what types of fires should be left to burn, and under what conditions.

‘A game changer’

Kate Dargan Marquis, a senior wildfire advisor to the Moore Foundation who previously served as state fire marshal for California, says she can “personally attest” to the difference that such tools will make to firefighters in the field.

“It is a game changer, especially as wildfires are becoming more extreme, more frequent, and more dangerous for everyone,” she says. “Information like this will make a lifesaving difference for firefighters and communities around the globe.”

Kate Dargan Marquis, senior advisor, Moore Foundation.
GOOGLE

Google Research developed the sensors for the satellite and tested them as well as the company’s AI fire detection models by conducting flights over controlled burns in California. Google intends to work with Earth Fire Alliance “to ensure AI can help make this data as useful as possible, and also that wildfire information is shared as widely as possible,” the company said.

Google’s Van Arsdale says that providing visual images of every incident around the world from start to finish will be enormously valuable to scientists studying wildfires and climate change. 

“We can combine this data with Google’s existing models of the Earth to help advance our understanding of fire behavior and fire dynamics across all of Earth’s ecosystems,” he says. “All this together really has the potential to help mitigate the environmental and social impact of fire while also improving people’s health and safety.”

Specifically, it could improve assessments of fire risk, as well as our understanding of the most effective means of preventing or slowing the spread of fires. For instance, it could help communities determine where it would be most cost-effective to remove trees and underbrush. 

Figuring out the best ways to conduct such interventions is another key goal of the program, given their high cost and the limited funds available for managing wildlands, says Genny Biggs, the program director for the Moore Foundation’s Wildfire Resilience Initiative.

The launch

The idea for FireSat grew out of a series of meetings that began with a 2019 workshop hosted by the Moore Foundation, which provided the first philanthropic funding for the program. 

The first satellite, scheduled to be launched aboard a SpaceX rocket early next year, will be fully functional aside from some data transmission features. The goals of the “protoflight” mission include testing the onboard systems and the data they send back. The Earth Fire Alliance will work with a handful of early-adopter agencies to prepare for the next phases. 

The group intends to launch three fully operational satellites in 2026, with additional deployments in the years that follow. Muon Space will build and operate the satellites. 

Agencies around the world should be able to receive hourly wildfire updates once about half of the constellation is operational, says Brian Collins, executive director of the Earth Fire Alliance. It hopes to launch all 52 satellites by around the end of this decade.

Each satellite is designed to last about five years, so the organization will eventually need to deploy 10 more each year to maintain the constellation.

The Earth Fire Alliance has secured about two-thirds of the funding it needs for the first phase of the program, which includes the first four launches. The organization will need to raise additional money from government agencies, international organizations, philanthropies, and other groups  to deploy, maintain, and operate the full constellation. It estimates the total cost will exceed $400 million, which Collins notes “is 1/1000th of the economic losses due to extreme wildfires annually in the US alone.”

Asked if commercial uses of the data could also support the program, including potentially military ones, Collins said in an email: “Adjacent applications range from land use management and agriculture to risk management and industrial impact and mitigation.” 

“At the same time, we know that as large agencies and government agencies adopt FireSat data to support a broad public safety mandate, they may develop all-hazard, emergenc[y] management, and security related uses of data,” he added. “As long as opportunities are in balance with our charter to advance a global approach to wildfire and climate resilience, we welcome new ideas and applications of our data.”

‘Living with fire’

A wide variety of startups have emerged in recent years promising to use technology to reduce the frequency and severity of wildfires—for example, by installing cameras and sensors in forests and grasslands, developing robots to carry out controlled burns, deploying autonomous helicopters that can drop suppressant, and harnessing AI to predict wildfire behavior and inform forest and fire management strategies

So far, even with all these new tools, it’s still been difficult for communities to keep pace with the rising dangers.

Dargan Marquis—who founded her own wildfire software company, Intterra—says she is confident the incidence of disastrous fires can be meaningfully reduced with programs like FireSat, along with other improved technologies and policies. But she says it’s likely to take decades to catch up with the growing risks, as the world continues warming up.

“We’re going to struggle in places like California, these Mediterranean climates around the world, while our technology and our capabilities and our inventions, etc., catch up with that level of the problem,” she says. 

“We can turn that corner,” she adds. “If we work together on a comprehensive strategy with the right data and a convincing plan over the next 50 years, I do think that by the end of the century, we absolutely can be living with fire.”

Attributing Marketing Expenses by Channel

Some marketers now measure and attribute the cost of labor, technology, and services to individual promotional channels.

The concept is simple. Many companies track only revenue from marketing channels without considering the expense of managing them. The result is often misleading bottom-line performance.

Channel Comparison

Imagine a business with two marketing channels, A and B, each costing $1,000. Both generate 3,000 interactions from potential customers. However, Channel A converts at 2.5%, while Channel B converts at 4%.

If both channels had a $75 average order value and a 25% gross profit margin, Channel A would produce $406 in profit, and Channel B would earn $1,250. Channel B is the clear winner when compared in this way.

Channel A Channel B
Promotional Cost $1,000 $1,000
Interactions 3,000 3,000
Conv. Rate 2.50% 4.00%
Orders 75 120
Avg. Order Value $75 $75
Sales Generated $5,625 $9,000
Margin 25% 25%
Gross Revenue $1,406 $2,250
Profit $406 $1,250

Just about every business would take the $1,000 invested in Channel A and double down on Channel B. After all, Channel B produces about three times as much profit.

This is often the proper choice, but not always.

Marketing Budgets

There’s more to marketing expenses than advertising or accessing a channel.

There are salaries for the marketing team, software subscriptions, creative design expenses, and even influencer fees.

Let’s apply this idea to Channel A and Channel B. Suppose each channel is a demand-side platform (DSP), wherein marketers choose from a list of potential publishers.

DSP A allows marketers to pick some basic targeting demographics, but there’s little a specialist could do to optimize performance. It is a set-it-and-forget-it sort of platform.

On the other hand, DSP B has 100 targeting options that can be compared, fine-tuned, and optimized.

DSP B’s platform provides real-time data with Slack notifications every time a campaign’s conversion rate changes.

The marketing specialist spends about 30 minutes a month setting up the simplistic DSP A but about an hour a day monitoring, studying, and tweaking DSP B.

If the marketing specialist earns $50 an hour, DSP A costs about $25 per month in labor. Given 20 working days a month and an hour per day spent monitoring and optimizing, DSP B takes $1,000 in labor to run.

When counting labor, DSP A generates $381 in profit compared to DSP B’s $250. DSP A is the clear winner.

DSP A DSP B
Promotional Cost $1,000 $1,000
Interactions 3,000 3,000
Conv. Rate 2.50% 4.00%
Orders 75 120
Avg. Order Value $75 $75
Sales Generated $5,625 $9,000
Margin 25% 25%
Gross Revenue $1,406 $2,250
Labor Cost $25 $1,000
Profit $381 $250

Applying the Concept

Beyond labor, other expenses — e.g., software, creative design, agency retainers — could also change a channel’s return on investment, although not every expense is ongoing. Some are one-time or upfront charges that go away.

Thus, when attributing marketing expenses by channel:

  • Decide what to measure. Labor, software, or simply the cost of an ad or promotion?
  • Choose when to measure. Should the channel be measured per interaction? Or would monthly work better?
  • Plan for upfront expenses. Should upfront expenses be amortized? If so, over what period? How will channels with amortized costs compare with those having ongoing expenses?
  • Manage sensitive information. Some costs are sensitive or private. Will salaries be shared, or will the labor portion of the equation be closely held?
  • Decide how you will measure. Should marketers use time-tracking software?
  • Document the process. Record what, when, and how results are measured.
  • Collect only essential data. There’s no need to track labor or software costs if they don’t impact marketing decisions.

Lastly, remember that sometimes the cure can be worse than the illness. Attributing expenses by channel can drive performance at the cost of damaging staff morale. So attribute with prudence.