Google Brings Circle To Search To iPhone via @sejournal, @MattGSouthern

Google is expanding ‘Circle to Search’ to iPhones, allowing users to search for anything on their screens.

Google isn’t calling it by that name, though, because you can use “whatever gesture comes naturally.”

If you’ve seen how it works on Android, it essentially does the same thing on iPhones.

The big difference is you can only activate it from the Google or Chrome apps, whereas ‘Circle to Search’ on Android is built into the operating system.

Additionally, Google is expanding AI Overviews within Lens search results.

‘Search Screen With Google Lens’ On iOS

Previously exclusive to Android, this ‘Circle to Search‘ alternative expands visual search on iOS.

How It Works

iPhone users can now open the Chrome or Google apps and tap the three-dot menu to initiate a screen-wide Lens search.

Instead of taking a screenshot or navigating to another tab, you can highlight, draw, or tap on the part of the screen you want to identify.

Google will then return information about the object you’ve identified.

Expanding AI Overviews

Circle to Search employs Google Lens, which uses an online image database to help you identify objects.

Google says new AI models allow Lens to recognize more unusual or unfamiliar things.

For those “unusual” searches, Google says an AI Overview will appear with information about what your camera is pointing at.

This overview details what you see and includes links to web resources. You can see an example below:

This is all done without prompts. You won’t need to ask a specific question or add extra keywords to find what you need.

However, you can refine your search with follow-up questions if you choose to.

Availability

These new features are rolling out to iOS users this week and will be available globally for Chrome and Google apps.

For AI Overviews, English-language users in select locations will see them first in the Google app on Android and iOS, followed soon by Chrome on desktop and mobile.

How to perform an SEO audit (with checklist)

An SEO audit is a health checkup of your site. It allows you to know what works and what does not, and it allows you to make improvements based on what you find. This can lead to improved performance — both on the search results pages and how visitors engage with your website.

Table of contents

What is an SEO audit?

An SEO audit looks at how well a website performs in search results to find areas that need work. It helps find technical SEO problems, analyze on-page elements, evaluate Core Web Vitals and site speed, and analyze user experience and content quality. An SEO audit also looks at outside variables like backlinks and rival tactics to identify areas for improvement. Making sure your website is optimized for users and search engines can help it rank better and attract more organic traffic.

A helpful guide

An SEO audit checklist

Read on below for the step-by-step process, but here is an SEO audit checklist that will help you get started quickly.

⬜️ Crawl your website using Screaming Frog (or similar tools)

⬜️ Analyze your site with an SEO tool (e.g., Semrush or Ahrefs)

⬜️ Pull reports from Google Analytics and Search Console

⬜️ Create a centralized spreadsheet for findings

⬜️ Check the user experience (check CTAs, menus, etc)

⬜️ Audit website content (duplicate and thin content)

⬜️ Optimize internal linking

⬜️ Optimize page titles and meta descriptions

⬜️ Improve content with proper headings (H1 to H6)

⬜️ Ensure the correct use of canonical tags

⬜️ Add and validate Schema markup

⬜️ Monitor and improve Core Web Vitals

⬜️ Improve general site performance

⬜️ Improve mobile responsiveness

⬜️ Boost user engagement

⬜️ Track metrics regularly

⬜️ Check Search Console reports

⬜️ Schedule regular check-ins

Step 1: Preparing an SEO audit

To make your site audit a success, you must prepare well. You need to collect the right information about your website using SEO tools, understand how to diagnose issues and prioritize fixes.

Crawl your website with Screaming Frog (or something similar)

The first step is crawling your website with crawler software. This helps find technical SEO issues that otherwise wouldn’t be so visible. Screaming Frog is one of the most trusted names in this, but Sitebulb is another highly recommended one. The free version of Screaming Frog crawls 500 URLs, but you can upgrade if needed. 

Crawling your site is easy; simply download and install Screaming Frog. Open the tool and enter your site’s homepage URL. Then, hit Start, and the crawl will run. Once the scan is complete, export the data into a CSV file for further sorting and prioritization.

Screaming Frog gives you a ton of data that you can export to sheets quickly

What to look for?

Screaming Frog generates a ton of data, so it’s good to prioritize the outcome. Scan for missing, duplicate, or overly long titles and descriptions. Each page should have unique, targeted metadata. Find pages or links that return (404) errors as broken links frustrate users and hurt SEO. Then, identify oversized assets that slow your page load time, such as images, JavaScript, and CSS files. Last but not least, make sure that canonical URLs are properly implemented to avoid duplicate content issues.

Use an all-in-one SEO tool (Semrush or Ahrefs)

In addition to a technical crawl, you can use tools like Semrush or Ahrefs to conduct a detailed SEO audit. These tools provide many insights, including keyword rankings, backlink health, and competitor performance. 

These tools also let you run a site audit, which gives you a technical health score. You’ll find many improvements to make, like pages blocked by robots.txt or issues with internal linking. The tools also review the quality and relevance of your backlinks and give you ideas on how to get high-quality new links. You’ll also get keyword rankings to track how individual pages perform for target keywords. Identify opportunities to refine content or target new search terms.

Download the most important reports and cross-reference them with your Screaming Frog export.

Pull data from Google Analytics and Search Console

Combining all these insights with your site’s user behavior and engagement data will make your SEO audit come alive. It helps you understand how people use your site and how they experience it to pinpoint pages to improve. Export your findings from Google Analytics and Search Console to include in your audit comparisons.

Check the top-performing landing pages in Google Analytics and their engagement rates. Pages with low engagement rates may have poor content or a disconnect between user expectations and page design. Also, look at session duration and exit rates to find pages where people quickly leave your site.

Use the Performance Report in Search Console to see which pages and queries drive the most clicks and impressions. This will also highlight low CTR pages — ranking well but failing to attract searchers. Then, check the Page Indexing Report for crawl errors, warnings, or blocked pages and review the Core Web Vitals Report to find pages failing on speed or usability metrics.

Google Search Console is an essential tool for SEO audits, here we see the perfomance report
Google Search Console is an essential tool for SEO audits

Create a centralized spreadsheet

Once you have all the data, please combine everything in a big spreadsheet. How you set this up is up to you, as everyone uses something different. But you could use something like this:

  • Page URL
  • Technical issues (e.g., broken links, slow load speed)
  • Engagement metrics (e.g., engagement rates, time on page)
  • Keyword rankings
  • Optimization notes (e.g., missing metadata, duplicate content)
  • Priority (High, Medium, Low)

This spreadsheet will guide your fixes throughout the audit process.

Minimal SEO audit (optional)

Not every audit needs to be a deep dive into your site. Sometimes, you don’t have the time but still feel the need to work on your site. In this case, you could do a simpler, quicker health check and evaluate specific regions of your site to see if these perform well. Such a minimal SEO audit is a streamlined version of a full audit to find and fix critical performance issues.

Here’s a basic framework for a quick audit:

  1. Check that your site is indexed by searching site:yourdomain.com in Google.
  2. Run a Google PageSpeed Insights test for slow-loading pages.
  3. Examine the titles and meta descriptions of your most important pages (e.g., homepage, service pages, and key sales pages).
  4. Fix broken links using Screaming Frog or a quick manual check in your navigation.

This lightweight SEO audit still finds high-priority issues without the time commitment of a full review.

Step 2: User experience & content SEO

The next step is to see how people perceive and interact with your site. Look at the user experience and see if you can find things to improve. You can get people to your site by using high-quality content aimed at the right search intent and audience. Not only that, because you want to have them returning. 

Improving the user experience

Do you know if your users can find what they need quickly? If not, they might leave your site quickly. Giving them a good experience will do wonders in the long run. In your SEO audit, start by diagnosing these common UX factors:

Make sure the colors match your branding and are easy to read. Look at contrast, as this is especially important for buttons and links. Make CTAs (like “Buy now” or “Learn more”) stand out visually.

Check if the most important design elements are above the fold. Key messages and CTAs should be visible without scrolling. Think of this as the headline act—it must grab attention immediately. Add customer testimonials, third-party endorsements, and security badges (e.g., SSL or payment protection signs) to build credibility.

Give special attention to your menus. Test menus, drop-downs, and search functions. Breadcrumbs also help users see where they are within the site hierarchy.   

Audit website content

SEO is largely about content, so review its quality and improve where necessary. The Semrush/Ahrefs site audit should have given you many pointers. With this list, start working on the following.

Check the keyword targeting of your content. Make sure that each page represents a primary keyword. Ahrefs and Semrush show which keywords your pages rank for and identify gaps.

Check for duplicate or thin content. Avoid weak, duplicate, or low-value content. Where necessary, merge similar pages into one in-depth article. Provide actionable, valuable content.

Remember Google’s Helpful Content standards. Create content that delivers real value and focuses on user intent. Your content should answer questions with actionable, audience-focused solutions. Last, you demonstrate Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T): Add author bios, cite reliable sources, and link references where necessary to develop expertise and trustworthiness.

Internal linking and related content

SEO is not just about getting users and search engines to your site —it’s also about keeping and showing them around. One of the most powerful ways to do this is through internal linking, so be sure to include this in your SEO audit. 

Check how you link your most important pages, like cornerstone articles or product categories. Your content should have a couple of links based on relevance and importance, but not too many. In addition, you should include a related content section on your pages to encourage further reading.  

Anchor text should include relevant keywords or describe the linked page and try to avoid generic phrases like “click here”. 

An internal search feature is another important aspect of showing people around your site. Make sure that your search bar provides relevant results, especially on large websites. Monitor what people search for to inform your content strategy.

Step 3: General on-page SEO

On-page SEO concerns the technical and content improvements you make on specific pages. This helps search engines understand your pages. It also helps your readers to find what they want. 

Optimize page titles and meta descriptions

Page titles and meta descriptions are the first things a visitor sees in search results. While search engines like to generate these based on relevance, you can still influence how you’d like these to appear for maximum CTR. 

For your page titles, make sure that every page on your site has a unique title. Duplicate titles confuse search engines, which is something you don’t want. And while there’s no limit to how long titles can be in the SERPs, they get cut off visually after a set number of characters. Try to find the sweet spot.

Incorporate your primary keyword close to the beginning of the title, but avoid keyword stuffing. For example, instead of “SEO tips SEO tips SEO tips,” use “10 SEO tips for beginners – Step-by-step guide.” Don’t forget to add your brand name at the end of the title, e.g., “How to do an SEO audit – Your Brand”

For your meta descriptions, make sure that they concisely explain what the page is about. You should also include the primary keyword while making sure the text flows naturally. Don’t forget to encourage action. Incorporate a call-to-action (CTA), such as “Learn more,” “Discover how,” or “Start now.”

Optimize heading structures (H1 to H6)

Headings are excellent tools for structuring and making your content easier to read. They also assist search engines with recognizing how important the information is on each page.

  • Start with one H1: The H1 is the main heading for the webpage, and it should contain your targeted keyword. Each page should have a single H1 tag.
  • Use H2s for major sections: Use H2 tags to break up content into logical sections. Consider these the main subheadings of your article.
  • Add H3s or H4s for subsections: You can have more subsections under H2s if you want to break it down further using H3 or H4 for better structuring.
  • Keep it logical: Don’t skip heading levels (e.g., jumping from H1 to H4) or use headings only for styling.
  • Be descriptive: Write headings describing the section’s content. For example, instead of “Step 1,” use “Step 1: Analyze your traffic metrics.”
WordPress has a handy feature to check the heading structure of your articles

Ensure proper use of canonical tags

Canonical tags show a search engine which version of a page to prioritize when duplicates or near-duplicates of the same page are available on your site. This is especially important for online stores, as these have many variations of the same products due to filtering or session-based URLs. 

You should always choose one canonical version for a page. For example, if both https://example.com and https://www.example.com exist, set one canonical URL to prevent duplicate content issues. Don’t forget to add the canonical tag in each page’s HTML section and be consistent in your internal linking. For instance, always link to one version of the URL rather than switching between http and https.

Regularly check for issues using Screaming Frog or Semrush to find pages missing canonical tags or ones with conflicting canonicals.

Add and test schema markup

Structured data in the form of Schema markup helps make your site more understandable for search engines. The code you add to your site helps structure and identify your content in a way that search engines can easily consume. In some cases, this can even lead to highlighted search results, for instance, for products or ratings and reviews. 

Yoast SEO drastically simplifies adding schema for WordPress, WooCommerce and Shopify users. The SEO plugin outputs JSON-LD (the format preferred by Google) to add schema markup directly to your page’s HTML.

There are many options for adding Schema, but you should start with the basics and things relevant to your site. For instance, you should use the Article schema for articles and blog posts and highlight publication dates, images, authors, and headlines. 

Ecommerce businesses should use Product structured data. This data should highlight pricing, stock availability, ratings, and reviews. If it makes sense, you can also markup your FAQ pages, which will no longer be highlighted in Google’s SERPs.

There are many other options, so you must check what makes sense for your situation. For instance, if you run a recipe site, you can add Recipe structured data, or if you publish events on your site, use Events. 

Don’t forget to test your structured data. Use Google’s Rich Results Test Tool to check if your structured data is correct and valid. Also, check Search Console for errors under the “Enhancements” tab.

the google rich results test shows seven valid items for rich results
Yoast SEO makes it easy to add essential structured data

Audit and improve your backlinks

Backlinks are as important as ever. Every link from a relevant, high-quality source counts towards your authority. These links prove to search engines that your content is valuable and meaningful. Of course, there’s a ton of spamming happening with links.

You can use tools like Moz, Ahrefs, or Semrush to audit your backlink profile. The results show a list of spammy backlinks and links from irrelevant websites with low authority. If spammy websites link to you, there’s an option in Google Search Console to disavow these links. This is only needed in very rare cases, though. Only disavow links you’re sure are harmful — this is a last resort for low-quality links you cannot get removed manually.

It’s more important to focus on earning high-quality backlinks. Create shareable, high-value content like guides, research, or infographics while building relationships with related websites, bloggers, or journalists for natural backlink opportunities.

Step 4: Site speed and engagement

Check your site performance, as site speed and user engagement greatly impact success. Pages that load slowly are annoying for users and can give you a poor score in the eyes of search engines. Low engagement rates can hurt your results, as users might stop visiting your site.

Understanding and improving Core Web Vitals

To underscore the importance of performance, Google launched the Core Web Vitals. These metrics help site owners gain insights into how their sites perform in real life and get tips on improving those scores. The metrics focus on loading times, interactivity and stability. Together, these determine how enjoyable users find your site. 

LCP measures how long your largest asset loads

The Largest Contentful Paint (LCP) measures how long it takes for the largest visible element on the screen (usually an image, video, or headline) to render fully. If performance is bad, you can improve this by optimizing images by compressing them without sacrificing quality. You can use modern file formats like WebP for faster performance and minimize render-blocking resources like heavy CSS or JavaScript files. Defer unnecessary scripts and prioritize above-the-fold content.

INP measures interactivity 

Interaction to Next Paint (INP): INP is the new Core Web Vitals metric from Google that tracks how quickly your site responds to user input clicks, taps, and keystrokes. While FID only reported on the delay for the first interaction, INP evaluates all interactivity events for the session. This ensures a fuller score.

You can improve your performance by minimizing JavaScript execution. Use Screaming Frog or PageSpeed Insights to flag heavy scripts and defer or remove non-critical JavaScript. Use browser caching to cache JavaScript and other assets so they don’t reload unnecessarily and reduce reliance on third-party scripts. You can offload heavy tasks to web workers to free up the main thread and process user interactions faster.

CLS measures stability

Cumulative Layout Shift (CLS) measures the stability of a webpage’s visual layout. It checks if the content moves unexpectedly as the page loads (e.g. when an image loads late and pushes buttons elsewhere on the screen).

You can improve this by specifying dimensions (width and height) for all images and videos in your HTML/CSS. This prevents the browser from guessing dimensions and rearranging content. Avoid inserting ads, banners, or other dynamic elements above the fold after loading content. Please preload important assets like fonts or images to ensure they appear quickly and predictably.

Site speed optimization beyond Core Web Vitals

Core Web Vitals should be a main focus, but there are other strategies to implement to improve site speed and page experience. Faster websites equal user satisfaction, reduce bounce rates and make your audience more likely to stick around in the future. 

Start reducing the number of HTTP requests for a faster site. Combine CSS and JavaScript files where practical, or use modern HTTP/3 protocols, allowing browsers to send out multiple requests simultaneously. Also, unused CSS and JavaScript should be eliminated to reduce file sizes and speed up load times. File compression can be used via Gzip or Brotli to compress the assets before serving them to the user. Compressed files load faster without losing quality; most hosting providers or web servers can help you set this up. Tools like Google Lighthouse can also alert you if compression is missing.

Implement lazy loading for images and videos so that only visible content loads immediately while other assets load as needed. WordPress users can easily use plugins like Smush or Lazy Load by WP Rocket to achieve this, or custom JavaScript libraries like lazysizes work on other platforms. Distribute your site’s static assets with a Content Delivery Network (CDN), which delivers files from servers closest to users, improving global load speeds. Popular CDN providers include Cloudflare, Akamai, and Amazon CloudFront. Finally, performance analysis tools such as Google Lighthouse, GTmetrix, or Pingdom analyze bottlenecks, track progress, and ensure your efforts work.

Google's PageSpeed Insights is one of the best tools for an seo audit
Google’s PageSpeed Insights is one of the best tools to understand your site’s real-life performance

Improving mobile performance and responsiveness

Mobile is everything these days. For most websites, this means that most of the traffic will be coming from mobile devices. Search engines like Google consider the quality of your mobile site when ranking your content, so being mobile-friendly should always be on the tip of your tongue.

Run various mobile tests to see how your site performs on phones and tablets. Look for layout issues, problems with interactive elements, or slow-loading pages or assets. Check if your responsive web design works properly so your site dynamically adapts to all device sizes. Also, ensure your CTAs are mobile-friendly, and your forms are accessible from mobile devices. 

Increasing user engagement on your site

Faster pages keep users on your website, but engagement ensures they take meaningful actions. Thanks to better site performance, you’ll get higher engagement rates, which results in better conversions, newsletter signups, product purchases, and more.

Simplify your site’s navigation to make it easy for users to find what they need. Use clear menus with logical structures, such as categories and subcategories, and add breadcrumbs to show users where they are within the site. Dropdown menus should be intuitive, and internal search bars must return accurate, relevant results quickly. Additionally, ensure key Call-to-Actions (CTAs), like “Sign Up” or “Request a Quote,” are prominently placed above the fold or immediately following key content sections. Use descriptive, action-oriented language in your CTAs to make them more compelling and clickable.

Encourage users to explore your site more with internal links and related content suggestions. Add social sharing buttons to blog posts, infographics, or product pages to make it easy for users to share content on platforms like Facebook, LinkedIn, or X. If using popups or exit-intent offers (e.g., subscription prompts or discounts), ensure they are thoughtfully designed and minimally intrusive. Poorly timed or aggressive popups risk driving users away, so aim to balance engagement with user experience.

Tools for site speed and engagement improvements

To help optimize, you can utilize Google Lighthouse, which will show you how your Core Web Vitals performs overall, and GTmetrix, which goes in-depth and gives actionable recommendations on improving page speed results.

Hotjar offers insights into where users click, how they scroll, and how they behave overall. WP Rocket is for WordPress users looking to automate technical processes such as caching, lazy loading, and database optimization. Various WordPress plugins add customizable social share buttons to enhance content sharing, making it easier for visitors to share your posts on their favorite platforms.

Step 5: Monitoring and tracking results

SEO is a colossal effort; the process does not end there once that initial effort is made. You must monitor your actions to determine whether those changes work as intended. Regular monitoring is also a great opportunity to find improvements and better calibrate your SEO strategy. Regular monitoring helps you improve your site, adjust to the latest algorithm updates, and retain the course.

Why monitor results?

By tracking results, you can measure the impact of your audit (e.g., increased rankings, traffic, and engagement). It’ll also help spot new issues like broken links, slow pages, or dropped rankings. This will ultimately help you improve your strategy by identifying what’s driving results and where to focus next.

SEO is not something you do in a month or so. It takes time, and you might see the results in many months. Consistently track and analyze.

Metrics to track

Start by looking at traffic metrics. Organic traffic shows how many users find your site through search engines, which you can monitor in Google Analytics under Acquisition > Organic Search. Check referral traffic to see if other backlinks are sending visitors to your site. This data shows how effective your SEO and link-building work is.

Next, evaluate engagement and search performance. Metrics like engagement rates and time on page help you understand how users interact with your content. On the search side, track keyword rankings with tools like Wincher, Ahrefs, or Semrush to see how well your pages are doing in the SERPs.

Use Google Search Console to monitor your CTR and check for indexing issues in the Coverage Report. Make sure that your most important pages are indexed. Monitor loading speed, interactivity, and layout stability in tools like PageSpeed Insights.

Schedule regular check-ins

You need to make monitoring results a regular thing. Review rankings, CTR, and new crawl errors weekly. Each month, check traffic trends, user behavior, and fixes made during the audit. Every quarter, you should run a fresh crawl with Screaming Frog, check competitor performance, and update old pages based on new opportunities.

Conclusion on doing SEO audits

Following these steps will help perform an SEO audit,  from preparing your data to addressing user experience and technical SEO improvements. Make sure each fix you aim to do aligns with your goals and strategy. Auditing regularly keeps your site running at its best and ready to rank in search results.

How To Manage Multiple Websites On WordPress

WordPress is the most popular content management system (CMS) in the world.

Many sites worldwide use it for good reasons: It extensibility means that you can build more than just a website; its open-source nature means you own your site, and it tends to rank pretty well in search engines. Your only limit when it comes to WordPress is your imagination.

However, like a lot of other forms of websites, it does need some care to manage to make sure it’s safe and secure.

This can be a bit intimidating for a new user, but with a bit of planning, you can have a management strategy for your WordPress websites that works and is scalable for your business.

In this article I’ll share the questions I ask myself to manage WordPress in a scalable fashion.

Which Version Of WordPress Is Right For You?

The first question to ask is before your WordPress website is even built and how your project is structured.

If you are running multiple instances of WordPress where each one is relatively similar, WordPress multisite may be suitable.

This is where you run one instance of WordPress but have multiple websites running off one database, and one copy of every plugin and theme. It means scheduled tasks, such as plugin updates and backups, only need to run on one codebase.

WordPress multisite is great for larger sites that are all relatively similar – so subdomains for each department, or different languages or locales are perfect for multisite.

If you have, for example, a site and a blog subdomain, both running on WordPress, then I’d recommend looking at this approach. You can even have a WooCommerce solution in one of those subdomains.

If you’re an SEO agency running multiple WordPress sites, I recommend managing each instance separately.

The bespoke nature of client work could mean that the amount of plugins and themes installed and available for every user will be massive.

There could be client confidentiality issues. Every client could potentially see each other’s themes and plugins.

Also, there are potential security implications with one point of failure. If one site is compromised, then all the other sites in the network could be at risk.

Furthermore, not all hosts support WordPress multisite, so you really should speak to your host. It also requires a bit more technical knowledge to implement.

Should you wish to investigate multisite, then WordPress has a guide on how to install WordPress multisite network. However, for the rest of this guide, I assume you’re using the vanilla version of WordPress.

Begin With Tools You May Not Know You Have (But Don’t Rely On Them)

WordPress and your host may have some tools available to you that you can use to automate some of the management.

Speak to your host and find out if they offer backups, how often, and where they are stored (backups hosted on the same server as the WordPress sites are next to useless!).

If they do, go through the process of restoring the backup of a staging server and document that process. Some hosts also put their backups behind a paywall, so you don’t want to rely on them.

WordPress has the ability to enable auto-updates. Before enabling this, run through a test update of all plugins and themes on a staging server and review.

If themes have been edited without the creation of a child theme, then theme edits will be overwritten. Likewise, if changes were made to the plugins, then there could be errors.

If things haven’t been updated for a while, jumping from a very old version of the site to the latest may fail. Running through these changes on a staging server and testing thoroughly before deploying to live will minimize potential issues.

If both those tests are completed fine, they should be safe to auto-update. Even if there is a problem and the site triggers a fatal error, the update will usually roll back to a working version of the site.

Even with both enabled, please don’t rely on them. There are often gotchas, especially with more complex solutions, that you may need to work around.

This is learned knowledge you put into your plan of action. Furthermore, hosts can go bust or change their offering, often at short notice, so it’s a good idea to have a host-independent plan.

Prepare A Plan

Before managing multiple WordPress sites, you need to have a plan. This is what you do on a daily, weekly, and monthly basis.

For “daily” tasks, they should be automated – these are your uptime and security monitoring, automatic backups, and updates.

Weekly tasks are more manual tasks – any updates that were missed by the daily tasks, visual checks, and testing if functionality on the site works as intended.

Every month, some time should be dedicated to analyzing Google Search Console errors, testing the loading time of key pages, and searching for broken links. You should also dedicate some time every few months to review plugins and identify ones that have been either abandoned or removed from the WordPress repository. Doing such reviews can help fend off security issues arising in the future.

For weekly and monthly tasks, pick a number of pages to look at. These are key pages – either high conversion or traffic pages or pages that have unique functionality.

If you have an ecommerce site, place a test order too.

Finally, you should put a plan in place for what happens if something goes wrong. This is typically if the site gets hacked, or if the site goes down for a considerable length of time.

Do a dummy restore of the site, to make sure it works, and have a plan for what occurs when a minor security breach (such as a vulnerability) is discovered. A well built website with up-to-date plugins and themes, on a good host, is unlikely to be hacked, but nipping security vulnerabilities in the bud can help this. Below, I share a few tools that have security monitoring linked to services like Patchstack.

Store this plan somewhere. I have a spreadsheet I use to manage my clients, containing the client name and contact, what package they are subscribed to, the key pages, as well as any gotchas associated with the client’s sites. After setting this up, you’re ready to go.

Tools You Can Use To Manage Multiple Websites On WordPress

Thankfully, to save time, there are a number of tools to help you manage multiple WordPress sites.

These will tend to allow you to update multiple sites from one dashboard, and handle things such as security and uptime monitoring.

They are largely very similar, and unless you have a particularly bad experience, you are unlikely to move them.

Here are some of the main players:

  • ManageWP allows free updates and monthly backups for free. Payment services are usually “addons,” which start at $1-$2 per site for each addon such as EU and U.S. server backups, uptime monitoring (which integrates with Slack), security and performance monitoring, and link monitoring. A site with all premium add-ons would cost $9/month. If you have over 25 sites, you can bundle services, with a maximum cost of $150/month, for 100 sites. For full disclosure, I use ManageWP.
  • MainWP also allows free updates. It’s popular in the industry but a bit more complex, as it offers 30+ extensions that handle security and uptime monitoring, as well as integration with popular plugins like WooCommerce, Yoast, and WP Rocket, so you can analyze and update all your sites in one place. You will need to set up backups separately, and it costs $199/yr for unlimited sites or $599 for a lifetime license.
  • InfiniteWP has a free tier which allows you to update WordPress and plugins, and backup manually that you can download. Its premium tier, however, allows scheduled backups to the cloud, malware scanning, uptime monitoring, broken link checkers, and 15+ other features. Pricing for premium is tiered, starting at $147/year for 10 sites, up to $647/yr for unlimited sites and users.
  • WPRemote allows you to update plugins and themes for free. It has premium tiers with backups, staging site creation, uptime monitoring, and security and vulnerability scans. Premium tiers start at $299/year for five sites at the “Basic” level to $9,999/year for 100 sites at the “Pro” level.

A Simple Plan For Managing Multiple WordPress Websites

If you are part of in-house development, or a marketing team managing multiple similar WordPress websites, then a WordPress multisite installation may be suitable for your needs.

For the vast majority of agencies, multiple WordPress installations with separate databases and potentially hosts is the way forward. Running an entire agencies clients websites through a WordPress multisite installation would become unwieldy quickly.

For both approaches, a few simple steps can be taken to help manage multiple WordPress websites:

  1. Prepare a task list for all WordPress sites to be split into daily, weekly and monthly tasks.
  2. To begin, run through each task with each client, to manually identify potential gotchas, and include these in your notes.
  3. Come up with a disaster recovery plan for worst case scenarios.
  4. Offload as many tasks as possible, such as security and uptime monitoring and backups to a third party.

Doing this will keep your website secure, up to date, and with every performant optimisation in the latest versions of WordPress on your sites as quickly as possible.

More Resources:


Featured Image: fizkes/Shutterstock

Beyond Vanity Metrics: How Brands Are Measuring Social Media Impact via @sejournal, @rio_seo

Social media marketing metrics look vastly different today than they did even a few years ago.

Numbers used to define social success – the more likes, shares, and followers, the better. Yet, numbers are now falling behind in significance when examining success.

Connection is key.

If you’re not primarily focused on engaging with your target audience in a meaningful way, you risk falling behind.

Measuring social media impact must go beyond vanity metrics and will be a key determinant for expanding your social reach and follower loyalty.

It will deepen your understanding of where and how your audience engages with you most.

Engagement metrics reveal real impact. It leads you forward to improve facets of your social media marketing that deserve the most attention: how your followers perceive your business.

These metrics enable you to go beyond surface-level indicators and help you align with what social media algorithms now value most, such as adhering to privacy regulations and ethical data collection.

Savvy social marketers looking to gain an edge and adapt to the latest algorithm updates must stay on top of what to track.

This article will explore how forward-thinking brands and marketers alike are redefining success on myriad social media platforms.

We’ll examine why we’re moving beyond traditional metrics, best practices for metric measurement, and the tools and technology shaping the future of social media marketing.

The Problem With Vanity Metrics

Measuring social media success used to rely mostly on the numbers, tied to the amount of likes a post received or the number of people following an account.

The more the merrier, and it was easy to track, given more meant better. This number-based approach is referred to as “vanity metrics,” or how consumers interact with your brand at a surface level.

While vanity metrics can highlight how many people are interested in your business, they aren’t granular enough to truly measure why customers find your content interesting (or if they don’t).

Yet, vanity metrics offer some semblance of success, even if it’s an instant hit of dopamine. Who hasn’t gotten excited by seeing their follower numbers steadily grow or gaining over 1,000 likes on a post?

However, vanity metrics aren’t the sole measure of success and rarely tell the full story of how a brand is actually performing on social media.

Follows, impressions, and likes are only a small fraction of how engaging your social media efforts actually are.

Another pertinent issue with vanity metrics is how they tie to broader business goals. Sure, vanity metrics may measure how quickly your platform is growing, but they simply don’t capture how that translates into qualified leads, sales, and more.

Consider the following: A brand may have thousands of followers, but what if it isn’t capturing (or caring about) how many of those followers turn into paying customers?

This missed opportunity makes it hard to justify social media spend and the effort allocated towards this marketing effort.

For example, a post may receive 2,000 likes, which might lead a brand to think the post is a great success. However, without understanding who liked it or what drove the spike in likes, it’s impossible to pinpoint the true value of the post.

The simple truth is numbers fail to tell the whole story. Context is key and can only be gained when social media marketers move beyond vanity metrics.

In the future, shifting beyond likes will be crucial to having a full, comprehensive view of performance.

Key Metrics That Matter

As a social media marketer, it’s crucial to keep up with ever-evolving best practices to stay visible in the sea of social posts and ahead of the curve.

In fact, the average social media user spends about 143 minutes a day scrolling through their social platforms, leaving ample opportunity for brands to make an impression.

As social platform algorithms change, so too does how these platforms evaluate performance.

For example, media algorithms now favor and surface content that receives high engagement rather than displaying posts chronologically.

Right now, success on social platforms will look different, extending beyond traditional metrics of interest. Building quality connections with followers will win.

It’s no longer a popularity contest but building an authentic and genuine social presence with your customers, becoming a business customers can trust.

The new era of social media marketing metrics has arrived, and now is the time to up your metric tracking game.

Here are the metrics social media marketers should be tracking.

Engagement Metrics

Social media marketers are diving deeper, focusing on getting their audience to take meaningful action.

To do so, they must examine the depth of that engagement, digging deeper than likes alone.

Comments: A Free Form Of Feedback

There is much to be gained in feedback, including comments on social media. Comments can convey genuine emotion, both positive and negative.

Irrespective, both types of feedback highlight how captivating your content is. They can help your business gain a glimpse into what’s working and what isn’t, allowing you to mine for common themes within the context of your comments.

Tip: Focus on who is leaving thoughtful comments or questions, responding to those individuals whenever possible.

Sentiment: How Your Content Makes Your Customers Feel

How a customer feels about your business can often be understood through the context of a comment.

It’s likely emotion promoted the customer to comment in the first place, but knowing how to decipher the comment and the customer’s emotions helps brands take meaningful action. It can also highlight what’s working and what isn’t.

For example, perhaps videos capture your customers’ attention the most, and you may want to shift your content strategy to create more video-type content in the future.

Tip: Businesses should accurately track sentiment analysis to enhance their customer experience.

Conversion Metrics

Understanding what makes customers take action will always be a top metric to track. This won’t be any different this year, and social media marketers should continue to monitor conversion metrics.

Click-Through Rates: Encourage Action

Click-through rates (or CTR) enable you to see how compelling your content truly is. Do people feel motivated to take the next step and read more, learn more, or buy from you?

Tip: Track which posts are gaining the most traction in terms of clicks to determine what’s working well and what to replicate in the future.

Sales: Ensure Your Social Efforts Pay Off

In the social media realm, it’s important to keep a pulse on the type of messaging that is driving your audience to take the next step – whether that’s landing on a product page or signing up for a monthly newsletter.

Tip: It’s equally as important to attribute sales to your social media channels to get the most out of your marketing efforts.

Customer Retention Metrics

Social media isn’t just a tool for building a brand reputation and attracting new customers. It should also play a key role in retaining existing ones.

In fact, 88% of business leaders agree social media data is a must for improving customer retention and experiences.

Loyal customers want to come back to consume your content on your social media channels, and the right messaging can help develop deeper relationships.

Repeat Purchases: Keep The Revenue Rolling In

Demonstrating social media value is critical to encourage your customers to keep coming back for more.

Messaging might also differ between how you communicate with a prospect rather than a repeat customer.

Having revenue attribution in place is a must on social media to help you better personalize communication between these different audience segments.

Tip: Set up proper attribution to be able to tie repeat sales back to social media.

Loyalty: Becoming More Than A One-Time Purchase

Loyalty programs have long been a tried-and-true marketing tactic to entice repeat customer engagement. It also helps demonstrate value to your loyal customers by offering exclusive deals and promotions.

Tip: Ensure your loyalty signups are trackable through customized UTM links to be able to attribute new member sign-ups to your social efforts.

Tools And Technologies Empowering Brands

Social media marketing is getting smarter, leaning on emerging technologies to help enhance analytics, workflows, and performance.

This year, marketers can expect social platforms to grow more intelligent, increasingly relying on advanced technology, like artificial intelligence (AI), to fuel their platforms.

In turn, marketers can expect to see smarter insights, powering a clearer view of the metrics that measure success.

Here’s a closer look at how forward-thinking brands are leveraging technology to reshape their social strategies to improve experiences and marketing success.

AI

Undoubtedly, AI is transforming the way nearly every business operates.

Social media isn’t any different, enabling brands to measure success and even predict future performance with a precision never realized before.

AI-driven analytics are becoming deeply woven into the fabric of social media performance measurement.

In fact, predictive models can predetermine the success of a campaign before it even gets kicked off.

With the help of AI, brands can avoid costly campaigns that will likely lead to lackluster outcomes by analyzing historical data and previous campaign data.

This deep level of insight allows social media marketers to make informed decisions that lead to more desirable outcomes.

AI has the capability to predict optimal post time, and which day of the week is best to post, taking factors like time zones and target audience location into consideration as well.

In turn, social media marketers can more effectively get their content in front of more eyes.

In the social realm, AI has been a great help for marketers, enabling brands to get their messages in front of their desired audience more easily than ever before.

The good news is AI already has significant buy-in from organization leaders. A study found that 97% of leaders agree AI and machine learning (ML) enable businesses to analyze social media data and insights more efficiently.

Social Listening Tools

Social listening can lead to higher customer satisfaction rates. The 2025 Sprout Social Index found that 73% of social users will buy from a competitor if a brand doesn’t respond on social.

Your customers are talking, and if your brand isn’t listening, you risk losing their trust, loyalty, and hard-earned dollars.

Enter social listening tools, an easy way for brands to see what potential customers and customers alike are saying about their business.

With social listening tools, brands can track specific keywords and set alerts any time someone mentions these keywords.

For example, a brand might want to track relevant hashtags related to their product or service or be alerted any time a competitor is mentioned.

Rapid alerts enable brands to move quickly, replying directly to customers any time they’re in need of customer support.

If a product has a major defect that’s widespread, customers may turn to social media to share their grievances with their followers.

How you respond to this feedback sets the stage for if they’ll give your business the opportunity to course-correct, or if they’ll take their business elsewhere.

The same can apply to a gap in service.

For example, a fast-casual Mexican food restaurant sought to find out how they could improve customer experiences. The restaurant leveraged a mix of AI and social media listening tools to identify where there was friction in the customer journey.

Customers frequently mentioned long wait times in their social media feedback, prompting the restaurant to take quick action to remedy this consistent complaint. The result? Better experiences and happier customers.

Integrated Dashboards

To successfully scale as a marketing entity, integrated dashboards are a must. The right technology allows marketers to view all their data in one centralized location.

From social media metrics to local search performance, technology should empower marketers to make better decisions with a clear view of their performance.

In theory, marketers should be able to have a clear picture of the entire customer journey, from the time a customer first searches for a business on Google to what page they land on to checking out.

A unified approach helps marketers measure how social media campaigns contribute to other organizational efforts such as lead generation, purchasing, and more.

Integrated dashboards also allow different teams to assess performance at a higher level.

Content marketers are able to see how their efforts coincide with social media marketers with a transparent view of social media metrics. It also allows teams to easily share their impact with C-level colleagues.

Next Steps For Social Media Marketers Ready To Make An Impact

As we’ve explored throughout this blog post, the days of measuring social media impact by simply looking at followers or like numbers are long gone.

This year, measuring social media impact will rely on looking at the right metrics and integrating the right technology.

Social media marketers will want to consistently showcase the results through integrated dashboards with their broader team to highlight the positive impact they’re making on the business.

By sharing how social media efforts tie to broader business goals, social media marketers can elevate their careers. It’s not all about the business, though; the real drivers of success are loyal and engaged customers.

Social media metrics do more than show what’s working and what isn’t. They show whether your potential and current customers care to consume your content.

Now that the new year has come, it’s time for brands to reassess their social media strategies. Are you relying on metrics that don’t show the full picture? Or are you adopting the strategies, tools, and technology that lead to better customer outcomes?

More Resources:


Featured Image: ImageFlow/Shutterstock

How To Create a Certified Fast Website To Compete In 2025

This post was sponsored by Bluehost. The opinions expressed in this article are the sponsor’s own.

Imagine clicking on a website only to wait several seconds for it to load.

Frustrating, right?

Your prospective customers think so, too.

In a world where attention spans are shrinking, even a one-second delay can lead to lost visitors, lower rankings, and missed revenue opportunities.

Research finds that B2C websites that load in one second or less have conversion rates three times higher than those that load in five seconds or more.

Conversion rates are 2.5 times higher for B2C websites that load in one second or less.

In other words, speed is no longer a luxury.

Speed is a necessity.

A fast-loading website enhances user experience, boosts SEO rankings, and drives higher conversions.

And with search engines and consumer expectations continuing to evolve, businesses must prioritize performance to stay ahead of the competition.

Implementing the right strategies ensures that websites remain fast, competitive, and ready for the demands of 2025.

A trusted partner like Bluehost provides the robust infrastructure, advanced caching mechanisms, and built-in performance enhancements needed to help websites reach peak efficiency.

1. How To Select The Right Hosting Plan

A website’s performance starts with selecting the right hosting plan. The plan should align with the site’s current and future needs to effectively accommodate growth and traffic fluctuations.

Assess Your Website’s Needs

Before settling on a hosting plan, it’s crucial to evaluate key factors like traffic expectations, content types, and scalability.

For example, websites with heavy multimedia content require more resources than text-based sites, and anticipated visitor numbers influence server capacity needs.

Additionally, selecting a plan that supports future growth ensures smooth scaling without performance bottlenecks.

Match Your Website’s Needs To What The Host Provides

Different hosting solutions cater to different website requirements, ranging from budget-friendly shared hosting to more robust, performance-driven plans. Bluehost offers multiple hosting options tailored to various business needs.

Shared Hosting can work well for smaller websites with moderate traffic, offering a cost-effective way to get started.

Bluehost’s VPS hosting offers more power and flexibility by providing dedicated resources, making it an excellent choice for growing websites that need additional performance.

For large-scale websites demanding maximum speed and control, our dedicated hosting plans deliver exclusive server access with top-tier performance for optimal speed and scalability.

2. Implement Caching Mechanisms

Caching is an essential tool for optimizing website speed by reducing the need to load the same data repeatedly. By storing frequently accessed files, caching decreases server load, enhances response times, and ensures visitors experience faster page loads.

Websites that effectively utilize caching experience better performance, lower bounce rates, and improved search rankings.

Use Built-In Caching Features

For instance, Bluehost provides multiple caching mechanisms to enhance website performance, such as PHP APC (Alternative PHP Cache). A powerful opcode caching system, PHP APC improves database query speed and optimizes PHP script execution, ensuring that frequently accessed data is retrieved faster.

On the other hand, edge caching minimizes latency by delivering content from servers closest to the user, reducing server response times and improving load speeds.

Bluehost makes it easy to use caching to enhance website speed. Caching can be enabled directly through the Bluehost control panel, ensuring seamless implementation.

Additionally, Bluehost is powered by Dell rack-mount servers, which use AMD EPYC chips, DDR5 RAM, and ultrafast NVMe storage. With caching plugins like W3 Total Cache or WP Rocket, your web pages will load faster, improving the user experience, SEO, traffic, and conversion rates.

3. Absolutely Leverage Content Delivery Networks (CDNs)

Another way to speed up websites is to examine how content is delivered to users. A Content Delivery Network (CDN) enhances website performance by distributing content across multiple servers worldwide. This reduces latency and ensures visitors load pages faster, regardless of location.

CDNs minimize the physical distance between the server and the user by caching static assets like images, stylesheets, and scripts at various data centers worldwide. This results in load times and reduced bandwidth usage.

Beyond speed improvements, CDNs also enhance website security by protecting against DDoS attacks, traffic spikes, and malicious bots. Some CDNs offer additional features, such as image optimization, automated compression, and firewall rules, that further improve performance and security.

CDNs & Bluehost

Bluehost offers built-in CDN solutions, including Cloudflare integration, to help websites achieve optimal performance and security.

Activating a CDN through Bluehost’s dashboard is straightforward, and configuring settings that best suit a website’s needs significantly improves speed and reliability.

4. Optimize Images & Media

Impact of Media Files on Load Times

Large images and unoptimized videos can significantly slow down a website. Why? High-resolution media files require more bandwidth and processing power, leading to slower page loads and a poorer user experience.

This is particularly problematic for mobile users and those with slower internet connections since heavy media files can take significantly longer to load, frustrating visitors and increasing bounce rates.

Additionally, media files that are not optimized can consume excessive server resources, potentially affecting overall website performance. If too many large files are loaded simultaneously, the hosting environment can strain, causing slowdowns for all users.

Image- and media-based slowdowns are widespread on websites that rely heavily on visual content, such as e-commerce platforms, portfolios, and media-heavy blogs.

Reducing file sizes, choosing appropriate formats, and leveraging compression techniques can greatly enhance website speed while maintaining visual quality.

How To Size Images The Right Way

First, while it may be common and easy to do, avoid using the width and height attributes in HTML to resize images since this forces the browser to scale the image, increasing load times and decreasing performance.

Instead, resize images before uploading them using graphic editing tools such as Photoshop, GIMP, or online compression services. Scaling images improperly can lead to pixelation and a stretched appearance, negatively impacting user experience.

By resizing images to their intended display size before uploading, websites can significantly reduce the amount of data a browser needs to process, resulting in faster page loads and a more visually appealing layout.

Appropriately resized images will also have a higher visual quality because they are sized for the right display dimensions.

How To Compress Images For Better Website Performance

Compressing images using tools like Squoosh, TinyPNG, or plugins like Smush helps reduce file sizes without sacrificing quality.

Implementing lazy loading ensures that off-screen images and videos only load when needed, reducing initial load times and enhancing overall site performance.

5. Minimize Plugins & External Scripts

How To Discover Your Plugins’ Usage

Overloading a website with excessive plugins and external scripts can severely impact performance. Therefore, it’s essential to regularly assess installed plugins and remove outdated, redundant, or unnecessary ones.

Limiting the number of external scripts running on a page can also help reduce loading times and improve efficiency.

How To Choose Efficient Plugins

Selecting the right plugins is crucial for maintaining website performance. First, look for lightweight, well-coded plugins that prioritize speed and efficiency.

Then, regularly auditing your plugins and removing outdated or redundant ones can prevent conflicts and minimize resource usage.

Bluehost provides hosting environments tailored for WordPress users, ensuring compatibility with essential caching, security, and SEO plugins.

By hosting your website on a reliable platform like Bluehost, you can benefit from a stable infrastructure that complements the best WordPress plugins. This will help you enhance functionality without compromising speed.

6. Tips For Compression, Minification & Technical Tweaks

Additional technical optimizations, in addition to caching and CDNs, can further improve site speed and performance. Compression and minification techniques help reduce file sizes, while other backend optimizations ensure web pages load efficiently.

Implementing these strategies can significantly improve desktop and mobile user experiences.

Benefits Of Compression

Reducing the size of HTML, CSS, and JavaScript files significantly improves page speed. Compressed files require less bandwidth and load faster, creating a smoother user experience.

Effortless Compression & Technical Optimization With Bluehost

Bluehost makes compression easy. GZIP compression can be enabled via Bluehost’s control panel or by modifying the .htaccess file.

Plugins like Autoptimize help minify code by removing unnecessary characters, ensuring that files remain lightweight and optimized for performance.

Utilizing ETags & Expires Headers

Another important aspect of page speed optimization involves using ETags and expired headers, which help streamline browser requests and improve overall efficiency.

These settings instruct a visitor’s browser on how to handle cached content, preventing unnecessary reloads and reducing the number of requests made to the server.

ETags (Entity Tags) are used by browsers to determine whether cached resources have been modified since the last visit. If the content remains unchanged, the browser loads the local copy instead of downloading it again, minimizing bandwidth usage and speeding up load times.

On the other hand, expired headers specify a timeframe for when specific resources should be refreshed.

By setting an appropriate expiration date for static files like images, CSS, and JavaScript, web developers can ensure that repeat visitors are not unnecessarily reloading content that has not changed.

For example, a website logo that remains consistent across pages can be cached efficiently so that users do not have to download it every time they navigate the site.

Properly configuring these settings enhances website performance, particularly for sites with recurring visitors. It prevents redundant data transfers and reduces the workload on the browser and server.

Many hosting providers, including Bluehost, offer tools and support to help website owners implement these optimizations effectively. This ensures a faster and more seamless user experience.

7. Regularly Monitor & Execute Maintenance

Practice Continuous Performance Assessment

Technology changes and slows down. Websites are no exception.

Therefore, websites should undergo regular performance assessments to ensure they’re continually optimized for the best user experience.

Routine speed testing helps identify areas where performance can be improved, whether by addressing slow-loading elements, optimizing server response times, or refining backend processes.

Various tools can assist in performance evaluation. Google PageSpeed Insights, for example, provides detailed reports on website speed and offers specific recommendations for improvements.

Lighthouse, a Google open-source tool, analyzes performance, accessibility, and SEO, helping site owners fine-tune their pages.

Beyond automated tools, ongoing monitoring through website analytics platforms, such as Google Analytics, can offer valuable insights into user behavior.

High bounce rates and low engagement metrics may indicate slow performance, guiding further refinements.

Businesses running ecommerce platforms or large applications should consider integrating application performance monitoring (APM) tools to track performance bottlenecks in real time.

Maintenance Tips

Regular updates to website software, regardless of the platform used, are essential for security and performance.

Content management systems (CMS) like WordPress, Joomla, and Drupal require frequent updates to core files, themes, and plugins to prevent compatibility issues and vulnerabilities. Similarly, frameworks and libraries for custom-built sites must be kept up to date to ensure efficiency and security.

Database optimization is another crucial maintenance task. Over time, databases accumulate redundant data, slowing down query execution.

Periodic optimizations, such as removing unused tables, cleaning up post revisions, and properly indexing databases, can enhance efficiency.

Server maintenance is equally important. Websites hosted on dedicated or VPS servers should have automated backups, uptime monitoring, and log analysis configured.

Cloud-based hosting solutions like Bluehost Cloud provide performance-tracking tools that can help identify and mitigate slowdowns at the infrastructure level, a 100% uptime SLA, and more to ensure websites run smoothly.

Lastly, implementing a proactive security strategy ensures ongoing performance stability. Regular malware scans, security patches, and SSL certificate renewals help prevent vulnerabilities that could slow down or compromise a website.

Security plugins and firewalls, such as Cloudflare, add an extra layer of protection while minimizing unwanted traffic that could strain server resources.

That’s what makes Bluehost the superior choice. We offer automated backups, performance monitoring tools, and dedicated 24/7 support professionals who can help keep your website running at peak efficiency.

And with a range of hosting plans tailored to different needs, Bluehost ensures that your website will remain fast, secure, and scalable as it grows.

Building a certified fast website in 2025 requires strategic hosting, caching, content delivery, and ongoing maintenance.

Leveraging Bluehost’s robust hosting plans, integrated CDN, and performance optimization tools ensures your website remains fast, competitive, and ready for the evolving digital landscape.

Bluehost’s hosting solutions provide an easy and reliable way to optimize performance.


Image Credits

Featured Image: Image by Bluehost. Used with permission.

How to have a child in the digital age

When the journalist and culture critic Amanda Hess got pregnant with her first child, in 2020, the internet was among the first to know. “More brands knew about my pregnancy than people did,” she writes of the torrent of targeted ads that came her way. “They all called me mama.” 

The internet held the promise of limitless information about becoming the perfect parent. But at seven months, Hess went in for an ultrasound appointment and everything shifted. The sonogram looked atypical. As she waited in an exam room for a doctor to go over the results, she felt the urge to reach for her phone. Though it “was ludicrous,” she writes, “in my panic, it felt incontrovertible: If I searched it smart and fast enough, the internet would save us. I had constructed my life through its screens, mapped the world along its circuits. Now I would make a second life there too.” Her doctor informed her of the condition he suspected her baby might have and told her, “Don’t google it.”

Unsurprisingly, that didn’t stop her. In fact, she writes, the more medical information that doctors produced—after weeks of escalating tests, her son was ultimately diagnosed with Beckwith-Wiedemann syndrome—the more digitally dependent she became: “I found I was turning to the internet, as opposed to my friends or my doctors, to resolve my feelings and emotions about what was happening to me and to exert a sense of external control over my body.”  

But how do we retain control over our bodies when corporations and the medical establishment have access to our most personal information? What happens when humans stop relying on their village, or even their family, for advice on having a kid and instead go online, where there’s a constant onslaught of information? How do we make sense of the contradictions of the internet—the tension between what’s inherently artificial and the “natural” methods its denizens are so eager to promote? In her new book, Second Life: Having a Child in the Digital Age (Doubleday, 2025), Hess explores these questions while delving into her firsthand experiences with apps, products, algorithms, online forums, advertisers, and more—each promising an easier, healthier, better path to parenthood. After welcoming her son, who is now healthy, in 2020 and another in 2022, Hess is the perfect person to ask: Is that really what they’re delivering? 

In your book, you write, “I imagined my [pregnancy] test’s pink dye spreading across Instagram, Facebook, Amazon. All around me, a techno-­corporate infrastructure was locking into place. I could sense the advertising algorithms recalibrating and the branded newsletters assembling in their queues. I knew that I was supposed to think of targeted advertising as evil, but I had never experienced it that way.” Can you unpack this a bit?

Before my pregnancy, I never felt like advertising technology was particularly smart or specific. So when my Instagram ads immediately clocked my pregnancy, it came as a bit of a surprise, and I realized that I was unaware of exactly how ad tech worked and how vast its reach was. It felt particularly eerie in this case because in the beginning my pregnancy was a secret that I kept from everyone except my spouse, so “the internet” was the only thing that was talking to me about it. Advertising became so personalized that it started to feel intimate, even though it was the opposite of that—it represented the corporate obliteration of my privacy. The pregnancy ads reached me before a doctor would even agree to see me.

Though your book was written before generative AI became so ubiquitous, I imagine you’ve thought about how it changes things. You write, “As soon as I got pregnant, I typed ‘what to do when you get pregnant’ in my phone, and now advertisers were supplying their own answers.” What do the rise of AI and the dramatic changes in search mean for someone who gets pregnant today and goes online for answers?

I just googled “what to do when you get pregnant” to see what Google’s generative AI widget tells me now, and it’s largely spitting out commonsensical recommendations: Make an appointment to see a doctor. Stop smoking cigarettes. That is followed by sponsored content from Babylist, an online baby registry company that is deeply enmeshed in the ad-tech system, and Perelel, a startup that sells expensive prenatal supplements. 

So whether or not the search engine is using AI, the information it’s providing to the newly pregnant is not particularly helpful or meaningful. 

The Clue period-tracking
app
AMIE CHUNG/TRUNK ARCHIVE

The internet “made me feel like I had some kind of relationship with my phone, when all it was really doing was staging a scene of information that it could monetize.”

For me, the oddly tantalizing thing was that I had asked the internet a question and it gave me something in response, as if we had a reciprocal relationship. So even before AI was embedded in these systems, they were fulfilling the same role for me—as a kind of synthetic conversation partner. It made me feel like I had some kind of relationship with my phone, when all it was really doing was staging a scene of information that it could monetize. 

As I wrote the book, I did put some pregnancy­-related questions to ChatGPT to try to get a sense of the values and assumptions that are encoded in its knowledge base. I asked for an image of a fetus, and it provided this garishly cartoonish, big-eyed cherub in response. But when I asked for a realistic image of a postpartum body, it refused to generate one for me! It was really an extension of something I write about in the book, which is that the image of the fetus is fetishized in a lot of these tech products while the pregnant or postpartum body is largely erased. 

You have this great—but quite sad—quote from a woman on TikTok who said, “I keep hearing it takes a village to raise a child. Do they just show up, or is there a number to call?” 

I really identified with that sentiment, while at the same time being suspicious of this idea that can we just call a hotline to conjure this village?

I am really interested that so many parent-­focused technologies sell themselves this way. [The pediatrician] Harvey Karp says that the Snoo, this robotic crib he created, is the new village. The parenting site Big Little Feelings describes its podcast listeners as a village. The maternity clothing brand Bumpsuit produces a podcast that’s actually called The Village. By using that phrase, these companies are evoking an idealized past that may never have existed, to sell consumer solutions. A society that provides communal support for children and parents is pitched as this ancient and irretrievable idea, as opposed to something that we could build in the future if we wanted to. It will take more than just, like, ordering something.

And the benefit of many of those robotic or “smart” products seems a bit nebulous. You share, for example, that the Nanit baby monitor told you your son was “sleeping more efficiently than 96% of babies, a solid A.”

I’m skeptical of this idea that a piece of consumer technology will really solve a serious problem families or children have. And if it does solve that problem, it only solves it for people who can afford it, which is reprehensible on some level. These products might create a positive difference for how long your baby is sleeping or how easy the diaper is to put on or whatever, but they are Band-Aids on a larger problem. I often found when I was testing out some of these products that the data [provided] was completely useless. My friend who uses the Nanit texted me the other day because she had found a new feature on its camera that showed you a heat map of where your baby had slept in the crib the night before. There is no use for that information, but when you see the heat map, you can try to interpret it to get some useless clues to your baby’s personality. It’s like a BuzzFeed quiz for your baby, where you can say, “Oh, he’s such, like, a right-side king,” or “He’s a down-the-middle guy,” or whatever. 

The Snoo Smart Sleeper Bassinet
COURTESY OF HAPPIEST BABY

“[Companies are] marketing a cure for the parents’ anxiety, but the product itself is attached to the body of a newborn child.”

These products encourage you to see your child themselves as an extension of the technology; Karp even talks about there being an on switch and an off switch in your baby for soothing. So if you do the “right” set of movements to activate the right switch, you can make the baby acquire some desirable trait, which I think is just an extension of this idea that your child can be under your complete control.

… which is very much the fantasy when you’re a parent.

These devices are often marketed as quasi-­medical devices. There’s a converging of consumer and medical categories in baby consumer tech, where the products are marketed as useful to any potential baby, including one who has a serious medical diagnosis or one who is completely healthy. These companies still want you to put a pulse oximeter on a healthy baby, just in case. They’re marketing a cure for the parents’ anxiety, but the product itself is attached to the body of a newborn child.

After spending so much time in hospital settings with my child hooked up to monitors, I was really excited to end that. So I’m interested in this opposite reaction, where there’s this urge to extend that experience, to take personal control of something that feels medical.

Even though I would search out any medical treatment that would help keep my kids healthy, childhood medical experiences can cause a lot of confusion and trauma for kids and their families, even when the results are positive. When you take that medical experience and turn it into something that’s very sleek and fits in your color scheme and is totally under your control, I think it can feel like you are seizing authority over that scary space.

Another thing you write about is how images define idealized versions of pregnancy and motherhood. 

I became interested in a famous photograph that a Swedish photographer named Lennart Nilsson took in the 1960s that was published on the cover of Life magazine. It’s an image of a 20-week-old fetus, and it’s advertised as the world’s first glimpse of life inside the womb. I bought a copy of the issue off eBay and opened the issue to find a little editor’s note saying that the cover fetus was actually a fetus that had been removed from its mother’s body through surgery. It wasn’t a picture of life—it was a picture of an abortion. 

I was interested in how Nilsson staged this fetal body to make it look celestial, like it was floating in space, and I recognized a lot of the elements of his work being incorporated in the tech products that I was using, like the CGI fetus generated by my pregnancy app, Flo. 

You also write about the images being provided at nonmedical sonogram clinics.

I was trying to google the address of a medical imaging center during my pregnancy when I came across a commercial sonogram clinic. There are hundreds of them around the country, with cutesy names like “Cherished Memories” and “You Kiss We Tell.” 

In the book I explore how technologies like ultrasound are used as essentially narrative devices, shaping the way that people think about their bodies and their pregnancies. Ultrasound is odd because it’s a medical technology that’s used to diagnose dangerous and scary conditions, but prospective parents are encouraged to view it as a kind of entertainment service while it’s happening. These commercial sonogram clinics interest me because they promise to completely banish the medical associations of the technology and elevate it into a pure consumer experience. 

baby monitor
The Nanit Pro baby monitor with Flex Stand
COURTESY OF NANIT

You write about “natural” childbirth, which, on the face of it, would seem counter to the digital age. As you note, the movement has always been about storytelling, and the story that it’s telling is really about pain.

When I was pregnant, I became really fascinated with people who discuss freebirth online, which is a practice on the very extreme end of “natural” childbirth rituals—where people give birth at home unassisted, with no obstetrician, midwife, or doula present. Sometimes they also refuse ultrasounds, vaccinations, or all prenatal care. I was interested in how this refusal of medical technology was being technologically promoted, through podcasts, YouTube videos, and Facebook groups. 

It struck me that a lot of the freebirth influencers I saw were interested in exerting supreme control over their pregnancies and children, leaving nothing under the power of medical experts or government regulators. And they were also interested in controlling the narratives of their births—making sure that the moment their children came into the world was staged with compelling imagery that centered them as the protagonist of the event. Video evidence of the most extreme examples—like the woman who freebirthed into the ocean—could go viral and launch the freebirther’s personal brand as a digital wellness guru in her own right. 

The phrase “natural childbirth” was coined by a British doctor, Grantly Dick-Read, in the 1920s. There’s a very funny section in his book for prospective mothers where he complains that women keep telling each other that childbirth hurts, and he claimed that the very idea that childbirth hurts was what created the pain, because birthing women were acting too tense. Dick-Read, like many of his contemporaries, had a racist theory that women he called “primitive” experienced no pain in childbirth because they hadn’t been exposed to white middle-class education and technologies. When I read his work, I was fascinated by the fact that he also described birth as a kind of performance, even back then. He claimed that undisturbed childbirths were totally painless, and he coached women through labor in an attempt to achieve them. Painless childbirth was pitched as a reward for reaching this peak state of natural femininity.

He was really into eugenics, by the way! I see a lot of him in the current presentation of “natural” childbirth online—[proponents] are still invested in a kind of denial, or suppression, of a woman’s actual experience in the pursuit of some unattainable ideal. Recently, I saw one Instagram post from a woman who claimed to have had a supernaturally pain-free childbirth, and she looks so pained and miserable in the photos, it’s absurd. 

I wanted to ask you about Clue and Flo, two very different period-tracking apps. Their contrasting origin stories are striking. 

I downloaded Flo as my period-tracking app many years ago for one reason: It was the first app that came up when I searched in the app store. Later, when I looked into its origins, I found that Flo was created by two brothers, cisgender men who do not menstruate, and that it had quickly outperformed and outearned an existing period-tracking app, Clue, which was created by a woman, Ida Tin, a few years earlier. 

The elements that make an app profitable and successful are not the same as the ones that users may actually want or need. My experience with Flo, especially after I became pregnant, was that it seemed designed to get me to open the app as frequently as possible, even if it didn’t have any new information to provide me about my pregnancy. Flo pitches itself as a kind of artificial nurse, even though it can’t actually examine you or your baby, but this kind of digital substitute has also become increasingly powerful as inequities in maternity care widen and decent care becomes less accessible.

“Doctors and nurses test pregnant women for drugs without their explicit consent or tip off authorities to pregnant people they suspect of mishandling their pregnancies in some way.”

One of the features of Flo I spent a lot of time with was its “Secret Chats” area, where anonymous users come together to go off about pregnancy. It was actually really fun, and it kept me coming back to Flo again and again, especially when I wasn’t discussing my pregnancy with people in real life. But it was also the place where I learned that digital connections are not nearly as helpful as physical connections; you can’t come over and help the anonymous secret chat friend soothe her baby. 

I’d asked Ida Tin if she considered adding a social or chat element to Clue, and she told me that she decided against it because it’s impossible to stem the misinformation that surfaces in a space like that.

You write that Flo “made it seem like I was making the empowered choice by surveilling myself.”

After Roe was overturned, many women publicly opted out of that sort of surveillance by deleting their period-tracking apps. But you mention that it’s not just the apps that are sharing information. When I spoke to attorneys who defend women in pregnancy criminalization cases, I found that they had not yet seen a case in which the government actually relied on data from those apps. In some cases, they have relied on users’ Google searches and Facebook messages, but far and away the central surveillance source that governments use is the medical system itself. 

Doctors and nurses test pregnant women for drugs without their explicit consent or tip off authorities to pregnant people they suspect of mishandling their pregnancies in some way. I’m interested in the fact that media coverage has focused so much on the potential danger of period apps and less on the real, established threat. I think it’s because it provides a deceptively simple solution: Just delete your period app to protect yourself. It’s much harder to dismantle the surveillance systems that are actually in place. You can’t just delete your doctor. 

This interview, which was conducted by phone and email, has been condensed and edited.

Inside China’s electric-vehicle-to-humanoid-robot pivot

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

While DOGE’s efforts to shutter federal agencies dominate news from Washington, the Trump administration is also making more global moves. Many of these center on China. Tariffs on goods from the country went into effect last week. There’s also been a minor foreign relations furor since DeepSeek’s big debut a few weeks ago. China has already displayed its dominance in electric vehicles, robotaxis, and drones, and the launch of the new model seems to add AI to the list. This caused the US president as well as some lawmakers to push for new export controls on powerful chips, and three states have now banned the use of DeepSeek on government devices. 

Now our intrepid China reporter, Caiwei Chen, has identified a new trend unfolding within China’s tech scene: Companies that were dominant in electric vehicles are betting big on translating that success into developing humanoid robots. I spoke with her about what she found out and what it might mean for Trump’s policies and the rest of the globe. 

James: Before we talk about robots, let’s talk about DeepSeek. The frenzy for the AI model peaked a couple of weeks ago. What are you hearing from other Chinese AI companies? How are they reacting?

Caiwei: I think other Chinese AI companies are scrambling to figure out why they haven’t built a model as strong as DeepSeek’s, despite having access to as much funding and resources. DeepSeek’s success has sparked self-reflection on management styles and renewed confidence in China’s engineering talent. There’s also strong enthusiasm for building various applications on top of DeepSeek’s models.

Your story looks at electric-vehicle makers in China that are starting to work on humanoid robots, but I want to ask about a crazy stat. In China, 53% of vehicles sold are either electric or hybrid, compared with 8% in the US. What explains that? 

Price is a huge factor—there are countless EV brands competing at different price points, making them both affordable and high-quality. Government incentives also play a big role. In Beijing, for example, trading in an old car for an EV gets you 10,000 RMB (about $1,500), and that subsidy was recently doubled. Plus, finding public charging and battery-swapping infrastructure is much less of a hassle than in the US.

You open your story noting that China’s recent New Year Gala, watched by billions of people, featured a cast of humanoid robots, dancing and twirling handkerchiefs. We’ve covered how sometimes humanoid videos can be misleading. What did you think? 

I would say I was relatively impressed—the robots showed good agility and synchronization with the music, though their movements were simpler than human dancers’. The one trick that is supposed to impress the most is the part where they twirl the handkerchief with one finger, toss it into the air, and then catch it perfectly. This is the signature of the Yangko dance, and having performed it once as a child, I can attest to how difficult the trick is even for a human! There was some skepticism on the Chinese internet about how this was achieved and whether they used additional reinforcement like a magnet or a string to secure the handkerchief, and after watching the clip too many times, I tend to agree.

President Trump has already imposed tariffs on China and is planning even more. What could the implications be for China’s humanoid sector?  

Unitree’s H1 and G1 models are already available for purchase and were showcased at CES this year. Large-scale US deployment isn’t happening yet, but China’s lower production costs make these robots highly competitive. Given that 65% of the humanoid supply chain is in China, I wouldn’t be surprised if robotics becomes the next target in the US-China tech war.

In the US, humanoid robots are getting lots of investment, but there are plenty of skeptics who say they’re too clunky, finicky, and expensive to serve much use in factory settings. Are attitudes different in China?

Skepticism exists in China too, but I think there’s more confidence in deployment, especially in factories. With an aging population and a labor shortage on the horizon, there’s also growing interest in medical and caregiving applications for humanoid robots.

DeepSeek revived the conversation about chips and the way the US seeks to control where the best chips end up. How do the chip wars affect humanoid-robot development in China?

Training humanoid robots currently doesn’t demand as much computing power as training large language models, since there isn’t enough physical movement data to feed into models at scale. But as robots improve, they’ll need high-performance chips, and US sanctions will be a limiting factor. Chinese chipmakers are trying to catch up, but it’s a challenge.

For more, read Caiwei’s story on this humanoid pivot, as well as her look at the Chinese startups worth watching beyond DeepSeek. 


Now read the rest of The Algorithm

Deeper Learning

Motor neuron diseases took their voices. AI is bringing them back.

In motor neuron diseases, the neurons responsible for sending signals to the body’s muscles, including those used for speaking, are progressively destroyed. It robs people of their voices. But some, including a man in Miami named Jules Rodriguez, are now getting them back: An AI model learned to clone Rodriguez’s voice from recordings.

Why it matters: ElevenLabs, the company that created the voice clone, can do a lot with just 30 minutes of recordings. That’s a huge improvement over AI voice clones from just a few years ago, and it can really boost the day-to-day lives of the people who’ve used the technology. “This is genuinely AI for good,” says Richard Cave, a speech and language therapist at the Motor Neuron Disease Association in the UK. Read more from Jessica Hamzelou.

Bits and Bytes

A “true crime” documentary series has millions of views, but the murders are all AI-generated

A look inside the strange mind of someone who created a series of fake true-crime docs using AI, and the reactions of the many people who thought they were real. (404 Media)

The AI relationship revolution is already here

People are having all sorts of relationships with AI models, and these relationships run the gamut: weird, therapeutic, unhealthy, sexual, comforting, dangerous, useful. We’re living through the complexities of this in real time. Hear from some of the many people who are happy in their varied AI relationships and learn what sucked them in. (MIT Technology Review)

Robots are bringing new life to extinct species

A creature called Orobates pabsti waddled the planet 280 million years ago, but as with many prehistoric animals, scientists have not been able to use fossils to figure out exactly how it moved. So they’ve started building robots to help. (MIT Technology Review)

Lessons from the AI Action Summit in Paris

Last week, politicians and AI leaders from around the globe went to Paris for an AI Action Summit. While concerns about AI safety have dominated the event in years past, this year was more about deregulation and energy, a trend we’ve seen elsewhere. (The Guardian)  

OpenAI ditches its diversity commitment and adds a statement about “intellectual freedom”

Following the lead of other tech companies since the beginning of President Trump’s administration, OpenAI has removed a statement on diversity from its website. It has also updated its model spec—the document outlining the standards of its models—to say that “OpenAI believes in intellectual freedom, which includes the freedom to have, hear, and discuss ideas.” (Insider and Tech Crunch)

The Musk-OpenAI battle has been heating up

Part of OpenAI is structured as a nonprofit, a legacy of its early commitments to make sure its technologies benefit all. Its recent attempts to restructure that nonprofit have triggered a lawsuit from Elon Musk, who alleges that the move would violate the legal and ethical principles of its nonprofit origins. Last week, Musk offered to buy OpenAI for $97.4 billion, in a bid that few people took seriously. Sam Altman dismissed it out of hand. Musk now says he will retract that bid if OpenAI stops its conversion of the nonprofit portion of the company. (Wall Street Journal)

Nokia is putting the first cellular network on the moon

Later this month, Intuitive Machines, the private company behind the first commercial lander that touched down on the moon, will launch a second lunar mission from NASA’s Kennedy Space Center. The plan is to deploy a lander, a rover, and hopper to explore a site near the lunar south pole that could harbor water ice, and to put a communications satellite on lunar orbit. 

But the mission will also bring something that’s never been installed on the moon or anywhere else in space before—a fully functional 4G cellular network. 

Point-to-point radio communications, which need a clear line of sight between transmitting and receiving antennas, have always been a backbone of both surface communications and the link back to Earth, starting with the Apollo program. Using point-to-point radio in space wasn’t much of an issue in the past because there never have been that many points to connect. Usually, it was just a single spacecraft, a lander, or a rover talking to Earth. And they didn’t need to send much data either.

“They were based on [ultra high frequency] or [very high frequency] technologies connecting a small number of devices with relatively low data throughput”, says Thierry Klein, president of Nokia Bell Labs Solutions Research, which was contracted by NASA to design a cellular network for the moon back in 2020. 

But it could soon get way more crowded up there: NASA’s Artemis program calls for bringing the astronauts back to the moon as early as 2028 and further expanding that presence into a permanent habitat in 2030s. 

The shift from mostly point-to-point radio communications to a full-blown cell network architecture should result in higher data transfer speeds, better range, and increase the number of devices that could be connected simultaneously, Klein says. But the harsh conditions of space travel and on the lunar surface  make it difficult to use Earth-based cell technology straight off the shelf. 

Instead, Nokia designed components that are robust against radiation, extreme temperatures, and the sorts of vibrations that will be experienced during the launch, flight, and landing. They put all these components in a single “network in a box”, which contains everything needed for a cell network except the antenna and a power source.

“We have the antenna on the lander, so together with the box that’s essentially your base station and your tower”, Klein says. The box will be powered by the lander’s solar panels.

During the IM-2 mission, the 4G cell network will allow for communication between the lander and the two vehicles. The network will likely only work for a few days— the spacecraft are not likely to survive after night descends on the lunar surface. 

But Nokia has plans for a more expansive 4G or 5G cell network that can cover the planned Artemis habitat and its surroundings. The company is also working on integrating cell communications in Axiom spacesuits meant for future lunar astronauts. “Maybe just one network in a box, one tower, would provide the entire coverage or maybe we would need multiple of these. That’s not going to be different from what you see in terrestrial cell networks deployment”, Klein says. He says the network should grow along with the future lunar economy. 

Not everyone is happy with this vision. LTE networks usually operate between 700 MHz and 2.6 GHz, a region of the radiofrequency spectrum that partially overlaps with frequencies reserved for radio astronomy. Having such radio signals coming from the moon could potentially interfere with observations.

“Telescopes are most sensitive in the direction that they are pointing–up towards the sky”, Chris De Pree, deputy spectrum manager at the National Radio Astronomy Observatory (NRAO) said in an email. Communication satellites like Starlink often end up in the radio telescopes’ line of sight. A full-scale cell network on the moon would add further noise to the night sky. 

There is also a regulatory hurdle that must be worked around. There are radio bands that have been internationally allocated to support lunar missions, and the LTE band is not among them. “Using 4G frequencies on or around the moon is a violation of the ITU-R radio regulations”, NRAO’s spectrum manager Harvey Liszt explained in an email.

To legally deploy the 4G network on the moon, Nokia received a waiver specifically for the IM-2 mission. “For permanent deployment we’ll have to pick a different frequency band,” Klein says. “We already have a list of candidate frequencies to consider.” Even with the frequency shift, Klein says Nokia’s lunar network technology will remain compatible with terrestrial 4G or 5G standards.

And that means that if you happened to bring your smartphone to the moon, and it somehow survived both the trip and the brutal lunar conditions, it should work on the moon just like it does here on Earth. “It would connect if we put your phone on the list of approved devices”, Klein explains. All you’d need is a lunar SIM card.