Squarespace Update Strengthens Its Robust Website Builder via @sejournal, @martinibuster

Squarespace announced updates to their Blueprint AI, automating website creation, and enhancing their tool suite—further strengthening their website building platform for small and medium-sized businesses.

Squarespace

Squarespace is known for their easy to use drag and drop interface that allows user to select a template, modify it with a few clicks and to drag and drop web page elements in order to create a professional looking website. Over 2% of all websites are reported to use Squarespace, showing that it’s a popular choice website building platform for small to medium size businesses.

Blueprint AI

Blueprint AI, launched in late 2023, is Squarespace’s proprietary AI website builder that helps users create a website by answering questions related to what kind of site they’re trying to create. The AI then creates a template based on the answers to the questions. Users can then use Squarespace’s full suite of editing features to further modify their website then modify to suit their needs and create a true custom website.

Other Improvements

Squarespace also announced other improvements that help users switch web page layouts and apply changes, a one-click style changer that instantly creates new style combinations, and a new hub for managing the website brand identify.

The announcement explained:

Layout Switcher:
An adaptive layout menu that enables faster website design experimentation—offering a set of flexible compositions with one’s content automatically embedded, then applied instantly to a page.

Site Themes:
One-click styling combinations that make it easier to preview and apply a new website aesthetic—via handpicked font pairings, color palettes, button styles and more, with recommendations aligned to a customer’s brand personality.

Brand Identity Management:
A central hub for crafting and storing one’s unique brand identity that guides Squarespace’s AI writer to instantly generate first draft, on-brand copy populated across key surface areas, including website text, content descriptions, and client documents, among others.”

Takeaways

Squarespace has about 20 years experience helping businesses easily build websites and start doing business online. This announcement shows that Squarespace continues to improve the already excellent platform that gives businesses the chance to effectively compete online.

Read Squarespace’s announcement:

Squarespace Refresh 2024: Introducing a New Era for Entrepreneurs

Featured Image by Shutterstock/IB Photography

Maintaining SEO Against Varying International Laws And Regulations via @sejournal, @TaylorDanRW

When implementing effective SEO strategies for clients, a frequent challenge is managing limited resources, especially in content creation and the technical capabilities needed to execute SEO recommendations.

This complexity increases when working with organizations operating across multiple territories and markets.

Each region may have its own set of regulations, language requirements, and market-specific needs, adding another layer of difficulty in executing consistent and compliant SEO strategies across different territories.

In these cases, strategies and routine activities often need to be adjusted to meet the specific laws and regulations of each location.

Non-compliance with these regulations might not directly impact your overall digital performance.

The organization could face significant consequences in the form of legal charges and potential fines.

Adjusting to these differences is essential for maintaining compliance and ensuring the successful implementation of SEO strategies.

Common Legislation

While understanding legislation may not fall entirely within the scope of SEO, being aware of the limitations it imposes on activities and data collection is crucial.

Legal regulations can directly impact how data is gathered, used, and stored, influencing SEO strategies in significant ways.

Beyond the DMCA, other legal frameworks can also affect SEO efforts, depending on the region in which a business operates.

Compliance with data privacy laws – like GDPR in Europe or CCPA in California, for example – can shape how businesses handle user data, adjust targeting, and execute their SEO tactics across different jurisdictions.

Global Privacy Legislation

Privacy regulations have a significant impact on SEO, as they influence how businesses can collect, store, and use personal data.

When we talk about privacy legislation, the two that generally come to the top of mind are the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA).

Other privacy legislations that you may come into contact with when working with a global organization include:

Understanding these different privacy laws and how they affect data handling (and user tracking) is important, as data between regions may not be directly comparable because of these laws.

European Accessibility Act (EAA) 2025

The EAA 2025 aims to improve accessibility for persons with disabilities across the EU by setting common requirements for certain products and services.

It aims to standardize practices, so that businesses comply with unified accessibility standards by June 28, 2025, promoting equal access to digital products and services.

This means that web design will need to adapt to meet specific accessibility standards, ensuring that websites are usable by individuals with disabilities.

This could include incorporating features like keyboard navigation, screen reader compatibility, alternative text for images, accessible forms, and adequate color contrast, allowing for a more inclusive online experience.

As companies work to adapt (and become compliant) to this legislation, third-party software may be introduced to websites to facilitate a number of (if not all) of the requirements.

This means adding scripts and potentially altering how a page loads and renders for both users (and search engines).

Geo-Blocking Regulation (EU) 2018/302

The Geo-Blocking Regulation (EU) 2018/302 is a European Union regulation aimed at preventing unjustified geographical discrimination of customers within the EU’s single market.

It came into effect in December 2018.

The regulation specifically targets practices that aim to block or redirect users trying to purchase goods, or services, online from a website “based” in a different EU member state.

A key feature of this is geo-blocking. The regulation aims to prevent geo-based redirects, such as automatically redirecting users to a different section of the website (such as a localized subfolder) based on IP.

During the Covid pandemic, there were calls for regulation to adapt to the shifts in user behavior with online shopping.

Anecdotally, I’ve not seen many instances of companies in the EU falling foul of this regulation for geo-blocking.

In 2021, Valve, the company behind Steam, along with a number of video game publishers, were fined €7.8 million for geo-blocking practices. Outside of this instance, very few have surfaced in my news feeds.

Differences Between US State Laws

Laws governing consumer protection, digital goods, and subscription services differ widely across U.S. states, resulting in unique legal frameworks that businesses must consider when operating in multiple regions.

These variations create challenges for companies, particularly in advertising and data compliance, as they must tailor their practices to meet the specific requirements of each state’s regulations.

Consumer Protection & Advertising Laws

Many states implement their own criteria for defining deceptive advertising, with some, like California and New York, establishing stricter guidelines than federal standards.

California’s Unfair Competition Law (UCL) and New York’s General Business Law are prime examples of state laws that set specific requirements for advertising practices.

These regulations often demand a higher level of compliance, making it essential for businesses to adjust their marketing efforts accordingly.

An example from the tangible world is the claim of “Made in the USA.”

In California, the definition of what qualifies as “Made in the USA” is notably more stringent than federal guidelines, directly influencing how companies can promote their products.

Businesses must carefully navigate these rules to ensure their advertising aligns with state-specific standards.

Laws Governing Digital Goods & Services

The sale and advertisement of goods and services online in the U.S. are often governed by varying state regulations. One area where this is evident is in the treatment of digital goods, such as ebooks and software.

Some states, like Texas, classify digital goods as taxable, requiring businesses to apply sales tax to their transactions.

Other states, such as Delaware, do not impose taxes on digital goods. These differences mean that businesses selling digital products must remain aware of each state’s rules to ensure compliance across multiple jurisdictions.

Subscription Renewals

Some states, like California, have specific rules around automatic subscription renewals. Businesses must clearly disclose renewal terms, obtain affirmative consent, and make it easy for consumers to cancel. Other states have less stringent or no such regulations.

This could lead to retention and MRR data being lower for states like California than others, and is important to understand this when reviewing data, and then using this to further inform marketing strategy.

This is especially pertinent in the SaaS space.

What You Should Be Asking Your SEO Vendor

Companies must ensure that any third-party marketing vendors they work with are also compliant with these privacy laws.

This includes reviewing contracts and agreements with vendors to ensure they follow proper data-handling practices, including the ability to delete, disclose, or limit the use of consumer data.

Why This Matters

Global compliance is essential for businesses to effectively manage the complexities of the international digital landscape.

Ensuring that SEO strategies align with the legal frameworks of each region is a key part of this process and building long-term, sustainable organic campaigns that drive value across multiple territories.

Looking ahead, it’s not out of the question that Google may introduce a user accessibility metric, similar to how Core Web Vitals serve as a proxy for user experience.

There is some historical basis for this, with prior emphasis on HTTPS for securing the web, along with mobile-first strategies and page speed optimizations.

While these factors are “ranking factors,” the greater emphasis on them was to enact change across the wider internet to benefit users.

More resources: 


Featured Image: Rawpixel.com/Shutterstock

Google Updates Their Spam Policy Documentation via @sejournal, @martinibuster

Google updated their spam documentation, adding a new definition of site reputation abuse as the largest single change, followed by additional information about manual action consequences. The remaining updates are a content refresh aimed at making the documentation easier to understand and more concise. Understanding these changes can provide ideas for how to update your own content effectively.

What Changed

There are about eight kinds of changes made to the documentation that improves the content. That’s seven ways that older content can be made fresher.

These are the types of changes made:

  • More Information About Site Reputation Abuse
  • New Details About Manual Action Consequences
  • Changed Concept Of Thin Affiliate To Thin Affiliation
  • More Appropriate Introductory Sentence
  • Consolidation Of Words: Practices & Spam Practices
  • Added The Concept Of Spam Abuse
  • Improved Conciseness In General
  • Improved Topic: Machine-Generated Traffic

More Information About Site Reputation Abuse

The previous documentation advises that site reputation abuse is when a third party publishes content on an authoritative site with “with little or no first-party oversight” but it didn’t explain what “first-party oversight” is so the new version of the spam documentation adds a new definition.

“Close oversight or involvement is when the first-party hosting site is directly producing or generating unique content (for example, via staff directly employed by the first-party, or freelancers working for staff of the first-party site). It is not working with third-party services (such as “white-label” or “turnkey”) that focus on redistributing content with the primary purpose of manipulating search rankings.”

New Details About Manual Action Consequences

Google added a new sentence explaining that one of the consequences of continuing to violate Google’s spam guidelines is that they can escalate the consequences by removing more sections of a site from the search results. This isn’t a new consequence but it is new information.

This is the new detail in the context of a site that continues to spam:

“…and taking broader action in Google Search (for example, removing more sections of a site from Search results).”

This is an example of refreshing content by adding additional information that was left out of the original version.

Changed Concept Of Thin Affiliate To Thin Affiliation

Google changed the section about “Thin affiliate pages” so that it is now about “Thin affiliation” and added a definition of what they mean.

The original version about thin affiliate pages started like this:

“Thin affiliate pages are pages with product affiliate links…”

The new version starts like this:

“Thin affiliation is the practice of publishing content with product affiliate links…”

More Appropriate Introductory Sentence

Google’s documentation improved the introductory sentence by making it more appropriate for the context of the topic. It now defines what spam is. The new sentence doesn’t replace the the old introductory sentence, the old one simply becomes the second sentence.

Original introductory sentence:

“Our spam policies help protect users and improve the quality of search results.”

New introductory sentence:

“In the context of Google Search, spam is web content that’s designed to deceive users or manipulate our Search systems in order to rank highly. Our spam policies help protect users and improve the quality of search results.”

The new version starts with a definition of spam, which makes sense for documentation about spam.

Consolidation Of Words: Practices & Spam Practices

The following examples show how Google consolidated euphemisms for the same thing (spam) into a single phrase that emphasizes the phrase Spam Practices.

This change combines phrases like ‘content and behaviors’ and ‘forms of spam’ into the simpler phrases ‘practices” and ‘spam practices.’ I’m not sure why Google made this change, but using consistent terminology makes content easier to understand.

Here are some examples of the phrase “practices” and “spam practices” being emphasized:

1. The second paragraph is changed to make it more concise.

This:

“We detect policy-violating content and behaviors both through automated systems….”

Is now this:

“We detect policy-violating practices…”

The sentence becomes easier to understand. <— This is important.

2. Around the fourth paragraph:

This:

“Our policies cover common forms of spam, but Google may act against any type of spam we detect.”

Becomes this:

“Our policies cover common spam practices, but Google may act against any type of spam practices we detect.”

The new sentence above is kind of redundant, but it shows a conscious effort to consolidate similar activities into a single category of activity.

Concept Of Spam Abuse

The next change is to increase the use of the word “abuse” in the new version of the spam policies. Abuse is a word that describes a harmful activity. In the case of SEO, Google may be using that word because it describes an activity that intentionally deceives users and search engines.

The old version used the word 11 times and the new version uses that word 17 times. It’s a relatively minor change but it significantly heightens the concept of spam being a form of abuse.

Here are two examples of how Google added the concept of abuse:

  1. The word “doorways”  is now “doorway abuse”
  2. The phrase “Hidden text and links” is now Hidden text and links abuse”

There are other changes to the documentation where they add the word “abuse” and what’s interesting about that is this is a change to how a concept (abuse) is introduced to make a series of seemingly different things related. This helps reader comprehension because “hidden text” and “doorways” are now connected to each other in the concept of “abuse” in the sense of spam.

Improved Conciseness

Another change which should always be considered in a content refresh is to make phrases more concise.

Google changed the following text:

“Google uses links as a factor in determining the relevancy of web pages. Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site.”

It’s now significantly shorter:

“Link spam is the practice of creating links to or from a site primarily for the purpose of manipulating search rankings.”

Big difference, right? I really like that change because someone probably looked at that original three sentences and considered what the core message was that was trying to get through that thicket of three sentences.

If you read the original three sentences it’s kind of a lot of information that doesn’t really stick in the mind. Considering whether a series of sentences communicate effectively is a good way to approach a content rewrite. Just read it and ask, “what does this mean?” and if the answer is shorter then consider writing that in place of the sentences.

Improved Topic Communication: Machine-Generated Traffic

This next change dramatically improves the machine-generated traffic section because it removes a part that makes it about Google and makes it more about a definition of machine generated traffic.

These sentences:

“Machine-generated traffic consumes resources and interferes with our ability to best serve users. Examples of automated traffic include:”

Are now this:

“Machine-generated traffic (also called automated traffic) refers to the practice of sending automated queries to Google. This includes scraping…”

The part about consuming resources is still there but it’s now moved toward the end of that section.

There are other instances in the documentation were two sentences were shortened into one that gets to the point more directly, concise.

For example, the section about Misleading Functionality replaces two sentences with one sentence that defines what misleading functionality is:

“Misleading functionality refers to the practice of…”

The section about Scraped Content replaced three long sentences with a sentence that defines what scraped content is:

“Scraping refers to the practice of taking content from other sites…”

Content Refresh Versus A Rewrite

The updated spam documentation is not a rewrite but an incremental refresh with some new information. It suggests ways to update your own content by adding new details and making existing information clearer and more concise.

Read the updated documentation:

Spam policies for Google web search

Featured Image by Shutterstock/Shutterstock AI Generator

WordPress Bans Thousands Of WP Engine Customers via @sejournal, @martinibuster

WordPress banned WP Engine, a managed WordPress web host, blocking thousands of websites from adding or updating plugins and themes. Technology writer Robert Scoble described the decision as “universally hated in tech” and that out of hundreds of posts on the subject almost none was on Mullenweg’s side on the issue.

What Happened

Matt Mullenweg, co-creator of WordPress, accused WP Engine of infringing on their trademarks and demanded tens of millions of dollars in compensation. Things came to a head on the last day of the recent WordCamp WordPress conference when Mullenweg gave WP Engine until 4:30 PM that day to comply with his demands. Failure to agree to those demands led to a public shaming of WP Engine by Mullenweg and the subsequent ban of WP Engine.

WordPress.org Bans WP Engine

In a post titled WP Engine is banned from WordPress.org, Mullenweg announced that WP Engine was banned and cut off from their plugin and theme repository.

He wrote:

“Any WP Engine customers having trouble with their sites should contact WP Engine support and ask them to fix it.”

WP Engine posted an incident report on their website that offered a workaround:

“WordPress.org has blocked WP Engine customers from updating and installing plugins and themes via WP Admin. There is currently no impact on the performance, reliability or security of your site nor does it impact your ability to make updates to your code or content. We know how important this is for you and we are actively developing a remediation for this issue. We will update you as soon as we have a fix.

If you need to install or update a plugin or theme, manual instructions can be found at https://wpengine.com/support/manage-plugins-and-themes-manually

If you have any questions or need assistance, do not hesitate to reach out to our technical support team.”

WordPress Core Contributor Sentiment

WordPress core contributors are apparently not liking the current situation. A post on Reddit by an anonymous code contributor to WordPress indicated that that the core developer community is not rallying around Mullenweg.

The WordPress core contributor wrote:

“WordPress core dev here.

All contributors, Automattic and non Automattic, are watching very closely. We’re also thinking very carefully about our contributions. This is a community project and contributors are part of the community. No matter who is listed as project leadership, we’ll continue to be here for the community.

I’ve said this in other comments, but whether Matt has been accurate doesn’t even come into it for plenty of us. The way this has been done, and is continuing to be done, is such a significant problem to address before even looking at whether he’s been accurate or not.

The community, which includes us at WordPress core, are not rallying around in support of this action. Everyone I’ve spoken to at WordPress core had no prior notice of this action being taken. Given the lack of notice about this latest action, it raises concerns about whether more is to come. Right now, there’s an almost deafening public silence in contributor-to-contributor communication. We’re still trying to regulate our reactions to recent events and hopefully avoid adding more of the kinds of kneejerk actions our whole community have been subjected to in recent days.

Ceasing our own contributions would have further impact on the community. We definitely don’t want that. Even with that at the forefront of my mind, if a decision is made to engage in a collective withdrawal of contributions, with a clearly communicated desired outcome to break that withdrawal, I’d join that action. Regretfully.”

The core contributor related that aside from the warning about legal action, Mullenweg has not discussed his plans or course of action with the WordPress contributors. They also confirmed that none of the core contributors have made a change to “facilitate his actions”.

The core contributor posted a follow up to their comments to say that they’re not seeing anything positive yet.

It should be crystal clear in Matt’s mind that what’s happening right now can’t continue. How he chooses to act on that knowledge is anyone’s guess. Unfortunately, what I’m seeing in his messages through all communication channels so far doesn’t show signs of anything positive, yet (as I said in my earlier comment, we’re all watching very, very closely).”

Social Media Reaction

An overwhelming number of the posts on X (formerly Twitter) express disappointment with Mullenweg’s actions and are supportive of WP Engine.

Technology writer Robert Scoble posted:

“WordPress at war. So sad.

I was one of the first to use WordPress. I didn’t see this coming. I read hundreds of posts about what @photomatt did and almost none of them are on his side.

Never seen a decision so universally hated in tech. Lots of my old friends are in pain tonight.”

Typical reaction on X:

“This is absolutely insane and such a disgusting abuse of power by @photomatt. I have clients using @wpengine and now they can’t easily install plugins or update themes. #WordPress”

Another WordPress user posted:

“Gotta say, I’m with WP Engine on this. Not saying they are the “good guys” but if we all have to pay Automattic for using the word “WordPress” in our marketing, then we’re all in trouble. This sets a dangerous precedent.”

Read Mullenweg’s post on WordPress.org

WP Engine is banned from WordPress.org

Read the WP Engine Incident Report:

Plugin and Theme Update Restrictions

Featured Image by Shutterstock/Wirestock Creators

Four Reasons You Can’t Ignore Branded SEO in 2025

This post was sponsored by Similarweb. The opinions expressed in this article are the sponsor’s own.

According to Rand Fishkin:

‘For most small and medium businesses and newer creators/publishers, SEO is likely to show poor results until you’ve established credibility, navigational demand, and a strong reputation among a sizable audience.’

In other words, if you want to build organic traffic, build your brand.

The question is how do search engines measure brand, and what does that mean to your SEO?

In this post, I’ll dig into what influence SEOs have over brand-building, and show you:

  • Why branded SEO is an untapped opportunity you should grab with both hands.
  • How brand in SEO differs from traditional brand strategy.
  • Ways you can educate Google about your brand.

Branded SEO, An Untapped Opportunity

Branded SEO remains a largely untapped SEO opportunity. It’s untapped because, as SEOs, we’re trained to ignore branded traffic. However, by ignoring branded search, you are potentially missing some big opportunities to move your business forward.

The reason is that users don’t just use search engines to discover information and products. They also use search engines to discover brands. By focusing on how your brand shows up, you have an opportunity to influence how middle-of-the-funnel users perceive your brand. If you get it right, that could turn into a long-term relationship with your brand. Get it wrong, and I’m sure you have a number of competitors that would love to have the business.

This leads us to a crucial question…

Is There An SEO Angle To Brand?

As SEOs, we influence how content appears on search engines. The function of a search engine is to match end users with content. This means as SEOs, we are not just dealing with how users perceive your brand. We are dealing with how search engines understand how users perceive your brand.

The difference is not subtle.

Search engine algorithms measure how strong a brand is and incorporate those signals into the search results. They primarily seem to use three methods:

  • Measuring branded search queries
  • Measuring brand engagement metrics
  • Understanding your brand entity

So, if the search engine incorporates brand signals into how it generates search results, then as SEOs we should be looking for a strategic way to influence these signals.

1. Branded Search Queries

Google’s leaked documents reveal a crucial aspect of branded SEO: the strength of a brand significantly influences its search rankings. This is measured through what we can call the BrandQueryFactor. This metric assesses how frequently users search for a brand by name. The more brand-specific queries a company receives, the higher its likelihood of ranking well in search results.

We understand that branded searches affect your rankings, the question now is, in what way do they affect your rankings?

Perhaps the answer can be found in a Google patent called Ranking Search Results. This patent describes how Google uses branded search queries as a quality factor similar to links. In fact, it describes branded and navigational queries as implied links that demonstrate user trust and intent.

These ‘implied links’ have a slightly different role in establishing a site’s authority than actual links:

  • Links act as a vote of confidence from other websites, often indicating external recognition or authority
  • Branded queries reflect real-world user interest, signaling how often users search for and interact with a resource through queries

Putting that together with the Google leak metrics above, we see that brand signals include user engagement and branded query analysis. Google uses these signals to see how users engage with your brand.

Increasing brand signals with branded queries and direct traffic

As an SEO can you increase branded traffic? The sad truth is, not directly. When your SEO starts to bring in traffic you will see an increase in branded searches. But this is an indirect benefit rather than a branded search strategy.

Does this mean branded search is out of your hands?

To answer this, it’s important to first understand how to increase brand signals.

A few years ago, when I worked as an SEO manager, I noticed something. Whenever we ran PPC campaigns, we would see increasing amounts of branded search terms in our Search Console accounts.

The reason is pretty simple. People saw our ads and Googled the brand name. Take a look at the Similarweb Channels report below. What do you see? (Hint, the blue line represents organic traffic.)

Screenshot from Similarweb, September 2024

Channel data for greenies.com

 As we see above, organic traffic seems to correlate directly with paid search.

Screenshot from Similarweb, September 2024

Looking at the organic search breakdown for the brand in 2023, we see that 72% of the site’s keywords were branded.

Screenshot from Similarweb, September 2024

 

Channel data for elorea.com

 What we see above is organic traffic directly correlates with other channels.

Screenshot from Similarweb, September 2024

Looking at the organic search breakdown for the brand in 2023, we see that 81% of the site’s keywords were branded.

The reason for this pattern is that the more your audience sees your brand, the more likely they are to Google your brand.

Another great example of this is monday.com. The brand has doubled down on its brand strategy, focusing primarily on YouTube ads.

Screenshot from Similarweb, September 2024

Its YouTube ads together with its memorable domain name has led to unprecedented levels of direct traffic.

Screenshot from Similarweb, September 2024

Although I haven’t yet seen evidence that Google uses direct traffic as a ranking signal, it stands to reason that it does. What’s more, you can see that Google has taken notice of the brand.

Try Googling the word Monday and then Google the word Tuesday and compare the results. To Google the word Monday refers to a brand.

Screenshot from search for Monday, Google, September 2024

Tuesday, on the other hand, is a day of the week.

Screenshot from search for Tuesday, Google, September 2024

The takeaway: You can increase brand signals like branded keywords and direct traffic by focusing on other channels.

So where does this leave you as an SEO?

You have the data to assess branded keywords and direct traffic, putting you in a unique position to partner with marketing leaders to work on big-picture marketing strategies designed to increase branded keywords.

2. Brand Engagement Metrics

Looking at the Google leak we can also see that Google measures user engagement as a signal of brand strength.

Brand Engagement Metrics includes user engagement factors such as click-through rates (CTR) and user interactions with the brand’s content. Higher engagement can positively influence rankings.

This means focusing on improving user engagement is a crucial aspect of brand SEO.

The best way to evaluate user engagement on your site is to compare your engagement metrics with that of your competitors.

For instance I’m analyzing toyota.com and four of its competitors with the Similarweb Website Performance report. Looking at the Engagement metrics we see that ford.com is getting more engagement in almost all metrics.

Screenshot from Similarweb, September 2024

One of the best ways to improve user engagement is to focus on site navigation. This means figuring out the flow of information on your site and including it in your:

  • URL structure
  • Bread crumbs
  • Top level menu

Also make sure that the top of the fold section of every piece of content directly answers the user intent directly.

3. Creating content For [Brand] + Modifier Keywords

You have direct influence over how your brand appears when users search for it on Google. While you might assume all your branded traffic goes to your homepage, there are actually other ways to capture this traffic.

What’s more, your branded traffic can help you discover customer sticking points or even areas where you are potentially losing customers to your competitors.

Want to see how? Try digging through your branded keywords. You are looking for keyword modifiers that either represent issues to resolve or opportunities to be won.

Keyword modifiers might be:

  • [brand] pricing
  • [brand] reviews
  • [brand] alternative
  • Where is [brand] located
  • [brand] alternative

By looking through your branded keywords, you can quickly see how users are interacting with your brand by seeing the questions they are asking. Make sure you have content that answers all of these questions. If you find long-tail queries it might be a good idea to create an FAQ on your site.

If you don’t do this you might see your branded traffic go to sites like YouTube or worse, your competitors.

For instance using the Similarweb SERP Players report below, we see a large portion of branded clicks for Ninja Creamy going to YouTube.

Screenshot from Similarweb, September 2024

What’s fascinating about this is that YouTube is not above the fold. This means that users often have more than one search intent and are willing to scroll to find what they are looking for. Can you afford to lose traffic for YouTube?

Screenshot from search for Ninja Creami, Google, September 2024

In a case like this depending on which videos users are clicking on, it might make sense for the brand to create video content designed to feature on the SERP.

A great example of a brand that got this right is wildgrain.com. In 2023, the keyword ‘wildgrain reviews’ was trending. Fortunately for the brand they already included a page on their site featuring reviews.

Googling the keyword, users were faced with Wildgrain’s own reviews page ranking in position #1. What’s more the rich result included a review rating of 4.7 out of five. It also listed the amount of reviews (which currently stands at 31040.)

Screenshot from search for Wildgrain reviews, Google, September 2024

Our data shows that 72% of searches were zero-click. That means the vast majority of users were satisfied with what they saw in the search results.

What’s interesting is of the remaining 28% of users, 57% clicked on the Wildgrain’s own result.

Screenshot from Similarweb, September 2024

The takeaway is with the right content, you can directly influence how users interact with your brand, even on things like reviews.

4. Educating Search Engines About Your Brand Entity

Another aspect of how Google evaluates your brand is through your brand entity. Google’s machine learning allows the search engine to understand real-world entities. It does this by gathering information about entities mentioned around the web and arranging that information in a similar way to how a human brain arranges information.

The purpose of this is to understand the relationships between people, places, and things so that Google can deliver more relevant and contextual information in the SERPs.

Google’s knowledge is constantly expanding and updating as new information becomes available.

How does Google understand your brand?

To find out, just Google it. If there are strong signals around the web, Google will present you with things like a Knowledge Panel, Twitter (X) boxes, image boxes and more.

Screenshot from search for Mr Beast, Google, September 2024

If you don’t see anything, you have work to do, as I mentioned above, a large portion of users hear about your brand and then Google you. What they see when they arrive on your brand SERP is up to you.

The great news is you can educate Google about your brand entity. When you do that, you’ll not only improve your brand SERP, but you might see your brand popping up in other strategic places.

For instance, below, I’ve searched for Fandango, a company that sells movie tickets. If you look at the bottom of the Knowledge Panel on the right, you’ll see Fandango’s direct competitors including:

  • AMC Theatres
  • Regal Cinemas
  • Cinemark Theatres
Screenshot from search for Fandango, Google, September 2024

How did a site’s competitors make it into the site’s Knowledge Panel? Google doesn’t only rank content for keywords any more. It understands what the brand entity is and what it relates to. The result is you might find your brand mentioned appearing on your competitor’s brand SERP or in other relevant places on the web.

How do you educate Google about your brand entity?

There is a clear method to educate Google about your brand entity.

  1. Establish an entity home page: Create a dedicated page that describes your entity. This page should clearly outline what your business does and who it serves. Although this can be any page on the web, the best place to do this is on your ‘About Us’ page.
  1. Build entity citations: Mentions of your brand across the web will reinforce the information provided on your brand homepage. It’s important to keep your brand description consistent around the web so that Google can match each citation with your entity’s home page. Citations can appear on pages you control, such as social media profiles but citations on pages you don’t control often carry more weight and provide the most benefit.
  1. Link from your entity homepage to your entity citations: This could mean including links to your social media profiles as well as any guest posts, videos, or podcasts your brand is featured on.

Far From The Final Word On Brand SEO

As an SEO, if you prioritize branded SEO, you are not just a technical specialist. You have access to data to shape the business’s digital identity, which can drive tangible and sometimes immediate results. This paradigm shift allows you to directly impact revenue streams, aligning SEO efforts more closely with overarching business objectives.

If branded SEO is a paradigm shift for you, consider this.

Branded SEO is only one ranking factor in Google’s complex maze of ranking systems. You can read more about it in our latest ebook: Google’s Ranking Anatomy: Dissecting 90+ Ranking Signals.


Image Credits

Featured Image: Image by Similarweb. Used with permission.

In-Post Image: Images by Similarweb. Used with permission.

Coming soon: Our 2024 list of 15 Climate Tech Companies to Watch

MIT Technology Review set out last year to recognize 15 companies from around the world that demonstrated they have a real shot at meaningfully driving down greenhouse-gas emissions and safeguarding society from the worst impacts of climate change.

We’re excited to announce that we took up the task again this year and will publish our 2024 list of 15 Climate Tech Companies to Watch on October 1. We’ll reveal it first on stage to attendees at our upcoming EmTech MIT event, and then share it online later that day.

The work these companies are doing is needed now more than ever. Global warming appears to be accelerating. The oceans are heating up faster than expected. And some scientists fear the planet is approaching tipping points that could trigger dramatic shifts in Earth’s ecosystems.

Nations must cut the greenhouse-gas pollution fueling that warming, and the heat waves, hurricanes, droughts, and fires it brings, as fast as possible. But we can’t simply halt emissions without plunging the global economy into a deep depression and the world into chaos. 

Any realistic plan to cut billions of tons of emissions over the next few decades requires us to develop and scale up cleaner ways of producing electricity, manufacturing goods, generating heat and cooling, and moving people and stuff around the world. 

To do that, we need competitive companies that can displace heavily polluting industries, or force them to clean up their acts. Those firms need to provide consumers with low-emissions options that, ideally, don’t feel like a sacrifice. And because climate change is underway, we also need technologies and services and infrastructure that can keep communities safe even as the world grows hotter and the weather becomes more erratic and extreme.

As we stated last year, we don’t claim to be oracles or soothsayers. The success of any one business depends on many hard-to-predict variables, including market conditions, political winds, investor sentiment, and consumer preferences. Taking aim at the business model and margins of conglomerates is especially fraught—and some of these firms may well fail.

But we did our best to select companies with solid track records that are tackling critical climate problems and have shown recent progress. 

This year’s list includes companies working to cut stubborn agricultural emissions, mine the metals needed for the energy transition in cleaner ways, and help communities tamp out wildfires before they become infernos. Others are figuring out new ways to produce fuels that can power our vehicles and industries, without adding more carbon dioxide to the atmosphere. 

A few companies from last year’s list also made the cut again because they’ve made notable strides toward their goals in the past 12 months.

We’re proud to publish the full list in the coming weeks. We hope you’ll take a look, ideally learn something new, and perhaps leave feeling encouraged that the world can make the changes needed to ease the risks of climate change and build a more sustainable future.

Why one developer won’t quit fighting to connect the US’s grids

Michael Skelly hasn’t learned to take no for an answer.

For much of the last 15 years, the Houston-based energy entrepreneur has worked to develop long-haul transmission lines to carry wind power across the Great Plains, Midwest, and Southwest, delivering clean electricity to cities like Albuquerque, Chicago, and Memphis. But so far, he has little to show for the effort. 

Skelly has long argued that building such lines and linking together the nation’s grids would accelerate the shift from coal- and natural-gas-fueled power plants to the renewables needed to cut the pollution driving climate change. But his previous business, Clean Line Energy Partners, shut down in 2019, after halting two of its projects and selling off interests in three more.

Skelly contends he was early, not wrong, about the need for such lines, and that the market and policymakers are increasingly coming around to his perspective. Indeed, the US Department of Energy just blessed his latest company’s proposed line with hundreds of millions in grants. 

The North Plains Connector would stretch about 420 miles from southeast Montana to the heart of North Dakota and create the first major connection between the US’s two largest grids, enabling system operators to draw on electricity generated by hydro, solar, wind, and other resources across much of the country. This could help keep regional power systems online during extreme weather events and boost the overall share of electricity generated by those clean sources. 

Skelly says he’s already secured the support of nine utilities around the region for the project, as well as more than 90% of the landowners along the route.

Michael Skelly
Michael Skelly founded Clean Line Energy Partners in 2009.
GRID UNITED

He says that more and more local energy companies have come to recognize that rising electricity demands, the growing threat storms and fires pose to power systems, and the increasing reliance on renewables have hastened the need for more transmission lines to stitch together and reinforce the country’s fraying, fractured grids.

“There’s a real understanding, really, across the country of the need to invest more in the grid,” says Skelly, now chief executive of Grid United, the Houston-based transmission development firm he founded in 2021. “We need more wires in the air.” 

Still, proposals to build long transmission lines frequently stir up controversy in the communities they would cross. It remains to be seen whether this growing understanding will be enough for Skelly’s project to succeed, or to get the US building anywhere near the number of transmission lines it now desperately needs.

Linking grids

Transmission lines are the unappreciated linchpin of the clean-energy transition, arguably as essential as solar panels in cutting emissions and as important as seawalls in keeping people safe.

These long, high, thick wires are often described as the highways of our power systems. They connect the big wind farms, hydroelectric plants, solar facilities, and other power plants to the edges of cities, where substations step down the voltage before delivering electricity into homes and businesses along distribution lines that are more akin to city streets. 

There are three major grid systems in the US: the Western Interconnection, the Eastern Interconnection, and the Texas Interconnected System. Regional grid operators such as the California Independent System Operator, the Midcontinent Independent System Operator, and the New York Independent System Operator oversee smaller local grids that are connected, to a greater or lesser extent, within those larger networks.

Transmission lines that could add significant capacity for sharing electricity back and forth across the nation’s major grid systems are especially valuable for cutting emissions and improving the stability of the power system. That’s because they allow those independent system operators to draw on a far larger pool of electricity sources. So if solar power is fading in one part of the country, they could still access wind or hydropower somewhere else. The ability to balance out fluctuations in renewables across regions and seasons, in turn, reduces the need to rely on the steady output of fossil-fuel plants. 

“There’s typically excess wind or hydro or other resources somewhere,” says James Hewett, manager of the US policy lobbying group at Breakthrough Energy, the Bill Gates–backed organization focusing on clean energy and climate issues. “But today, the limiting constraint is the ability to move resources from the place where they’re excessive to where they’re needed.” 

(Breakthrough Energy Ventures, the investment arm of the firm, doesn’t hold any investments in the North Plains Connector project or Grid United.)

It also means that even if regional wildfires, floods, hurricanes, or heat waves knock out power lines and plants in one area, operators may still be able to tap into adjacent systems to keep the lights on and air-conditioning running. That can be a matter of life and death in the event of such emergencies, as we’ve witnessed in the aftermath of heat waves and hurricanes in recent years.  

Studies have shown that weaving together the nation’s grids can boost the share of electricity that renewables reliably provide, significantly cut power-sector emissions, and lower system costs. A recent study by the Lawrence Berkeley National Lab found that the lines interconnecting the US’s major grids and the regions within them offer the greatest economic value among transmission projects, potentially providing more than $100 million in cost savings per year for every additional gigawatt of added capacity. (The study presupposes that the lines are operated efficiently and to their full capacity, among other simplifying assumptions.)

Experts say that grid interconnections can more than pay for themselves over time because, among other improved efficiencies, they allow grid operators to find cheaper sources of electricity at any given time and enable regions to get by with fewer power plants by relying on the redundancy provided by their neighbors.

But as it stands, the meager links between the Eastern Interconnection and Western Interconnection amount to “tiny little soda straws connecting two Olympic swimming pools,” says Rob Gramlich, president of Grid Strategies, a consultancy in Washington, DC. 

A win-win-win”

Grid United’s North Plains Connector, in contrast, would be a fat pipe.

The $3.2 billion, three-gigawatt project would more than double the amount of electricity that could zip back and forth between those grid systems, and it would tightly interlink a trio of grid operators that oversee regional parts of those larger systems: the Western Electricity Coordinating Council, the Midcontinent Independent System Operator, and the Southwest Power Pool. If the line is developed, each could then more easily tap into the richest, cheapest sources at any given time across a huge expanse of the nation, be it hydropower generated in the Northwest, wind turbines cranking across the Midwest, or solar power produced anywhere.

The North Plains Connector transmission line would stretch from from southeast Montana to the heart of North Dakota, connecting the nation's two biggest grids.
The North Plains Connector transmission line would stretch from from southeast Montana to the heart of North Dakota, connecting the nation’s two biggest grids.
COURTESY: ALLETE

This would ensure that utilities could get greater economic value out of those energy plants, which are expensive to build but relatively cheap to operate, and it would improve the reliability of the system during extreme weather, Skelly says.

“If you’ve got a heat dome in the Northwest, you can send power west,” he says. “If you have a winter storm in the Midwest, you can send power to the east.”

Grid United is developing the project as a joint venture with Allete, an energy company in Duluth, Minnesota, that operates several utilities in the region. 

The Department of Energy granted $700 million to a larger regional effort, known as the North Plains Connector Interregional Innovation project, which encompasses two smaller proposals in addition to Grid United’s. The grants will be issued through a more than $10 billion program established under the Bipartisan Infrastructure Law, enacted by President Joe Biden in 2021. 

That funding will likely be distributed to regional utilities and other parties as partial matching grants, designed to incentivize investments in the project among those likely to benefit from it. That design may also help address a chicken-and-egg problem that plagues independent transmission developers like Grid United, Breakthrough’s Hewett says. 

Regional utilities can pass along the costs of projects to their electricity customers. Companies like Grid United, however, generally can’t sign up the power producers that will pay to use their lines until they’ve got project approval, but they also often can’t secure traditional financing until they’ve lined up customers.

The DOE funding could ease that issue by providing an assurance of capital that would help get the project through the lengthy permitting process, Hewett says. 

“The states are benefiting, local utilities are benefiting, and the developer will benefit,” he says. “It’s a win-win-win.”

Transmission hurdles

Over the years, developers have floated various proposals to more tightly interlink the nation’s major grid systems. But it’s proved notoriously difficult to build any new transmission lines in the US—a problem that has only worsened in recent years. 

The nation is developing only 20% of the transmission capacity per year in the 2020s that it did in the early 2010s. On average, interstate transmission lines take eight to 10 years to develop “if they succeed at all,” according to a report from the Niskanen Center.

The biggest challenge in adding connections between grids, says Gramlich of Grid Strategies, is that there’s no clear processes for authorizing lines that cross multiple jurisdictions and no dedicated regional or federal agencies overseeing such proposals. The fact that numerous areas may benefit from such lines also sparks interregional squabbling over how the costs should be allocated. 

In addition, communities often balk at the sight of wires and towers, particularly if the benefits of the lines mostly accrue around the end points, not necessarily in all the areas the wires cross. Any city, county, or state, or even one landowner, can hold up a project for years, if not kill it.

But energy companies themselves share much of the blame as well. Regional energy agencies, grid operators, and utilities have actively fought proposals from independent developers to erect wires passing through their territories. They often simply don’t want to forfeit control of their systems, invite added competition, or deal with the regulatory complexity of such projects. 

The long delays in building new grid capacity have become a growing impediment to building new energy projects.

As of last year, there were 2,600 gigawatts’ worth of proposed energy generation or storage projects waiting in the wings for transmission capacity that would carry their electricity to customers, according to a recent analysis by Lawrence Berkeley National Lab. That’s roughly the electricity output of 2,600 nuclear reactors, or more than double the nation’s entire power system. 

The capacity of projects in the queue has risen almost eightfold from a decade ago, and about 95% of them are solar, wind, or battery proposals.

“Grid interconnection remains a persistent bottleneck,” Joseph Rand, an energy policy researcher at the lab and the lead author of the study, said in a statement.

The legacy of Clean Line Energy

Skelly spent the aughts as the chief development officer of Horizon Wind Energy, a large US wind developer that the Portuguese energy giant EDP snapped up in 2007 for more than $2 billion. Skelly then made a spirited though ill-fated run for Congress in 2008, as the Democratic nominee for the 7th Congressional District of Texas. He ran on a pro-renewables, pro-education campaign but lost by a sizable margin in a district that was solidly Republican.

The following year, he founded Clean Line Energy Partners. The company raised tens of millions of dollars and spent a decade striving to develop five long-range transmission projects that could connect the sorts of wind projects Skelly had worked to build before.

The company did successfully earn some of the permits required for several lines. But it was forced to shut down or offload its projects amid pushback from landowner groups and politicians opposed to renewables, as well as from regional utilities and public utility commissions. 

“He was going to play in other people’s sandboxes and they weren’t exactly keen on having him in there,” says Russell Gold, author of Superpower: One Man’s Quest to Transform American Energy, which recounted Skelly’s and Clean Line Energy’s efforts and failures.

Ultimately, those obstacles dragged out the projects beyond the patience of the company’s investors, who declined to continue throwing more money at them, he says. 

The company was forced to halt the Centennial West line through New Mexico and the Rock Island project across the Midwest. In addition, it sold off its stake in the Grain Belt Express, which would stretch from Kansas to Indiana, to Invenergy; the Oklahoma portion of the Plains and Eastern line to NextEra Energy; and the Western Spirit line through New Mexico, along with an associated wind farm project, to Pattern Development. 

Clean Line Energy itself wound down in 2019.

The Western Spirit transmission line was electrified in late 2021, but the other two projects are still slogging through planning and permitting.

“These things take a long time,” Skelly says. 

For all the challenges the company faced, Gold still credits it with raising awareness about the importance and necessity of long-distance interregional transmission. He says it helped spark conversations that led the Federal Energy Regulatory Commission to eventually enact rules to support regional transmission planning and encouraged other big players to focus more on building transmission lines.

“I do believe that there is a broader social, political, and commercial awareness now that the United States needs to interconnect its grids,” Gold says. 

Lessons learned

Skelly spent a few years as a senior advisor at Lazard, consulting with companies on renewable energy. But he was soon ready to take another shot at developing long-haul transmission lines and started Grid United in 2021.

The new company has proposed four transmission projects in addition to the North Plains Connector—one between Arizona and New Mexico, one between Colorado and Oklahoma, and one each within Texas and Wyoming.

Asked what he thinks the legacy of Clean Line Energy is, Skelly says it’s mixed. But he soon adds that the history of US infrastructure building is replete with projects that didn’t move ahead. The important thing, he says, is to draw the right lessons from those failures.

“When we’re smart about it, we look at the past to see what we can learn,” he says. “We certainly do that today in our business.”

Skelly says one of the biggest takeaways was that it’s important to do the expensive upfront work of meeting with landowners well in advance of applying for permitting, and to use their feedback to guide the line of the route. 

Anne Hedges, director of policy and legislative affairs at the Montana Environmental Information Center, confirms that this is the approach Grid United has taken in the region so far.

“A lot of developers seem to be more focused on drawing a straight line on a map rather than working with communities to figure out the best placement for the transmission system,” she says. “Grid United didn’t do that. They got out on the ground and talked to people and planned a route that wasn’t linear.”

The other change that may make Grid United’s project there more likely to move forward has more to do with what the industry’s learned than what Skelly has.  

Gramlich says regional grid operators and utilities have become more receptive to collaborating with developers on transmission lines—and for self-interested reasons. They’ll need greater capacity, and soon, to stay online and meet the growing energy demands of data centers, manufacturing facilities, electric vehicles, and buildings, and address the risks to power systems from extreme weather events.

Industry observers are also hopeful that an energy permitting reform bill pending in Congress, along with the added federal funding and new rules requiring transmission providers to do more advance planning, will also help accelerate development. The bipartisan bill promises to shorten the approval process for projects that are determined to be in the national interest. It would also require neighboring areas to work together on interregional transmission planning.

Hundreds of environmental groups have sharply criticized the proposal, which would also streamline approvals for certain oil and gas operations.

“This legislation guts bedrock environmental protections, endangers public health, opens up tens of millions of acres of public lands and hundreds of millions of acres of offshore waters to further oil and gas leasing, gives public lands to mining companies, and would defacto rubberstamp gas export projects that harm frontline communities and perpetuate the climate crisis,” argued a letter signed by 350.org, Earthjustice, the Center for Biological Diversity, the Union of Concerned Scientists, and hundreds of other groups.

But a recent analysis by Third Way, a center-left think tank in Washington, DC, found that the emissions benefits from accelerating transmission permitting could significantly outweigh the added climate pollution from the fossil-fuel provisions in the bill. It projects that the bill would, on balance, reduce global emissions by 400 million to 16.6 billion tons of carbon dioxide through 2050. 

“Guardedly optimistic” 

Grid United expects to begin applying for county and state permits in the next few months and for federal permits toward the end of the year. It hopes to begin construction within the next four years and switch the line on in 2032.

Since the applications haven’t been made, it’s not clear what individuals or groups are or will be opposed to it—though, given the history of such projects, some will surely object.

Hedges says the Montana Environmental Information Center is reserving judgment until it sees the actual application. She says the organization will be particularly focused on any potential impact on water and wildlife across the region, “making sure that they’re not harming what are already struggling resources in this area.”

So if Skelly was too early with his last company, the obvious question is: Are the market, regulatory, and societal conditions now ripe for interregional transmission lines?

“We’re gonna find out if they are, right?” he says. “We don’t know yet.”

Skelly adds that he doesn’t think the US is going to build as much transmission as it needs to. But he does believe we’ll start to see more projects moving forward—including, he hopes, the North Plains Connector.

“You just can’t count on anything, and you’ve just got to keep going and push, push, push,” he says. “But we’re making good progress. There’s a lot of utility interest. We have a big grant from the DOE, which will help bring down the cost of the project. So knock on wood, we’re guardedly optimistic.”

The Download: how to connect the US’s grids, and OpenAI’s new voice mode

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

Why one developer won’t quit fighting to connect the US’s grids

Michael Skelly hasn’t learned to take no for an answer. For much of the last 15 years, the energy entrepreneur has worked to develop long-haul transmission lines to carry wind power across the Great Plains, Midwest, and Southwest. But so far, he has little to show for the effort.

Skelly has long argued that building such lines and linking together the nation’s grids would accelerate the shift from coal- and natural-gas-fueled power plants to the renewables needed to cut the pollution driving climate change. But his previous business shut down in 2019, after halting two of its projects and selling off interests in three more.

Skelly contends he was early, not wrong, and that the market and policymakers are increasingly coming around to his perspective. After all, the US Department of Energy just blessed his latest company’s proposed line with hundreds of millions in grants. Read the full story.

—James Temple

OpenAI released its advanced voice mode to more people. Here’s how to get it.

OpenAI is broadening access to Advanced Voice Mode, a feature of ChatGPT that allows you to speak more naturally with the AI model. It allows you to interrupt its responses midsentence, and it can sense and interpret your emotions from your tone of voice and adjust its responses accordingly. 

Users who’ve been able to try it have largely described the model as an impressively fast, dynamic, and realistic voice assistant—which has made its limited availability particularly frustrating to some other OpenAI users. This is the first time the company has promised to bring the new voice mode to a wide range of users. Here’s what you need to know.

—James O’Donnell

An AI script editor could help decide what films get made in Hollywood

Every day across Hollywood, scores of film school graduates and production assistants work as script readers. Their job is to find the diamonds in the rough from the 50,000 or so screenplays pitched each year and flag any worth pursuing further. 

Now the film-focused tech company Cinelytic, which works with major studios like Warner Bros. and Sony Pictures to analyze film budgets and box office potential, aims to offer script feedback with generative AI. 

It takes its new tool Callaia less than a minute to compile a synopsis, a list of comparable films, grades for areas like dialogue and originality, and actor recommendations. Cool idea, but is it any good? Read the full story.

—James O’Donnell

The must-reads

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.

1 The star witness in the FTX case has been sentenced to two years in prison
 Caroline Ellison got off lightly in exchange for her extensive cooperation. (CNBC)
+ In contrast, Sam Bankman-Fried was sentenced to 25 years earlier this year. (FT $)
+ Her help has been credited with helping to recover customer assets. (The Verge)

2 A Chinese-funded US VC fund is under scrutiny from the FBI
There are fears it may have passed trade secrets to Beijing. (FT $)
+ Hone Capital has invested in heavy-hitters including Stripe. (TechCrunch)

3 CrowdStrike’s CEO apologized to US Congress over the catastrophic outage
The crash highlighted the dangers of relying on single vendors. (WP $)
+ It’s facing legal action from its disgruntled shareholders. (Bloomberg $)+ The system failure affected millions of PCs across the world. (MIT Technology Review)

4 A bold plan to refreeze the Arctic may just work
Trials pumping seawater over existing ice appear have proved successful. (New Scientist $)
+ Europe is running rings around the US in terms of heat pump adoption. (The Atlantic $)

5 Huge data centers are springing up across Latin America
And local communities are paying the price. (The Guardian)
+ Energy-hungry data centers are quietly moving into cities. (MIT Technology Review)

6 Why Mark Zuckerberg washed his hands of politics
He regrets some of the political posturing he dabbled in during his 20s. (NYT $)
+ Meta isn’t giving up on giving its chatbots famous voices. (Insider $)

7 Be wary of Google Images of risky mushroom species 🍄
They could be AI-generated and look nothing like the real thing. (404 Media)
+ Director and AI-embracer James Cameron has joined Stability AI’s board. (The Verge)

8 You probably don’t need an iPhone 16
How much better can a camera get, really? (New Yorker $) 

9 Resist the temptation to vent about work online
Anything you share on company devices could come back to bite you. (WSJ $)

10 How to save the Earth from a colossal asteroid ☄
Blast it into oblivion using a massive X-ray beam, obviously. (Vice)
+ Earth is probably safe from a killer asteroid for 1,000 years. (MIT Technology Review)

Quote of the day

“Not a day goes by that I don’t think about all of the people I hurt. I participated in a criminal conspiracy that ultimately stole billions of dollars from people who entrusted their money with us.”

—Caroline Ellison, a former executive at FTX, apologizes to New York federal court during her sentencing, Bloomberg reports.

The big story

How tracking animal movement may save the planet

February 2024

Animals have long been able to offer unique insights about the natural world around us, acting as organic sensors picking up phenomena invisible to humans. Canaries warned of looming catastrophe in coal mines until the 1980s, for example.

These days, we have more insight into animal behavior than ever before thanks to technologies like sensor tags. But the data we gather from these animals still adds up to only a relatively narrow slice of the whole picture.

This is beginning to change. Researchers are asking: What will we find if we follow even the smallest animals? What could we learn from a system of animal movement, continuously monitoring how creatures big and small adapt to the world around us? It may be, some researchers believe, a vital tool in the effort to save our increasingly crisis-plagued planet. Read the full story

—Matthew Ponsford

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ This night parrot looks like it’s hard some seriously late nights. 🦜
+ The Nazgûl morning routine sounds like a great way to start the day.
+ Time for a hypnotic pencil-sharpening video.
+ Spooky season’s starting early this year: we’ve just discovered a new species of ghost shark!

A tiny new open-source AI model performs as well as powerful big ones

The Allen Institute for Artificial Intelligence (Ai2), a research nonprofit, is releasing a family of open-source multimodal language models, called Molmo, that it says perform as well as top proprietary models from OpenAI, Google, and Anthropic. 

The organization claims that its biggest Molmo model, which has 72 billion parameters, outperforms OpenAI’s GPT-4o, which is estimated to have over a trillion parameters, in tests that measure things like understanding images, charts, and documents.  

Meanwhile, Ai2 says a smaller Molmo model, with 7 billion parameters, comes close to OpenAI’s state-of-the-art model in performance, an achievement it ascribes to vastly more efficient data collection and training methods. 

What Molmo shows is that open-source AI development is now on par with closed, proprietary models, says Ali Farhadi, the CEO of Ai2. And open-source models have a significant advantage, as their open nature means other people can build applications on top of them. The Molmo demo is available here, and it will be available for developers to tinker with on the Hugging Face website. (Certain elements of the most powerful Molmo model are still shielded from view.) 

Other large multimodal language models are trained on vast data sets containing billions of images and text samples that have been hoovered from the internet, and they can include several trillion parameters. This process introduces a lot of noise to the training data and, with it, hallucinations, says Ani Kembhavi, a senior director of research at Ai2. In contrast, Ai2’s Molmo models have been trained on a significantly smaller and more curated data set containing only 600,000 images, and they have between 1 billion and 72 billion parameters. This focus on high-quality data, versus indiscriminately scraped data, has led to good performance with far fewer resources, Kembhavi says.

Ai2 achieved this by getting human annotators to describe the images in the model’s training data set in excruciating detail over multiple pages of text. They asked the annotators to talk about what they saw instead of typing it. Then they used AI techniques to convert their speech into data, which made the training process much quicker while reducing the computing power required. 

These techniques could prove really useful if we want to meaningfully govern the data that we use for AI development, says Yacine Jernite, who is the machine learning and society lead at Hugging Face, and was not involved in the research. 

“It makes sense that in general, training on higher-quality data can lower the compute costs,” says Percy Liang, the director of the Stanford Center for Research on Foundation Models, who also did not participate in the research. 

Another impressive capability is that the model can “point” at things, meaning it can analyze elements of an image by identifying the pixels that answer queries.

In a demo shared with MIT Technology Review, Ai2 researchers took a photo outside their office of the local Seattle marina and asked the model to identify various elements of the image, such as deck chairs. The model successfully described what the image contained, counted the deck chairs, and accurately pinpointed to other things in the image as the researchers asked. It was not perfect, however. It could not locate a specific parking lot, for example. 

Other advanced AI models are good at describing scenes and images, says Farhadi. But that’s not enough when you want to build more sophisticated web agents that can interact with the world and can, for example, book a flight. Pointing allows people to interact with user interfaces, he says. 

Jernite says Ai2 is operating with a greater degree of openness than we’ve seen from other AI companies. And while Molmo is a good start, he says, its real significance will lie in the applications developers build on top of it, and the ways people improve it.

Farhadi agrees. AI companies have drawn massive, multitrillion-dollar investments over the past few years. But in the past few months, investors have expressed skepticism about whether that investment will bring returns. Big, expensive proprietary models won’t do that, he argues, but open-source ones can. He says the work shows that open-source AI can also be built in a way that makes efficient use of money and time. 

“We’re excited about enabling others and seeing what others would build with this,” Farhadi says. 

Two Nobel Prize winners want to cancel their own CRISPR patents in Europe

In the decade-long fight to control CRISPR, the super-tool for modifying DNA, it’s been common for lawyers to try to overturn patents held by competitors by pointing out errors or inconsistencies.

But now, in a surprise twist, the team that earned the Nobel Prize in chemistry for developing CRISPR is asking to cancel two of their own seminal patents, MIT Technology Review has learned. The decision could affect who gets to collect the lucrative licensing fees on using the technology.

­­The request to withdraw the pair of European patents, by lawyers for Nobelists Emmanuelle Charpentier and Jennifer Doudna, comes after a damaging August opinion from a European technical appeals board, which ruled that the duo’s earliest patent filing didn’t explain CRISPR well enough for other scientists to use it and doesn’t count as a proper invention.

The Nobel laureates’ lawyers say the decision is so wrong and unfair that they have no choice but to preemptively cancel their patents, a scorched-earth tactic whose aim is to prevent the unfavorable legal finding from being recorded as the reason. 

“They are trying to avoid the decision by running away from it,” says Christoph Then, founder of Testbiotech, a German nonprofit that is among those opposing the patents, who provided a copy of the technical opinion and response letter to MIT Technology Review. “We think these are some of the earliest patents and the basis of their licenses.”

Discovery of the century

CRISPR has been called the biggest biotech discovery of the century, and the battle to control its commercial applications—such as gene-altered plants, modified mice, and new medical treatments—has raged for a decade.

The dispute primarily pits Charpentier and Doudna, who were honored with the Nobel Prize in 2020 for developing the method of genome editing, against Feng Zhang, a researcher at the Broad Institute of MIT and Harvard, who claimed to have invented the tool first on his own.

Back in 2014, the Broad Institute carried out a coup de main when it managed to win, and later defend, the controlling US patent on CRISPR’s main uses. But the Nobel pair could, and often did, point to their European patents as bright points in their fight. In 2017, the University of California, Berkeley, where Doudna works, touted its first European patent as exciting, “broad,” and “precedent” setting.

After all, a region representing more than 30 countries had not only recognized the pair’s pioneering discovery; it had set a standard for other patent offices around the world. It also made the US Patent Office look like an outlier whose decisions favoring the Broad Institute might not hold up long term. A further appeal challenging the US decisions is pending in federal court.

Long-running saga

But now the European Patent Office is also saying—for different reasons—that Doudna and Charpentier can’t claim their basic invention. And that’s a finding their attorneys think is so damaging, and reached in such an unjust way, that they have no choice but to sacrifice their own patents. “The Patentees cannot be expected to expose the Nobel-prize winning invention … to the repercussions of a decision handed down under such circumstances,” says the 76page letter sent by German attorneys on their behalf on September 20.

The chief intellectual-property attorney at the University of California, Randi Jenkins, confirmed the plan to revoke the two patents but downplayed their importance. 

“These two European patents are just another chapter in this long-running saga involving CRISPR-Cas9,” Jenkins said. “We will continue pursuing claims in Europe, and we expect those ongoing claims to have meaningful breadth and depth of coverage.”

The patents being voluntarily disavowed are EP2800811, granted in 2017, and EP3401400, granted in 2019. Jenkins added the Nobelists still share one issued CRISPR patent in Europe, EP3597749, and one that is pending. That tally doesn’t include a thicket of patent claims covering more recent research from Doudna’s Berkeley lab that were filed separately.

Freedom to operate

The cancellation of the European patents will affect a broad network of biotech companies that have bought and sold rights as they seek to achieve either commercial exclusivity to new medical treatments or what’s called “freedom to operate”—the right to pursue gene-slicing research unmolested by doubts over who really owns the technique. 

These companies include Editas Medicine, allied with the Broad Institute; Caribou Biosciences and Intellia Therapeutics in the US, both cofounded by Doudna; and Charpentier’s companies, CRISPR Therapeutics and ERS Genomics.

ERS Genomics, which is based in Dublin and calls itself “the CRISPR licensing company,” was set up in Europe specifically to collect fees from others using CRISPR. It claims to have sold nonexclusive access to its “foundational patents” to more than 150 companies, universities, and organizations who use CRISPR in their labs, manufacturing, or research products.

For example, earlier this year Laura Koivusalo, founder of a small Finnish biotech company, StemSight, agreed to a “standard fee” because her company is researching an eye treatment using stem cells that were previously edited using CRISPR.

Although not every biotech company thinks it’s necessary to pay for patent rights long before it even has a product to sell, Koivusalo decided it would be the right thing to do. “The reason we got the license was the Nordic mentality of being super honest. We asked them if we needed a license to do research, and they said yes, we did,” she says.

A slide deck from ERS available online lists the fee for small startups like hers at $15,000 a year. Koivusalo says she agreed to buy a license to the same two patents that are now being canceled. She adds: “I was not aware they were revoked. I would have expected them to give a heads-up.” 

A spokesperson for ERS Genomics said its customers still have coverage in Europe based on the Nobelists’ remaining CRISPR patent and pending application.

In the US, the Broad Institute has also been selling licenses to use CRISPR. And the fees can get big if there’s an actual product involved. That was the case last year, when Vertex Pharmaceuticals won approval to sell the first CRISPR-based treatment, for sickle-cell disease. To acquire rights under the Broad Institute’s CRISPR patents, Vertex agreed to pay $50 million on the barrelhead—and millions more in the future.

PAM problem

There’s no doubt that Charpentier and Doudna were first to publish, in a 2012 paper, how CRISPR can function as a “programmable” means of editing DNA. And their patents in Europe withstood an initial round of formal oppositions filed by lawyers.

But this August, in a separate analysis, a technical body decided that Berkeley had omitted a key detail from its earliest patent application, making it so that “the skilled person could not carry out the claimed method,” according to the finding. That is, it said, the invention wasn’t fully described or enabled.

The omission relates to a feature of DNA molecules called “protospacer adjacent motifs,” or PAMs. These features, a bit like runway landing lights, determine at what general locations in a genome the CRISPR gene scissors are able to land and make cuts, and where they can’t.

In the 76-page reply letter sent by lawyers for the Nobelists, they argue there wasn’t really any need to mention these sites, which they say were so obvious that “even undergraduate students” would have known they were needed. 

The lengthy letter leaves no doubt the Nobel team feels they’ve been wronged. In addition to disavowing the patents, the text runs on because it seeks to “make of public record the reasons for which we strongly disagree with [the] assessment on all points” and to “clearly show the incorrectness” of the decision, which, they say, “fails to recognize the nature and origin of the invention, misinterprets the common general knowledge, and additionally applies incorrect legal standards.”