Amid Downturn, Ecommerce Investor Perseveres

The post-Covid ecommerce hangover has hit Roman Kahn. He launched his first direct-to-consumer brand in 2013, acquired others, and in 2021 founded Peak 21, an aggregator with equity investors. The outlook was good.

Fast forward to 2024, and many ecommerce companies are struggling. Mergers and acquisitions have cratered. Yet Kahn perseveres. His team reviews dozens of purchase candidates every month, albeit cautiously.

In our recent conversation, Kahn shared his investment criteria, current market conditions, and predictions for a recovery. The entire audio is embedded below. The transcript is edited for clarity and length.

Eric Bandholz: Give us a rundown of what you do.

Roman Khan: I’m the founder and president of an ecommerce holding company called Peak 21. We buy, grow, and sell direct-to-consumer brands. My DTC experience began in 2013 when my wife, Jennifer, and I started Linjer. We sold leather bags but now it’s mostly jewelry. We launched it on Indiegogo.

By 2016, we were doing a couple of million in annual revenue — big enough for Jennifer and me to quit our jobs to work on it full-time. In 2017, Linjer produced $1 million in EBITDA — earnings before interest, taxes, depreciation, and amortization. By then we had raised quite a bit of money on Kickstarter and Indiegogo and built up street cred. Folks were reaching out, asking us how we did it. We decided to diversify. We needed more brands, and Meta ads were working well.

I took that $1 million of cash, our street cred, and combined sweat equity with cash to invest in three other DTC companies. Each was doing less than $1 million in revenue annually. By 2019, we were doing $50 million in sales as a group.

When Covid hit in 2020, revenue ballooned to $100 million annually. In 2021, investors were knocking on our door, particularly Jeffrey Yan, whose family owned Forbes Media up until this year. He came to my office and said I needed to take on external capital to buy more prominent companies.

We set up a special purpose acquisition company — a blank check company — called Peak 21. Jeffrey Yan and others invested eight figures in equity. We’re now using that SPAC to buy companies. We seek brands doing $5 to $50 million in annual sales.

Bandholz: What’s an ideal acquisition candidate?

Khan: The pool is shrinking. I’ve spoken with many owners. My acquisitions team talks to 100-plus businesses every month. Only about 10% have a product-market fit that can grow with low budgets. Our main criterion now is size. We look at the fundamentals. What’s the customer acquisition cost? And the repeat buyer rate? The best scenario is 70% of first-time buyers repeat in the first quarter. We know the investment will likely work out at the rate.

Two, we look at customers’ buying habits. For instance, we own a company called Nutrition Kitchen. It’s a daily meal delivery service. Daily rather than weekly or monthly habits play a significant role.

Beyond consumables, we look at contribution margins on three levels.

First, we calculate revenue (net of taxes and coupon-driven sales) and shipping fees collected at checkout. That leaves us with “profit contribution one” — PC1.

Then, we deduct roughly 10 variable costs, such as warehouse storage, pick-and-pack, shipping fees, returns, and exchanges. That results in profit contribution two — PC2.

Lastly, we deduct marketing to determine PC3.

From PC3 we subtract operating expenses to arrive at EBITDA.

A key acquisition metric is a 50% or higher PC2 while maintaining a competitive suggested retail price.

Bandholz: A hundred candidates a month is a lot to review.

Roman: Many ecommerce companies are struggling now. Revenue and EBITDA are down. Out of our six main brands, two are struggling massively. Overall we’re okay. We’re growing with a diversified portfolio. But those two are a nightmare. We have lent over $1 million to each one in the last 24 months. So it’s been hard. Many founders are holding out until 2025 or 2026 to sell.

We buy companies in four ways. One is cash. Two is seller financing. Three is using debt, where we borrow the money against the acquired company’s value. That avenue, I should add, is very challenging now. The fourth method is an equity swap wherein we acquire a company with Peak 21 stock. Cash is scarce right now. Our willingness to pay a lot of cash upfront is low to non-existent. We’re often the only real buyers when talking to a company.

For the market to improve, two things need to happen. First, investors must get over the losses from aggregators, such as Perch, Thrasio, and others. Second, interest rates have to come down. Once that happens, liquidity will loosen up, and hopefully, the market will return, likely by Q1 2026 in my estimation.

Bandholz: How can listeners contact you?

Khan: Our site They can message me on X or on LinkedIn.

Google Limits News Links In California Over Proposed ‘Link Tax’ Law via @sejournal, @MattGSouthern

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.

Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.

However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.

Featured Image:Ismael Juan/Shutterstock

Query Deserves Ads Is Where Google Is Headed via @sejournal, @martinibuster

Google’s CEO Sundar Pichai recently discussed the future of search, affirming the importance of websites (good news for SEO). But how can that be if AI is supposed to make search engines obsolete (along with SEO)?

Search vs Chatbots vs Generative Search

There’s a lot of discussion about AI search but what’s consistently missing is a delineation of what is meant by that phrase.

There are three ways to think about what is being discussed:

  • Search Engines
  • Chatbots like Gemini or ChatGPT
  • Generative Search (which are chatbots stacked on top of a traditional search engine like and Bing)

Traditional Search Is A Misnomer

The word misnomer means an inaccurate name, description or label that’s given to something. We still talk about traditional search, perhaps out of habit. The reality that must be acknowledged is that traditional search no longer exists. It’s a misnomer to refer to Google as traditional search.

Sundar Pichai made the point that Google has been using AI for years and we know this is true because of systems like RankBrain, SpamBrain, Helpful Content System (aka HCU) and the Reviews System. AI is involved at virtually every step of Google search from the backend to the frontend in the search results.

Google’s 2021 documentation about SpamBrain noted how AI is used at the crawling and indexing level:

“First, we have systems that can detect spam when we crawl pages or other content. …Some content detected as spam isn’t added to the index.

These systems also work for content we discover through sitemaps and Search Console. …We observed spammers hacking into vulnerable sites, pretending to be the owners of these sites, verifying themselves in the Search Console and using the tool to ask Google to crawl and index the many spammy pages they created. Using AI, we were able to pinpoint suspicious verifications and prevented spam URLs from getting into our index this way.”

AI is involved in the indexing process and all the way through to the ranking process and lastly in the search results themselves.

The most recent March 2024 update is described by Google as complex and is still not over in April 2024. I suspect that Google has transitioned to a more AI-friendly infrastructure in order to accommodate doing things like integrating the AI signals formerly associated with the HCU and the Reviews System straight into the core algorithm.

People are freaking out because the AI search of the future will summarize answers. Well, Google already does that in featured snippets and knowledge graph search results.

Let’s be real: traditional Search no longer exists, it’s a misnomer. Google is more accurately described as an AI search engine and this important to acknowledge because as you’ll shortly see, it directly relates to Sundar Pichai’s means when he talks about what search will look like in ten years.

Blended Hybrid Search AKA Generative Search

What people currently call AI Search is also a misnomer. The more accurate label is Generative Search. Bing and are generative AI chatbots stacked on top of a search index with something in the middle that coordinates between the two, generally referred to as Retrieval-Augmented Generation (RAG), a technology that was created in 2020 by Facebook AI researchers


Chatbots are a lot of things including ChatGPT and Gemini. No need to belabor this point, right?

Search Vs Generative Search Vs Chatbots: Who Wins?

Generative search is an awkward mix of a chatbot and a search engine with a somewhat busy interface. It’s awkward because it wants to do your homework and tell you the phone number of the local restaurant but it’s mediocre at both. But even if generative search improves does anyone really want a search engine that can also write an essay? It’s almost a given that those awkwardly joined capabilities are going to drop off and it’ll eventually come to resemble what Google already is.

Chatbots and Search Engines

That leaves us with a near-future of chatbots and search engines. Sam Altman said that an AI chatbot search that shows advertising is dystopian.

Google is pursuing both strategies by tucking the Gemini AI chatbot into Android as an AI assistant that can make your phone calls, phone the local restaurant for you and offer suggestions for the best pizza in town. CEO Sundar Pichai is on record stating that the web is an important resource that they’d like to continue using.

But if the chatbot doesn’t show ads, that’s going to significantly cut into Google’s ad revenue. Nevertheless, the SEO industry is convinced that SEO is over because search engines are going to be replaced by AI.

It’s possible that Google at some point makes a lot of money from cloud services and SaaS products and it will be able to walk away from search-based advertising revenue if everyone migrates towards AI chatbots.

Query Deserves Advertising

But if there’s money in search advertising, why go through all the trouble to crawl the web, develop the technology and not monetize it? Who leaves money on the table? Not Google.

There’s a search engine algorithm called Query Deserves Freshness. The algorithm determines if a search query is trending or is newsworthy and will choose a webpage on the topic that is recently published, fresh.

Similarly, I believe at some point that chatbots are going to differentiate when a search query deserves ads and switch over to a search result.

Google’s CEO Pichai contradicts the SEO narrative of the decline and disappearance of search engines. Pichai says that the future of search includes websites because search needs the diversity of opinions inherent in the web. So where is this all leading toward?

Google Search already surfaces answers for non-money queries that are informational like the weather and currency conversions. There are no ads for those queries so Google is not losing anything by showing informational queries in a chatbot.

But for shopping and other transactional types of search queries, the best solution is Query Deserves Advertising.

If a user asks a shopping related search query there’s going to come a time where the chatbot will “helpfully” decide that the Query Deserves Advertising and switch over to the search engine inventory that also includes advertising.

That may explain why Google’s CEO sees a future where the web is not replaced by an AI but rather they coexist. So if you think about it, Query Deserves Advertising may be how search engines preserve their lucrative advertising business in the age of AI.

Query Deserves Search

An extension of this concept is to think about search queries where comparisons, user reviews, expert human reviews, news, medical, financial and other queries that require human input will need to be surfaced. Those kinds of queries may also switch over to a search result. The results may not look like today’s search results but they will still be search results.

People love reading reviews, reading news, reading gossip and other human generated topic and that’s not going away. Insights matter. Personality matters.

Query Deserves SEO

So maybe the SEO knee jerk reaction that SEO is dead is premature. We’re still at the beginning of this and as long as there’s money to be made off of search there will still be a need for websites, search engines and SEO.

Featured Image by Shutterstock/Shchus

10 Paid Search & PPC Planning Best Practices via @sejournal, @LisaRocksSEM

Whether you are new to paid media or reevaluating your efforts, it’s critical to review your performance and best practices for your overall PPC marketing program, accounts, and campaigns.

Revisiting your paid media plan is an opportunity to ensure your strategy aligns with your current goals.

Reviewing best practices for pay-per-click is also a great way to keep up with trends and improve performance with newly released ad technologies.

As you review, you’ll find new strategies and features to incorporate into your paid search program, too.

Here are 10 PPC best practices to help you adjust and plan for the months ahead.

1. Goals

When planning, it is best practice to define goals for the overall marketing program, ad platforms, and at the campaign level.

Defining primary and secondary goals guides the entire PPC program. For example, your primary conversion may be to generate leads from your ads.

You’ll also want to look at secondary goals, such as brand awareness that is higher in the sales funnel and can drive interest to ultimately get the sales lead-in.

2. Budget Review & Optimization

Some advertisers get stuck in a rut and forget to review and reevaluate the distribution of their paid media budgets.

To best utilize budgets, consider the following:

  • Reconcile your planned vs. spend for each account or campaign on a regular basis. Depending on the budget size, monthly, quarterly, or semiannually will work as long as you can hit budget numbers.
  • Determine if there are any campaigns that should be eliminated at this time to free up the budget for other campaigns.
  • Is there additional traffic available to capture and grow results for successful campaigns? The ad platforms often include a tool that will provide an estimated daily budget with clicks and costs. This is just an estimate to show more click potential if you are interested.
  • If other paid media channels perform mediocrely, does it make sense to shift those budgets to another?
  • For the overall paid search and paid social budget, can your company invest more in the positive campaign results?

3. Consider New Ad Platforms

If you can shift or increase your budgets, why not test out a new ad platform? Knowing your audience and where they spend time online will help inform your decision when choosing ad platforms.

Go beyond your comfort zone in Google, Microsoft, and Meta Ads.

Here are a few other advertising platforms to consider testing:

  • LinkedIn: Most appropriate for professional and business targeting. LinkedIn audiences can also be reached through Microsoft Ads.
  • TikTok: Younger Gen Z audience (16 to 24), video.
  • Pinterest: Products, services, and consumer goods with a female-focused target.
  • Snapchat: Younger demographic (13 to 35), video ads, app installs, filters, lenses.

Need more detailed information and even more ideas? Read more about the 5 Best Google Ads Alternatives.

4. Top Topics in Google Ads & Microsoft Ads

Recently, trends in search and social ad platforms have presented opportunities to connect with prospects more precisely, creatively, and effectively.

Don’t overlook newer targeting and campaign types you may not have tried yet.

  • Video: Incorporating video into your PPC accounts takes some planning for the goals, ad creative, targeting, and ad types. There is a lot of opportunity here as you can simply include video in responsive display ads or get in-depth in YouTube targeting.
  • Performance Max: This automated campaign type serves across all of Google’s ad inventory. Microsoft Ads recently released PMAX so you can plan for consistency in campaign types across platforms. Do you want to allocate budget to PMax campaigns? Learn more about how PMax compares to search.
  • Automation: While AI can’t replace human strategy and creativity, it can help manage your campaigns more easily. During planning, identify which elements you want to automate, such as automatically created assets and/or how to successfully guide the AI in the Performance Max campaigns.

While exploring new features, check out some hidden PPC features you probably don’t know about.

5. Revisit Keywords

The role of keywords has evolved over the past several years with match types being less precise and loosening up to consider searcher intent.

For example, [exact match] keywords previously would literally match with the exact keyword search query. Now, ads can be triggered by search queries with the same meaning or intent.

A great planning exercise is to lay out keyword groups and evaluate if they are still accurately representing your brand and product/service.

Review search term queries triggering ads to discover trends and behavior you may not have considered. It’s possible this has impacted performance and conversions over time.

Critical to your strategy:

  • Review the current keyword rules and determine if this may impact your account in terms of close variants or shifts in traffic volume.
  • Brush up on how keywords work in each platform because the differences really matter!
  • Review search term reports more frequently for irrelevant keywords that may pop up from match type changes. Incorporate these into match type changes or negative keywords lists as appropriate.

6. Revisit Your Audiences

Review the audiences you selected in the past, especially given so many campaign types that are intent-driven.

Automated features that expand your audience could be helpful, but keep an eye out for performance metrics and behavior on-site post-click.

Remember, an audience is simply a list of users who are grouped together by interests or behavior online.

Therefore, there are unlimited ways to mix and match those audiences and target per the sales funnel.

Here are a few opportunities to explore and test:

  • LinkedIn user targeting: Besides LinkedIn, this can be found exclusively in Microsoft Ads.
  • Detailed Demographics: Marital status, parental status, home ownership, education, household income.
  • In-market and custom intent: Searches and online behavior signaling buying cues.
  • Remarketing: Advertisers website visitors, interactions with ads, and video/ YouTube.

Note: This varies per the campaign type and seems to be updated frequently, so make this a regular check-point in your campaign management for all platforms.

7. Organize Data Sources

You will likely be running campaigns on different platforms with combinations of search, display, video, etc.

Looking back at your goals, what is the important data, and which platforms will you use to review and report? Can you get the majority of data in one analytics platform to compare and share?

Millions of companies use Google Analytics, which is a good option for centralized viewing of advertising performance, website behavior, and conversions.

8. Reevaluate How You Report

Have you been using the same performance report for years?

It’s time to reevaluate your essential PPC key metrics and replace or add that data to your reports.

There are two great resources to kick off this exercise:

Your objectives in reevaluating the reporting are:

  • Are we still using this data? Is it still relevant?
  • Is the data we are viewing actionable?
  • What new metrics should we consider adding we haven’t thought about?
  • How often do we need to see this data?
  • Do the stakeholders receiving the report understand what they are looking at (aka data visualization)?

Adding new data should be purposeful, actionable, and helpful in making decisions for the marketing plan. It’s also helpful to decide what type of data is good to see as “deep dives” as needed.

9. Consider Using Scripts

The current ad platforms have plenty of AI recommendations and automated rules, and there is no shortage of third-party tools that can help with optimizations.

Scripts is another method for advertisers with large accounts or some scripting skills to automate report generation and repetitive tasks in their Google Ads accounts.

Navigating the world of scripts can seem overwhelming, but a good place to start is a post here on Search Engine Journal that provides use cases and resources to get started with scripts.

Luckily, you don’t need a Ph.D. in computer science — there are plenty of resources online with free or templated scripts.

10. Seek Collaboration

Another effective planning tactic is to seek out friendly resources and second opinions.

Much of the skill and science of PPC management is unique to the individual or agency, so there is no shortage of ideas to share between you.

You can visit the Paid Search Association, a resource for paid ad managers worldwide, to make new connections and find industry events.

Preparing For Paid Media Success

Strategies should be based on clear and measurable business goals. Then, you can evaluate the current status of your campaigns based on those new targets.

Your paid media strategy should also be built with an eye for both past performance and future opportunities. Look backward and reevaluate your existing assumptions and systems while investigating new platforms, topics, audiences, and technologies.

Also, stay current with trends and keep learning. Check out ebooks, social media experts, and industry publications for resources and motivational tips.

More resources: 

Featured Image: Vanatchanan/Shutterstock

Google Unplugs “Notes on Search” Experiment via @sejournal, @martinibuster

Google is shutting down it’s Google Notes Search Labs experiment that allowed users to see and leave notes on Google’s search results and many in the search community aren’t too surprised.

Google Search Notes

Availability of the feature was limited to Android and Apple devices and there was never a clearly defined practical purpose or usefulness of the Notes experiment. Search marketers reaction throughout has consistently been that would become a spam-magnet.

The Search Labs page for the experiment touts it as mode of self-expression, to help other users and as a way for users to collect their own notes within their Google profiles.

The official Notes page in Search Labs has a simple notice:

Notes on Search Ends May 2024

That’s it.

Screenshot Of Notice

Screenshot of Google's notice of cancellation

Reaction From Search Community

Kevin Indig tweeted his thoughts that anything Google makes with a user generated content aspect was doomed to attract spam.

He tweeted:

“I’m gonna assume Google retires notes because of spam.

It’s crazy how spammy the web has become. Google can’t launch anything UGC without being bombarded.”

Cindy Krum (@Suzzicks) tweeted that it was author Purna Virji (LinkedIn profile) who predicted that it would be shut down once Google received enough data.

She shared:

“It was actually @purnavirji who predicted it when we were at @BarbadosSeo – while I was talking. Everyone agreed that it would be spammed, but she said it would just be a test to collect a certain type of information until they got what they needed, and then it would be retired.”

Purna herself responded with a tweet:

“My personal (non-employer) opinion is that everyone wants all the UGC to train the AI models. Eg Reddit deal also could potentially help with that.”

Google’s Notes for Search seemed destined to never take off, it was met with skepticism and a shrug when it came out and nobody’s really mourning that it’s on the way out, either.

Featured Image by Shutterstock/Jamesbin

Scaling individual impact: Insights from an AI engineering leader

Traditionally, moving up in an organization has meant leading increasingly large teams of people, with all the business and operational duties that entails. As a leader of large teams, your contributions can become less about your own work and more about your team’s output and impact. There’s another path, though. The rapidly evolving fields of artificial intelligence (AI) and machine learning (ML) have increased demand for engineering leaders who drive impact as individual contributors (ICs). An IC has more flexibility to move across different parts of the organization, solve problems that require expertise from different technical domains, and keep their skill set aligned with the latest developments (hopefully with the added benefit of fewer meetings).

In an executive IC role as a technical leader, I have a deep impact by looking at the intersections of systems across organizational boundaries, prioritize the problems that really need solving, then assemble stakeholders from across teams to create the best solutions.

Driving influence through expertise

People leaders typically have the benefit of an organization that scales with them. As an IC, you scale through the scope, complexity, and impact of the problems you help solve. The key to being effective is getting really good at identifying and structuring problems. You need to proactively identify the most impactful problems to solve—the ones that deliver the most value but that others aren’t focusing on—and structure them in a way that makes them easier to solve.

People skills are still important because building strong relationships with colleagues is fundamental. When consensus is clear, solving problems is straightforward, but when the solution challenges the status quo, it’s crucial to have established technical credibility and organizational influence.

And then there’s the fun part: getting your hands dirty. Choosing the IC path has allowed me to spend more time designing and building AI/ML systems than other management roles would—prototyping, experimenting with new tools and techniques, and thinking deeply about our most complex technical challenges.

A great example I’ve been fortunate to work on involved designing the structure of a new ML-driven platform. It required significant knowledge at the cutting edge and touched multiple other parts of the organization. The freedom to structure my time as an IC allowed me to dive deep in the domain, understand the technical needs of the problem space, and scope the approach. At the same time, I worked across multiple enterprise and line-of-business teams to align appropriate resources and define solutions that met the business needs of our partners. This allowed us to deliver a cutting-edge solution on a very short timescale to help the organization safely scale a new set of capabilities.

Being an IC lets you operate more like a surgeon than a general. You focus your efforts on precise, high-leverage interventions. Rapid, iterative problem-solving is what makes the role impactful and rewarding.

The keys to success as an IC executive

In an IC executive role, there are key skills that are essential. First is maintaining deep technical expertise. I usually have a couple of different lines of study going on at any given time, one that’s closely related to the problems I’m currently working on, and another that takes a long view on foundational knowledge that will help me in the future.

Second is the ability to proactively identify and structure high-impact problems. That means developing a strong intuition for where AI/ML can drive the most business value, and leveraging the problem in a way that achieves the highest business results.

Determining how the problem will be formulated means considering what specific problem you are trying to solve and what you are leaving off the table. This intentional approach aligns the right complexity level to the problem to meet the organization’s needs with the minimum level of effort. The next step is breaking down the problem into chunks that can be solved by the people or teams aligned to the effort.

Doing this well requires building a diverse network across the organization. Building and nurturing relationships in different functional areas is crucial to IC success, giving you the context to spot impactful problems and the influence to mobilize resources to address them.

Finally, you have to be an effective communicator who can translate between technical and business audiences. Executives need you to contextualize system design choices in terms of business outcomes and trade-offs. And engineers need you to provide crisp problem statements and solution sketches.

It’s a unique mix of skills, but if you can cultivate that combination of technical depth, organizational savvy, and business-conscious communication, ICs can drive powerful innovations. And you can do it while preserving the hands-on problem-solving abilities that likely drew you to engineering in the first place.

Empowering IC Career Paths

As the fields of AI/ML evolve, there’s a growing need for senior ICs who can provide technical leadership. Many organizations are realizing that they need people who can combine deep expertise with strategic thinking to ensure these technologies are being applied effectively.

However, many companies are still figuring out how to empower and support IC career paths. I’m fortunate that Capital One has invested heavily in creating a strong Distinguished Engineer community. We have mentorship, training, and knowledge-sharing structures in place to help senior ICs grow and drive innovation.

ICs have more freedom than most to craft their own job description around their own preferences and skill sets. Some ICs may choose to focus on hands-on coding, tackling deeply complex problems within an organization. Others may take a more holistic approach, examining how teams intersect and continually collaborating in different areas to advance projects. Either way, an IC needs to be able to see the organization from a broad perspective, and know how to spot the right places to focus their attention.

Effective ICs also need the space and resources to stay on the bleeding edge of their fields. In a domain like AI/ML that’s evolving so rapidly, continuous learning and exploration are essential. It’s not a nice-to-have feature, but a core part of the job, and since your time as an individual doesn’t scale, it requires dedication to time management.

Shaping the future

The role of an executive IC in engineering is all about combining deep technical expertise with a strategic mindset. That’s a key ingredient in the kind of transformational change that AI is driving, but realizing this potential will require a shift in the way many organizations think about leadership.

I’m excited to see more engineers pursue an IC path and bring their unique mix of skills to bear on the toughest challenges in AI/ML. With the right organizational support, I believe a new generation of IC leaders will emerge and help shape the future of the field. That’s the opportunity ahead of us, and I’m looking forward to leading by doing.

This content was produced by Capital One. It was not written by MIT Technology Review’s editorial staff.

This US startup makes a crucial chip material and is taking on a Japanese giant

It can be dizzying to try to understand all the complex components of a single computer chip: layers of microscopic components linked to one another through highways of copper wires, some barely wider than a few strands of DNA. Nestled between those wires is an insulating material called a dielectric, ensuring that the wires don’t touch and short out. Zooming in further, there’s one particular dielectric placed between the chip and the structure beneath it; this material, called dielectric film, is produced in sheets as thin as white blood cells. 

For 30 years, a single Japanese company called Ajinomoto has made billions producing this particular film. Competitors have struggled to outdo them, and today Ajinomoto has more than 90% of the market in the product, which is used in everything from laptops to data centers. 

But now, a startup based in Berkeley, California, is embarking on a herculean effort to dethrone Ajinomoto and bring this small slice of the chipmaking supply chain back to the US.

Thintronics is promising a product purpose-built for the computing demands of the AI era—a suite of new materials that the company claims have higher insulating properties and, if adopted, could mean data centers with faster computing speeds and lower energy costs. 

The company is at the forefront of a coming wave of new US-based companies, spurred by the $280 billion CHIPS and Science Act, that is seeking to carve out a portion of the semiconductor sector, which has become dominated by just a handful of international players. But to succeed, Thintronics and its peers will have to overcome a web of challenges—solving technical problems, disrupting long-standing industry relationships, and persuading global semiconductor titans to accommodate new suppliers. 

“Inventing new materials platforms and getting them into the world is very difficult,” Thintronics founder and CEO Stefan Pastine says. It is “not for the faint of heart.”

The insulator bottleneck

If you recognize the name Ajinomoto, you’re probably surprised to hear it plays a critical role in the chip sector: the company is better known as the world’s leading supplier of MSG seasoning powder. In the 1990s, Ajinomoto discovered that a by-product of MSG made a great insulator, and it has enjoyed a near monopoly in the niche material ever since. 

But Ajinomoto doesn’t make any of the other parts that go into chips. In fact, the insulating materials in chips rely on dispersed supply chains: one layer uses materials from Ajinomoto, another uses material from another company, and so on, with none of the layers optimized to work in tandem. The resulting system works okay when data is being transmitted over short paths, but over longer distances, like between chips, weak insulators act as a bottleneck, wasting energy and slowing down computing speeds. That’s recently become a growing concern, especially as the scale of AI training gets more expensive and consumes eye-popping amounts of energy. (Ajinomoto did not respond to requests for comment.) 

None of this made much sense to Pastine, a chemist who sold his previous company, which specialized in recycling hard plastics, to an industrial chemicals company in 2019. Around that time, he started to believe that the chemicals industry could be slow to innovate, and he thought the same pattern was keeping chipmakers from finding better insulating materials. In the chip industry, he says, insulators have “kind of been looked at as the redheaded stepchild”—they haven’t seen the progress made with transistors and other chip components. 

He launched Thintronics that same year, with the hope that cracking the code on a better insulator could provide data centers with faster computing speeds at lower costs. That idea wasn’t groundbreaking—new insulators are constantly being researched and deployed—but Pastine believed that he could find the right chemistry to deliver a breakthrough. 

Thintronics says it will manufacture different insulators for all layers of the chip, for a system designed to swap into existing manufacturing lines. Pastine tells me the materials are now being tested with a number of industry players. But he declined to provide names, citing nondisclosure agreements, and similarly would not share details of the formula. 

Without more details, it’s hard to say exactly how well the Thintronics materials compare with competing products. The company recently tested its materials’ Dk values, which are a measure of how effective an insulator a material is. Venky Sundaram, a researcher who has founded multiple semiconductor startups but is not involved with Thintronics, reviewed the results. Some of Thintronics’ numbers were fairly average, he says, but their most impressive Dk value is far better than anything available today.

A rocky road ahead

Thintronics’ vision has already garnered some support. The company received a $20 million Series A funding round in March, led by venture capital firms Translink and Maverick, as well as a grant from the US National Science Foundation. 

The company is also seeking funding from the CHIPS Act. Signed into law by President Joe Biden in 2022, it’s designed to boost companies like Thintronics in order to bring semiconductor manufacturing back to American companies and reduce reliance on foreign suppliers. A year after it became law, the administration said that more than 450 companies had submitted statements of interest to receive CHIPS funding for work across the sector. 

The bulk of funding from the legislation is destined for large-scale manufacturing facilities, like those operated by Intel in New Mexico and Taiwan Semiconductor Manufacturing Corporation (TSMC) in Arizona. But US Secretary of Commerce Gina Raimondo has said she’d like to see smaller companies receive funding as well, especially in the materials space. In February, applications opened for a pool of $300 million earmarked specifically for materials innovation. While Thintronics declined to say how much funding it was seeking or from which programs, the company does see the CHIPS Act as a major tailwind.

But building a domestic supply chain for chips—a product that currently depends on dozens of companies around the globe—will mean reversing decades of specialization by different countries. And industry experts say it will be difficult to challenge today’s dominant insulator suppliers, who have often had to adapt to fend off new competition. 

“Ajinomoto has been a 90-plus-percent-market-share material for more than two decades,” says Sundaram. “This is unheard-of in most businesses, and you can imagine they didn’t get there by not changing.”

One big challenge is that the dominant manufacturers have decades-long relationships with chip designers like Nvidia or Advanced Micro Devices, and with manufacturers like TSMC. Asking these players to swap out materials is a big deal.

“The semiconductor industry is very conservative,” says Larry Zhao, a semiconductor researcher who has worked in the dielectrics industry for more than 25 years. “They like to use the vendors they already know very well, where they know the quality.” 

Another obstacle facing Thintronics is technical: insulating materials, like other chip components, are held to manufacturing standards so precise they are difficult to comprehend. The layers where Ajinomoto dominates are thinner than a human hair. The material must also be able to accept tiny holes, which house wires running vertically through the film. Every new iteration is a massive R&D effort in which incumbent companies have the upper hand given their years of experience, says Sundaram.

If all this is completed successfully in a lab, yet another hurdle lies ahead: the material has to retain those properties in a high-volume manufacturing facility, which is where Sundaram has seen past efforts fail.

“I have advised several material suppliers over the years that tried to break into [Ajinomoto’s] business and couldn’t succeed,” he says. “They all ended up having the problem of not being as easy to use in a high-volume production line.” 

Despite all these challenges, one thing may be working in Thintronics’ favor: US-based tech giants like Microsoft and Meta are making headway in designing their own chips for the first time. The plan is to use these chips for in-house AI training as well as for the cloud computing capacity that they rent out to customers, both of which would reduce the industry’s reliance on Nvidia. 

Though Microsoft, Google, and Meta declined to comment on whether they are pursuing advancements in materials like insulators, Sundaram says these firms could be more willing to work with new US startups rather than defaulting to the old ways of making chips: “They have a lot more of an open mind about supply chains than the existing big guys.”

SEO Takeaways from SGE’s Partial Rollout

Google extended Search Generative Experience last month beyond Labs, its testing program. A limited number of searchers now see AI snapshots in results regardless of whether they signed up.

Many observers believe it’s the first step to SGE becoming fully public this year. Google hasn’t much changed SGE in Labs for months, perhaps signaling its satisfaction thus far.

I’ve closely monitored SGE developments. Here are my observations and expectations.

Traffic Losses Overestimated

Authoritas, a search-engine-optimization platform, has been testing SGE in Labs and publishing the results. In March, the tests found the SGE appears in 91% of U.S. search results for brand and product terms in two ways.

First, the results could contain a “Generate” button that produces an AI-powered answer only when clicked, such as this query of “best laptops.”

Clicking the “Generate” button produces AI-powered answers. Click image to enlarge.

Second, the search results could contain an instant answer, such as the example below for “How healthy is spaghetti squash?” Clicking “Show more” expands the explanation.

Instant SGE answers such as this example appear automatically. Clicking “Show more” expands the explanation. Click image to enlarge.

The instant answer takes much more SERP space and will likely steal more clicks from organic listings because it pushes them further down the page.

Fortunately, according to the same Authoritas study, clicks on the “Generate” button drive 81.4% of SGE responses, much more than an instant answer. This indicates that organic listings won’t be hugely impacted, at least for now, since there’s no page disruption unless the button is clicked.

Organic Listings Displaced

However, when SGE is triggered, organic results appear far below the initial screen.

For example, searching for “smart TV” and clicking “Generate” produces an AI answer that occupies an entire screen on mobile and desktop.

Authoritas estimated an average organic listings drop of 1,243 pixels, depending on the search term. SGE results more or less eliminate the visibility of organic listings, especially for queries seeking consumables such as household goods.

Even before SGE, organic visibility was increasingly limited owing to the various features Google inserts at or near the top, such as ads, “People also ask” boxes, image packs, local packs, and more.

Opportunities in SGE

The good news is that SGE answers contain links, an opportunity for organic visibility. Authoritas states that SGE’s snapshots, on average, contain five unique links, and just one matches the top 10 organic listings below it. Perhaps it’s because the snapshots often address related queries.

For example, SGE snapshots for the search of “best laptops” list many makes and models as well as links for the best laptops for students, budgets, and coding. Organic listings for “best laptops” do not include those additional choices.

SGE snapshots for “best laptops” list many makes and models as well as related links, such as “The Best Laptops for Sims 4” (for students). Click image to enlarge.

Thus after optimizing important keywords, consider creating supporting content for related queries, increasing the chances of showing up in SGE results. For ideas, keep an eye on “Related searches” because those keywords seem to appear in SGE.

“Related searches” keywords seem to appear in SGE.

WordPress Releases A Performance Plugin For “Near-Instant Load Times” via @sejournal, @martinibuster

WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.

Speculative Loading

Speculative loading is a technique that fetches pages or resources before a user clicks a link to navigate to another webpage.

The official WordPress page about this new functionality describes it:

“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.

This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”

The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:

“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).

The Speculation Rules API provides an alternative to the widely-available feature and is designed to supersede the Chrome-only deprecated feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”

Performance Lab Plugin

The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.

The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:

Settings > Reading > Speculative Loading

Browser Compatibility

The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.

Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.

Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.

Speculative Loading By WordPress Performance Team

Are Websites Getting Faster? New Data Reveals Mixed Results via @sejournal, @MattGSouthern

Website loading times are gradually improving, but a new study shows significant variance in performance across sites and geographic regions.

The study from web monitoring company DebugBear examined data from Google’s Chrome User Experience Report (CrUX), which collects real-world metrics across millions of websites.

“The average website takes 1.3 seconds to load the main page content for an average visit,” the report stated, using Google’s Largest Contentful Paint (LCP) metric to measure when the main content element becomes visible.

While that median LCP time of 1.3 seconds represents a reasonably fast experience, the data shows a wide range of loading performances:

  • On 25% of mobile websites, visitors have to wait over 2.1 seconds for the main content to appear
  • For the slowest 1% of websites, even an average page load takes more than 5.7 seconds on mobile
  • The slowest 10% of websites make 10% of users wait over 5 seconds for the LCP on mobile
  • Almost 1% of mobile page loads take nearly 20 seconds before the main content shows up

“Even on a fast website, some percentage of page views will be slow,” the study reads.

Continue reading for a deeper dive into the study to understand how your website speed compares to others.

Site Speed Divergences

The data reveals divergences in speeds between different user experiences, devices, and geographic locations:

  • Desktop sites (1.1-second median LCP) load faster than mobile (1.4 seconds)
  • While 25% of mobile page loads hit LCP in under 1 second, 10% take over 4 seconds
  • In the Central African Republic, a typical mobile LCP is 9.2 seconds (75th percentile)
  • Sweden, Slovenia, Japan, and South Korea all had 75th percentile mobile LCPs under 1.7 seconds

“Differences in network connections and device CPU speed mean that visitors in different countries experience the web differently,” the report noted.

The study also found that more popular sites are faster, with the median LCP on the top 1000 sites being 1.1 seconds compared to 1.4 seconds for the top 10 million sites.

Steady Improvement Continues

DebugBear’s analysis shows that websites have steadily become faster across device types over the past few years despite the variances.

A similar improvement was seen for other loading metrics, like First Contentful Paint.

“While changes to the LCP definition may have impacted the data, the First Contentful Paint metric – which is more stable and well-defined – has also improved,” the report stated.

The gains could be attributed to faster devices and networks, better website optimization, and improvements in the Chrome browser.

The study’s key finding was that “Page speed has consistently improved.” However, it also highlighted the wide range of experiences in 2024.

As DebugBear summarized, “A typical visit to a typical website is fast, but you likely visit many websites each day, some slow and some fast.”

Why SEJ Cares

This study provides an annual check-in to see how the web is progressing in terms of loading performance.

In recent years, Google has been emphasizing page load times and its Core Web Vitals metrics to measure and encourage better user experiences.

Speed also plays a role in search rankings. However, its precise weight as a ranking signal is debated.

How This Can Help You

SEO professionals can use studies like this to advocate for prioritizing page speed across an organization.

This report highlights that even high-performing sites likely have a segment of visitors hitting a subpar speed.

Refer to the study as a benchmark for how your site compares to others. If you’re unsure where to start, look at LCP times in the Chrome User Experience Report.

If a segment is well above the 2.1-second threshold for mobile, as highlighted in this study, it may be worth prioritizing front-end optimization efforts.

Segment your page speed data by country for sites with an international audience. Identifying geographic weak spots can inform performance budgeting and CDN strategies.

Remember that you can’t do it all alone. Performance optimization is a collaborative effort between SEOs and developers.

Featured Image: jamesteohart/Shutterstock