How I Edit AI Content: A Workflow For The New Age Of Content Creation via @sejournal, @Kevin_Indig

In last week’s Memo, I explained how, just as digital DJing transformed music mixing, AI is revolutionizing how we create content by giving us instant access to diverse expressions and ideas.

Instead of fighting this change, writers should embrace AI as a starting point while focusing our energy on adding uniquely human elements that machines can’t replicate, like our personal experiences, moral judgment, and cultural understanding.

Last week, I identified seven distinctly human writing capabilities and 11 telltale signs of AI-generated content.

Today, I want to show you how I personally apply these insights in my editing process.

Image Credit: Lyna ™

Rather than seeing AI as my replacement, I advocate for thoughtful collaboration between human creativity and AI efficiency, much like how skilled DJs don’t just play songs but transform them through artistic mixing.

As someone who’s spent countless hours editing and tinkering with AI-generated drafts, I’ve noticed most people get stuck on grammar fixes while missing what truly makes writing connect with readers.

They overlook deeper considerations like:

  • Purposeful imperfection: Truly human writing isn’t perfectly polished. Natural quirks, occasional tangents, and varied sentence structures create authenticity that perfect grammar and flawless organization can’t replicate.
  • Emotional intelligence: AI content often lacks the intuitive emotional resonance that comes from lived experience. Editors frequently correct grammar but overlook opportunities to infuse genuine emotional depth.
  • Cultural context: Humans naturally reference shared cultural touchpoints and adapt their tone based on context. This awareness is difficult to edit into AI content without completely reframing passages.

In today’s Memo, I explain how to turn these edits into a recurring workflow for you or your team, so you can leverage the power of AI, accelerate content output, and drive more organic revenue.

Boost your skills with Growth Memo’s weekly expert insights. Subscribe for free!

Turning AI-Editing Into A Workflow

AI Editing WorkflowImage Credit: Kevin Indig

I like to edit AI content in several passes, each with a specific focus:

  • Round 1: Structure.
  • Round 2: Language.
  • Round 3: Humanization.
  • Round 4: Polish.

Not every type of content needs the same amount of editing:

  • You can be more hands-off with supporting content on category or product pages, while editorial content for blogs or content hubs needs significantly more editing.
  • In the same way, evergreen topics need less editing while thought leadership needs a heavy editorial hand.

Round 1: Structure & Big-Picture Review

First, I read the entire draft like a skeptical reader would.

I’m looking for logical flow issues, redundant sections, and places where the AI went on unhelpful tangents.

This is about getting the bones right before polishing sentences.

Rather than nitpicking grammar, I ask: “Does this piece make sense? Would a human actually structure it this way?”

But, the most important question is: “Does this piece meet user intent?” You need to ensure that the structure optimizes for speed-to-insights and helps users solve the implied problem of their searches.

If sections feel out of order or disconnected, I rearrange them.

If the AI repeats the same point in multiple places (they love doing this), I consolidate.

Round 2: Humanize The Language & Flow

Next, I tackle that sterile AI tone that makes readers’ eyes glaze over.

I break up the robotic rhythm by:

  • Consciously varying sentence lengths (Watch this. I’m doing it right now. Different lengths create natural cadence.).
  • Replacing corporate-speak with how humans actually talk (“use” instead of “utilize,” “start” instead of “commence”).
  • Cutting those meaningless filler sentences AI loves to add (“It’s important to note that…” or “As we can see from the above…”).

For example, I’d transform this AI-written line:

Utilizing appropriate methodologies can facilitate enhanced engagement among target demographics.

Into this:

Use the right approach, and people will actually care about what you’re saying.

Round 3: Add The Human Value Only You Can Provide

Here’s where I earn my keep.

I infuse the piece with:

  • Opinions where appropriate.
  • Personal stories or examples.
  • Unique metaphors or cultural references.
  • Nuanced insights that come from my expertise.

One of the shifts we have to make – and that I made – is to be more deliberate about collecting stories and opinions that we can tell.

In his book “Storyworthy,” Matthew Dicks shares how he saves stories from everyday life in a spreadsheet. He calls this habit Homework For Life, and it’s the most effective way to collect relatable stories that you can use for your content. It’s also a way to slow down time:

As you begin to take stock of your days, find those moments — see them and record them — time will begin to slow down for you. The pace of your life will relax.

Round 4: Final Polish & Optimization

Finally, I do a last pass focusing on:

  • A punchy opening that hooks the reader.
  • Removing any lingering AI patterns (overly formal language, repetitive phrases).
  • Search optimization (user intent, headings, keywords, internal links) without sacrificing readability.
  • Fact-checking every statistic, date, name, and claim.
  • Adding calls to action or questions that engage readers.

I know I’ve succeeded when I read the final piece and genuinely forget that an AI was involved in the drafting process.

The ultimate test: “Would I be proud to put my name on this?”

AI Content Editing Checklist

Before you hit “Publish,” run through this checklist to make sure you’ve covered all bases:

  • User Intent: The content is organized logically and addresses the intended topic or keyword completely, without off-topic detours.
  • Tone & Voice: The writing sounds consistently human and aligns with brand voice (e.g., friendly, professional, witty, etc.).
  • Readability: Sentences and paragraphs are concise and easy to read. Jargon is explained or simplified. The formatting (headings, lists, etc.) makes it skimmable.
  • Repetition: No overly repetitive phrases or ideas. Redundant content is trimmed. The language is varied and interesting.
  • Accurate: All facts, stats, names, and claims have been verified. Any errors are corrected. Sources are cited for important or non-obvious facts, lending credibility. There are no unsupported claims or outdated information.
  • Original Value: The content contains unique elements (experiences, insights, examples, opinions) that did not come from AI.
  • SEO: The primary keyword and relevant terms are included naturally. Title and headings are optimized and clear. Internal links to related content are added where relevant. External links to authoritative sources support the content.
  • Polish: The introduction is compelling. The content includes elements that engage the reader (questions, conversational bits) and a call to action. It’s free of typos and grammatical errors. All sentences flow well.

If you can check off all (or most) of these items, you’ve likely turned the AI draft into a high-quality piece that can confidently be published.

AI Content Editing = Remixing

We’ve come full circle.

Just as digital technology transformed DJing without eliminating the need for human creativity and curation, AI is reshaping writing while still requiring our uniquely human touch.

The irony I mentioned at the start of this article – trying to make AI content more human – becomes less ironic when we view AI as a collaborative tool rather than a replacement for human creativity.

Just as DJs evolved from vinyl crates to digital platforms without losing their artistic touch, writers are adapting to use AI while maintaining their unique value.

You can raise the chances of creating high-performing content that stands out by selecting the right models, inputs, and direction:

  • The newest models lead to exponentially better content than older (cheaper) ones. Don’t try to save money here.
  • Spend a lot of time getting style guides and examples right so the models work in the right lanes.
  • The more unique your data sources are, the more defensible your AI draft becomes.

The key insight is this: AI content editing is about enhancing the output with the irreplaceable human elements that make content truly engaging.

Whether that’s through adding lived experience, cultural understanding, emotional depth, or purposeful imperfection, our role is to be the bridge between AI’s computational efficiency and human connection.

The future belongs not to those who resist AI but to those who learn to dance with it, knowing exactly when to lead with their uniquely human perspective and when to follow the algorithmic beat.

Back in my DJ days, the best sets weren’t about the equipment I used but about the moments of unexpected connection I created.

The same holds true for writing in this new era.


Featured Image: Paulo Bobita/Search Engine Journal

Ask An SEO: My Content Can’t Be Seen When I Disable JavaScript – Will This Affect My SEO? via @sejournal, @HelenPollitt1

This week’s question comes from Thomas, who asks:

I disabled the JavaScript just to check the content of my webpage, but unfortunately I could not see any content except the banner H1 tag.

Will it hurt my SEO? If yes, what are the advisable solutions for this?

This is a great question – it’s something that all SEO professionals need to be aware of.

We spend so much time trying to create interesting, engaging content that it would be heartbreaking to think that it isn’t visible to search engines.

However, given the recent advancements in Google’s ability to render JavaScript content, is that something we still need to be concerned about?

The short answer is yes.

Why JavaScript Can Be A Problem

We know that to ingest information, Googlebot will discover a page, crawl it, parse and index it. For JavaScript, the crawler needs to “render” the code. The rendering stage is where JavaScript problems can occur.

JavaScript has to be downloaded and executed in order for the content to be parsed. This takes more resources than the bot parsing content in HTML.

As such, sometimes Google will defer the rendering stage and come back to a page to render it at a later date.

Most websites these days will use some JavaScript – that’s absolutely fine.

However, if your website requires JavaScript to load important content that is crucial to the page, then it might be a risk.

If, for some reason, a search bot does not render the JavaScript on a page, then it will not have any context as to what the page is about.

It is crucial to remember that not every search engine can render JavaScript. This is becoming increasingly important in the era of generative search engines – very few of which render JavaScript.

Diagnosing A Problem

You’ve done the right thing by starting to investigate the effect JavaScript rendering might be having on your site.

Turning off the JavaScript and seeing what content remains, and what is still interactive without it, is important.

I suggest going a step further and looking at what is available to the search bots to read on a page’s first load. This will help you identify content accessible without JavaScript rendering.

Check Google Search Console

First off, use Google Search Console URL Inspection tool and look at the rendered HTML. If the content is present in the rendered HTML then Google should be able to read the content.

Check Chrome Browser

You can go to “View Source” in Chrome to see what the pre-rendered HTML looks like. If the content is all there, you don’t need to worry any further.

However, if it’s not, then you can use the Developer Tools in Chrome for further diagnostics. Look in the “Elements” tab. If you can see your content, then again, you are probably OK.

Check The Robots.txt

Sometimes, developers may choose to block specific JavaScript files from being crawled by disallowing them in the robots.txt.

This isn’t necessarily an issue unless those files are needed to render important information.

It’s always worth checking your robots.txt file to see if there are any JavaScript files blocked that could prevent the bots, in particular, from accessing the content of the page.

Next Steps

JavaScript tends to worry a lot of people when it comes to SEO. It’s a significant part of the modern web, however. There’s no escaping the use of JavaScript.

We need to ensure that our websites utilize JavaScript so that both popular and emerging search engines can find and read our content.

You don’t need to worry but be diligent.

If you have developer resources at hand, you can work with them to identify the most applicable solution.

Here are some checks you may want to make:

Are We Using Client-Side Rendering Or Server-Side Rendering?

Client-side rendering essentially utilizes the browser to render the JavaScript of a page.

When a page is visited, the server responds by sending the HTML code and the JavaScript files. The browser then downloads those files and generates the content from the JavaScript.

This is counter to server-side rendering, where the content is rendered by the server and then sent to the browser with the data provided.

In general, server-side rendering is easier for bots, can be a quicker experience for users, and tends to be the default recommendation for SEO.

However, it can be more costly for the websites and, therefore, isn’t always the default choice for developers.

Is Our Main Content Able To Be Rendered Without JavaScript?

The most important content on your page, the main content, needs to be possible to parse without JavaScript rendering.

That is always the safest way to ensure that bots can access the content.

Are We Using JavaScript Links?

A further consideration is whether your links can be crawled easily by the search bots.

It’s not always an issue to have links generated through JavaScript. However, there is a risk that bots might not be able to resolve them unless they are properly contained in HTML element with an href attribute.

Google states it “can’t reliably extract URLs from elements that don’t have an href attribute or other tags that perform as links because of script events.”

Remember, though, it’s not just Google that you need to be conscious of. It’s always better to err on the side of making your links easy to follow.

In Summary

It is crucial to make sure your content is accessible to bots, now and in the future.

That means that if your website relies heavily on JavaScript to load content, you may struggle to communicate that information to some search engines.

It’s true that Google is much better at rendering JavaScript-heavy sites than it used to be, but the SEO playing field is not just Google.

To make sure your website can perform well in search platforms beyond Google, you may want to change how your website renders content, making sure your main content is in HTML.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

Google Confirms That Structured Data Won’t Make A Site Rank Better via @sejournal, @martinibuster

Google’s John Mueller answered a question on Bluesky about whether structured data helps with SEO, which may change how some people think about it.

Schema.org Structured Data

When SEOs talk about structured data they’re talking about Schema.org structured data. There are many kinds of structured data but for SEO purposes only Schema.org structured data matters.

Does Google Use Structured Data For Ranking Purposes?

The person starting the discussion first posted that they were adding structured data to see if it helps with SEO.

Mueller’s first post was a comment about the value of preparation:

“Yes, and also no. I love seeing folks stumble into the world of online marketing, search engines, and all that, but reading up on how things technically work will save you time & help you focus.”

The original poster responded with a question:

“In your experience, how has it helped?”

That’s when Mueller gave his answer:

“(All of the following isn’t new, hence the meme.) Structured data won’t make your site rank better. It’s used for displaying the search features listed in developers.google.com/search/docs/… . Use it if your pages map to & are appropriate for any of those features.”

Google Only Uses Structured Data For Rich Results

It might seem confusing that structured data doesn’t help a site rank better but it makes more sense to think about it as something that makes a site eligible for rich results. In the context of AI Search results, Google uses regularly indexed data from websites and because AI search results are a search feature, it may rely on the documented structured data for search related features (read more about that here: Google Confirms: Structured Data Still Essential In AI Search Era.)

The main points about structured data in the context of AI search is that according to what was shared at a recent Search Central Live (hat tip to Aleyda Solis):

“Structured data is critical for modern search features

Check the documentation for supported types

Structured data is efficient,
…for computers easy to read,
… and very precise”

In a nutshell, for the context of AI Search:
Structured data supports search features and AI Search is an AI feature. AI search also relies on the regular search index apart from the Schema.org structured data.

How Google Uses Structured Data In Search Features

Google uses only a fraction of the available Schema.org structured data. There are currently over 800 Schema.org structured data types and Google only uses around 30 types for which it publishes structured data documentation for required properties for each structured data type and other guidelines and requirements.

The only use Google has for structured data is to collect information in a machine readable format so that it can then use the information for displaying rich results, which can be seen for recipes, reviews, displaying website information in carousel format, and even to enable users to buy books directly from the search results.

Adding Schema.org structured data doesn’t guarantee that Google will display the site with a rich results feature in search. It only makes a site eligible to be displayed in rich results. Adding non-documented forms of Schema.org structured data won’t affect search optimization for a site because Google ignores all but the roughly thirty structured data types.

Read the original discussion on Bluesky:

Adding structured data to see if it helps with SEO

Featured Image by Shutterstock/ViDI Studio

AI & SEO-Driven Content Marketing: How To Calculate True ROI for B2B Companies in 2025

This post was sponsored by Heeet. The opinions expressed in this article are the sponsor’s own.

How do you calculate the true cost of SEO content production?

Are you overspending or underspending on SEO compared to performance?

Can you connect SEO-driven awareness to pipeline and revenue?

How do you make SEO efforts more visible to your C-suite?

If you aren’t sure, that’s okay.

You may simply lack the tools to measure the actual impact of SEO on revenue.

So, let’s dive in and:

  • Break down the true steps to B2B conversion.
  • Highlight the tools to calculate the true ROI of your SEO-driven content in 2025.
  • Look past the simplified first and last-touch approach to attribution.
  • Leverage the need for multitouch solutions that track engagement with SEO content throughout the buyer’s journey.

Can I Connect SEO To Revenue?

Yes, you can connect SEO to revenue.

Why Should I Connect SEO To Revenue?

SEO plays a large role in future conversions.

In fact, SEO helps prospects discover your brand, tool, or company.

SEO also helps provide easy-to-discover content with informational intent, which helps to nurture a prospective lead into a sale.

Your prospect’s journey:

  1. Starts at the first time they find your optimized webpage on the search engine results page (SERP).
  2. Moves into nurture, where your B2B prospects typically perform months of extensive product research via traditional searches and AI results before a sale is closed.

The fact that informative content is found on SERPs is due to SEO.

But how is this tracked? How do you know which non-conversion pages are:

  • Part of the user journey?
  • Part of the overall ROI?

How Do I Tie SEO To Company Revenue?

Luckily, your C-suite likely recognizes the need for SEO content.

They are prepared to invest in a strategy incorporating AI search.

However, you need tools that validate the investment and clearly showcase it for your higher-ups.

How To Keep Revenue High When SERPs Are Changing

Gartner predicts that traditional search engine volume will drop 25% by 2026 and flow directly to AI chatbots and agents.

As AI continues to accelerate the evolution of SEO, it’s critical to ensure that high-performing pages:

  • Continue to rank in traditional SERPs.
  • Appear in Google’s AI overviews.
  • Get referenced by the Gen AI tools your audience relies on.
  • They are tracked, so these visits are attributed to a sale.

That’s why you need to understand why certain content is picked up by AI tools and the cost of generating the content to calculate the true ROI of your SEO.

Step 1. How To Create Content That Gets Seen In Traditional Search & AI Overviews

With the shift in consumer search behavior, your first step is to create, optimize, and measure the ROI of content sourced by leading AI tools.

That means appearing in AI Overviews and AI Answers that contain list-based content and product comparisons.

Search Your Brand & See What Each AI Tool Recommends

That’s the first step to determining whether your content or your competitor’s stands out.

Give these prompts a try:

  • What is the best solution for…
  • Give me the top tools for…
  • Best alternative to…
  • Is [competitor] solution better than…

Optimize Your Existing Content & Strategy To Feed AI’s Answer Base

The next step is optimizing existing content and adjusting your strategy so that you write copy that gives AI the answers it’s looking for.

With that said, following traditional SEO strategies and best practices championed by Google should help.

Just like traditional search, AI tools also favor:

  • Proper site and article structure with explicit metadata and semantic markup.
  • Content with lists and bullet points that are easier to scan.
  • Websites optimized for speed.
  • Updated content, keeping things fresh with context.
  • Content with backlinks from high-quality publications.
  • FAQ sections.
  • Mobile-responsive websites with indexable content when pulling sources to provide an answer.

These factors give your content more authority in your industry, just like the content outside your website that Google and LLMs look for to find answers from, such as videos on YouTube, reviews on G2, and conversations on Reddit forums.

Publishing enough quality content for all those channels to optimize for AI and be visible in traditional search is no small task. It requires substantial human resources, SEO tools, and time.

Step 2. Understand All Aspects Of The Real Cost Of SEO Content In 2025

SEO is a long game, especially in B2B, where the path from first click to purchase can span weeks or months and involve multiple touchpoints.

And now, with AI influencing how content is discovered, the cost of doing SEO well has increased.

To accurately assess the cost of SEO-driven content in 2025, you need to go beyond production budgets and organic traffic. Here’s how:

Break Down Your True SEO Investment

Start by identifying all the resources that go into content creation and maintenance:

  • People: Writers, designers, SEOs, developers, and editors.
  • Tools: SEO platforms, content optimization tools, keyword research databases, analytics software.
  • Distribution: Paid support for SEO content, social promotion, and email newsletters.
  • Maintenance: Refreshing old content, updating links, and improving page experience.

Monitor Content Performance Over Time

Track the performance of each piece of content using more than just rankings:

  • Organic traffic (from both traditional search and AI surfaces).
  • Time on page and engagement metrics.
  • Cost per lead and pipeline contribution (if possible).
  • Assisted conversions across all touchpoints.

Map Content to Buyer Journey Stages

Content doesn’t just convert, it nurtures. Tie content assets to specific stages:

  • Top-of-funnel (education, discovery).
  • Mid-funnel (comparison, product evaluation).
  • Bottom-of-funnel (case studies, demos).

Even if content isn’t the final touchpoint, it plays a role. Traditional tools miss this.

Adjust, Monitor & Pivot

No single metric will tell the full story. Instead:

  • Adjust: Re-optimize content based on AI overview visibility, CTR, and engagement.
  • Monitor: Watch how users arrive from search vs. AI sources.
  • Pivot: Invest more in formats and topics that show traction across both human and AI audiences.

Without full-funnel attribution, even the most engaged content may look like a cost center instead of a revenue driver.

That’s why accurate measurement, aligned with total investment and the full buyer journey, is critical to understanding the real ROI of your SEO content in 2025.

However, we know that:

  • AI Overviews and similar answer engines also play a big role in education and nurturing.
  • Attributing a sale to content read on an untrackable AI Overview is impossible, but it’s happening.

This is where the calculation gets difficult.

Step 3. Incorporate Multi-Touch Attribution To Your Revenue Calculations

Now that we’re here, you’re beginning to understand how tricky it is to tie ROI to AI Overview responses that nurture your prospects.

How do you accurately determine the cost?

Some people are creating their own attribution models to calculate ROI.

Most people are using tools that are built specifically for this new calculation.

The only way to accurately calculate cost in B2B SEO is to capture the engagement with content throughout the buyer journey, which conventional attribution models don’t credit.

Incorporate These Blindspots: Pre-Acquisition & The Post-Lead Journey

Another substantial blind spot in SEO measurement occurs when companies focus exclusively on pre-acquisition activities, meaning everything that happens before a lead is added to your CRM.

Consider the typical journey enterprise clients take in an account-based marketing approach:

  1. After multiple organic searches, a prospect converts into a lead from direct traffic.
  2. After being qualified as an SQL, they’re included in an email sequence that they never respond to, but return through a Google Ads campaign promoting a white paper.
  3. They download it from an organic search visit and continue reading more blog articles to understand your product and the outcomes they hope to achieve.

Can your marketing team track how each channel (direct, paid search, and organic) influenced the deal throughout the sales process?

Multitouch attribution tools allow marketers to finally link SEO content to tangible business outcomes by tracking what SEO-driven content leads interacted with before a sale.

Heeet Makes SEO ROI Calculations Easy

After years of wrestling with these challenges, we built Heeet to fill the void: an end-to-end attribution solution that connects SEO efforts and interactions generated from content marketing to revenue by highlighting their impact throughout the sales cycle within Salesforce.

Our proprietary cookieless tracking solution collects more data, ensuring your decisions are based on complete, unbiased insights rather than partial or skewed information.

Traditional SEO measurement often relies on first-click or last-click attribution, which fails to capture SEO’s entire influence on revenue. Heeet places SEO on a level playing field by providing full-funnel attribution that tracks SEO’s impact at every customer journey stage.

We help marketers determine whether SEO-driven content is the first touchpoint, one of the many intermediary interactions along the lengthy B2B sales cycle, or the final conversion leading to a sale to pinpoint SEO’s cumulative influence on your pipeline.

Screenshot from Google, April 2025

Heeet actively tracks every touchpoint, ensuring that the actual impact of SEO is neither underestimated nor misrepresented.

Rather than neglecting SEO’s role when a prospect converts through another channel, Heeet delivers a complete view of how different personas in the buying committee interact with each piece of content and where they’re converting. This empowers businesses to make informed, data-driven SEO strategies and investment decisions.

Screenshot from Heeet, April 2025
Screenshot from Heeet, April 2025

Measuring ROI is non-negotiable and hinges on precise revenue tracking and a thorough understanding of costs. Heeet streamlines this process by directly integrating SEO costs into Salesforce, covering all production expenses such as software, human resources, design, and other strategic investments.

Screenshot from Heeet, April 2025

Businesses can accurately evaluate SEO profitability by linking these costs to SEO-driven revenue. Heeet delivers a straightforward, unified view of previously fragmented data within Salesforce, empowering marketing and finance teams to confidently assess SEO ROI with a single tool.

Screenshot from Heeet, April 2025

SEO is more than ranking on Google; it’s about driving impactful engagement with quality content referenced in the multiple search tools buyers use. Heeet tracks which content prospects engage with and ties it directly to revenue outcomes, providing marketing and sales teams with critical insights that propel them forward. With our Google Search Console integration, we’re helping marketers draw more data into Salesforce to get the unified view of their content’s performance in a single place and connect search intents with business outcomes (leads, converted leads, revenue,…). This enables marketers to align ranking position with search intent and revenue, enhancing content strategy and tracking performance over time.

Screenshot from Heeet, April 2025

For B2B marketers pairing their SEO content with a paid strategy, our latest Google Ads update allows users to see the exact search query that prospects typed before clicking on a search result. This allows SEO experts and copywriters to gain the intel they need to reduce their cost per lead by creating content they know their audience is searching for.

Screenshot from Heeet, April 2025

Ready to enhance your marketing ROI tracking and connect every marketing activity to revenue?

From SEO to events, paid ads, social organic, AI referrals, webinars, and social ads, Heeet helps you uncover the real performance of your marketing efforts and turn revenue data into actionable insights.


Image Credits

Featured Image: Image by Shutterstock. Used with permission.

In-Post Image: Images by Heeet. Used with permission.

DOGE’s tech takeover threatens the safety and stability of our critical data

Tech buzzwords are clanging through the halls of Washington, DC. The Trump administration has promised to “leverage blockchain technology” to reorganize the US Agency for International Development, and Elon Musk’s DOGE has already unleashed an internal chatbot to automate agency tasks—with bigger plans on the horizon to take over for laid-off employees. The executive order that created DOGE in the first place claims the agency intends to “modernize Federal technology and software.” But jamming hyped-up tech into government workflows isn’t a formula for efficiency. Successful, safe civic tech requires a human-centered approach that understands and respects the needs of citizens. Unfortunately, this administration laid off all the federal workers with the know-how for that—seasoned design and technology professionals, many of whom left careers in the private sector to serve their government and compatriots.

What’s going on now is not unconventional swashbuckling—it’s wild incompetence. Musk may have run plenty of tech companies, but building technology for government is an entirely different beast. If this administration doesn’t change its approach soon, American citizens are going to suffer far more than they probably realize.

Many may wince remembering the rollout of Healthcare.gov under the Obama administration. Following passage of the Affordable Care Act, Healthcare.gov launched in October of 2013 to facilitate the anticipated wave of insurance signups. But enormous demand famously took down the website two hours after launch. On that first day, only six people were able to complete the registration process. In the wake of the mess, the administration formed the US Digital Service (USDS) and 18F, the digital services office of the General Services Administration. These agencies—the ones now dismantled at the hands of DOGE—pulled experienced technologists from industry to improve critical infrastructure across the federal government, including the Social Security Administration and Veterans Affairs. 

Over the last decade, USDS and 18F have worked to build safe, accessible, and secure infrastructure for the people of the United States. DirectFile, the free digital tax filing system that the IRS launched last year, emerged from years of careful research, design, and engineering and a thoughtful, multi-staged release. As a result, 90% of people who used DirectFile and responded to a survey said their experience was excellent or above average, and 86% reported that DirectFile increased their trust in the IRS. Recently, Sam Corcos, a DOGE engineer, told IRS employees he plans to kill the program. When 21 experienced technologists quit their jobs at USDS in January after their colleagues were let go, they weren’t objecting on political grounds. Rather, they preferred to quit rather than “compromise core government services” under DOGE, whose orders are incompatible with USDS’s original mission.

As DOGE bulldozes through technological systems, firewalls between government agencies are collapsing and the floodgates are open for data-sharing disasters that will affect everyone. For example, the decision to give Immigration and Customs Enforcement access to IRS data and to databases of unaccompanied minors creates immediate dangers for immigrants, regardless of their legal status. And it threatens everyone else, albeit perhaps less imminently, as every American’s Social Security number, tax returns, benefits, and health-care records are agglomerated into one massive, poorly secured data pool. 

That’s not just speculation. We’ve already seen how data breaches at companies like Equifax can expose the sensitive information of hundreds of millions of people. Now imagine those same risks with all your government data, managed by a small crew of DOGE workers without a hint of institutional knowledge between them. 

Making data sets speak to each other is one of the most difficult technological challenges out there. Anyone who has ever had to migrate from one CRM system to another knows how easy it is to lose data in the process. Centralization of data is on the administration’s agenda—and will more than likely involve the help of contracting tech companies. Giants like Palantir have built entire business models around integrating government data for surveillance, and they stand to profit enormously from DOGE’s dismantling of privacy protections. This is the playbook: Gut public infrastructure, pay private companies millions to rebuild it, and then grant those companies unprecedented access to our data. 

DOGE is also coming for COBOL, a programming language that the entire infrastructure of the Social Security Administration is built on. According to reporting by Wired, DOGE plans to rebuild that system from the ground up in mere months—even though the SSA itself estimated that a project like that would take five years. The difference in those timelines isn’t due to efficiency or ingenuity; it’s the audacity of naïveté and negligence. If something goes wrong, more than 65 million people in the US currently receiving Social Security benefits will feel it where it hurts. Any delay in a Social Security payment can mean the difference between paying rent and facing eviction, affording medication or food and going without. 

There are so many alarms to ring about the actions of this administration, but the damage to essential technical infrastructure may be one of the effects with the longest tails. Once these systems are gutted and these firewalls are down, it could take years or even decades to put the pieces back together from a technical standpoint. And since the administration has laid off the in-house experts who did the important and meticulous work of truly modernizing government technology, who will be around to clean up the mess?  

Last month, an 83-year-old pastor in hospice care summoned her strength to sue this administration over its gutting of the Consumer Financial Protection Bureau, and we can follow her example. Former federal tech workers have both the knowledge and the legal standing to challenge these reckless tech initiatives. And everyday Americans who rely on government services, which is all of us, have a stake in this fight. Support the lawyers challenging DOGE’s tech takeover, document and report any failures you encounter in government systems, and demand that your representatives hold hearings on what’s happening to our digital infrastructure. It may soon be too late.

Steven Renderos is the executive director of Media Justice.

Correction: Due to a CMS error, this article was originally published with an incorrect byline. Steven Renderos is the author.

A vision for the future of automation

The manufacturing industry is at a crossroads: Geopolitical instability is fracturing supply chains from the Suez to Shenzhen, impacting the flow of materials. Businesses are battling rising costs and inflation, coupled with a shrinking labor force, with more than half a million unfilled manufacturing jobs in the U.S. alone. And climate change is further intensifying the pressure, with more frequent extreme weather events and tightening environmental regulations forcing companies to rethink how they operate. New solutions are imperative.

Meanwhile, advanced automation, powered by the convergence of emerging and established technologies, including industrial AI, digital twins, the internet of things (IoT), and advanced robotics, promises greater resilience, flexibility, sustainability, and efficiency for industry. Individual success stories have demonstrated the transformative power of these technologies, providing examples of AI-driven predictive maintenance reducing downtime by up to 50%. Digital twin simulations can significantly reduce time to market, and bring environment dividends, too: One survey found 77% of leaders expect digital twins to reduce carbon emissions by 15% on average.

Yet, broad adoption of this advanced automation has lagged. “That’s not necessarily or just a technology gap,” says John Hart, professor of mechanical engineering and director of the Center for Advanced Production Technologies at MIT. “It relates to workforce capabilities and financial commitments and risk required.” For small and medium enterprises, and those with brownfield sites—older facilities with legacy systems— the barriers to implementation are significant.

In recent years, governments have stepped in to accelerate industrial progress. Through a revival of industrial policies, governments are incentivizing high-tech manufacturing, re-localizing critical production processes, and reducing reliance on fragile global supply chains.

All these developments converge in a key moment for manufacturing. The external pressures on the industry—met with technological progress and these new political incentives—may finally enable the shift toward advanced automation.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.

How to Track ChatGPT Traffic in GA4

ChatGPT is becoming a valuable traffic source. It may not appear in a Google Analytics overview because the volume is small, but ChatGPT traffic is often the most engaging source, even more than organic search.

I base those observations on my experience optimizing client sites for AI answers.

Screenshot of GA report showing engagement for ChatGPT traffic.

ChatGPT traffic is often the most engaging, per Google Analytics. Click image to enlarge.

I know of no studies examining why ChatGPT traffic performs well, but I have two theories:

  • Like organic search, ChatGPT provides solutions to problems, with occasional links to external sites to learn more.

The trend may change as genAI tools become mainstream. Until then, monitoring AI traffic is essential.

Track ChatGPT Referrals

In Google Analytics 4:

  • Go to Acquisition > Traffic acquisition,
  • Below the graph in the drop-down, choose “Session source / Medium,”
  • In the “Search” field, type “gpt” and click “Enter” to filter session sources.
Screenshot of GA4 traffic acqusition report for ChatGPT.

GA4, go to Acquisition > Traffic acquisition. Click image to enlarge.

Then create custom reports to access the data quickly.

Some external tools can filter GA4 data traffic. For example, Databox allows users to add the report to its dashboard and even overlay other data, such as conversions:

Screenshot of the Databox report.

Databox allows users to add the GA4 report for ChatGPT. Click image to enlarge.

ChatGPT does not disclose actual user prompts, but we can surmise the content by exploring the landing pages of those users. Each page solves a problem. Thus the prompt presumably requested that solution.

Analyze ChatGPT Referrals

In GA4:

  • Go to Engagement > Landing Pages,
  • Click “Add filter” below “Landing page,”
  • Select “Session source / Medium,”
  • Select “Contains” and type “gpt”
  • Click “Apply”

Build a “gpt” traffic source filter in GA4. Click image to enlarge.

This will filter traffic sources to those containing “gpt” and sort the landing pages by the most clicks from ChatGPT.

The resulting report will help identify pages that ChatGPT cites to solve relevant problems. From there, query ChatGPT to see the context of those citations, as in:

This is my URL: [URL]. What prompts would trigger ChatGPT to cite the page as a solution?

LinkedIn Launches New Creator Hub With Content Strategy Tips via @sejournal, @MattGSouthern

LinkedIn has launched a new “Create on LinkedIn” hub that helps professionals create better content, understand their stats, and use different post types.

The new hub is organized into three main sections: Create, Optimize, and Grow. It also includes a Creator Tools section with specific advice for each post format.

This resource offers helpful tips straight from LinkedIn for people using it to grow their business, build their brand, or share industry expertise.

Screenshot from: https://members.linkedin.com/create, April 2025.

Content Creation Best Practices

The “Create” section explains what makes a good LinkedIn post. It highlights four key parts:

  • A catchy opening that grabs attention
  • Clear, simple messaging
  • Your personal view or unique angle
  • Questions that start conversations

LinkedIn suggests posting 2-5 times weekly to build your audience, noting that “consistency helps you build community.”

The guide recommends these popular content topics:

  • Career advice and personal lessons
  • Industry knowledge and expertise
  • Behind-the-scenes workplace stories
  • Thoughts on industry trends
  • Stories about overcoming challenges

Analytics-Driven Content Optimization

The “Optimize” section shows how to use LinkedIn’s analytics to improve your strategy. It suggests these four steps:

  1. Regularly check how many people see and engage with your posts
  2. Adjust when you post based on when your audience is most active
  3. Set goals using your average performance numbers
  4. Make more content similar to your best-performing posts

Format-Specific Creator Tools

One of the most useful parts for marketers is the breakdown of LinkedIn’s different content types. Each comes with specific tips and technical requirements:

Video Content

LinkedIn says “videos build trust faster” and reveals that “85% of videos watched on LinkedIn are viewed on mute.” This makes subtitles a must.

The guide suggests keeping videos short (60-90 seconds) and posting them directly on LinkedIn instead of sharing links.

Text and Images

For regular posts, LinkedIn stresses being real:

“People want to learn from those they feel a connection to, so it’s best to be yourself.”

It suggests focusing on specific topics rather than broad ones.

Screenshot from: members.linkedin.com/create-tools, April 2025.

Newsletters

You can create newsletters if you have over 150 followers and have posted original content in the last 90 days.

LinkedIn recommends posting on a regular schedule and using eye-catching cover videos.

Screenshot from: members.linkedin.com/create-tools, April 2025.

Live Events

LinkedIn Live lets you stream to your audience using third-party broadcasting tools if you qualify. To help you get the best results, LinkedIn offers tips before, during, and after your event.

Screenshot from: members.linkedin.com/create-tools, April 2025.

Why This Matters

While organic reach has dropped on many social platforms, LinkedIn still offers good visibility opportunities.

The content strategy advice matches what many marketers already do on other platforms. However, it provides specific insights into how LinkedIn’s algorithm works and what its users prefer.

Next Steps for Marketers

LinkedIn’s focus on analytics and testing different content types shows it wants users to be more strategic.

Check out this new resource to update your LinkedIn strategies. The format details are especially helpful for optimizing your content.

With over 1 billion professionals on LinkedIn, the platform is essential for B2B marketing, promoting professional services, and building thought leadership.

Smart marketers will include these approaches in their social media plans.


Featured Image: Fanta Media/Shutterstock

OpenAI CEO Sam Altman Confirms Planning Open Source AI Model via @sejournal, @martinibuster

OpenAI CEO Sam Altman recently said the company plans to release an open source model more capable than any currently available. While he acknowledged the likelihood of it being used in ways some may not approve of, he emphasized that highly capable open systems have an important role to play. He described the shift as a response to greater collective understanding of AI risks, implying that the timing is right for OpenAI to re-engage with open source models.

The statement was in the context of a Live at TED2025 interview where the interviewer, Chris Anderson, asked Altman whether the Chinese open source model DeepSeek had “shaken” him up.

Screenshot Of Sam Altman At Live at TED2025

Altman responded by saying that OpenAI is preparing to release a powerful open-source model that is near the capabilities of the most advanced AI models currently available today.

Altman responded:

“I think open source has an important place. We actually just last night hosted our first like community session to kind of decide the parameters of our open source model and how we want to shape it.

We’re going to do a very powerful open source model. I think this is important. We’re going to do something near the frontier, I think better than any current open source model out there.
This will not be all… like, there will be people who use this in ways that some people in this room, maybe you or I, don’t like. But there is going to be an important place for open source models as part of the constellation here…”

Altman next admitted that they were slow to act on open source but now plan to contribute meaningfully to the movement.

He continued his answer:

“You know, I think we were late to act on that, but we’re going to do it really well.”

About thirty minutes later in the interview Altman circled back to the topic of open source, lightheartedly remarking that maybe in a year the interviewer might yell at him for open sourcing an AI model but he said that in everything there are trade-offs and that he feels OpenAI has done a good job of bringing AI technology into the world in a responsible way.

He explained:

“I do think it’s fair that we should be open sourcing more. I think it was reasonable for all of the reasons that you asked earlier, as we weren’t sure about the impact these systems were going to have and how to make them safe, that we acted with precaution.

I think a lot of your questions earlier would suggest at least some sympathy to the fact that we’ve operated that way. But now I think we have a better understanding as a world and it is time for us to put very capable open systems out into the world.

If you invite me back next year, you will probably yell at me for somebody who has misused these open source systems and say, why did you do that? That was bad. You know, you should have not gone back to your open roots. But you know, we’re not going to get… there’s trade-offs in everything we do. And and we are one player in this one voice in this AI revolution trying to do the best we can and kind of steward this technology into the world in a responsible way.

I think we have over the last almost decade …we have mostly done the thing we’ve set out to do. We have a long way to go in front of us, our tactics will shift more in the future, but adherence to this sort of mission and what we’re trying to do I think, very strong.”

OpenAI’s Open Source Model

Sam Altman acknowledged OpenAI was “late to act” on open source but now aims to release a model “better than any current open source model.” His decision to release an open source AI model is significant because it will introduce additional competition and improvement to the open source side of AI technology.

OpenAI was established in 2015 as a non-profit organization but transitioned in 2019 to a closed source model over concerns about potential misuse. Altman used the word “steward” to describe OpenAI’s role in releasing AI technologies into the world, and the transition to a closed source system reflects that concern.

2025 is a vastly different world from 2019 because there are many highly capable open source models available, models such as DeepSeek among them. Was OpenAI’s hand forced by the popularity of DeepSeek? He didn’t say, framing the decision as an evolution from a position of responsible development.

Sam Altman’s remarks at the TED interview suggest that OpenAI’s new open source model will be powerful but not representative of their best model. Nevertheless, he affirmed that open source models have a place in the “constellation” of AI, with a legitimate role as a strategically important and technically capable part of the broader technological ecosystem.

Featured image screenshot by author

AI Search Study: Product Content Makes Up 70% Of Citations via @sejournal, @MattGSouthern

A new study tracking 768,000 citations across AI search engines shows that product-related content tops AI citations. It makes up 46% to 70% of all sources referenced.

This finding offers guidance on how marketers should approach content creation amid the growth of AI search.

The research, conducted over 12 weeks by XFunnel, looked at which types of content ChatGPT, Google (AI Overviews), and Perplexity most often cite when answering user questions.

Here’s what you need to know about the findings.

Product Content Visible Across Queries

The study shows AI platforms prefer product-focused content. Content with product specs, comparisons, “best of” lists, and vendor details consistently got the highest citation rates.

The study notes:

“This preference appears consistent with how AI engines handle factual or technical questions, using official pages that offer reliable specifications, FAQs, or how-to guides.”

Other content types struggled to get cited as often:

  • News and research articles each got only 5-16% of citations.
  • Affiliate content typically stayed below 10%.
  • User reviews (including forums and Q&A sites) ranged between 3-10%.
  • Blog content received just 3-6% of citations.
  • PR materials barely appeared, typically less than 2% of citations.

Citation Patterns Vary By Funnel Stage

AI platforms cite different content types depending on where customers are in their buying journey:

  • Top of funnel (unbranded): Product content led at 56%, with news and research each at 13-15%. This challenges the idea that early-stage content should focus mainly on education rather than products.
  • Middle of funnel (branded): Product citations dropped slightly to 46%. User reviews and affiliate content each rose to about 14%. This shows how AI engines include more outside opinions for comparison searches.
  • Bottom of funnel: Product content peaked at over 70% of citations for decision-stage queries. All other content types fell below 10%.

B2B vs. B2C Citation Differences

The study found big differences between business and consumer queries:

For B2B queries, product pages (especially from company websites) made up nearly 56% of citations. Affiliate content (13%) and user reviews (11%) followed.

For B2C queries, there was more variety. Product content dropped to about 35% of citations. Affiliate content (18%), user reviews (15%), and news (15%) all saw higher numbers.

What This Means For SEO

For SEO professionals and content creators, here’s what to take away from this study:

  • Adding detailed product information improves citation chances even for awareness-stage content.
  • Blogs, PR content, and educational materials are cited less often. You may need to change how you create these.
  • Check your content mix to make sure you have enough product-focused material at all funnel stages.
  • B2B marketers should prioritize solid product information on their own websites. B2C marketers need strategies that also encourage quality third-party reviews.

The study concludes:

“These observations suggest that large language models prioritize trustworthy, in-depth pages, especially for technical or final-stage information… factually robust, authoritative content remains at the heart of AI-generated citations.”

As AI transforms online searches, marketers who understand citation patterns can gain a competitive edge in visibility.


Featured Image: wenich_mit/Shutterstock