Why your content team is getting stupider by the day (and it’s ChatGPT’s fault)

Why your content team is getting stupider by the day (and it’s ChatGPT’s fault)Photo by Walls.io / Pexels

I watched a colleague cry over a blog post last Tuesday. Not because it was a particularly moving piece of prose, but because she had spent four hours trying to “fix” a 1,500-word article that ChatGPT had spit out in thirty seconds. The draft was technically perfect and entirely hollow. It was like trying to perform CPR on a mannequin. There was no heart to find, but she was under orders to make it feel “human.”

We are currently in the middle of a massive, unacknowledged brain drain. I’m not talking about people leaving their jobs—I’m talking about the people staying in them and letting their brains turn to mush because they’ve outsourced the hardest part of thinking to a LLM. If you’re leading a content team right now, you aren’t just producing more content. You’re likely presiding over the slow-motion decay of your team’s critical thinking skills.

The part where we stop actually thinking

Writing is thinking. That’s the thing everyone forgets. When you sit down to write an article, you aren’t just recording thoughts you already have; you’re wrestling with ideas, finding the gaps in your logic, and realizing that your initial premise was actually kind of garbage.

When you start a piece with a prompt, you skip the wrestling match. You go straight to the result. What I mean is—actually, let me put it differently. You’re eating the result without ever knowing what the ingredients were. I’ve seen this happen with my own team. Last month, I asked a junior writer why they included a specific point about “synergy” in a piece about remote work. They stared at me for five seconds before admitting they didn’t know. The AI put it there, it sounded professional, so they left it.

They didn’t even disagree with the point. They just hadn’t thought about it. This is the danger. We’re building a generation of “editors” who don’t actually know how to build an argument from the ground up. They’re just rearranging the furniture in a house someone else built poorly.

The moment you stop asking ‘is this true?’ and start asking ‘does this look right?’ you’ve already lost.

The data is actually pretty depressing

Two colleagues brainstorm on a whiteboard with colorful sticky notes for project planning.

I’m not just being a luddite here. I actually tracked this. Over the last 14 weeks, I ran a small experiment with two different freelancers I use for my side projects. I paid one a flat rate to use AI as their primary draft generator and then edit it. I paid the other 50% more to write everything from scratch, with a strict “no AI” rule for the first draft.

  • AI-First Content: Average time on page was 42 seconds. Conversion rate (email signups) was 0.4%.
  • Human-First Content: Average time on page was 3 minutes and 12 seconds. Conversion rate was 3.2%.
  • The kicker: The human-first content took 4x longer to produce, but generated 8x the value.

People can smell the “AI-ness” on a brand. It’s like a weird, uncanny valley scent. It’s too balanced. It’s too polite. It never takes a risk. It’s the literary equivalent of a beige hotel room. You can stay there, but you’re never going to remember it the next morning. Total waste of money.

A brief tangent about my keyboard

I’m writing this on a mechanical keyboard with blue switches that click so loudly my neighbor probably thinks I’m operating a 1940s telegraph machine. It’s tactile. It’s physical. There’s a resistance to it that makes me think about every word. I think we’ve made writing too easy. When the friction disappears, the quality disappears with it. Anyway, back to the point.

The “Prompt Engineering” lie

I’m going to say something that will probably get me some angry LinkedIn DMs, but prompt engineering is a fake job. It’s a temporary workaround for people who don’t want to learn how to communicate. If you need a 400-word prompt to get a 500-word article, you should have just written the damn article.

I refuse to hire anyone who lists “Prompt Engineer” as a primary skill. To me, that’s just code for “I’ve forgotten how to have an original thought.” I know people will disagree with this, and they’ll talk about efficiency and scaling, but you can’t scale soul. You can’t scale the specific, weird, slightly-wrong-but-interesting take that a human has after a few drinks or a long walk.

I used to think AI would save us 40 hours a week. I was completely wrong. It just moved those 40 hours into a different, more soul-crushing bucket: the bucket of trying to make robotic text sound like it wasn’t written by a robot. It’s exhausting. It’s boring. And it’s making us all stupider.

How to actually use this stuff without losing your mind

If you want a human-first AI strategy, you have to stop using it for the “middle” of the process. That’s where the thinking happens.

Use it for the grunt work. Use it to transcribe your messy voice notes. Use it to find three counter-arguments to your main point so you can crush them. But for the love of everything, do not let it write your first draft.

I have a rule now: No AI until the structure is 100% finished and at least 300 words of the “core” idea are written by hand. If you can’t explain your point in 300 words without help, you don’t have a point yet. You just have a topic.

Stop optimizing for volume. Nobody wants more content. Everyone is drowning in content. People want a perspective. They want to know that a real person with real skin and real problems sat in a chair and struggled to put these words together.

I’m still not sure if we can actually win this fight. The pressure to produce more, faster, and cheaper is a hell of a drug for most CMOs. But I do know that the teams who stay “dumb” by relying on the machine are going to be the first ones replaced by it. If your only value is being a slightly better editor than the AI, you’re already obsolete.

Go write something that makes someone angry. Or makes someone laugh. Just make sure it’s you doing it.