The AI Cannibalism Problem

Stack Overflow is dying. Is AI the culprit...or just the undertaker?

"If I have seen further it is by standing on the shoulders of giants."

Isaac Newton

Gino Ferrand, writing today from Seattle, WA 🌄

But what happens when the giants vanish?

The exodus is undeniable.

Stack Overflow traffic is down more than 50% since ChatGPT launched. New questions? Down 76% over the last two years. Answer rates are slipping. Veteran contributors are burning out. Moderators are resigning. All while AI tools grow faster, smoother, and more context-aware by the day.

Let’s not pretend it’s a coincidence. Developers aren’t suddenly writing perfect code. They’re skipping the forum. They're asking ChatGPT instead.

And to be clear: I get it. AI gives you an answer in 5 seconds. Stack Overflow gives you a snarky comment and a link to a decade-old thread...if you're lucky.

But here’s the part no one wants to talk about: AI’s ability to answer those questions depends on the very forums it’s displacing.

AI-Enabled Nearshore Engineers: The Ultimate Competitive Edge

The future of software engineering isn’t just AI... it’s AI-powered teams. By combining AI-driven productivity with top-tier remote nearshore engineers, companies unlock exponential efficiency at a 40-60% lower cost, all while collaborating in the same time zone.

 AI supercharges senior engineers—faster development, fewer hires needed
 Nearshore talent = same time zones—real-time collaboration, no delays
 Elite engineering at significant savings—scale smarter, faster, better

ChatGPT and other large language models were trained on years of questions, answers, explanations, and tutorials scraped from Stack Overflow, GitHub, Reddit, and other community-driven platforms. They’re derivative by design. When you ask ChatGPT how to fix a Python bug, you're often getting a remix of code that was posted on Stack Overflow back in 2014.

So what happens when the forums go quiet? What happens when the model eats the entire corpus...and there's no fresh source to learn from?

A recent analysis from Stack Overflow’s own blog notes a significant drop not just in traffic, but in question quality. Fewer new developers are participating. The ones who do show up often post questions that feel like AI-generated prompts themselves. It’s a feedback loop...and not a good one.

Stack Overflow was always more than a database. It was a filtering mechanism. Bad answers got downvoted. Great ones rose. Rebuttals were threaded. Context was preserved. That system produced a kind of open-source editorial layer that AI now bypasses. There’s no voting on ChatGPT. No peer review. Just probability and style.

Now we’re seeing the early warning signs. An open-source project led by researchers at Stanford and EPFL found that AI models trained on synthetic data (e.g. model-generated content) suffer from "model collapse"...a degradation in quality where outputs become increasingly lossy and derivative, like photocopies of photocopies.

We’re not there yet. But the trajectory is clear.

To be fair, AI won't starve without Stack Overflow...not immediately. There are workarounds. Reinforcement learning from human feedback (RLHF) allows models to adapt based on user interaction. Proprietary datasets can be licensed. Some believe we’ll enter a world of synthetic training loops, where AIs critique each other and learn in closed ecosystems.

But...do we want to live in a world where knowledge creation is privatized, synthetic, and optimized for prediction over understanding? Are we comfortable trading communal knowledge for compressed probability?

Stack Overflow was annoying. It was flawed. But it was real. It made you explain yourself. It forced you to think. It taught you how to learn.

If we dismantle these systems entirely, we’re not just replacing them with AI...we’re replacing them with silence.

More to come…

Recommended Reads

✔️StackOverflow’s Decline (Eric Holscher)

Gino Ferrand, Founder @ TECLA