Stack Overflow‘s traffic tanked when ChatGPT launched. Three years later, CEO Prashanth Chandrasekar has pivoted the company hard into enterprise SaaS and AI data licensing.

But there’s a fascinating split happening in the developer community. Over 80 percent of Stack Overflow users want to use AI for coding. Yet only 29 percent actually trust it.

That gap tells you everything about where we are with AI in 2025. Everyone’s using tools they don’t believe in.

The ChatGPT Existential Crisis

Chandrasekar saw the threat immediately when ChatGPT launched in November 2022. One month after our last interview, his entire business model faced extinction.

Stack Overflow had built itself as the trusted Q&A hub for developers. Suddenly, AI could answer those same questions instantly through a natural language interface.

His response was decisive. He declared a company code red and pulled 10 percent of staff to figure out solutions. About 40 people got six months to deliver answers by summer 2023.

“This was going to be this very huge change to how people consume technology,” Chandrasekar said. “It became very clear what we needed to focus on because this was an existential moment.”

Plus, he had experience with disruptive threats. Before Stack Overflow, he helped Rackspace respond to Amazon Web Services. So he knew how to structure the response.

AI Slop Flooded the Platform

Stack Overflow faced immediate problems on both sides. The input side got hammered with AI-generated answers as users tried gaming the system with ChatGPT responses.

Traffic spiked initially. Then the community revolted.

Moderators quickly identified AI-generated content flooding the forums. So Stack Overflow banned AI answers outright. That ban still stands today.

“Our proposition is to be the trusted vital source for technologies,” Chandrasekar explained. “We had to make sure there are only a few places where you can go and not deal with AI slop.”

Meanwhile, the output side collapsed. Why visit Stack Overflow when ChatGPT could answer your coding questions instantly?

Questions on the platform dropped dramatically. Almost all of those were simple questions AI could handle easily. Complex questions still get asked, but the easy stuff vanished.

The Brutal Pivot to Enterprise

Chandrasekar made hard choices. The company laid off nearly 25 percent of staff in 2023 as traffic declined. Today, Stack Overflow runs with about 300 people.

But he saw opportunity in the chaos. Enterprise customers still needed trustworthy knowledge bases. In fact, they needed them more than ever.

AI tools require high-quality data to work properly. Stack Overflow’s 20 years of human-curated answers became incredibly valuable for training and retrieval-augmented generation (RAG).

So the company built two new revenue streams. First, Stack Internal became the primary business. It’s a private version of Stack Overflow that 25,000 companies now use.

Companies like Uber plug Stack Internal into their AI assistants. Uber Genie uses thousands of Stack Overflow questions to automatically answer employee queries in Slack.

Second, data licensing exploded. Every major AI lab wanted access to Stack Overflow’s corpus. Google, OpenAI, Databricks, Snowflake – they all struck deals.

“We put up anti-scrapers very quickly,” Chandrasekar said. “We knew exactly who was scraping and who wasn’t.”

The Controversial OpenAI Deal

Users erupted when Stack Overflow partnered with OpenAI. Some started deleting their contributions to prevent AI training. Stack Overflow had to ban them.

But Chandrasekar defends the licensing model. The old internet economy broke. Traffic from search engines collapsed as AI tools kept users inside their interfaces.

“Companies that support these platforms have to adopt a new business model to survive,” he explained. “Data licensing only felt right.”

These aren’t one-time payments either. AI companies pay recurring fees for continued access to both historical data and new contributions.

However, some labs refuse to play ball. Chandrasekar wouldn’t name names, but he acknowledged that some companies keep scraping despite takedown requests.

“Some of them care and want to be good citizens,” he said. “Some of them absolutely do not care and they would prefer the smoke.”

Nobody Trusts the Tools They’re Using

Here’s the contradiction driving everything. Stack Overflow’s 2025 Developer Survey found 80 percent of users either use AI or plan to use it for coding tasks.

But only 29 percent trust AI outputs.

Think about that gap. Four out of five developers want to use tools that seven out of ten don’t trust. That’s not sustainable.

“Trust is a very deep word,” Chandrasekar said. “You don’t trust something because you don’t think it’s producing high integrity, accurate answers.”

Yet developers keep using these tools anyway. Maybe they’re curious about a transformative technology. Maybe they fear becoming irrelevant if they don’t learn AI workflows.

Or maybe the tools provide just enough value despite their flaws.

Stack Overflow itself uses “vibe coding” tools internally. Designers and product managers prototype features with AI before engineering builds them properly.

“We’ve embraced these tools internally for that benefit,” Chandrasekar admitted. “There will be ways in which you feel comfortable using it.”

AI Assist Bets on Natural Language

Stack Overflow launched AI Assist this week. It’s their answer to the ChatGPT threat, three years later.

The feature uses RAG technology. It searches Stack Overflow’s 80 million questions first, then falls back to OpenAI if needed. Every answer includes attribution links.

Stack Overflow traffic tanked when ChatGPT launched in 2022

Chandrasekar believes this solves the trust problem. Ground answers in human-curated knowledge. Provide sources. Let users verify.

But it’s still dependent on LLM technology that users fundamentally don’t trust. The platform hopes ecosystem improvements will eventually close that 29 percent trust gap.

“That is more a reflection of what people have access to beyond Stack,” he said. “We can focus on being the most vital source for technologists.”

Does Stack Overflow Still Attract New Users?

Simple coding questions disappeared from Stack Overflow. ChatGPT handles those now.

But complex, thorny problems still drive engagement. Plus, the platform opened new features to broaden its appeal.

Users can now chat directly with experts in topic-specific rooms. Coding challenges let developers prove they understand fundamentals – important as companies worry about juniors who only know vibe coding.

Stack Overflow also partnered with Indeed on tech job listings. The goal is providing multiple reasons to visit beyond just asking questions.

“We want to give people other reasons to come to the site besides just getting their answers,” Chandrasekar explained.

But there’s an open question: Will the next generation of developers ever visit Stack Overflow? Tools like Cursor AI and GitHub Copilot handle coding entirely within the editor.

Chandrasekar thinks yes. Complex problems still need human expertise. AI can’t reason through truly novel architectural decisions.

Maybe. But that’s a much smaller addressable market than “everyone learning to code.”

The Age of Rationalization Arrives

Chandrasekar predicts 2026 will bring massive changes. Companies tested countless AI tools in 2025. Now CFOs want return on investment.

Stack Overflow banned AI slop while licensing trusted data

“Productivity improvements have to come from these,” he said. “There’s going to be tremendous pressure in the system to prove what the real value is.”

He sees consolidation coming. Companies won’t keep paying for four different AI coding assistants. They’ll pick one, maybe two.

Stack Overflow positions itself as the trust layer sitting beneath those tools. The knowledge intelligence platform that makes AI agents actually reliable.

Big customers like HP, Eli Lilly, and Xerox are testing this approach. They need to justify AI spending. Stack Internal provides the human-curated data that grounds AI responses.

“They want this trust layer through a company like Stack that they can actually insert between their data and their AI tools,” Chandrasekar said.

The Uncomfortable Truth

Stack Overflow survived an existential threat by completely reinventing itself. The company went from community platform to enterprise SaaS in three years.

But that survival came with trade-offs. The community that built Stack Overflow’s value feels betrayed by data licensing deals. New developers might never join the community at all.

And the whole business model rests on one assumption: AI tools will keep improving until that 29 percent trust number climbs high enough to justify enterprise spending.

If reasoning models plateau. If RAG doesn’t solve hallucinations. If companies decide junior developers are cheaper than AI subscriptions.

Then what?

Chandrasekar believes improvements are coming. Gemini 3 shocked everyone this month. Trainium chips keep dropping prices. Compounding effects will produce “magical outcomes.”

Maybe he’s right. Or maybe 2026 brings the rationalization he predicts, just not the way anyone expects.

The gap between using AI and trusting AI has to close eventually. Either tools get dramatically better, or everyone admits the emperor has no clothes.

We’ll find out which one soon enough.