“It was very clear that we will never ever write code by hand again.”
That comment, made recently by Dropbox’s former chief technology officer Aditya Agarwal, reflects a growing belief that generative AI is poised to displace swathes of white-collar workers — starting, perhaps, with software developers.
But research by Wharton professor of operations, information and decisions Neha Sharma found that many of the routine coding questions that developers once posted on popular online forum Stack Overflow appear to have moved to AI tools, while the more novel problems still require human expertise.
In a working paper, Sharma found that within four months of ChatGPT’s November 2022 release, the share of previously unseen types of questions rose by 8.6 percentage points on Stack Overflow. By seven months, novel questions made up 40.9% of all posts on the forum. This was the highest share recorded between 2020 and 2023. Importantly, these novel questions are not necessarily about AI itself; many reflect new combinations of niche “knowledge domains,” or areas of technical expertise.
“When people talk about large language models (LLMs) replacing humans, it wasn’t clear to us what space would remain for people,” Sharma said. “What we find is that the space that remains is where problems haven’t been solved before.”
“The space that remains [for humans] is where problems haven’t been solved before.”— Neha Sharma
What Coding Questions Can LLMs Answer?
According to the paper, co-authored with Simin Li from Tulane University, overall question volume on Stack Overflow fell sharply after the launch of ChatGPT. But the drop was concentrated among the most common questions. Posts linked to existing topics fell by 13.4% — about 10,669 fewer queries per month in the months following the chatbot’s launch.
By contrast, questions that combined technical topics in ways not seen before rose by 3.9% — roughly 1,672 additional questions per month — often by recombining less-popular domains.
Sharma and Li analyzed roughly 9.3 million Stack Overflow questions collected between 2018 and 2023, focusing on the period surrounding ChatGPT’s debut. By tracking new combinations of technical “tags” attached to each question, the researchers could identify when developers were asking something genuinely new, rather than repeating an existing problem.
“A novel question is one where the tags have never appeared together before,” Sharma explained. “That usually means someone is combining tools in a way that hasn’t been documented.”
“People say these systems can’t improve without human data. At the same time, they’re thinning the source that created them.”— Neha Sharma
The Long-Term Health of Online Communities
After ChatGPT came out, discussions became less centered on a few dominant topics. Within four months, the most popular topics were connected to fewer others — with the highest-traffic topic seeing its links fall by about 1.6% — as activity shifted toward more specialized and boundary-spanning areas.
The pattern is consistent with what the authors call “selective substitution”: Basic problems are moving to AI tools, while more unfamiliar ones still require human expertise.
The findings raise questions about the long-term health of online knowledge communities. If routine questions keep moving to AI tools, platforms like Stack Overflow may shrink even as they become more specialized. “The routine questions are what generates traffic,” Sharma said.
Yet those communities still produce the novel training data that future AI systems depend on. Sharma noted the irony that current LLMs were trained on platforms like Stack Overflow. “People say these systems can’t improve without human data,” she said. “At the same time, they’re thinning the source that created them.”






