For roughly three decades, a growing band of academics, economists and technology experts have focused intensely on this question: Why do some high-tech innovations succeed, while others fail?
According to panelists at the 11th Wharton Technology Conference, a growing body of research can increasingly explain why a once cutting-edge technology company like Sun Microsystems eventually imploded, and why an invention like Apple’s iPad became an industry leader when the inner workings of the device differs little from what other firms have developed.
Sidney Winter, an emeritus professor of management at Wharton, said that technology changes during the last 30 years have strengthened his belief that the social environment plays a critical role in technological “evolution” — that innovation does not take place in a vacuum. “There has to be a real context,” Winter told the conference participants. “There has to be something happening in the world that makes the industry appear at roughly the time it appears. That fact then colors what happens in the industry’s very important early stages of the development.”
In order for technology industries to evolve, the firms themselves must also participate in creating the right environment for innovations to take hold. “The history of technological evolution isn’t merely the history of patented or non-patented advances,” said INSEAD professor of entrepreneurship Philip Anderson, another conference panelist. “What you have to do, if you really want to create value, is to hook up the technology with complementary assets and customers — it’s not merely coming up with a better technology that works.”
Science-based breakthroughs have proven the quickest and most effective way to carry out Austrian-American economist Joseph Schumpeter’s broader theory of “creative destruction” — bringing together new ideas, organizations and consumers to destroy old ways of doing business while creating more viable ones — Anderson said. But he added that Schumpeter and some of his followers are wrong to focus solely on the firm as the key building block for change, when more recent evidence suggests that wider ecosystems of technological development are at the root of entrepreneurial progress.
“Where do routines come from that make it possible for somebody to come up with a good mini-computer?” Anderson asked. The iPad is a huge commercial success, he said, not because its technology is particularly radical — “as far as I can see, it’s a big phone” — but because Apple created a home for innovative applications, or apps, that many customers would probably never buy otherwise. He cited in particular one iPad app that he can hold up against the evening sky and use to identify constellations — a product he loves but would not likely have purchased as a separate, stand-alone item.
He also pointed out that there is a certain irony in the fact that Facebook’s Silicon Valley headquarters was the home of a recently bought-out 1990s technology leader, Sun Microsystems. “The interesting thing is that while Sun is no longer an organizational structure within which all the brilliant people who worked there can create the kind of value that they once did, the disappearance of [it] has essentially spewed all kinds of genetic material into the environment that now works in different places and can create amazing results.” During the dot-com boom of the late 1990s, he added, some of the greatest advances in reaching customers through e-commerce came out of firms that failed within a few years. Those advances were eventually picked up by larger and more prosperous companies.
The question of how ideas evolve and become commercialized transcends mere academics, Anderson said. Understanding how these ecosystems of innovation work can make the world a better place more quickly, “but in this networked world, maybe the most interesting question is: How do talented people find a way to keep moving in these different structures and different places at different times and find a way to express [their ideas] to the world?”