Why Corporate Digital Transformation Didn't Always DeliverPublished: May 07, 2003 in Knowledge@Wharton
In 2000, DaimlerChrysler AG was in crisis. The company, formed by the merger of Germany’s Daimler-Benz and Detroit’s Chrysler, had stumbled in integrating the operations of its predecessors. The Chrysler portion of the business was on its way to losing $1.2 billion that year.
In response, the company laid off 20% of the Chrysler staff and shuttered 13 plants.
Karenann Terrell arrived to help lead its e-business initiatives during this time of trouble. Her timing couldn’t have been better. “It’s a helluva lot easier to implement digital transformation in companies in crisis,” said Terrell, who is now the company’s director for NAFTA technical IT operations. “You have to have urgency, and people can no longer do their business in a silo when 20% of their staff is out the door.”
Terrell was speaking as part of a panel on corporate digital transformation at “Catching Our Breath: Reorienting Strategy During the Internet’s Quiet Time,” a May 1 conference at the Wharton School sponsored by the Reginald H. Jones Center and the Wharton e-Business Initiative (WeBI). David Sherr, vice president for architecture and planning at Charles Schwab, and Bill Raduchel, formerly of AOL Time Warner, joined her on the panel.
Moderator Raffi Amit, professor of entrepreneurship and management at Wharton and WeBI’s co-director, set the stage for the discussion. “First-generation internet investments have been less than profitable,” he pointed out.
During the boom, companies formed 62 Internet-related consortia; only 19 remain today, he said. Investors shelled out more than $20 billion in venture capital from 1998 to 2000, and companies committed to more than $50 billion in web spending. But less than a quarter of corporate investments in customer-relationship management and supply-chain systems yielded positive returns on investment.
What went wrong? Is it, as some critics have suggested, that the Internet’s uses as a business tool were oversold? Or is it, as Amit suggested, that the companies failed to fundamentally change the way they operated to exploit efficiencies enabled by the web?
Based on his research, Amit opts for the second explanation. His work has led him to identify several keys to successful digital transformation, including that “change must be driven from the top, and executives must be obsessed with change management.”
That finding mirrors Terrell’s experience at DaimlerChrysler. “Our CEO said, ‘Let’s use the Internet to turn the company around.’ So we started with strong, visionary leadership. We were going to cut 20% of personnel, but we were going to invest in e-business.”
That was a shift for Chrysler, which Terrell said had been an Internet laggard. Until the crisis, “There was a feeling of an arranged marriage between Chrysler and the Internet. The company was focused on cars and thought, ‘There’s this Internet thing out there. We ought to do something about that.’”
Chrysler, for example, didn’t have a central repository for customer information or a standard way to dig it out. “We had customer information all over the place – in sales, in marketing, in warranty and in dealerships.” It thus couldn’t effectively analyze customer data to identify trends and selling opportunities.
The crisis helped Terrell’s group gain access to that information: Fearing for their livelihood, division managers handed it over. In better times, she suspects they would have guarded it jealously. “Crisis can be an enabler for change,” she noted.
But crisis alone isn’t enough. A company has to have a corporate culture that allows it to accept change readily, she said. Though Chrysler wasn’t web-savvy, it was “a scrappy company that had had to reinvent itself every 10 years.” And its re-inventions had acted as “a natural refresh cycle” that made employees more comfortable with change than they might have been at a more stable place.
Too Much Money, Too Few Questions
David Sherr, vice president for architecture and planning at San Francisco-based Charles Schwab Corp., has a name for companies such as Chrysler that frequently reinvent themselves. He calls them “transfirms” and counts Schwab among them.
Like Chrysler, Schwab has been a place of frequent change, Sherr pointed out. It started in discount brokerage, then added a “mutual-fund supermarket” and web-based stock trading. Most recently, it has begun distributing independent investment research to individual investors.
Thanks to its frequent transformations, it hasn’t been difficult for Schwab to embrace the web, Sherr said. Even so, not every IT decision it made was a smart one. “During the boom, people would throw money at ideas and build capability in silos.” Throughout corporate America, so much money was available for technology purchases – and so few critical questions were asked about them – that divisions within companies built redundant systems.
Tight budgets in the current tough economy no longer allow that. And IT executives are looking for ways to run multiple department applications through a single server. “The challenge is, will one department let somebody else’s application run side by side with theirs,” Sherr said. “At Schwab, we’re telling them we will manage it like a utility and ‘sell’ you the capability you need.”
Sherr also offered a piece of advice for fellow IT executives: Keep doing everything you can to finagle additional money for system improvements. “Build now, because when the next boom comes, you won’t have the time to structure it right.”
Like Terrell and Sherr, Bill Raduchel, former chief technology officer New York-based AOL Time Warner Inc., suggested that the secret to digital transformation is changing minds, not machines. “In the past, you captured the benefits of technology by centralizing,” he said. “That’s not true today.”
Older mainframe computers required centralization, which fit well with hierarchical corporate structures. They were too big and too costly to be spread throughout the business. But the Internet and client-server technology distribute computing power – and thus decision-making power – throughout companies.
To take full advantage of them, Raduchel argued that firms must decentralize their operations – something executives have trouble doing.
Terrell suggested that another cultural issue bubbles below the surface. “Big businesses are built of like thinkers. You bring in IT people with different ways of thinking, and your executives, your board, can’t talk to them. What’s the first thing that happens when an entrepreneurial company gets bought by a big company? They throw out the visionary CEO.”
Amit put it slightly differently: “It’s not technology that’s the obstacle to digital transformation. It’s people. The expectation was that technology was going to change business profoundly, but human nature doesn’t change.”
Business Models that No Longer Work
People may be the largest obstacle, but they are not the only one, said David Farber, a conference speaker and professor of telecommunications systems and public policy at the University of Pennsylvania. The Internet has roiled whole fields of public policy, and policymakers and the public are still trying to understand the implications.
Consider intellectual property law. The rules there were pretty well settled till the Internet came along, and “we could save everything digitally and make perfect copies every time,” Farber said. “Business models are broken by this technology. There’s panic among people in the media business.”
And there’s no consensus on solutions. In the music industry, for example, consumers clearly want to share music files over the web. Companies want to prevent this and have implored regulators to let them devise solutions. “The courts are confused,” Farber suggests.
Even national security, which now depends on computers and networks, is in disarray, Farber says. “These networks were originally built as research efforts by people who were friends. They weren’t built to prevent people from damaging them. I know at least three ways I could take the Internet down, and it would [require] three to five hours to get it back up.”
According to Farber, the economic slowdown has aggravated the security problem. Computer-security staffers were among the first people to be laid off because companies viewed them as luxuries, not generators of badly needed revenue.
Solving the security problems will present another set of questions about limiting free expression and protecting privacy, he pointed out. “We want security, but we don’t want to lose freedom.” Nor should we want to, he added, offering a quote from Benjamin Franklin: “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.”
The Rules They Are A-Changing
In the final panel of the conference, Wharton management professor Sidney Winter noted that the Internet scene is replete with examples of situations where the rules of the game are still being developed or rewritten even as the game is being played. All the way from the technological infrastructure to the specific content of websites, there are basic institutional issues affecting governance and standards that are still under vigorous discussion.
Winter, co-director of the Reginald H. Jones Center, pointed out that the way these issues are settled can have a powerful influence on the size of market niches and the viability of business models. The e-business arena is one where the scale of an activity is a particularly powerful determinant of its fundamental economics, he noted.
Panelists who joined Winter provided an introduction to the sometimes-arcane discussions in four crucial areas.
Leading off was Gerald Faulhaber, professor of business and public policy at Wharton. He addressed the standards and governance issues in the area of wireless technology, an area he was intimately involved with during a year he spent as chief economist of the Federal Communications Commission. Faulhaber emphasized that technology is capable of delivering comparable value to consumers in a variety of different ways, and the question of which technology dominates behind the scenes may not matter much – as long as the key consumer needs are met.
For example, in current digital cell service, a variety of technologies can comfortably co-exist provided that interconnection is possible. The additional manufacturing costs of phones that can connect to multiple services are not that high. Regarding the heralded “3G” phones, Faulhaber suggested that progress is slow mainly because there is yet no clear success in identifying a service product that is appealing enough to create a mass consumer market.
WiFi – local wireless Internet connection – is “the latest craze,” Faulhaber said. It is the solution not to the problem of the “last mile” of connection, but to the “last 100 feet.” Here there is an issue of compatibility or product interoperability, but so far it looks to be something the marketplace can work out, he suggests.
Faulhaber proposed that the governance issues related to the use of the radio spectrum are of far greater importance than the technical standards. Digital cell technology and “3G” services operate in the licensed part of the spectrum. Spectrum allocations are made by the Federal Communications Commission in a process still shaped by political and bureaucratic pressures, though spectrum auctions have recently become a prominent part of it.
The big problem is that we are challenged to find frequencies for new services because we are “out” of spectrum – it has been previously allocated to other uses such as UHF television, Faulhaber noted. The present institutional arrangements do not seem to provide a feasible path to re-directing this spectrum to new, more highly valued services.
Faulhaber indicated two possible approaches to radical institutional reform. One is a property rights system that would facilitate transactions in spectrum, the other an unregulated “commons” approach that would largely rely on technology to come to the rescue. Both of these have significant disadvantages, and neither seems likely right now.
How these issues are resolved, Faulhaber suggested, will determine the shape of the wireless revolution – or indeed if there is one.
Another key set of rules for the Internet governs the domain name system. The chief responsibility is borne by ICANN – the Internet Corporation for Assigned Names and Numbers. David Farber, a computer and information science professor at Penn, summarized the rather tortuous history of this institution. As is the case in other areas of the Net’s institutional structure, the early arrangements were quite informal and generally worked well. A key role was played by Jon Postel, a student and subsequently a professor at USC. He initially managed the system as an unpaid volunteer, consulting with his colleagues who were involved with the Net.
This system became unworkable as the Internet grew, and Postel asked the government to do something about it. It responded by setting up Network Solutions, which subsequently tried to lay claim to a monopoly position in domain names. Out of the ensuing controversy, ICANN was born – but controversy intensified. Farber noted that, in addition to the scale and high stakes of the problem, the challenges to ICANN have emerged from its own internal struggles and its apparently strong aversion to transparency of process.
The controversy continues today, and the heritage of the past makes meaningful reform hard to achieve. Farber expressed hope that progress could be made through the Internet community – and fear that the alternative of “adult supervision” from the government could be even worse.
The legal infrastructure regulating the Internet continues to evolve during this “growing up” period as well. Penn law school professor R. Polk Wagner drew a contrast between the scope of the regulatory infrastructure and the source of the regulatory authority and talked about emerging trends in the direction of generalized, bottom-up regulation.
According to Wagner, the scope can range from localized regulations that apply to particular situations within traditional territorial boundaries (nations, states) to generalized regulations that attempt to cover the Internet globally, cutting across all traditional jurisdictions. For example, following the mostly unregulated early days of the Internet, localized regulation arose around particular “red flag” issues such as privacy until a clash of different local approaches pointed towards the advantages of a more generalized approach.
The source can range from top-down regulations that are imposed by national governments attempting to control Internet policy within their traditional territorial boundaries, to bottom-up regulations that typically emerge from non-governmental associations. These associations were traditionally dominated by those involved with policy from the early community-oriented days of the Internet, but increasingly now attract those with other agendas (commercial applications, national security concerns, etc.)
Wagner then talked about the strengths and weaknesses of different regulatory approaches. Regulations with localized scope can be put into place more quickly, are more precise, and can handle the complexities of local situations more readily. But they are also easier to avoid, since Internet activity can so easily be moved outside of a traditional physical territory, where the localized regulations are in force. Similarly, while top-down regulations enforced by national governments have a high level of legal authority and are tied to representative political structures, the history of the Internet reveals that bottom-up regulations created by intensive users are more adept at keeping up with technological developments and better at fostering continued innovation.
His prediction is that the recent development in the ICANN story told by Farber will represent the future trend – with regulations emerging at the generalized level from non-profit, user-based organizations that operate according to bottom-up principles. While these new regulations will emerge slowly, from a tortuous policy-making process, Wagner argued, they will be more adaptable to the future challenges of the Internet than local regulations by traditional governmental bodies
At a previous Jones Center conference last fall, two speakers independently asked the audience whether they had heard of something called “XBRL”. Many had not. At this conference, the audience got the word about XBRL from Mike Willis, founding chairman of XBRL International and senior partner in PricewaterhouseCoopers. “XBRL” stands for eXtensible Business Reporting Language. It offers a standardized system of financial terminology, coupled with the technological specifications needed to make that language flow easily through the Internet.
A consortium of some 200 member institutions stands behind the XBRL initiative. A large proportion of these are household names, covering a spectrum from technology and systems through accounting and financial services, and including public and non-profit organizations as well as for-profit companies, both U.S. and foreign. They are joined together by virtue of their common involvement in the “business reporting supply chain.”
With XBRL, a number of constraints on the flow of financial information are broken simultaneously. These include the obvious constraint of printed paper, but also those of reporting conventions (XBRL accommodates more than U.S.GAAP) and languages (XBRL dictionaries are multi-lingual). Perhaps most important, any investor can download information directly into a spreadsheet and try to out-do the Wall Street analysts. Some, at least, will probably succeed.
Willis said that recent key endorsements by organizations like the FDIC and the Australian regulatory authorities indicate that XBRL is gaining momentum and may well be the wave of the future in financial reporting. In the U.S., a supportive stance by the SEC is anticipated, but the IRS may be a holdout.