For global insurance firms, cyberattacks have become the most threatening of all emerging risks, according to a survey conducted recently by Guy Carpenter & Co., the risk and reinsurance specialists. Over the past two years, hackers have infiltrated major airlines, energy companies and defense firms, among many other businesses. 

Most sensationally, hackers recently broke into personal files of Sony Pictures’ executives, and threatened further action against the filmmaker unless it withdrew the release of The Interview, a comedy about a plot to assassinate North Korean dictator Kim Jong Un. This week, Sony said it was shelving the film. And on Friday, the FBI confirmed that it had sufficient evidence to conclude that North Korea’s government was behind the hack. 

As debate and discussion continue about the most effective ways to thwart such attacks in the future, companies and governments worldwide are wondering about the efforts by law enforcement agencies to create “back doors” into major computer and data networks, which bypass software security to give them access to user data. Might such back doors aid them in the investigation of such cyberattacks? Or might they instead actually wind up weakening system security in unexpected ways?

At a recent Wharton-hosted panel discussion, “Cybersecurity and Law Enforcement Back Doors,” two Penn experts on cybersecurity explored the little-understood legal and technical issues surrounding such questions. The panelists — Jeffrey Vagle, a lecturer at the University of Pennsylvania Law School, and Matt Blaze, a professor at Penn’s School of Engineering and Applied Science — agreed that “back doors,” however well-intentioned, could wind up raising security risks. The discussion was moderated by Howard Kunreuther, co-director of Wharton’s Risk Management and Decision Processes Center. The panel discussion was part of the Risk Regulation Seminar Series co-sponsored by the Penn Program on Regulation and the Wharton Risk Management and Decision Processes Center.

“If you knew that J. Edgar Hoover had this technology, how would you feel about it?”–Jeffrey Vagle 

Rights vs. Safety 

Kunreuther, who is also a Wharton operations and information management professor, summed up the dilemma facing law-enforcement agencies this way: “Law enforcement agencies like the FBI want to obtain data from firms like Google and Apple on criminal suspects such as drug dealers, but there is a concern that if this information is provided by these firms via the ‘back door,’ it will make it easier for hackers to compromise this system.”

Last summer, both Google and Apple announced that new versions of their operating systems – Google’s Lollipop and Apple’s iOS 8 – were going to encrypt data through users’ pass phrases, which would prevent the companies from getting at their customers’ data. “That market change was spurred in part, we think, by the Snowden revelation that the NSA’s PRISM program [of clandestine mass electronic surveillance and data-mining] was mining data from Apple and Google devices,” 

When Apple and Google announced this policy, then-Attorney General Eric Holder and FBI Director James Comey objected strenuously, saying that the move would be a contributing factor to the networks “going dark” for them. 

In simple English, said,the change meant that U.S. law enforcement agencies “would have certain avenues of investigation cut off from them because the networks or the files are encrypted and they don’t have the facilities to decrypt those files. And so there would be crimes that go unsolved. Criminals would be able to communicate with impunity.”

In actual fact, noted Vagle, research shows, “only a small number of crimes that were reported actually involved encryption.” 

Vagle noted that today’s smartphones “are as powerful as supercomputers were back in the 1980s. This is not an exaggeration.” With mobile devices replacing desktop computers as the primary electronic devices for much of the population, “most of our data ends up being transmitted by these [phones]. We love these [devices] for their convenience, but it is problematic because they are generating a great deal of data that may be open to law enforcement scrutiny.” Some of this is “data that we know about, [but much is] data that we don’t know about.” 

Apart from the technical challenges of securing that data, there are free speech issues as well, noted Vagle. For him, a key question to ask is: “If you knew that J. Edgar Hoover [the much-feared director of the FBI from 1935 to 1972] had this technology, how would you feel about it?” 

“The number of law enforcement wiretaps is actually up … and the number of cases in which cryptography has interfered with an investigation rounds down to zero.”–Matt Blaze

“It is this argument that we are having again today,” said Vagle. “Should law enforcement be able to say to us, ‘We need to have access to everything that you transmit, that you store … so we must enforce a ‘back door.’ Can law enforcement require this?” 

A Bigger ‘Attack Surface’ 

Blaze then explained why these back doors are so problematic from an engineering perspective. Back in the early 1990s, as the Internet was about to burst onto the scene, it was clear that it would be commercially important, yet equally clear that it “did not have intrinsic security; you couldn’t really rely on the fact that a phone line is somewhat hard to tap and requires physical access and so on. [With the Internet], a lot of things could be done automatically.” In retrospect, it might seem strange, but “the Internet did not have any security built into its design, and all of a sudden, we were about to start using it for incredibly important things – to replace telegrams and telephones and postal communications, and in-person meetings. We were on the cusp of that.”

At that time, Blaze said, “Cryptography, which is essentially the underlying technology that can provide security of messages over an insecure medium, was just starting to be practical to do in software… We were about to incorporate cryptography in software, but the government came up with a hardware solution. It meant that in order to implement cryptography using the government’s back door solution, not only would you have to trust that the government wouldn’t use the back door, but you would also have to buy extra hardware in every product that would use it. In the 1990s, it was clear that this would be commercially disastrous. It would have converted [a technology] that had no marginal cost … into an expensive one. This was doomed from the very beginning.” The U.S. government ultimately lost that battle, and by 2000 it abandoned the idea of having back doors in cryptography, and removed most export controls on cryptography. And, in spite of the FBI’s warning that any day now, it would lead to disaster, cryptography proliferated.

Today, virtually every over-the-air cell phone call is encrypted. “We have cryptography in many of the applications that the FBI warned that, if this happened, it would cause law enforcement to ‘go dark,’” Blaze said, meaning that electronic surveillance would become impossible and cyber-criminals would escape justice.

“[Yet] for some reason,” he pointed out, “the number of law enforcement wiretaps is actually up – not dramatically up, but it is not down – and the number of cases in which cryptography has interfered with an investigation rounds down to zero.” 

Blaze warned that “the FBI needs to be very careful what it wishes for. Adding surveillance mechanisms vastly increases what we in the security field call the ‘attack surface’ of the system. The surveillance back door vastly complicates what a security product has to accomplish, and vastly increases the number of opportunities for someone to attack it, and to take advantage of flaws.” (The attack surface of a software environment is all the points where an unauthorized user can try to hack it.)

“The surveillance back door vastly complicates what a security product has to accomplish, and vastly increases the number of opportunities for someone to attack it.”–Matt Blaze

“We computer scientists actually don’t know what we are doing,” said Blaze. “We are the worst engineers of all the engineers… The way we build software, we build it badly and then we let everyone else find the flaws, and after a little while, we make them a little more robust. The security engineering aspects of information systems are no different. We don’t know how to reliably secure complex systems. We don’t even know how to reliably secure not-so-complex systems.” 

Adding a back door makes securing anything far more difficult: The attack surface increases, and the number of components and interfaces that have to be considered rises enormously. Simply put: “It vastly increases the probability that somebody will find a way around the explicit back door, and find security flaws that are introduced through the back door.”

A second problem, he added, is that the FBI wants this design mandated for a huge array of products and services — any technology that processes information that criminals might use. “That’s a huge range of products. It means your phone, your computer, the smart devices in your home, and small components that are interacting with other components — many of which are being built by companies with much smaller security staffs than companies like Google and Apple.”  

Finally, noted Blaze, “The FBI is [seeking] these design mandates for features that don’t actually add any value to the end consumer.” From an engineering point of view, “it is very likely that we will see things put in in the most inexpensive way possible that complies with requirements,” probably with less-than-thorough testing — all of which adds up to an outsized negative impact on information security.

From his perspective, Vagle says there’s one key reason that law enforcement hasn’t “gone dark,” yet: “When criminals use encryption or some sort of security device to hide their communication, they usually do it badly.”

But for his part, Blaze has a warning: “I appreciate the FBI’s confidence in my field, that somehow we are on the cusp of being able to imperviously secure every computing device against eavesdropping, but I don’t know how to do it … and I don’t think we’re going to achieve that soon. What we’re talking about is really the cost of these targeted attacks. I do know how to make it more expensive, and how to make it less convenient. But I don’t know how to prevent it.”