The federal government’s demand that Apple create new software to hack into the phone of one dead terrorist speaks to the complex and countervailing forces of privacy and national security. The subject of corporate constitutional rights is of great interest to professors Eric Orts and Amy Sepinwall from Wharton’s legal studies and business ethics department. Perhaps presciently, they recently penned the article, “Privacy and Organizational Persons,” in the Minnesota Law Review that foreshadowed this debate.

Sepinwall and Orts joined the Knowledge at Wharton show on Wharton Business Radio on SiriusXM channel 111, to discuss the juxtaposition of privacy and security in the digital and social media age.

An edited transcript of the conversation appears below.

Knowledge at Wharton: Let’s just start with your reaction to the case itself — Apple against the government, just a case of two sides butting heads, trying to figure this out.

Amy Sepinwall: The issues are really complicated. Because of some security concerns, we don’t have a full sense of just what’s at stake.

In much of the media, this has been portrayed as a kind of dichotomy between, on the one hand, security interests, which are obviously very important to us, but privacy interests as well. There is something a little misleading about that, though, because in fact, privacy is a countervailing force against our concerns for security. It’s also the case that there are security concerns on Apple’s side as well as on the government side, and Apple security concerns have reason to be of moment for all of us.

Apple’s concern is that if it develops this technology, which it doesn’t currently have, that will allow it to unlock this phone, and that’s not technology that’s going to be specific to that particular phone. Once that technology exists, it could get into the wrong hands, which could lead to cyber-attacks or hacking that essentially puts all of us at risk. So again, there are security concerns on both sides of this debate, and it’s important to take those seriously.

Our interest in the issue comes from a larger-scale interest in corporate constitutional rights, and those rights themselves are a source of some anxiety. To take us away from the Apple issue for just a moment, when you think about a case like Citizens United, what’s at stake there is a corporate constitutional right to political speech. And again, some of us have concerns about whether corporations should be engaging in politics in that way, or if you think about Hobby Lobby, what’s at stake there is a corporate constitutional right to freedom of religion.

“Once that technology exists, it could get into the wrong hands, which could lead to cyber-attacks or hacking that essentially puts all of us at risk.” –Amy Sepinwall

Here, we have a case of a corporation’s invoking its asserted rights to privacy. One important distinction perhaps is that it is doing so, not on its own behalf, but on behalf of its users. So it’s seeking to protect its users in ways that they really couldn’t protect themselves, both because they obviously don’t have the power that Apple has, but also because, if, for example, the United States government wanted to be spying on me through my iPhone, I wouldn’t know about it, so I wouldn’t be able to assert my privacy rights. So it’s incumbent upon Apple, given that it is in this privileged position, to seek to protect the privacy of its users.

Knowledge at Wharton: Tim Cook did an interview yesterday with ABC News’ David Muir, and he talked about a lot of what you just spoke about, and I wanted to play a clip from it.

Tim Cook: What is at stake here is, can the government compel Apple to write software that we believe would make hundreds of millions of customers vulnerable, around the world, including the U.S., and also trample civil liberties that are at the basic foundation of what this country was made on?

David Muir: And you would have to write that system in order to unlock that phone.

Cook: Yes, … the only way we know would be to write a piece of software that we view as sort of the software equivalent of cancer. We think it’s bad news to write, we would never write it, we have never written it. And that is what is at stake here.

Knowledge at Wharton: What’s your reaction to what Cook said?

Sepinwall: I don’t think you could put it in starker terms than likening the software to cancer. That isn’t the way that I’ve been thinking about it. I’ve been thinking about it more like the development of a master key, but of course, a master key has power of its own. One of the interesting pieces here is that Tim Cook, and Apple as a whole, seem to be motivated not even by concerns about their bottom line, but on the basis of principle.

“What if someone has a cell phone the FBI needs unlocked, and it’s a nuclear threat of some kind? Then what?” –Eric Orts

So just a very deep commitment to privacy, which Tim Cook in other contexts has called a civic duty, motivates them. In fact, at a meeting with shareholders sometime within the last year, Tim Cook effectively said to them, “Look, if what you care about is the bottom line, at the expense of our commitment to privacy, then you really should go elsewhere.” So he has been very forthright about the fact that this is commitment that deserves weight on its own, independent of whatever effect it may have on the company’s stock value.

Knowledge at Wharton: In the article that you and Eric Orts did for the Minnesota Law Review, you talk about how the term “privacy” is kind of fluid right now. And it’s changing even as we speak.

Sepinwall: That’s right. It’s inevitable, given how quickly technology itself is developing. I teach undergraduate students at Wharton, and they have a completely different conception of informational privacy than I might have, just because so much of their lives are now made available to the world at large through various social media.

Knowledge at Wharton: What do you think the resolution will be of this particular case with Apple and the FBI?

Sepinwall: It’s really complicated. If Apple [finds] itself to be in a bargaining position, it could seek to negotiate terms, such that the technology it developed would be destroyed immediately afterward. And that it wouldn’t feel subject to having now established this precedent, whereby the government can compel it to create technology for purposes of breaching users’ privacy.

If Apple could specify, “Look, we’ll do this, but only because we have really good reason to know that the person whose phone [it is] is someone who [committed] a series of murders. But for other cases, for lower level crimes, for example, don’t come to us. We’re just not going to be your handmaidens when it comes to developing that technology.”

Knowledge at Wharton: Later in that interview, Tim Cook alludes to that as well. If the government hadn’t taken the tack that they did in making this such a public issue, you get the sense that Apple might have worked with them. So it makes you think: Well, maybe Apple does have the ability to do this, but now they just don’t want to do this.

Sepinwall: That could be. I think one of the issues here is, why did the government come out? I think the government thought it had this very sympathetic case. Everyone wants to see what might have been on the San Bernardino shooter’s phone, and as it happens, Apple has been subject to something like 11,000 requests in the last year. And it says it has complied with about 7,000 of them, so it’s largely been cooperative. But of course, this is an especially polarizing case, given what’s at stake. This isn’t a low-level criminal.

“The idea that you use all that effort to create the backdoor, and then somehow can erase that knowledge, I think is the problem.” –Eric Orts

At the same time, the Manhattan [district attorney] has said, “Well, we need to have access to the phones of low-level criminals too, because some of what we’re going to find there is going to lead us to bigger criminals.” So you have a slippery slope, and it’s really hard to know whether you should embark upon it, and if so, whether you’re going to be able to stop that train — to mix metaphors — once it gets going.

Knowledge at Wharton: Eric Orts joins us. I’m sure this has been an interesting week for you as well, watching this all play out along the lines of what you and Amy wrote about a few months ago.

Eric Orts: It’s always helpful when you think you’re writing something that is rather theoretical, and then suddenly it’s on the front pages of all of the papers, and we get invited on to your show, etc. But it is a very important issue … the whole question of the right of organizations to assert privacy rights, whether on their own behalf or for their customers and users.

Knowledge at Wharton: Is a company like Apple asserting this on behalf of its consumers a relatively new concept? And is it one that we will see develop as we go further?

Orts: That’s one of the things that we talk about in our article, but there is a theoretical question: Is Apple only responsible for advancing the interests of its users? In many of Tim Cook’s pronouncements, that is the argument. Encryption is created for users. We respect the privacy of our users, and we’re not going to make a big backdoor to that. And the reason is that we are protecting the privacy of our users.

But there are times — including at one point in the ABC interview — when Tim Cook was asked, “Well, what would Steve Jobs have done?” And he says, “I think about that every day. In fact, I think about him every day.” And one thing that he said about [Jobs] is that he always did the right thing, at least according to how he thinks about it.

So then, the question is, Does Tim Cook, as CEO of Apple, have responsibility from a business ethics point of view, to actually take a position on what the right thing to do is? And I think a lot of this argument, too, is what’s the best thing for the future of making everybody safe? Protecting privacy on millions, billions of phones? I think you do have to take a public perspective on that.

At that point, it’s not just about business interests. It’s not just about your customers, or what’s going to make money for you. It’s about a higher principle. And I think you have Google, Facebook, Twitter, some of the other companies weighing in, and they have to think about similar kinds of questions. To what extent can these new technologies be used by terrorists, or to hurt many people? Not just San Bernardino, but what if someone has a cell phone the FBI needs unlocked, and it’s a nuclear threat of some kind? Then what?

“Even if they could manage to erase the new operating system itself, the developers now have the knowledge of how to do this in their heads … and it ends up in the wrong hands.” –Amy Sepinwall

And against that issue, you have to [strike a] balance: Do you really make people safer if you have backdoors in your encryption, that criminals or maybe even terrorists could then use, and then they use that against you? So it’s not an easy question, but I think that, inevitably, the companies are correct in that they have to stand up, and step up, and take a position one way or another on these issues.

Knowledge at Wharton: The idea that Washington could say, “Listen, we have 14 people who were killed and there is potentially this [valuable] information on this one phone. Could you unlock this? And then destroy the process of doing it?” I think a lot of people would want that, but the problem is, somehow, some way, most likely, that information is going to get out. Where do you fall on that question?

Orts: I’m not a technology expert, so I’m relying, to some extent, on what Cook and other Apple executives are saying. But my understanding is, that what they would have to do technically is that they basically create a new operating system that replaces the phone’s OS. Then, with the new operating system, you can crack the password by running high-level computers on that.

Now, the idea is that you would create that whole operating system, have a bunch of people working on that who would have to use Apple confidential information, maybe, to create that. But the idea that you use all that effort to create the backdoor, and then somehow can erase that knowledge, I think is the problem. At least from Apple’s perspective, you create that new software, and then it’s like creating a new Frankenstein monster out there.

…But then once you create that, there are going to be other demands for that, and then what’s the stopping point? What happens when the Chinese government comes to Apple, and says, “We need you to create a backdoor to stop this terrorist.” You can see that there is something to the “slippery slope” argument that Apple is making here.

Sepinwall: So let me add something that gets back to your earlier question about whether this is a new frontier, whether we are just now seeing … corporations trying to protect their users. The precedent that the government is invoking here is a 1977 case involving phone companies, where the phone companies were resisting government efforts to track the phone numbers that suspected individuals were calling, and the phone companies were forced to relent, and to hand over that information. But of course, that is a much smaller-scale infringement on privacy, relative to all of the content that is currently on a person’s iPhone. Their photos, their notes, their calendar, their location.

Orts: Credit card info.

Sepinwall: Right. Some very sensitive financial information, some potentially intimate information. So what is at stake here is much larger — to say nothing of the concerns that Eric rightly raises about this technology getting into the wrong hands.

Again, even if they could manage to erase the new operating system itself, the developers now have the knowledge of how to do this in their heads, and you don’t want to have a situation where someone becomes disgruntled, or someone gets bought for a high enough price such that they are willing to recreate this technology that we thought was destroyed, and it ends up in the wrong hands.

Knowledge at Wharton: The other interesting thing is the fact that we are talking about an issue of privacy in an era where technology advances so quickly, yet privacy law, in some respects, is going back to the founding of our country, back in the 1700s.

“It’s only Apple that really has the firepower to stand up against the FBI on this issue. It’s not going to be every individual iPhone user.” –Eric Orts

Orts: There is a lot of law that is developing to protect privacy rights, whether it’s in the context of the Fourth Amendment protection against unreasonable searches and seizures, or other areas of privacy. Amy and I have written about the different areas of privacy, and we’re following other people. But I think you’re right.

It is somewhat ironic that in the United States, where I think we have a general sense that our privacy is very heavily protected, in fact, we are a little bit behind some other parts of the world, notably Europe, where they take privacy interests very strongly. There are fairly large controversies between businesses in the United States that want to mine a lot of data, etc., versus Europe’s sense that you should protect privacy more. So Apple gets thrown right into that, as well as the other big companies.

You even have, in the example of this case, just an illustration of how old the law is here: You have the ‘All Writs Act’ that is being relied on — it was a 1789 act. I don’t think anybody was thinking about, “Let’s get all of your information on an iPhone” when that was written, so it’s probably time to update the law a little bit on privacy. Then you have a democratic determination of what the law of privacy should be in this age, rather than these court fights. But that doesn’t mean that we’re not going to have this court fight. It looks like it’s inevitable.

Knowledge at Wharton: So we need to update our philosophy and laws on privacy. But in the meantime, potentially, if we’re talking about one company deciding what their level of privacy is, other companies could view privacy in a different manner, correct?

Orts: That’s correct. You’re going to have a division between the different companies involved. One of the other interesting features here is — and this is something that Amy and I write about in our article — their traditional idea was that you would have big government, and then individual people would oppose the government trying to get into their business, invade their privacy.

But in a modern world, the idea that one person is going to stand up against the NSA, for example, or one person in China is going to stand up against their government, is unrealistic. We make the argument that organizations like Apple, like the big companies of the world, that are engaged in providing this technology, which ostensibly is providing privacy by encryption and other means — they actually have a responsibility to step up. And in some ways, if you’re an advocate of privacy — as I think we are — you want companies to do that, because they are the players. It’s only Apple that really has the firepower to stand up against the FBI on this issue. It’s not going to be every individual iPhone user.

“Some of these tech companies, even while they are asserting rights to privacy on behalf of their users, engage in a fair amount of data mining on their own.” –Amy Sepinwall

Sepinwall: On the other side of the issue, one of the interesting dynamics here is that some of these tech companies, even while they are asserting rights to privacy on behalf of their users, engage in a fair amount of data mining on their own. So their analytics and their business model depends on their tracking what kind of searches individuals are doing.

Apple has presented itself as operating with a different model, so it sees itself as providing hardware, not software. It sees itself as disabled from seeing what is on an individual’s phone, and that is supposed to be a key virtue of an Apple device. Apple itself is not participating in this double-sided game, where on the one hand it’s trying to stave off the government, and on the other hand it’s engaging in perhaps its own privacy violations. But some of the other tech companies could be said to be playing both sides of this fence.

Orts: It is interesting that, in some respects, it sounds like Apple is kind of playing two sides here, because, obviously, they are a company out to make a profit, and they are providing services for hundreds of millions of people around the globe, yet they do also have the technology aspect of this as well.

Sepinwall: They do, and I think the cynical take is, Apple is just putting up a good front. It’s going to resist for as long as it can, and then the government will compel it, and it will have to turn over the information, or it will have to develop the technology. But of course, we won’t be able to blame it because it put up a very good fight. I am not necessarily prepared to be that cynical about it. I think that this is a longstanding commitment of Apple’s that it is not always focused first on what is going to enhance its profits. And the commitment is real, as far as I can tell, anyway.

Orts: Yeah, I think that’s right. We had an interesting debate in my MBA class [recently] about this case, where some students presented this issue. The opinion of the class, I think, was roughly divided half and half, because I think it’s not easy to tell what the motivations are of a company of this sort. When I’ve seen Tim Cook actually in these interviews — for what it’s worth, I’m not an expert in judging demeanor particularly — but it looks to me like he is seriously grappling with the ethical principles involved here. That it’s not just window dressing, and the company trying to make as much profits as possible.

So Amy is right to also mention that there are other companies that have different interests, different profiles, and views of privacy. Just as companies can defend privacy, as Apple seems to be doing right now, you can have companies going the other way, and not being so protective.