Recent weeks have not been kind to Facebook. Since the privacy brouhaha over Cambridge Analytica erupted in mid-March, the world’s biggest social network has lost $41 billion in market value as investors unfriended it. While Facebook certainly is not the only tech giant that lets marketers mine the data of its users and their friends — and politicians from both parties also collect and analyze data to better target voters — the firm is taking the brunt of the public’s anger, especially after the fake news fiasco.
Federal and state agencies are investigating Facebook. Congress has asked CEO Mark Zuckerberg to testify. A growing group of companies have halted advertising on Facebook. Tesla CEO Elon Musk, along with celebrities such as Cher and Will Ferrell, joined the #DeleteFacebook movement and closed their accounts. Nearly 175,000 people have signed a Change.org petition asking Facebook to better protect their data and be more transparent. It is arguably the biggest PR crisis the company has ever faced.
How can Facebook regain the public’s trust? Wharton management professor David Hsu said the social network should deploy a lot more resources to protect user data and think more deeply about the different ways its platform could be abused. But it seems that Facebook has been more concerned with monetizing its mobile platform, especially since it’s a public company. “All the [Wall Street] analysts are saying this is the metric we pay attention to. So it’s not surprising that, up and down, executives are focused on that rather than” safeguarding consumer data.
But it turns out that privacy concerns affect its core product: user content. By not paying attention to this macro-risk, it has cost the company dearly. “This probably reflects on managerial experience,” Hsu said. Zuckerberg’s management style is to make many of the decisions himself about the user experience. While a founder-CEO can learn on the job as a company grows, it is critical however to recognize one’s strengths and weaknesses. “It’s not just a one-person show,” he said. Founders and leaders need to “be realistic about their domain of expertise and where they need wise counsel.”
What Went Wrong
At the heart of the grievance is that in 2013, a university researcher collected data from 270,000 users who opted to take a personality quiz. At the time, Facebook allowed third parties to collect information about users as well as those of their friends. Thus, the researcher was able to get data on millions of Facebook users without having to ask permission. Facebook said it ended the policy in 2014. A year later, reporters told Facebook that the researcher shared the data with Cambridge Analytica, a U.K. target marketing firm hired by the Trump campaign. Such sharing without authorization is against Facebook’s policy.
“[Part] of the issue Facebook had is that it went beyond just storing user information. It was collecting and adding additional data to help advertisers target [its] users.” –Ron Berman
Facebook said it demanded that the researcher and Cambridge Analytica certify that the unauthorized data was deleted, which they did. But last month, journalists said the data might not have been deleted after all. Facebook, which booted the two parties from its network, said Cambridge Analytica agreed to be audited. Meanwhile, Facebook also moved to strengthen other data controls: Now, friends of users must authorize third parties to collect their information. Developers also must get Facebook’s permission before they can ask users for sensitive information.
Last week, Facebook executives met with reporters and pledged to protect the integrity of data, especially those related to elections. The company detailed a series of improvements to better fight foreign interference, remove fake accounts, boost ad transparency and hamper the spread of fake news. “None of us can turn back the clock, but we are all responsible for making sure the same kind of attack on our democracy does not happen again,” said Guy Rosen, vice president of product management. “We are taking our role in that effort very, very seriously.”
Fixing the Problem
Lyle Ungar, professor of computer and information science at the University of Pennsylvania, said Facebook erred in the following: It should not have let organizations collect information on people without their consent, such as user’s friends, and organizations collecting the data should not be able to sell it without people’s consent; it should have required that political ads be more transparent so people know who is paying for them; it needs to take more responsibility for what third parties do on its platform; and it needs to be more sensitive to how people feel about their data being used in different ways.
Lynn Wu, Wharton professor of operations, information and decisions, said Facebook also could take a page from Google in making it easier for users to know what data is captured. “If you log onto your Google profile, you can actually see what they know about you and your interests and you can modify and change it as you see fit,” she said. In addition, Facebook could send out summary statistics about the users’ social graph regularly. “This is what we know about you. These are your friends. This is how your data is shared,” she said. “And maybe you can control that.”
Wharton marketing professor Ron Berman added that “part of the issue Facebook had is that it went beyond just storing user information. It was collecting and adding additional data to help advertisers target users on Facebook.” The company should have asked users if they wanted their data to be augmented with third-party information and shared with advertisers, he added. Credit card companies, for example, are required to disclose how they share their data, and consumers can opt out of this practice.
While Facebook stands accused in the public discourse of letting content slip in that might have helped elect Trump, Berman is not convinced people are that susceptible. “It is very hard to change someone’s opinion about voting. First, they would need to be undecided voters — and there aren’t that many of those, surveys show. Second, they need to be exposed to only one side of the political spectrum,” he said. But being undecided means the voter will seek different political views, so that means the opposition can also present its information. “I am just not convinced the impact [on the election] has been as big as people claim it was,” he said.
Move Fast and Break Things?
But there is a larger issue to consider than fixing Facebook’s foibles with more privacy controls. Its famous dictum is to “move fast and break things.” But it is so big now — with 2.1 billion users — that breaking things could have wide repercussions. As such, Facebook has to go through “a real cultural shift,” said Kevin Werbach, Wharton professor of legal studies and business ethics, on the Knowledge at Wharton show on SiriusXM channel 111. (Listen to the podcast at the top of the page.)
Werbach compared Facebook’s troubles with Microsoft’s situation in the 1990s under co-founder and CEO Bill Gates, when the government sued it for antitrust violations. The company was accused of acting monopolistically by bundling software with its Windows operating system, used by most of the world’s personal computers. Today, Microsoft has learned from its experience. Led by CEO Satya Nadella, it is a “fundamentally different company — not at a product level, but at a cultural level,” he said.
“… [There] are now fundamental questions being asked by users and by governments around the world about not just the specifics of what Facebook did in this case — but also about what Facebook is.” –Kevin Werbach
But Werbach also pondered whether a much more drastic change might be needed for Facebook than a culture shift. The company “understands that there are now fundamental questions being asked by users and by governments around the world about not just the specifics of what Facebook did in this case — but also about what Facebook is,” Werbach said. “Is there something inherently problematic in the kind of information platform that Facebook has created? And that’s a fundamental challenge to the company.
“Will Facebook and companies like it — either on their own or being forced by governments — have to fundamentally change their business model?” Werbach asked. “And not even just at the level of what data do they share with third party apps. Will they be forced to give users control of their data? Will they be forced to share their social graph with competitors, as some are calling for in Europe? These would be fundamental changes that would have huge impacts on their business, but they are the kinds of things that go with this basic question about whether the business model is ethical and trustworthy.”
Make Facebook a Public Utility?
Werbach floated the idea that perhaps digital platforms should be regulated as public utilities such as railroads, electricity, communications and broadband. “We have a whole body of law and regulatory oversight based on the idea that these are fundamental infrastructural platforms for society and for public discourse,” he said. The U.S. chose not to strictly regulate digital platforms to let them innovate and disrupt incumbents. “But we’re, I think, long past the point where it makes sense to talk about Comcast as a big, powerful company and not also talk about Facebook and Google as big and powerful companies.”
Wu has another idea: Consider making Facebook a decentralized social network. That means there is no central authority like Facebook making decisions about the network for its users. Rather, users decide what they want to see on the platform, how it operates and how data sharing is handled. She pointed to the open-source Linux operating system — similar to Windows or Mac OS X — as an analogy. Users decide what features to put into the software. And even if Facebook becomes decentralized, it can still make a profit. Linux is free, Wu said, but applications can be built on top of it that make money.
However, the drawback to decentralization is that there could be less incentive for users to keep innovating on new features. Wu noted that even though Linux is free, the dominant system for PCs is still Windows, and it’s Mac OS X for Apple computers. Also, there already are decentralized social networks out there for people who care deeply about privacy — but they haven’t become very popular. “They’re still pretty small and they’re very much in startup mode,” she said. To get the best of both worlds, someone has to come up with a new business model. But “I don’t think Facebook is necessarily the right one to do it,” Wu added.
Of course, Facebook could also start charging users instead of relying on advertisers to make money. Berman said the company was only making about $18 per year per user before expenses — or $7 per user after deducting the cost of running the platform. “This isn’t a terribly high number, but consumers are often not willing to pay even this amount to maintain their privacy,” he said. Another option is to limit which companies can advertise, such as allowing only brands whose users opt in to see ads.
Privacy Leak Fallout
Facebook is far from alone in monetizing user data in exchange for free services. But the current uproar over Cambridge Analytica could lead to the tightening of privacy laws that will affect many companies going forward, especially the tech giants. “It’s a problem for Google, it’s a problem for Amazon and any other large online company which has been collecting and using data for various reasons,” said Wharton marketing professor Pinar Yildirim, who joined Werbach on the Knoweldge@Wharton radio show. “They will all be exposed to any regulation that may come out of this incident.”
It will also affect consumers “in a drastic way,” Yildirim continued. “Consumers are used to using their products or services for free, in exchange for providing their information to advertisers. If we start to build walls [around] third-party developers or advertisers for use of that data, that’s also going to shift the way that services are provided to consumers.”
Just take a look at the European Union, which has more stringent privacy rules than in the U.S. A few years ago, there was a study on the impact of Europe’s new rule, where consumers have to opt-in for cookies to track them on the web rather than opt out. Apps, websites and other online services remained free, but ads became more irritating, Wu said. Since most people did not opt in, that means marketers cannot target individuals with smaller, specialized ads. The result? Big pop-up ads, ads that takeover whole screens and the like became more prevalent. “Because they can’t target you effectively anymore, they give you the screen-takeover [ad], which is annoying,” she said.
Google, Amazon and other online giants that collect data “will all be exposed to any regulation that may come out of this incident.” –Pinar Yildirim
When the EU’s new General Data Protection Regulation (GDPR) takes effect on May 25, it will even be tougher for marketers, Wu said. A key change in this legislation is that any company anywhere in the world that targets anyone in the European Union must actively get that person’s consent before collecting their personal data. The penalties are severe: Up to 4% of annual global revenues or 20 million euros, whichever is greater. Moreover, cloud storage companies are not exempt.
With companies thus constrained, it stands to reason that EU consumers will not be exposed to as many products and services as other citizens. But at least their data will remain private. “There’s going to be some kind of consumer welfare loss,” Wu said. “At the same time, maybe people care about privacy more, so we have to think about the costs and benefits of doing that.”
In a recent interview with CNN, Zuckerberg said Facebook is open to regulation. “I actually think the question is more what is the right regulation rather than” whether or not the industry should be regulated, he noted. For example, he said there’s plenty of regulation of ads on TV and print. “It’s just not clear why there should be less on the internet. You should have the same level of transparency required.” He said Facebook is proactively rolling out tools that let users know who bought the ads and whom they’re targeting.
Wharton marketing professor Gideon Nave said Facebook has become the symbol of personalized targeting when other companies are doing similar things. Moreover, personalized targeting has been around for ages – the new thing here is just the improvement in campaign efficiency. “If you publish an ad in a golf magazine based on the demographic and psychographic characteristics of people who like golf, you’re targeting, too,” he said. The advent of the internet and platforms like Facebook and Google “gives us a capacity to do it in a better way because we can target each person individually,” Nave added. “But the idea itself is not new.”
The Facebook imbroglio inflamed public sentiment because it exposed how people could be manipulated. “People don’t like to feel they’re being manipulated,” Nave said. “In reality, we are manipulated all the time from the moment we’re born. We’re manipulated to like specific brands of soft drinks and computers” even though inside the package, the products may be almost identical to each other. “Advertising makes you focus on things [such as emotions] that you will associate with specific products,” he said.
In the long run, Nave believes Facebook will weather this crisis. “People forget about the past and many people still don’t care that much about privacy,” he said. “Facebook’s API policy that had allowed third parties such as Cambridge Analytica to exploit its data has already been changed years ago, and at the time nobody seemed to care,” Nave pointed out. Millennials also have a more relaxed attitude about data privacy since they’re used to giving out personal information to get free services. Indeed, 40% of people choose to keep their Facebook likes public. “It’s possible a few weeks from now everyone will forget about it,” Nave said. “It could be temporary.”