'Act Now, Apologize Later': Will Users 'Friend' Facebook's Latest Intrusion on Privacy?Published: May 12, 2010 in Knowledge@Wharton
With more than 400 million users, social networking giant Facebook theoretically has a lot to lose if it sidesteps privacy without consumers' consent.
But each time the company introduces new features that make users' personal information available to an increasingly wide circle, the site continues to grow. In 2007, for example, it introduced "Beacon," a system that tracked users' online purchases and reported them back to their profiles and made the information visible to their friends. The goal was to create more opportunities for targeted advertising. The problem: Facebook never asked users for their permission to record and publicize their browsing activities. Following a groundswell of resistance from its member base, Facebook CEO Mark Zuckerberg issued an apology and the company shuttered the service.
Flash forward to today, and Facebook -- now the largest social networking site -- has once again become the center of a debate over online privacy as the company pushes norms and raises a few hackles along the way. This time, industry observers are eyeing its newly announced "Open Graph" plan -- an attempt to link Facebook users to other parts of the web by sharing their "likes" and other activities across a number of different sites. Days after Zuckerberg unveiled Open Graph at Facebook's f8 conference for software developers on April 21, privacy groups banded together and petitioned the Federal Trade Commission for an investigation.
Experts at Wharton say that despite vocal opposition, Facebook is increasingly defining the parameters of online privacy through new features and its ever-changing policies. "Facebook's approach is to 'act now, apologize later,'" notes Wharton legal studies and business ethics professor Kevin Werbach. "It has repeatedly pushed the envelope on privacy, sometimes clearly going too far." This time, will the company ultimately face a backlash from users and regulators?
The 'Open Graph'
According to Zuckerberg, Facebook's latest initiative will put "people at the center of the web" by connecting them with sites like Pandora, an Internet radio service, Yelp, a local business review and referral network, and content-driven sites like CNN and IMDB, the Internet Movie Database. All it takes to make these connections is clicking a "like" button on a site when users are logged in to Facebook. Anyone with a Facebook account can visit Pandora, for example, and see what songs their friends are listening to, or check CNN to read stories their friends liked. The benefits to partner sites that enable these "like" buttons are obvious: increased traffic from recommendations across Facebook communities. And, according to Wharton experts, Facebook benefits by becoming the center of what's known as the "social web," where status -- and any potential monetization -- is based on personal connections and recommendations rather than search engine results.
Unlike the bumpy Beacon launch, Facebook argues that it has found the right balance between being useful to customers, linking together social profiles and gathering data that could be valuable to advertisers. Facebook's Open Graph approach features plugins that can be included on any site so users can "like" or recommend content, and personal data is controlled by Facebook and not shared with the partner site. However, a user's recommendations show up as public information on their Facebook profiles. Users can change their privacy settings to determine which recommendations are visible. Nonetheless, critics such as privacy advocacy group Electronic Frontier Foundation (EFF) say Facebook makes it hard for users to restrict the information they share.
"Facebook's privacy settings are very confusing," says Peter Fader, a marketing professor at Wharton. "And the fact that Facebook changes frequently makes it worse. Due to [the potential for] monetization, Facebook prefers users opt out. It's better for the user [to be able to] opt in. The controls could be much clearer."
Experts at Wharton are quick to note that, in the past, Facebook has been able to push features and change privacy settings without much impact on the site's meteoric growth. While some bloggers and commentators closed their Facebook accounts in response to the recent changes, Facebook's continued expansion makes it unlikely these protests will have much impact on the company. "It's doubtful whether a user boycott would be successful," notes Kendall Whitehouse, director of new media at Wharton. "If you managed to get a million people to close their accounts -- which would be quite a feat -- that only represents one-quarter of one percent of Facebook's user base" and would be unlikely to compel the company to change direction, he says.
Shawndra Hill, a professor of operations and information management at Wharton, says there is a cost to users when it comes to all of these privacy changes and new features: The time that it takes people to understand what changed and how it affects them. "What's important is that Facebook try to understand why people keep some things private. Facebook can't blindly make everything public."
Whitehouse says the risk to Facebook isn't users leaving in droves but, rather, how the company's seemingly cavalier attitude toward privacy could taint its image among advertisers and corporate customers. "The danger for Facebook is that it might become the next MySpace," he notes, adding that while MySpace still commands a large audience, the site isn't as attractive to many corporate and enterprise customers as Facebook or Twitter. MySpace is often eschewed as a place to promote major brands, "unless you're a musician or a pop culture film," says Whitehouse.
On a blog post following his keynote at the f8 conference, Zuckerberg argued that connecting sites and services together seamlessly will have "profound benefits.... We're really proud that Facebook is part of the shift toward more social and personalized experiences everywhere online."
Privacy: A Moving Target
However, Wharton experts say there may be problems along the way. Facebook isn't only benefiting from the propensity of Internet users to share details of their lives, but also defining privacy online. Meanwhile, Facebook has to balance user demands with those of advertisers -- the key to making money -- and the law.
"The challenge with online privacy is that people care very deeply about it, but have a difficult time articulating where to draw the lines," Werbach says. "It's difficult to set rules for services that no one has experience with over time and at scale. As the world becomes increasingly interconnected, many domains that were once private become public. On the other hand, that can't mean that the deep human values associated with privacy should be abandoned."
Hill says that users don't necessarily understand what they give up in privacy when they join Facebook. Meanwhile, Facebook's ability to share information outside of its network can be worrisome if users aren't schooled on what they should keep private.
In addition, Facebook is perceived to have an ethical responsibility to protect user privacy, but the "extent of legal responsibility is totally up in the air," says Andrea Matwyshyn, a legal studies and business ethics professor at Wharton. According to Matwyshyn, the larger problem is that no one -- legislators, users or social networks -- agree on a common privacy definition. "Until there is some consensus, companies will keep pushing the edge on privacy. Facebook is testing the waters."
In December 2009, Facebook stated: "Certain categories of information such as your name, profile photo, list of friends and pages you are a fan of, gender, geographic region, and networks you belong to are considered publicly available to everyone, including Facebook-enhanced applications, and therefore do not have privacy settings."
That final change is what riled up 15 privacy groups, which in late April petitioned the FTC to investigate Facebook. The complaint revolved around "material changes to privacy settings made by Facebook that adversely impact the users of the service." Facebook "now discloses personal information to the public that Facebook users previously restricted ... and now discloses personal information to third parties that Facebook users previously did not make available," the groups said in their complaint. "These changes violate user expectations, diminish user privacy and contradict Facebook's own representations."
Facebook has maintained that it cares about privacy and giving users control over their data. Werbach agrees, but says the definition of privacy is changing by virtue of users' actions: Users may say they want everything private yet share the most intimate details of their lives. Given that fact, Facebook's approach of act first -- and apologize later if needed -- makes sense.
"Facebook has a culture of constant iteration and experimentation on its core service. It makes a lot of mistakes, but it's good at fixing those mistakes or further evolving quickly. That has been the story of all of Facebook's prior privacy intrusions, [including] Beacon," says Werbach. "From my perspective, Facebook cares a great deal about privacy, but it thinks about it differently than most privacy experts. Facebook doesn't start with privacy principles; it tests concrete user experiences on real users."
David Hsu, a management professor at Wharton, agrees that Facebook may be just reflecting societal changes with privacy. "People have been acclimated to sharing more and more information," says Hsu. "It's a change in mentality that's provocative and problematic as privacy is redefined."
The problem with that approach, according to Werbach and other experts, is that it's dangerous for Facebook. Ultimately, consumers could change their views of privacy and deactivate their Facebook accounts.
Werbach points out that these flare-ups have occurred before, but were resolved either with an apology from Facebook -- as in the case of Beacon -- or because users just became used to feature changes. "Facebook has generally been successful at recognizing the difference between sustained user outcry and uncomfortable adjustment to a new feature."
Matwyshyn notes that while consumers can control their privacy settings, it's almost impossible to keep up with Facebook's changes. "Things that are too confusing simply aren't used," adds Whitehouse, indicating that while Facebook's privacy controls are highly specific, their complexity effectively undermines their value.
The perception of control is important, however. "When I talk with my students about Facebook and privacy, they often come back to the issues of control and community," says Werbach. "As long as they feel they have control over their information, they are comfortable sharing it. And they think about Facebook as a distinct environment that largely overlaps with their physical communities such as the university and their families. Taking features such as 'likes' out to the public Internet breaks that 'magic circle.'"
Eric Bradlow, a marketing professor at Wharton, says that any concerns about the Open Graph are likely to blow over. Facebook is "thinking of clever ways to utilize the social graph" and users are likely to play along, because they have the control not to participate if they choose, he notes.
Fader agrees that critics are overstating the privacy concerns. Once everyone makes their likes, recommendations and affiliations public, worries about that information being public will fade, he says. However, he believes the site's convoluted opt-out procedures are ultimately detrimental. "Facebook does make it hard to opt out. It should be totally upfront about this stuff and make it easier to change your settings. Facebook should be bending over backwards to do that." Why? "It's just good business," says Fader. "Facebook has created a deep connection with the customer base that exceeds everybody else, [including] Google and eBay. While that may give Facebook carte blanche, there's no gain in abusing it."
High Stakes, High Profits
Experts at Wharton say that Facebook's Open Graph initiative -- as well as the privacy debate it has sparked -- boils down to the balance between making money through targeted advertising and keeping users happy. "Eventually, [Open Graph] can lead to recommendations and targeting, which allows for Facebook to monetize" the user information it has, says Bradlow. Hill agrees. "Facebook has the best data by far in social networking. [It] has geographic information, behavior, the sites you click on and groups you like. That's powerful for target marketing."
For now, Facebook has not indicated how it plans to use any data collected through the Open Graph. Werbach predicts that the social web will have a big impact on advertising, but it may take time. "Going from social graphs to effective advertising in practice is harder than it seems, and it raises a whole additional layer of privacy concerns," says Werbach. Many advertisers are still in the experimental stage with social media, and they don't want to be involved with privacy flaps.
Matwyshyn notes that Facebook is currently aggregating a massive amount of user data, but that doesn't necessarily mean that advertising is the Holy Grail. "The master plan may be to aggregate information and develop big revenue streams in data resale," she says.
Whatever business models Facebook develops, experts at Wharton suggest that the company may not be invincible in the long run. It's possible that privacy problems, user backlash and the need to generate revenue will create a toxic stew that erodes trust. "People stay with Facebook because they feel locked in, but they may lose trust over time," says Matwyshyn. "It could be an ideal time for a competitor to come in and harness that trust deficit."