Does the Internet empower consumers? Or does it make them more vulnerable to manipulation by companies and potentially the government? While both statements might be correct, the balance tilts definitely toward the latter, according to Joseph Turow, a professor of communication at the University of Pennsylvania’s Annenberg School. In his book, The Daily You: How the Advertising Industry Is Defining Your Identity and Your Worth, he argues that the advertising industry has launched “one of history’s most massive stealth efforts in social profiling.” The result, he says, is an increase in intrusive practices that are eroding traditional publishing ethics. Does the solution lie in greater self-regulation by advertising firms or more aggressive oversight by the government? Turow addressed this question and others in an interview with Knowledge at Wharton.
An edited transcript of the conversation follows.
Knowledge at Wharton: At the beginning of your book, you say that the advertising industry has launched what you call one of history’s most massive stealth efforts in social profiling. How is this happening, and what are the implications for consumers?
Joseph Turow: A lot of it is happening because companies feel that they have to know a heck of a lot more about their consumers, the users of their products, than they ever have before. As a consequence, there are companies that are prepared to follow consumers wherever they go — try to anyway — and then to share the data anonymously or not anonymously with publishers and marketers. The consequence is the kind of new environment that we are seeing in the media and among advertisers.
Knowledge at Wharton: Historically, if we look at the rise of the media agencies, what were some of the under-the-hood forces that led to this environment of digital intrusiveness?
Turow: Mainly it has to do with the transformation of the media-buying industry. The media-buying industry is really a subset of the advertising industry. Historically media buying was a straightforward activity. The center of advertising was in creating the ads, and then the idea was, where do we go to put those ads? Buy time? Buy space? It was pretty straightforward in a country like the United States where you went to television, magazines, newspapers and maybe billboards.
With the rise of cable television, you had the beginning of the multiplication of channels and the question of how do we try to reach people with different demographics across 150 channels, plus other new media coming online. Then with the rise of the Internet, that multiplied almost infinitely. By then, a whole new set of companies — media planning and buying companies — that had been spun off from the major advertising agency conglomerates as stand-alone profit centers had emerged. Those companies are leading the march into this new world.
Knowledge at Wharton: What are some examples of the kind of intrusiveness that you are talking about?
Turow: The kind of intrusiveness I’m talking about has to do with, in the first instance, following people around the web, even around a site, for example. What is it that you do on a website? How does that relate to what we know about you demographically? Beyond that, what is it that you do across a variety of websites? And then beyond that, what is it that you do connecting your laptop activities with your mobile activities with your in-store activities? All of that begins to create a profile of you. Much of it today is thin, although increasingly we are talking about hundreds of data points. A lot of it today is rather transient. If people get rid of their cookies — and a lot of people don’t — they won’t be able to be followed as well.
But increasingly we are in a situation where hundreds and hundreds of data points about people across a whole variety of aspects of their being, together with lots of ways to track them across sites and across platforms, is emerging. It is already having implications for the ads we see, for the discounts we get, for the world views that we get in terms of marketing and advertising. But it is also beginning to affect what we see in news and entertainment. It is having enormous ramifications for the so-called legacy media.
Knowledge at Wharton: How so?
Turow: What’s happening is that the legacy media — and by that, I mean traditional magazines, newspapers, particularly print media — are finding that as they lose audiences who are migrating to the web, and particularly in an economic downturn, advertisers are following them to the digital environment. What we see, however, is that advertisers are not sticking with those same platforms. They are moving into other platforms, and the competition is so large that while they used to get tens of dollars per thousand people in, say, newspapers and magazines, now we are talking about a couple of dollars and even less. As a result, magazines and newspapers that used to make a lot of money on those individuals in the legacy world are having trouble surviving even with double or triple the number of audiences in the digital world.
Knowledge at Wharton: Which are the companies you think are most responsible for some of these intrusive practices?
Turow: What we see are companies that have to do with tracking and managing data. Some of the biggest companies, like Acxiom and Experian, have data about virtually everybody, at least in the United States. If you go to the Acxiom catalogue, which is online, you can see that they sell enormous amounts of information about us, sometimes with actual names. You can find out what kinds of health aspects people search for. You can find out what kind of houses people have, what kind of financial activities they do, what kind of psychographic profiles they have, what vacations they take.
Then there are companies like eXelate, 33 Across, X+1 that are more niche companies in the digital realm. What they do is they track people and create cookies about people, often anonymously, that can be merged with marketers’ cookies about those same people. Sometimes the data flows across these sorts of companies. Many companies now are looking at social relationships, for example. So you add social relationships to demographics to offline activities to shopping activities to lifestyle profiles. All of these things begin to create images of who we are — essentially, reputations.
What I try to do in the book is to show that we are at the beginning of this era, certainly not even the middle or the end. But if you follow the logic, you see how this has the possibility of really creating a situation where we live in these silos, these areas of life based on how marketers define us without knowing anything about where they get their information and even that it is going on, except this vague notion that it is happening.
Knowledge at Wharton: You use a term in the book: “long click.” What exactly is that, and what implications does that have?
Turow: The long click is sort of the holy grail of advertisers today. It is the idea that we can begin to follow a person from the top of the marketing funnel — although we could discuss whether that really exists or not in the way it’s discussed — from the time a person begins to look for a product through the various online and offline platforms that that person uses to the point where that person swipes a credit card or writes a check. Then we can try to attribute that person’s activities through the entire process of searching and eventually purchasing. Knowing that, we know a lot more about what that person does and who that person is, and we can maybe generalize to other people we think are like that. That’s the holy grail. It also justifies the monetization of certain media much better than ever has been done before. As I say, it’s a holy grail, but technologies are developing to try to do that.
Knowledge at Wharton: What are some of the technological innovations that the advertising and media industry had to go through in order to get the capability to do some of the things you are describing?
Turow: The first one that we often take for granted in this ballgame is the cookie. The cookie was created as a very conscious effort to try to help marketers and actually stores online to track people’s purchases when they were buying more than one thing. The people who created the cookie really had some philosophical issues. How do we think about privacy in this? In the mid- and late-90s when all this was going on, they created a certain kind of regimen which was more beneficial, some would say, to marketers than to people who care about privacy. But the cookie is only the beginning.
Together with the cookie, we had the development of probabilistic statistical thinkers and computer scientists who have tried to figure out how we can get a hold of what it means to follow an individual across a whole variety of spaces. Another, of course, is search. With the rise of Google, we have a new marketing model having to do with the notion of the click, which is controversial. The click itself is a fascinating technology. It is really the extension of a human being’s hand. People say, “Well, do we really want that to be the model of what it means to sell a product?” Because display advertising gets hit by that. Now there’s a push back in terms of display and an attempt to try to figure out ways to model engagement as it relates to display.
All of these ideas are taking place with a whole new group of people who are thinking about this. Some of them worked on Wall Street and worked on derivatives, and now they are moving into Madison Avenue, and they are trying to understand how to bring big data and new technologies to the question of how we define people. It’s totally understandable. My concern about it is that it is also totally under the hood, and we as a society really have no idea about what’s going on, what control we have and down the line, what the implications are going to be.
Knowledge at Wharton: Let me just explore this a little bit further with a very specific example that I just remembered. The other day I was on Amazon.com and I bought a gift for a friend. It was a book on Buddhist meditation. The next time I went to Facebook, suddenly I started seeing ads about the Dalai Lama’s videotapes. Now, on the one hand, this could be an example of what you were just describing — the long click — where someone, somewhere saw my behavior on one website, Amazon, and is now presenting commercial messages to me on Facebook. But could it also not be argued this is something that might benefit me if I were genuinely interested in this topic?
Turow: No question.
Knowledge at Wharton: So how do you resolve that in your book?
Turow: Well, the way I try to resolve it is the following. I’m not against target marketing. I’m not against following people online for reasons that you suggested. What I am against, if we want to call it “against,” is the issue of it happening without people understanding it and without their giving permission regarding it. In the end, I would argue it hurts marketers and advertisers.
Fundamentally we are moving into a society where data about us are going to be continually traded in ways that have not happened at this scale before. That is what we started with, that question of collecting information about people. There has never been a time in history where individual bits of data about all of us are circulating the way they are now…. The question becomes, who gives you the right to define me and to surround me with ads and discounts and somebody else with very different ones? Some people are going to lose, other people are going to win. Some people are going to be called “waste,” which is a term they use. Some people are going to be targets. As a society, we have to come to grips with that because eventually people are going to say, “Hey, you showed me a different map of the airplane for seats than you showed the guy next to me. As a consequence, I ended up having to pay $25 for an aisle seat and you didn’t. Why is this going on?”
Right now people are relatively calm about this sort of thing. But eventually, we may have Occupy Madison Avenue the way people had Occupy Wall Street. People are going to get their hands on the idea that something is amiss here because we are being defined by forces we have no clue about.
Knowledge at Wharton: You also argue in the book that some of these advertising models are destroying traditional publishing ethics.
Turow: Yes.
Knowledge at Wharton: Could you give any examples of that?
Turow: Magazines and newspapers used to have this idea of the church/state divide. The term was coined by a famous American publisher, Henry Luce, who owned Time Inc. The notion was that the business side should not even speak to the editorial side. With hyper-competition, particularly in the print area, and with the notion that somehow the web is fundamentally different, you have to make money in advertising, not subscriptions. Although companies, as you know, are trying to get money that way, that has fallen apart. In much of the magazine industry, the notion now is how can we get advertisers to like us, and we will bring them in any way we can, virtually. Now there are some magazines that are more welcoming than others, but the notion of what a magazine is has fundamentally changed that way.
For example, a major magazine company, Meredith, really sees itself as a marketing services company, and Hearst owns a digital advertising company. We are moving into a world where media are seeing themselves more as an extension of marketers than they ever have. Not that they never did, but we had a period in American history where advertising and media editorial were considered separate. Even in newspapers, they are beginning to see parts of it, particularly soft news, as areas where they can welcome advertisers and not get into trouble with journalistic organizations.
Knowledge at Wharton: That’s very interesting. If we believe that publishers are starting to customize news and information based on the characteristics that they think the advertisers want to see, what kind of privacy concerns do you think that raises?
Turow: I think it has to do in that case with the question of why do I get certain agendas of news and you don’t? Let’s say hypothetically that I do an analysis of your clicking behavior, and I find that when you look at optimistic headlines you are much more likely to put your mouse over an ad or maybe spend more time with an ad in a display manner than you would if we have pessimistic headlines. It is only a little leap for me to decide that, from now on, I’m going to change the headlines that you see. I could say, “But that’s what that person wants.”
The same with television, which is going to move to an Internet addressable system. There have been attempts at this for the last 10 years or so, but eventually customization of television will exist to the point that when you turn your TV on you won’t see a channel, you will see an agenda, an intelligent navigator, as it may be called. I can very easily see that the agenda that gets set up is a combination of what you say you want and what the algorithms say you are likely to like, together with what some advertisers are interested in supporting. Maybe an advertiser will be likely to say, “I will pay for this movie if you want to watch it this week.” Your next door neighbor won’t get that.
Or Tiffany might say, “We know that your wife’s birthday is coming up. Here’s a romantic movie that we will pay for you to watch.” They will put product placements in there, seamlessly, specifically to your demographic. Other people who see it will get very different product placements. You will see a very different world at large than other people will. It’s not going to be totalistic, but in general, it could have very interesting effects on how you look at the world or reinforce your sense that you are being judged by society in certain ways and not in other ways. Fundamentally, it is a discriminatory activity. Some people gain by discrimination; other people don’t.
I also ought to point out that there are companies deeply involved in trending topics. And this, too, is affecting the media. For example, there is a company called Demand Media, which looks at what is trending in Google and Bing and has freelancers rather cheaply churn out videos and written stories on that with the idea of blanketing the web and their own websites so that when you do a search, their articles will come up. That has enormous implications for what we mean by journalism, soft news and even eventually entertainment and the kinds of things that advertisers want to put their stories, their own stories and their advertisements next to.
Knowledge at Wharton: I think these are all areas of major concern, especially the content farms as these are called. I think Google and some other companies have actually changed their algorithms to prevent this kind of thing happening.
Turow: They’ve tried. Ironically, you know what company seems to have been hurt most? About.com, which is owned by The New York Times. The Times had been taking a lot of money from that. They probably wouldn’t call it a content farm, but they were really hit pretty hard as a consequence of it.
Knowledge at Wharton: There has always been this view that the web empowers consumers. From what you say in the book, it seems like that may not be exactly true. Do you think that the web empowers consumers, or does it make them more subject to manipulation by companies and the government?
Turow: I think it’s both. I think that in the most obvious way, the web empowers consumers. We can now walk into a store, look on our mobile device if you have a smart phone and find out what prices are elsewhere. Stores have gotten to the point they will even allow people to go online in the store, with the idea that if they do not keep them there to look at prices, they will go anywhere else anyway. Those kinds of things, plus the ability to collaborate and to do a lot of things that some academics have emphasized, certainly show that the web is terrific, that the digital world is liberating in many ways. But what I was concerned about in writing this book is that people seem to stop there. What I am arguing is if you go under the hood of the Internet and see where the core power really is, it has to do with trying to take back what advertisers and marketers were afraid they were losing with the advent of the digital world, really take back certain power.
We see developing into the 2000s and nowadays an attempt to use what people do that they consider liberating and turn it to the advantage of marketers in ways that people might not be at all comfortable with. What we have found in a lot of the surveys we’ve done is that people have this vague sense of discomfort, a vague sense that they know what is going on, but they really don’t know much, for example, about data mining. They know stuff like that happens. But even if you go to websites that try to explain it, like Google for example, they use one point descriptions. They will say things like, “How do we do this? Let’s say we follow you across websites and find out that you are a football fan. Well, then we will send you ads about football.” That sounds so benign as to be stupid. Why would Professor Joe Turow care? But the point is they do a lot more than that. We’re talking hundreds of data points that are distributed across everything that you’re looking at.
Knowledge at Wharton: I wonder if we could end with some of the public policy implications. Do you think the solution lies in greater self awareness and self regulation by the advertising firms? Or more interventionist action by the government?
Turow: I think all of the above. I think that advertising companies have to realize that there are real responsibilities in today’s world. For a century, the advertising industry has basically gone on with the idea that it is a laissez-faire environment. Now it is quite clear that advertisers are shaping the media environment in ways that will have a major impact on society and also on our understanding of ourselves. For the self-protection of the industry, they have to understand that self regulation has to be done seriously. There should be a public discussion and an industry discussion about what that means. There is, to some extent, but not deeply enough, I would argue.
At the same time, I believe that regulation has to take place because there are a lot of bad actors out there. And there are a lot of competitive forces. Companies like Google, for example, who said five years ago they would never do something are beginning to do it because now they are competing with Facebook and Twitter. In some ways, they should see public regulation and public discussion about these things as beneficial for the industry because if things are done the right way, they will move forward in ways that people can feel comfortable with and that the public can feel safe about. Right now I think both of those are highly problematic.