Are you sure it is a bad idea to quit a job? In Think Like a Freak, Stephen J. Dubner and Steven D. Levitt argue that we are often overly confident about what we think we know, and they recommend a way to think differently to solve problems and make decisions.

Wharton management professor Adam M. Grant recently interviewed Dubner about his new book when he visited campus as a guest lecturer in the Authors@Wharton series. In this interview, Dubner discusses why we should say, “I don’t know” much more often than we do.

An edited transcript of the conversation appears below. 

Adam Grant: Your books have been fascinating to read. They have created an international explosion. Your latest is Think Like a Freak. What motivated you to write this one?

Stephen J. Dubner: I am a journalist, and [before Freakonomics] I had written an article about Steve Levitt and his strange brand of economic research. I was working on a totally separate book about the psychology of money…. I was interested in what we now call behavioral economics.

I wrote about Levitt. Then someone decided that it would be a good idea if we teamed up. We did, and we wrote Freakonomics, which was very successful. We didn’t plan on it being successful…. Then we thought, “Do we want to do another?” We took about two years to decide if we did and if we could come up with a good new material for a second one, which we did. Then for a third one, we were pretty sure we weren’t going to do another because we just didn’t want to milk it, contrary to the wishes of our publisher and agent. Anytime someone sees a franchise presented to them, they want to take it and exploit it. We had slightly different incentives. We felt like we had profited and been really lucky to get to that point. We didn’t want to exploit it unless we had material that we were really proud of. Again, it took us a couple years to come up with a framework for a different book, and that’s this third book, Think Like a Freak….

We hear from people a lot — emails mostly, which is great. Of all the things that the digital revolution has produced, once of the coolest, simplest ones is you can now contact people who write books that you read. You used to have to write a letter to the publisher and hope they passed it along, which they never did. We hear from people with all these problems and questions and queries about the way the world works. We couldn’t answer them all. It’s hard. To answer one email could take — forget about one day — months of research.

Rather than trying and failing to answer a shard of those questions, we thought, “What if we could write a book that deputizes the entire world or whoever wants to think like we do?” [We wanted] to develop a set of rules, a blueprint for problem solving. It’s not always problem solving, but that’s mostly what we try to do. That’s what this book is. It’s meant to be a fun, engaging, practical way to think about the way the world really works, [to] think about the way incentives really work, and the way that people really respond to incentives rather than how they say they might. Then, if you’re trying to solve a problem — big or small — in business or government or in your own family, you can maybe slightly increase your chances to actually solve [it]. That’s the idea.

Grant: Well, you certainly accomplished those goals. You start with the premise that there are three words that all of us should probably utter more often than we do, which are, “I don’t know.” Where did that come from?

Dubner: Well, I think that came primarily from the fact that Steve Levitt, my co-author, lives in the world of academia, where you are. I’m a writer. I’ve been a journalist for my whole adult life. And both of us wouldn’t have a job if we pretended we knew all the answers all the time. The whole premise of what I do as a journalist is go find people who know things that are interesting or worthwhile or hidden and ask them about [them], try to find out. So, you have to acknowledge what you don’t know.

“We’ve been conditioned to think that quitting is a failure, a form of failure. How do we know that that’s true?”

[G]ood academic research — like good medical research, like good physics or engineering research – is trying to figure out questions where you don’t yet know the answer. Once you come in with that mindset, you’re going to have a different approach. You’re going to acknowledge what you know, which may not be very much, and what you don’t know. Then, in order to try to figure out what you need to know, you’re going to develop a framework for experimentation, gathering feedback and so on.

Now, as totally ridiculously obvious as that sounds — what I just said — there are huge quadrants of modern society, particularly in business and in government, where people are constantly pretending they know the answer to a question or the solution to a problem. And I get it. I understand the way the incentives work. I understand that reputation works. Nobody wants to be the ignoramus or the dummy. If I’m a politician and someone says, “Governor Blah Blah, Senator Blah Blah, we just had this terrible mass shooting at a school. If you could do anything — if all options were available to you — what would you do to prevent that in the future?” 

The way the world works is, [the politician will respond], “I’m gonna tell you. I’m gonna do these three things, and that’s what will do it.” [But if you follow up with the question:] “Do you have any evidence? Is there any empirical reason to think that that actually would work?” Often, I hate to say it, [the answer is] no. You see that in certain realms — politics and in business where the incentives are different. There’s a big incentive to get it right in business, but there’s also a lot of, for lack of a more sophisticated term, peer pressure to be the gal or guy who knows, who has the plan.

A really basic rule of thumb or a basic MO that happens very frequently now is a firm will say, “We need to come up with a plan or a solution. Let’s get our top 20 people together in a room for an hour” — that’s 20 person hours — “and let’s come up with the best one, the best idea, and then put all our resources into that and go.” What are the odds? If this were science, what are the odds that that would bear a good result? Almost none.

Then there is the counter-example of something like a Google, which lets its engineers take 20% of their time and work on their projects on the side — the idea being, have a lot of ideas, most of them will be bad, but let the triage process work and let people figure out through scientific or empirical ways how they can really learn stuff. Then, once you have done some experimentation and some small-scale work, then maybe put some resources behind it.

That’s something that I think business needs to do much better. But I think many businesses are moving in the right direction. The digital revolution helps that so much because it’s now so easy and cheap to gather data and do A-B testing or A through Z testing to tell you what’s actually working. 

Grant: Do you have favorite tests that you’re seeing recently that represent this revolution in a positive direction, as opposed to the bad decisions we can all name that should have been based on evidence but weren’t? Are there any standout examples?

“When people quit something that they were generally really worried about quitting, their lives tend to get a little bit better. Even if they didn’t get a lot worse, you might argue that it’s a pretty good bet.”

Dubner: I’ll tell you one example…. I did some reporting on it a few years ago. I have no idea how well it’s working out. I like the idea because it’s the federal government doing it, and the federal government has typically been really bad — I mean, they are the worst. If you think about it, it makes sense why. They are on the top theoretically in some ways — the 50 state governments and all those municipal governments under them. So, they are not in a position really to go micro. I understand that.

But what they did with this Race to the Top program in education I thought was a really good idea. Again, I don’t know how well it’s going to work out, but they set up, first of all, a contest, which means that there are incentives that presumably are going to work better than no incentives or better than some kind of negative reinforcement that we’re used to. Arne Duncan, Secretary of Education, and President Barack Obama said to all the states, “Hey, we need to think of ways to improve or rethink our education system.” Believe me, I could talk about that for years because education is such a complicated box, with so many inputs and so many outputs. It’s really easy to look for magic bullets: Pay teachers more, or get rid of the unions, or smaller class sizes. Everybody likes those magic bullets.

But it’s a very complicated scenario. The Department of Education said, “We go to all 50 states. Each of you, we want you to try to come up with a good program, a good idea, a good solution that works. If it works, we will pay you for it, and there’s a good chance then we’ll run it up and we’ll standardize it.” That’s the right kind of thinking. Think small. Don’t pretend you know the answers. Experiment, get feedback. These are all the premises of Think Like a Freak really. 

Grant: You have some fascinating examples in the book that probably stretch beyond what most readers would themselves be willing to do. For example, you actually got people to agree to let you randomly assign them to do things like ask for a raise or quit their job or even break up with their significant other. What was the logic behind that?

Dubner: This came about because of a podcast episode. We do a Freakonomics Radio podcast and public radio show. We did an episode that I loved. It was just a great topic because it’s a blend of data and empirical thinking with narrative storytelling, which is my tradition. It was called “The Upside of Quitting.” It was making an economic argument to some degree, which is considering that most of us have been conditioned to not quit. We’ve been conditioned to think that quitting is a failure, a form of failure. How do we know that that’s true? 

“If for five minutes, you spend some time thinking about the sum cost and thinking about opportunity cost, then you can really get to different places.”

If you think about a project, a job, a war, a relationship…, you could quit [them], but because of some costs and because of peer pressure and because of your own moral position, you might not want to quit. We tried to look at what is the upside of quitting. We argue that there’s a significant upside and that people are really bad at estimating opportunity costs — what they could be doing if they [did] quit and so on.

But the fact is, it’s really hard to get data on this because it’s not like you can go into one big school district and say, “I’m going to take a thousand kids, and I’m going to totally mix them so that their grades are equivalent on either side, and I’m going randomly force half of them to quit school. Don’t allow them to go back to school. Then 10 and 20 and 30 years later, let’s see how their lives turned out.” That’s one way you might do that experiment, but of course, we couldn’t do that.

The people who tend to quit school tend to be a very different population than the people who don’t quit school. So, to compare them after is not equivalent. So, what we came up with was a website called “Freakonomics Experiments” [for people who] had a decision to make…. “Should I quit my job and go back to grad school?” “Should I join the military or stick with my job?” “Should I leave my boyfriend or girlfriend or husband or wife?” “Should I get a tattoo or not?” 

If they had a decision, and they were really, truly on the fence, then we offered to help them out and flip a coin for them. All we asked is that they fill out a survey beforehand telling us about it and that then they tell us whether they followed the coin — because we have no power to make them follow the coin. [We said] we would follow up with them and do research later to find out what their outcomes were.

Many different categories, a variety of outcomes, and the research isn’t done yet — but the short answer is that when people quit something that they were generally really worried about quitting, their lives tend to get a little bit better. Even if they didn’t get a lot worse, you might argue that it’s a pretty good bet. We should all consider quitting as a really good option. But it’s hard when you have [in your head] the words of Vince Lombardi, “A quitter never wins and a winner never quits,” which wasn’t actually Lombardi originally. And Winston Churchill telling people, “Never, never, never — in nothing, great or small, large or petty — never give in.”

You have these great people, and that gets in your ear and it convinces you that, “Oh man, if I start a project, I have to see it through.” But if for five minutes, you spend some time thinking about the sum cost and thinking about opportunity cost, then you can really get to different places. That’s what we were trying to accomplish there.

Grant: In closing, other than saying, “I don’t know,” through the whole process of working on Freakonomics the book, the radio show, the podcast, the movie, what’s the biggest lesson you have taken away about how to think like a freak?

“Being right doesn’t win that many arguments, weirdly enough. There are a lot of people who are right about a lot of things who don’t get their way.”

Dubner: This is more of a philosophical answer than a tactical or strategic answer. To me the challenge is always going to be the blend between the empirical or scientific or data — whatever you want to call that — and the intuitive or the human or the humane — whatever you want to call that.

What I mean by that is, especially in this era of big data, … we believe that if you get a pile of data representing a million decisions, that that’s better than asking three people what decisions they made. While I very much believe that to be true, and I very much applaud all the instincts for all of us to work with data in aggregate to distill the biggest truths, I also know that we’re humans and that …we’re biased in a lot of ways.

Even if you could tell me or I could tell you the most foolproof strategic way to reach a decision or the best decision to make or the best strategy or the best set of numbers to embrace, there might be a lot of good reasons why you still won’t be successful. That’s because the people that you are now employing that strategy on, or the people that you’re now offering those incentives to, don’t respond the way you think about the problem.

That requires a lot of humility. That’s something that people in government, in business, in academia, in journalism — everywhere — people in all those fields are really used to. … When we come up with something and we put it into play, we’re used to people snapping up and saying, “OK, we’re going to do this now.” That’s a lot of power. That’s a lot of authority. But with that power and authority comes the need for humility to understand that when you make decisions like that and put out incentives, whatever they are, big or small, governmental or non, that there are people on the other end of that. The decision-makers don’t often think through very well how it affects [those people’s] lives or how they’ll respond to the incentives and so on.

And so, that to me is the balance. To be as scientific as you can while understanding that, “Even if I can present 100 people with the science that says, “Hey, you should really do this,” 90 of them might have a really good reason for not wanting to do it. They might be wrong. I might be right. But it doesn’t mean I’ll win the argument. Being right doesn’t win that many arguments, weirdly enough. There are a lot of people who are right about a lot of things who don’t get their way.

That’s really the trickiest part. I’m working on a radio podcast episode right now about the flu vaccine — very, very simple. The flu vaccine is pretty effective — about 60% or so. Influenza, along with pneumonia, is always one of the 10 leading causes of death in the US, which most people don’t think about or don’t know. Yet, a lot of people who should get the flu vaccine don’t. Why?

It’s kind of a conundrum. We’re going through all these different layers of behavioral and PR and financial decisions to try to figure out how is something as seemingly simple as this so hard to accomplish? That is what I’m constantly reminded: The smart money may be smart, but unless it can deliver on something that really raises everyone’s behavior then it’s not worth that much.