HCER Sits Down with Richard Hanania
HCER staff writer Aden Barton sat down with Richard Hanania, a political scientist and public intellectual specializing in foreign policy and partisanship. His work on these subjects has been featured in the New York Times, The Washington Post, and Newsweek, in addition to other publications. He recently published his first book, Public Choice Theory and the Illusion of Grand Strategy. Below is an abridged transcript of the conversation, edited for length and clarity. Neither the Harvard College Economics Review nor the Interviewer necessarily endorses the opinions or viewpoints articulated within the Interview.
Let's start with foreign policy because that's where you got your start. What would you change about Biden's current response to the Ukrainian invasion?
There’s a lot we don't know, obviously, about what's going on, and what they're doing behind the scenes. To start with what they've done well, they've pretty much all but ruled out direct engagement with Russia unless something extreme happens. It looks like the President has created an expectation that we're not going to do that. I’ve seen some people say, ‘well, you have to leave the possibility open for game theoretic reasons.’ I think that is a simplistic model of the world that sort of simplifies things in the wrong way because it's assuming some kind of strategy for the country as a whole.
I think you have to think about the practical implications, the internal politics of that. So, I think he's influenced our internal politics in a smart way. Even if, in an ideal world, you would leave it on the table, I think leaving it on the table is actually dangerous, given the public mood and the way elite discourse is going. So, that's been going well.
I think that, to the greatest extent possible, nobody knows for sure what the Russians would accept. That was true before the war. That's true now. You have to deal probabilistically, and you have to deal in reality. I like to think about what the endgame is, and how we can get there, hopefully sooner to have war that ends more quickly, has less economic damage, and has less damage to human life. And, I think it's going to have to end with some kind of recognition that Ukraine has some kind of neutral status and also that there's going to be some Russian territorial gains. Hopefully, for the Ukrainians, it's just Crimea and Donetsk and Luhansk.
Now, from what Zelensky has been saying publicly, basically everything else is on the table, but they've never shown any willingness to recognize the territorial gains that Russia has had, even the ones that they've had since 2014, which includes Crimea. And that is very difficult. Maybe behind the scenes, they are pressuring Zelensky to do this. If they're not, I think that they should be doing this. I've always thought that ruling out NATO was probably the way to try to avoid the war before it started. Now, I don't know if that'll do any good. You're sort of stuck in this place where there aren't a lot of good options now. That's why I was hammering hard on getting NATO expansion off the table to at least address the Russian concerns. They didn't do that. So, now it's just bad options.
But I think the Biden administration is behaving reasonably from what you'd expect. There’s a moral question as far as providing weapons to a conflict that just drags it out and just ends up killing more people. Now, the Ukrainians have performed better than people have thought, so it actually could end up saving their territory. So, I would have thought that would have been actually less wise. I would have thought the Russian victory was all but inevitable—and maybe it still is. But if those Ukrainian weapons go to unrealistic goals, like seizing back Donetsk and Luhansk and Crimea, then you've just contributed to the bloodshed for what in the end is going to be the same result.
Is your willingness to sign a negotiation that gives Russia those territories just a pragmatic consideration to reduce the risk of war? Or do you think there's any legitimate claim to places like Crimea?
Legitimacy is in the eye of the beholder. So, I tend not to be a big moralist in international politics. People had a referendum, and people say, ‘it was a referendum under the auspices of the Russians, so you can't trust it.’ But from what I see, most people who understand the area and understand the region do think there's a lot of popular support for Russia in Crimea and also in parts of eastern Ukraine. So, do you care more about the technicality of international law, or do you care about what the people on the ground think? I don't think there's a right moral answer here, so, you know, it's pragmatic. It's not like there's a clear moral case for these things.
Turning to Harvard, if you were to decide Harvard's policies, both with respect to research and its admissions policy, how would you change how the university operates?
I haven't thought about Harvard specifically. Especially in something like universities, I'm in favor of what I said about the market as in experimentation and people doing what they want.
I'd have to think about what I am trying to maximize. Am I just trying to get the smartest people into Harvard? Am I choosing the people who are going to be elites and who are going to do the best things for society? That would be a different kind of thing. Am I trying to maximize the endowment of Harvard? So, I have to think about sort of what my goals are here.
Okay. Well, let me ask a little bit more specifically. How would you reorient the incentive structures of academic research to make them less bureaucratic, more innovative, etc.?
I think that what Tyler Cowen is doing is really great. They have grant programs, and they tend to care about the person rather than necessarily the project. I think that's very important. Anyone can see that people differ in their energy and creativity and willingness to take risks. Because Tyler's an economist, he really understands the concept of opportunity cost. A lot of these bureaucrats don't. This is a real problem with bureaucracy in a lot of different areas. So definitely, lower bureaucratization.
Some people have thought about lottery systems and at least testing how the lottery system works—that kind of experimentation would be good. I don't know if there's literature on this or not. But if there is, I would either look at it, or, since Harvard's big enough, they can do the research themselves.
And I think there needs to be real reckoning with what fields and what areas of study really aren't worth doing or are based on false premises. To the extent that social science is based on blank slate-ism, I would have no trouble cutting that off or reducing funding for fields or areas of study like that.
It is interesting. There are a lot of fast grants being launched. FTX, I think just launched one as well. So, I'm happy to see that innovation in funding. Following up on that, I have been thinking recently how I should interpret social science research with more scrutiny. If you see a random behavioral psychology article, or something more social science focused, how do you interpret that? The replicability crisis in economics and psychology has made me very distrustful of a lot of studies. How do you interpret and approach academic findings right now?
Before I was a public intellectual, I wrote academic papers, and I read a lot of academic papers. So, I saw, so to speak, how the sausage gets made. Over time, I've changed the way I read academic papers. Like I skip the literature review now, if I see a paper, because I know how it's done. Basically, you have this interesting study, and you say, ‘Oh, how can I find enough citations that will make me look smart.’ So, a lot of this stuff is just sort of fluff. It’s sort of like clicking the ‘I agree to proceed’ when you're on a website. That’s how I see a lot of the literature review and the introduction and stuff. Sometimes it's useful, but often it's not.
I look at the results. There's a couple things to keep in mind when you read the research. There's the stuff that most people who follow social sciences know, which is the replicability crisis. So, you look out for p-hacking, and you look out for weak results that are not very robust. Intelligent observers know that.
There’s a few other things. You should be on the lookout for using some very narrow results to explain some broader phenomenon. So, for example, I studied public opinion in international relations. This is pretty close to an actual example I saw. The paper was like ‘Oh, when people are told that there's UN support for a war, they're more likely to support a war. So, this is probably why politicians try to go to the UN.’
There’s a lot you have to assume there, right? Okay, you did this study that's just a sort of an A/B test where people like a war more. But how many people even know which wars have UN approval, right? If it doesn't get UN approval, can't the president just ignore that? Sometimes they don't get approval from the UN, so they’ll go to a Latin American equivalent, or they’ll go to NATO which is just basically the US.
So, I doubt this result. I have no doubt about the survey. It makes sense, and it works. I doubt it explains much of anything about the real world. They’re like the drunk searching for the keys where the lights are. It's because we can do the survey, we think it must explain international politics. So, I would be on the lookout for that too.
You know, of course, be on the lookout for the ideology. I think the more an issue touches on ‘wokeness,’ the worse you should expect the p-hacking and the file drawer effect to be. So, if you find a result that the person could be fired for writing, that should increase the credibility of the results. And, if it's something that’s consistent with what they have to believe anyway, you put less credence on that. It's not fair from a scientific perspective to the people who do believe in the ‘conventional wisdom’. But you do have to discount things in that way because that's just the reality of what gets published and what doesn't.
You've written about this extensively in terms of how to treat experts and how to think about expertise. My worry a little bit with having that level of skepticism is that it places such a burden onto the individual. You've also written about what heuristics to use, and you've said to trust futures markets. In the absence of large-scale efficient futures markets, what would be the best signal? Would you go on Metaculus still, or listen to smart intellectuals?
It depends on a lot. It depends on what your job is, how much time you have to devote to an issue, how much you want to divide your time between various issues, and, frankly, how smart you are. If your IQ is 95, I'm going to suggest different heuristics from if your IQ is 130 or 140.
So, it's hard to give blanket rules to people. When I write for my audience, I assume it's a relatively smart audience. I assume I'm not talking to people with very low intelligence or who are disinterested in politics. I assume some kind of threshold level.
It’s probably worth thinking about what priors you should take with you out there. When it comes down to it, we're all operating off a few priors. There's a few priors that we all have that are really doing the heavy lifting. To the extent that we're aware of that and we're questioning those priors and thinking about whether they make sense, I think that has the most return to your time and mental energy.
So, for example, in economics, my prior is that markets are better than central planning. If I want to know about some new stimulus bill that comes up, I could go read the 1000 pages and try to track down every claim that every researcher makes, but that's not a good use of my time. The good use of my time is figuring out why I have this prior that markets are better than central planning, seeing if it's correct, and looking at the alternative evidence. If it is correct, I think I can have a pretty good view on the bullet points of the stimulus package, and then it's going to be broader, and it's going to help you think about other things, too.
So, I think priors are unavoidable. Simplistic heuristics are unavoidable. Thinking for yourself on each issue or doing your research from scratch—that's not a realistic goal, no matter how smart you are, and much less if you're not very smart. So, acknowledging priors and questioning them—and making sure you have the right ones—is an important thing.
So, the lesson there is that you should spend more time scrutinizing your priors rather than a specific issue?
Yeah, I think so. If the issue is important enough, take your priors and do the work. But you know, generally, yes, that's a place to start.
I want to talk a little bit about your intellectual habits to finish out. First, where do you get your news from? And what weights do you place on those inputs from various sources?
That's a good question. I read the New York Times, The Washington Post, The Financial Times, and The Wall Street Journal. These are the big ones. People like to bash the mainstream media, but, for what's actually just going on as in the pure facts, you have to discover certain things. They're biased towards Democrats over Republicans. They're super biased on the ‘wokeness’ related stuff. They're biased on foreign policy—to the hawkish directions, from my perspective—but they're worth reading.
You can get a lot from Twitter. I prioritize how much I have to pay attention to any particular thing. So, the Russia and Ukraine thing - I think it's sufficiently important, and it's close to my area of expertise. So, every day I'm checking what's happening there. I'm looking at the political developments. The bias here is so strong that you actually have to go to Telegram, and you have to get the pro-Russian accounts because everyone in America has a Ukrainian flag in their bio. That’s great, but I'm going to distrust your objectivity if you all have a flag of one of the participants in the war. So, sometimes you have to seek out even information that is actually not very good, or it's going to be super, super biased because you’re getting the opposite. So, you’re trying to triangulate to see what makes sense.
So, I wrote for people who are interested about forecasting Ukraine and actually just even understanding what's going on. I think I have a Substack or two on those that people could check out.
You mentioned actively seeking out people who disagree with you. What other heuristics would you use to maintain intellectual humility?
I think it’s good to understand that there are different communities and that some of these communities have better habits of rationality than others. They call themselves rationalists. The fact that they call themselves rationalists and other people don’t say, ‘How come you get to call yourselves rationalists?’ shows they’ve ceded that certain people are rationalists. That tells you something. Or, like effective altruism. Why isn’t everyone claiming that label? Are you doing ineffective altruism? They’re just avoiding the issue. Most people are doing ineffective altruism. They’re not figuring out the costs and benefits of whatever they’re doing. So, that would imply the rationalists and the effective altruists tend to have good ideas.
Frankly, there are some people that are just not worth listening to on other stuff like people who are hyper-partisan. You should always in the back of your head ask, ‘what’s driving this person?’ Sometimes you can tell like with Trump, he says A one day and then says B the next. A person who mimics that may be helpful for understanding human nature but not a good person to get you closer to what actual truth is.
So, there’s heuristics for how to think about the world and heuristics applied to people and to communities of people. I think people tend to shy away from how simplistic our mental models are, but they have to be that. I think a lot about the limits of human cognition. My advice is to be conscious of this stuff rather than pretend it doesn’t exist.
Maybe I’ve just missed it, but I haven’t seen you write a lot about rationalism or effective altruism. It seems like you have a favorable view of the movements?
Yes, I’ve been thinking about a Substack on why I’m not an effective altruist, although I’m sympathetic to them. I don’t really fit in anywhere, but I think libertarianism, rationalism, and effective altruism are close enough to where I am, if I had to fit in.
The essay would be on why I’m not fully an effective altruist because I like people thinking carefully about what they’re doing. At the same time, I think my worldview—I think this may be different from a lot of rationalists—has a very big aesthetic component to it. There are some things that I find disgusting and that I find distasteful that can’t be completely rationalized. I can give you good reasons ‘wokeness’ is bad—and I think they’re logical reasons—but why do I focus on that rather than, say, anti-aging or something that may have a pay-off for humanity that is equal or possibly more?
Part of that is just aesthetics. I don’t like how it looks. It’s good to be aware of that fact. This prevents me from going full rationalist or effective altruist. I have to be motivated to work as hard as I do and to write as much as I do. That’s motivated in part by wanting to make the world a better place but also just by my liking this or disliking that. It’s just the way it is.
To finish out, do you have any advice for young intellectuals for thinking freely, for careers, or for current undergraduates in this political moment?
I’m assuming the audience for something like this is very smart, very capable, and very intellectually curious. I look back and I regret a lot of the time I spent on the conventional path of becoming an intellectual, trying to become an academic. I could have done what I’m doing now five or six years earlier. I learned some things in academia, but the opportunity cost was not being out there and not building my name and not being part of the public discourse.
Part of that was because I knew that as soon as I went public, my views would disqualify me from academia. I couldn’t really jump in. For young people, I would argue that if you’re an intellectual or you want to do something like I do (write books or Substacks) and if you’re talented, there are opportunities for you out there. Don’t feel like you have to go down the conventional path. It costs very little to start a Twitter, start a podcast, or start a Substack.
When I started my Substack, I didn’t think it was something I’d focus on. I thought I would write long reports at the Center for Science in the Public Interest (CSPI), recruit other people to write stuff, and write books. I didn’t see myself as writing regular pieces, but those really took off. People really enjoyed reading them, and I enjoyed writing them. The costs of starting out the Twitter or the Substack account are very little if that’s a path you potentially want to go down.