Episode 9: Not Data Tyrants

The second part of Dr. Thomas Scherer and Dan Spokojny talking about their article, Foreign Policy Should be Evidence-Based.

Transcript

Once more made by an error-prone algorithm.

Thomas Scherer
You and I are not data tyrants, we do not demand everything be measured. If people say that the thing that I'm working on is impossible to measure, That, to me just cannot be true, you're doing something for some outcome, something tangible in the world, it must be possible to measure that in some way. That does not mean it's easy to measure, or it's cheap to measure.

Alex Bollfrass
Hello, I'm Alex Bollfrass. It's my pleasure to welcome you to Fp 21 minutes, a podcast dedicated to evidence and integrity in foreign policy. We bring new conversations between practitioners and researchers about how American foreign policy is made and how it can be made better. This week, we continue with part two of the conversation between Thomas Scherer and Dan Spokojny about their article recently published by War on the Rocks under the title foreign policy should be evidence-based. You'll find the link in the show notes. Their conversation starts with Dan reading a paragraph from the article after which Thomas and Dan talk about whether the things we seek to achieve in American foreign policy can't be measured in a meaningful way. The second part of their conversation is about the role of ethics in evidence-based foreign policy. After their conversation. I'll editorialize a little bit about the results of a survey about what the makers of American foreign policy find useful in academic scholarship and what they emphatically do not.

Dan Spokojny
One of the biggest misconceptions about evidence based foreign policy is that science should provide the quote unquote, right answers to the problems we face. It will not it cannot, anyone who tells you otherwise is trying to sell you something. Instead, the goal of evidence based foreign policy is to continually strive for better informed answers. That's the point here, Thomas, what we're trying to ask of this process is that we continually learn, and we push ourselves to evaluate the quality of the data or the evidence, the reasoning behind the logic we use and learn from it and push ourselves to learn science is intended. It's built to be an additive process, an iterative process where you continually learn that learning mechanism is often missing from the policy.

Thomas Scherer
process. That brings me into my brief stint in government where I was working in their learning and research office where we have lots of lesson learned reports sitting on the shelf, but being able to pull in and organize that information when you need is a significant challenge. Another response that we got to our blog posts. So to take a tweet, someone responded, I think it's a mistake to think we have anything like accurate metrics on many of the issues, we would expect these data-driven Foreign Service officers to make decisions about. So there's this question here about sure evidence is great data is great. But the things that we're dealing with in the foreign policy community are very hard to measure or impossible to measure. Dan, what would you say to that?

Dan Spokojny
Of course, it's hard to measure. But what's the alternative? What's the status quo that we're measuring evidence or metrics against today, we're still using evidence, but we're just not talking about it as evidence. And we're not evaluating the quality of that evidence, we sneak the evidence into our arguments, we hide the evidence in order to make our claims that the United States should intervene here, or we should create a peace process here, or we need to lead with our military rather than our diplomacy here, we still are using claims within these mental models, it's really hard to get great metrics that are going to tell you what you need to know, that's always going to be true. Just because it's hard doesn't mean that there's a better alternative out there. If you think there's something magical and perhaps there is and there's really good research, which I've really tried to spend time on up there about the magic of the human brain and of somebodi’s gut instincts. So perhaps the status quo is you just have to trust that our evidence-based processes can't compete with the human brain yet,

Thomas Scherer
You and I are not data tyrants, we do not demand everything be measured. If people say that the thing that I'm working on is impossible to measure that to me just cannot be true. You're doing something for some outcome, something tangible in the world, it must be possible to measure that in some way. Now, that does not mean it's easy to measure or it's cheap to measure. It could be too time-consuming. I can see some other reasons why. Some idealized metric or data is not feasible. The argument being made here is not that what I'm doing is impossible to measure. It's just hard to have accurate metrics. And we don't have these for a lot of things. And there is going to be some costs involved and some questions about is it worth it often times we can't get at the real thing we want to measure. And so we have to use some kind of shorthand for it, instead of data on how many children are learning how to read, that's hard to get. But we have data on how many schools are built outputs versus outcomes, that initial measurement isn't quite what we want. It's not as accurate and as precise in a lot of cases as still okay, as long as we're well aware of the limitations, and that we're very careful about the conclusions that we reach based on the data that is available. We in academia, invest a great deal of time into trying to find good measures for things are trained and discuss endlessly ways of finding reasonable measurements, whether they be quantified as data, people usually associate data with some kind of numbers, or it'd be more qualitative storytelling, there are so many different ways to try to understand the world around us, what we're calling for is that some more of that expertise should be brought into the bureaucracy. For many of these things, we might not have very accurate metrics now, but that's largely due to the culture and the processes that we're currently stuck in, we have a vision of a number of other changes, that would lead to the available metrics expanding or being a bit larger. Is that your sense as well?

Dan Spokojny
That's such an important point, Thomas. If we don't have these metrics, which are really vital for foreign policy, then let's pay for them. Let's hire researchers to go out into the world and study violent deaths or attempts to pacify civil conflict or different tools that really work to expand access for American businesses. What really is the effect of sanctions versus naming and shaming? Let's go get that evidence. Let's set our mind to it as an institution and say, yeah, we don't have all the answers, but let's go get them.

Thomas Scherer
There's a bit of a chicken and egg here where there isn't a supply of data. So how can we have any demand for it? Once you build the demand, through demanding more evidence in the decision-making process, that's going to lead to the creation of supply?

Dan Spokojny
When I talked with a lot of high-level policymakers informally when I was working at the Department of State. And now from the outside, I get a sense that there's some turfs Minh ship here, I'm the person in charge, I know what's best. And if you try to come at me with a different tool, or a different method, that could make me seem like an old-timer, or anachronistic, that's a real threat to my job security.

Thomas Scherer
And I get that it's incumbent upon us who are advocating for reform, to argue convincingly to the power structure that these tools are going to help you. They're not going to threaten your ability to conduct politics, they're not going to threaten the primacy of those with the most experience to continue to lead our institutions, they're going to support our policymakers in achieving their objectives. The point of Fp 21, is not to overthrow the system, it's to make the system a little bit better. I feel like sometimes people imagine we envision a world where you enter your data disk in a computer, and it spits out the decision and that there's going to be no place for domestic considerations or experience working in foreign policy, or all these other factors. And that is not at all what we're advocating for that lovely dystopian world. Yes, political considerations will still be a factor. And we still want foreign policy decision-makers with a lot of experience to draw upon and to use that to also evaluate and weigh that evidence that should hopefully in our world being brought to them.

Dan Spokojny
We see this in some of the feedback we received. People talk about not just the politics that can't be pulled out of policymaking, but also the ethical considerations, the identity considerations, the emotional considerations, the contextual considerations, evidence can't distinguish right from wrong. Therefore, we should not have an evidence-based foreign policy is that right?

Thomas Scherer
Evidence is not going to tell you what's right or wrong. If you set good goals evidence will help you achieve that if you set bad goals evidence will help you achieve that. Evidence plays an important role in identifying unexpected outcomes around a policy. One of the favorite examples is a reconciliation program in Sierra Leone, where they were able to embed an experiment in the program. And they found good proof that the reconciliation program was accomplishing its goals but it was also having the side effect of sparking PTSD in some of the participants. If we bring this back to the question of good or bad, that piece of evidence. Let decision-makers know that policy, which largely led to what many would say is a good also had the side effect, which was for them to consider in how to expand the policy or whether the policy need to be changed, or tweaks need to be made to it. There is a role in evidence here to help understand outcomes, which then the decision-maker can judge as good as bad and decide how to move forward on

Dan Spokojny
People are right to distrust new ideas like this, the right to take these ideas to a logical extreme sometimes and say, Okay, if you start trying to quantify everything around me in diplomacy, what's next? Are you going to take away my ability to judge right from wrong? I don't think that point is lost than either of us. But that's just not what we're arguing for that there needs to be a healthy balance between what do we really think works? And what are we really observing is happening as a result of our policy interventions? And what does that information do for our very human judgments of right and wrong and the trade-offs that we have to make, we're not trying to create a machine-driven foreign policy.

Thomas Scherer
My guess is a lot of the policies that people see as having the worst outcomes are a result of a lack of evidence or a lack of knowledge about what an outcome of a policy would be within the research community. There's this mechanism for making sure that things are done in an ethical way, with some guidance from the US government, specifically Health and Human Services. And with this guidance, some group of your peers looks at every research program to decide if it's ethical, the Institutional Review Board process or IRB was also part of my past work in government, being the IRB enforcer, it may be a friend to nobody, even within that framework, I was sometimes alarmed at the projects that didn't need any kind of ethical review, and would push co-workers that even if a project doesn't follow the threshold of you have to follow these rules, you should still think hard about the ethics and right out possible concerns. And here's what we're going to do to mitigate them both as good practice and to be able to look back at to keep evaluating as we go. And to try to address those issues as they arise.

Dan Spokojny
There's sometimes this negative attitude within academia towards this ethics review stage. But I think what it boils down to is if you feel a little bit uncomfortable about evidence-based foreign policymaking not emphasizing ethics enough, join us, let's create some more opportunities for ethical reviews of our policy framework to understand how our research and our policies are really going to impact other people, and then fall back on our very human ability to judge right or wrong and to weigh sometimes difficult choices, whose equities deserve more attention in this policy environment

Data is never going to solve that question for you. Part of an evidence-based policy process is to put mechanisms in place that force us to ask the right questions. Those are about ethics, those are about politics, those are about context. Those are about the quality of the data, or the quality of the evidence, the quality of the research that underlies the claims you make, to the skeptics out there, those are the same questions that we have about the existing policy process, we've really tried to put some intellectual energy into trying to make something better. So we're with you. There's a risk of saying ethics exists off here in one corner, and science exists off here in another corner. And the two are opposite or mutually exclusive. I don't believe that's the case. On the one hand, good science asks you to be really clear about and define what your ethics really look like. Are your ethics about making sure the fewest people possible die? We can measure that actually. Is it about doing as little harm as possible, we can measure that isn't about doing the most good possible? How many people do we educate, even if maybe there's some harm in the way to spend our money or make trade offs or whatever, when one is asked to be a little bit more clear about what they're trying to achieve, and what their ethics look like in practice. It brings science and ethics closer together.

Alex Bollfrass
I hope you've enjoyed that behind-the-scenes look at the kind of conversations that we're having at Fp 21. This week's newsletter has a lot of fantastic material and includes a link to an article that reports the results of a survey. On the views of foreign policymakers about the usefulness of academic research in their work, the title is, does social science inform foreign policy evidence from a survey of US national security, Trade and Development officials. It was written by Paul Avey, and several co-authors and published by international studies quarterly. What's great about this work is that it's both an update and an expansion of an earlier similar survey that was largely focused on the national security community. In the article they write, there remains a sizable gap between what policymakers want and what scholars provide, but it is not as large as has often been claimed, at least not in all issue areas of international relations. Policy officials are broadly responsive to the views of academic experts and willing to engage with academic work. to a lesser extent, they are also receptive to various social scientific methods, including quantitative methods. This is an interesting change from the last time the survey was run, where quantitative methods were generally not viewed as useful by the surveyed policymakers. The article also reports on differences in what kind of academic scholarship is useful to policymakers depending on the areas that they work in. But a quote from the article again, security practitioners use social science ideas and data less frequently than their colleagues who work on Trade and Development are less likely to think that academic work applies directly to specific components of their work, are less likely to value academic research, are less likely to find mathematical approaches useful, and are more likely to find area studies, ethnographic research and historical approaches to be more helpful in their work. This is the piece that jumped out at me as it did, the last time the survey was run this interest in case studies and historical approaches, I think, carry carries real dangers with it. And since I can talk for a little minute about why think that is the case, that is what I will do. First, I have to emphasize to my academic friends that I think case studies are great. They are the core of a lot of my work. But what I worry about when you spend time with academic historians, they are among the scholars least likely to want to extrapolate their work into neatly packaged current lessons, they're often quite cautious and humble in the claims they make. Many of them would be alarmed that policymakers think that reading their work can be applied in a straightforward way to policy without effort dedicated to making sure that the historical period or event that's being examined shares the relevant features with the challenges that we have today. And some of the less approachable work in political science. That is exactly the problem that people are trying to solve. What are some of the generalizable universal features about international politics that we think we can show persist in similar form across space and time? that's entirely possible that the respondents to the surveys weren't really thinking of those cautious and responsible academic historians, but rather writers of popular history, who often already have an eye on the contemporary events, they want to illuminate with their reading of history. That's all fine and good. And I'm waiting for the next Robert Caro book on Johnson just as much as everyone else is. But that doesn't mean that the key to presidential success is to be found in close personal relationships with the Senate. What makes a story about the past compelling and interesting is exactly its ability to hijack our emotions and psychology. Our method desires for certainty, our judgments about right and wrong. So if we are enjoying reading historical case study, it's probably because we've turned our analytical mind off at its best in science, the equations and the models and the diagrams are an effort against all odds to assert our analytical capability against these kinds of narratives seductions.

Previous
Previous

Dear Congress: A Big Foreign Affairs Budget Does Not Guarantee Better Diplomacy

Next
Next

Episode 8: Deinstitutionalization