H-Diplo | ISSF
Article Review 102
H-Diplo/ISSF Editors: Thomas Maddux and Diane Labrosse
H-Diplo/ISSF Web and Production Editor: George Fujii
Jean-Christophe Boucher. “Yearning for a Progressive Research Program in Canadian Foreign Policy.” International Journal 69:2 (May 2014); and
Brian Bow. “Measuring Canadian Foreign Policy.” International Journal 69:2 (May 2014).
Review by Jonathan Paquin, Université Laval
Published by ISSF on 27 July 2018
Jean-Christophe Boucher’s scholarly essay, “Yearning for a Progressive Research Program in Canadian Foreign Policy” and Brian Bow’s invited response, “Measuring Canadian Foreign Policy,” offer a timely discussion of the state of Canadian Foreign Policy (CFP) analysis. Boucher’s essay should be applauded for its boldness and its diagnosis of some problems encountered in the discipline. Whether or not one agrees with Boucher’s conclusions, his analysis has the merit of shaking up the field of CFP and providing the basis for a well overdue discussion of methods and scientific progress in the study of CFP. As for Bow’s response to Boucher, I believe it is relevant and provides a broad perspective for reflecting on the state of the discipline, although I take issue with his main argument.
In his article, Boucher assesses the state of Canadian foreign policy analysis by evaluating over 500 peer-reviewed articles published in five Canadian journals from 2002 to 2012. The articles were sorted according to five methodological approaches: description, quantitative analysis, comparative study, critical study, and qualitative analysis. The definitions of these methodological categories are central to Boucher’s argument, as they are the building blocks of both his empirical demonstration and his recommendations. Boucher based these categories on pre-existing methodological categories defined in the Teaching, Research, and International Policy (TRIP) project. However I would suggest that these categories pose two problems. First, the ‘comparative study’ and ‘qualitative analysis’ categories are not mutually exclusive and often overlap. It might have been sound to merge them into a broader qualitative method category. Second, I tend to agree with Bow’s critique that the broad ‘descriptive’ category defined by Boucher may actually contain articles that, while not relying on explicit and transparent methods, are nevertheless “developing concepts, clarifying or challenging theories through logic, or reviewing trends in the field” (Bow, 230). Such articles, which could contribute to scientific progress, may thus have slipped under Boucher’s radar.
Despite these reservations, Boucher’s empirical demonstration is well executed and convincing. His aggregated data show that almost 60% of the articles published on CFP between 2002 and 2012 employed “a loose strategy of empirical validation” and were essentially descriptive (Boucher, 221). Boucher also shows that 21% of the articles were qualitative in nature, although most of them focused on case-study analyses and rarely relied on more ‘sophisticated’ techniques such as process tracing or counterfactual analysis. Another interesting, indeed troubling, finding is that only 5% of the 531 articles published used a quantitative method, even in a broad sense, which means that they did not necessarily perform statistical regression analysis.
Boucher’s article is therefore important since it convincingly validates the common perception that CFP analysis is mainly descriptive and lacks scientific rigor. The only exception to Boucher’s general findings is the Canadian Journal of Political Science (CJPS), which published 30% of qualitative research and 35% of quantitative work on CFP. This is little comfort, however, since CJPS only published 3.7% of the total number of CFP papers between 2002 and 2012.
In his response to Boucher’s essay, Bow’s main argument is that it is a mistake “to make Lakatosian theory-testing the gold standard by which all CFP research is measured and evaluated.” It is true that non-social scientists have contributed to CFP analysis over the years and that a certain number of articles analyzed by Boucher were most likely written by practitioners, historians, lawyers, or even journalists. This leads Bow to argue that non-social scientists cannot and should not be evaluated by social science, let alone political science standards, but rather by the standards of their own fields.
The question that remains, however, is whether we should lower our scientific expectations when evaluating the CFP work produced by social scientists on the pretext that they are contributing to, in Bow’s words, an “interdisciplinary project” (Bow, 229). The answer is no. We should keep in mind that the majority of the articles that were published in the five journals investigated by Boucher were most likely written by social scientists, that is, political scientists, economists, sociologists, and perhaps anthropologists. Therefore, the interdisciplinary nature of CFP should not blur our judgment when it comes to assessing the scientific value of social scientists’ work. To take up Bow’s analogy, I believe that hockey players should always be judged using hockey scorecards, because no matter where they play their game, it is always hockey. In other words, we should not lower the standards for evaluating hockey players simply because there are figure skaters in the arena (or vice-versa).
Even leaving the non-social scientist articles aside, I am afraid Boucher’s figures still tell a sad tale about the state of scientific progress in CFP. They suggest that some well-accepted methods that are largely taught in undergraduate and graduate social science programs across Canada are underrepresented in CFP analyses. I therefore agree with Boucher that, “it is absolutely imperative that the CFP scholarly community encourage current and future students to acquire a better understanding of the diversity of approaches” (Boucher, 226). I also believe that what is called for here is not interdisciplinary relativism and denial, but a reality check. While the fields of International Relations and U.S. foreign policy (i.e. CFP’s cousins) have been revolving around rich methodological and theoretical debates, experts on Canadian foreign policy have somehow agreed to settle for less – at least this is what the data have suggested for the last ten years.
Jean-Christophe Boucher raises the problem of scientific culture in CFP analysis that deserves to be heard and widely discussed inside as well as outside the field of Canadian foreign policy. It would be unfortunate for Boucher’s valuable contribution to be discarded on the pretext that the study of CFP is part of a so-called ‘interdisciplinary project.’
Jonathan Paquin is Professor of Political Science and the editor of the journal Études internationales at Université Laval in Quebec City. He has recently coauthored Foreign Policy Analysis: A Toolbox at Palgrave Macmillan in 2018, Paquin has also written articles in multiple journals including Cooperation and Conflict, Foreign Policy Analysis, Mediterranean Politics, the Canadian Journal of Political Science and International Journal. He is a former Fulbright visiting scholar and Resident Fellow at the School of Advanced International Studies (SAIS, Johns Hopkins).
©2018 The Authors | Creative Commons Attribution-NonCommercial-NoDerivs 3.0 United States License