Bridging the Academia-Policy Gap in Canadian Defense: Lessons for Other Smaller Allies

juneau

How much influence should academic expertise have in government policymaking? This is a question that is regularly debated in the United States. Yet the American debate is rarely applicable in other, smaller countries: The size of the university system, the importance of privately funded think thanks, and the enormous defense budget distinguish the American case from other states. This article offers a quick survey of recent efforts to bridge the academia-policy gap in the Canadian defense community based on a study we recently published in the journal Canadian Public Administration. Drawing on our own experiences as academics who have also worked in government and on interviews with Canadian defense officials, we outline key takeaways from Canada’s experience that might be relevant in other national contexts, notably those with smaller defense communities.

 

 

Becoming of One MINDS

Canada’s Department of National Defence has established a range of initiatives to engage with academia over the decades. The evolution of these programs highlights inherent tensions that surround government funding for academic research, notably in terms of how much policy-relevant analysis academics should aim to produce.

The first of these, the Military and Strategic Studies program, was launched in 1967 and supported the teaching of military and defense studies in Canadian universities, including funding for graduate students. To keep academics informed of the military’s thinking, an annual conference was also held to help foster ties between the two communities.

In 1994, the government expanded the Military and Strategic Studies program and transformed it into the Security and Defence Forum. This new iteration, with an annual budget of about CA$2.5 million (about US$1.9 million today) in its final year, offered permanent funding to 12 centers in universities across Canada, with each receiving between CA$100,000 and $140,000 per year. The Forum also provided awards for graduate and post-graduate scholarships for individuals working on defense and security topics, as well as special projects funding to support conferences or other activities.

The Security and Defence Forum was successful according to one set of metrics. Its centers of expertise increased the offer of security and defense courses and of academic publications on these matters. Faculty and graduate students affiliated with the Forum centers also came to increasingly participate in public debates. Many benefitted from its support to study in the United States, allowing them to improve their understanding of Canada’s most important defense ally. These centers were also important feeders for civilian policy analysts into the Department of National Defence. The Forum established a vibrant academic defense community where one would not likely have been otherwise fostered.

The Department of National Defence, however, was not satisfied with the Security and Defence Forum. The department ended the program in 2011, much to the chagrin of academics who had benefited from its support. The immediate catalyst was the budgetary context, with the government then engaged in a post-financial crisis deficit-reduction exercise. More broadly, many civilian and military leaders did not think that the academic centers offered them relevant analyses and studies. There was also growing concern that the Forum was too focused on supporting academia, blocking the Department of National Defence from accessing expertise elsewhere outside government. As a result, the Forum suffered from a benign neglect within the department that made it easier to terminate when there were budgetary pressures. Few defense leaders saw the program as useful and even fewer were willing to engage in a laborious effort to ensure that it better aligned with the department’s priorities, particularly since this would have touched on the sensitive question of academic freedom.

Shortly after the Security and Defence Forum’s demise, the Department of National Defence announced its successor: the Defence Engagement Program (DEP). The Program launched three new initiatives: targeted engagement grants, which provided funding for events such as conferences and workshops; the expert briefing series, inviting experts to National Defence Headquarters, and scholarships for graduate students. Importantly, the new program was specifically built to ensure that it would only support projects directly tied to specific Canadian defense interests, in contrast to the more permissive Security and Defence Forum criteria. The new Defence Engagement Program also operated more as a contractual relationship, whereas the Forum had been based on grants.

In 2018, the Department of National Defence announced that the Defence Engagement Program would be replaced by the Mobilizing Insights in Defence and Security (MINDS) program. The new program kept its predecessor’s three pillars — grants, expert briefings, and scholarships — and added a new, ambitious initiative: collaborative networks. These are multi-disciplinary groups of scholars and experts who receive CA$750,000 (US$573,000) over three years and commit to focus their activities — workshops, policy papers, in-person briefings — on specific themes or regions. Six networks were selected in 2019 and 2020, with three more to come in 2021. In that sense, the networks arguably aim to create something of a hybrid between the traditional academic center and a policy-relevant think tank.

Lessons from Canadian Efforts to Bridge the Gap

At the core of the Canadian defense engagement debate has been the question of how much direct return on its investment should the government expect from these types of initiatives. Put differently: How tightly should such a program link its funding to support for the department’s day-to-day priorities? Arguably the most damaging flaw in the Security and Defence Forum’s design was that its long-term grants left the centers with significant discretion on how they spent their money, what topics they would study, and in which venues would they publish their results.

In our interviews with serving and retired officials, we observed that a portion of the blame for the Forum’s demise should go to the Department of National Defence: It failed to define the parameters it expected the centers to work within. This may sound obvious, but that is the first lesson: Academic engagement programs should clearly lay out what their expectations are in terms of topics to be covered and format of expected outputs. This is easier said than done — it requires that program staff gain a detailed understanding of both demand (what are the precise knowledge gaps inside government) and supply (which experts are best qualified to offer relevant expertise). This is a necessary first step. If academics are only vaguely told that they are expected to analyze ill-defined issues, they will predictably take their work in whatever direction suits their own research interests.

Another important lesson that the managers of the MINDS program have learned from the demise of the Security and Defence Forum is the need to build a nimbler structure. The new networks, in particular, are meant to be flexible, as opposed to the permanent centers of the past. Since the funding is only for three years, underperforming networks can simply not be renewed (and funding can even be yanked before the end of the three-year period). Opening a new competition each year for three new networks (until a total of nine coexist) on a rolling basis also allows the Department of National Defence to constantly adjust its priorities as new issues emerge. In addition, the MINDS program established a fifth pillar, the rapid response mechanism, which allows it to call in researchers with specific expertise on very short notice if, for example, an unexpected crisis emerges on which there is little or no internal knowledge (this may be rarer in the American context, but is more frequent in smaller allied governments). As the MINDS program has realized during its initial years, however, such programs are difficult to manage in a large department with a heavy bureaucracy.

The MINDS program has also clearly stated that its funding is not limited to academics. This is one of the key lessons of the demise of the Security and Defence Forum: While some academics can produce knowledge relevant for policymakers, for example by challenging groupthink or through their access to countries or people which government officials cannot access, they do not have a monopoly on relevant external expertise. Maritime insurance companies, for example, can have unique insights on piracy, banks can have much to say on risk analysis in key countries, and journalists can have unique on-the-ground knowledge. This diversity also helps broaden the talent pool. This may not be an obvious limiting factor in the United States, but in smaller countries such as Canada, the academic and think tank bench strength in defense matters is limited. It is for the same reason that the MINDS program has also been keen to open its applications to non-Canadian scholars and experts, including, in many cases, Americans.

Engagement programs should, finally, carefully tailor their incentive structures to make it worthwhile for academics — or at least, for those with relevant expertise to share — to participate. Typically, academics will agree to invest time when they feel that, in exchange for their time, they will gain something valuable in return: access to policymakers (and to data and insight otherwise difficult to reach), financial support for their research (e.g., to hire research assistants or to travel), or direct financial support (i.e., honoraria). Effective engagement programs should, therefore, be structured in such a way that they guarantee one or more of these returns on the time investment of researchers. If not, the more skilled ones will be reluctant to participate.

In sum, Canadian efforts to bridge the academia-policy gap in the defense sphere suggest that there should be give and take from both sides. Academics may prefer grants that allow them to study questions using the latest fashionable theories and methods, thereby advancing their scholarly careers. Yet governments may not be keen on seeing their engagement programs producing studies that do not have evident policy relevance. Governments, however, need to be mindful of academic incentives and freedoms when they seek to fund policy-relevant research. If it succeeds, the hybrid approach Canada has devised with the MINDS program may be one way to meet expectations on both sides of the gap.

 

 

Thomas Juneau is an associate professor at the University of Ottawa’s Graduate School of Public and International Affairs and a former policy officer with Canada’s Department of National Defence (2003-2014). He tweets @thomasjuneau. 

Philippe Lagassé is associate professor and Barton Chair, Norman Paterson School of International Affairs, Carleton University, Ottawa. He has served as a third-party reviewer of major defense acquisitions for the government of Canada since 2012.

Image: Sailor Third Class Melissa Gonzalez