How Can We Know if Professional Military Education Works?
My position as a professor of educational methodology is a unique one within professional military education: Rather than substantive expertise in topics like national security, military operations and tactics, or strategic leadership, I contribute my expertise in the scholarship of teaching and learning. I explore questions like, “How does game-based learning develop students’ strategic thinking skills?” and, “How is the seminar learning environment influenced by different student demographics?” These are the types of answerable questions that are missing from recent discussions around structural reforms, instructional strategies, and like topics. Jim Golby argues that professional military institutions should emphasize applied social science research. I agree, but our efforts to achieve “intellectual overmatch” in professional military education systems could be further bolstered if we also include applied research on professional military education itself. Doing so would allow the U.S. military’s schoolhouses to make evidence-based curricular, instructional, and even administrative decisions.
Applied research is actionable scholarly inquiry. Its experimental design follows the scientific method: posing a research question, sharing a hypothesis, testing that hypothesis, analyzing data, and communicating results. Applied research is rigorous and transparent in its methods, and because of this its findings are not only testable but also often reproducible. Applied research helps us to understand experiences, behaviors, and relationships beyond discrete, highly controlled variables and measured effects. Context is key. One has only to read RAND’s recent report on culture and competition across the military services to understand that no two professional military education institutions, classrooms, or (especially!) individual students are identical. Applied educational research honors plurality and diversity, and a variety of theories and methods can shape one’s approach to studying the military learning environment.
We can learn much from educational research by civilian partners, but there are limits to this research’s findings and recommendations in a military learning context. Further, it is unwise to relegate an understanding of what goes on within military learning environments solely to the analyses often seen as part of standardized institutional assessments like end-of-course surveys or student evaluations of teaching. These surveys yield reactionary data. This data, based on the first level of Kirkpatrick’s evaluation model, is not indicative of student learning. Rather, it just reflects students’ satisfaction with any given quality of a learning event (most commonly, teacher performance). Student evaluations of teaching have also been increasingly exposed as biased against educators identifying as women as well as racial and ethnic minorities.
While it is important for faculty and schoolhouse leaders to know if students are satisfied with their educational experiences, it is more important to know what and how they learned from these experiences. Asking students to rate their own knowledge of curricular concepts in end-of-course surveys is a shallow data point that is more helpful as a reflection activity than as a measure of learning. Instead, we can use applied educational research to capture students’ true demonstration of learning through their own behaviors and dialogue, including the quality and vocabulary of the questions they ask. In this way, applied educational research can supplement the data we are already capturing from experiential learning activities and formative and summative assessments like tests, practical exercises, and capstone requirements.
Professional military education scholars, administrators, and educational methodologists should incorporate frequent, methodologically rigorous interventions — or experiments — in the classroom to test educational strategies. After all, as Beth McMurtrie wrote: “The future of learning is not ‘trust us.’ The future of learning is ‘did it work?’”
Teaching as Research
Applied research on professional military education tests if, how, and why educational strategies are working. This understanding can ensure leaders and educators are not making decisions based only on “gut instinct” or anecdotal support but on actual observable evidence. In my work, I have often heard from experienced instructors that they can tell if a student has learned something because they “see it in her eyes.” They admirably refer to students’ “ah-ha moments” as their own pinnacle teaching achievement. However, many of these instructors cannot clearly explain the process that got the student to that ah-ha moment, or how they know the student is truly experiencing what the instructor thinks they are experiencing. The value of these educators’ good instincts and years of practice is not in question. It is also not enough. Making assumptions about students’ learning based on gut instincts is the equivalent of asking our students for their rationale behind a strategic decision and their response of “I just know.” Instructors would not accept this from their students and should not accept it from themselves. Doing so limits the professional military education community’s ability to teach and learn from each other in a clear, demonstrable, and actionable way. It is time for professionals in the military education system to take the opportunity to dig deeper into the science of learning.
The questions asked in applied educational research can take many forms. As the Carnegie Mellon Eberly Center teaches its faculty in its annual Teaching as Research workshop, these questions could include: “What does the process of student learning look like?”; “Does this process vary for subgroups of students?”; and “How does student learning change over time?”
The data yielded by these studies is invaluable and can be immediately applied to improving teaching practice. The concept of teaching as research, which honors the discipline of the individual educator (i.e., military history, political science, organizational psychology, etc.) within the context of the scholarship of teaching and learning, is already in place at institutions like Harvard, Johns Hopkins, Princeton, Vanderbilt, Yale, Massachusetts Institute of Technology, and the University of Michigan, to name a few. This type of research is ethically guided by and shared in professional organizations such as the American Educational Research Association, founded over 100 years ago. In professional military education, it is time to mobilize institutions’ already highly experienced and credentialed faculty to conduct their own classroom-based studies centered around the research questions that matter most to them, and in a rigorous, valid, and reliable manner with sound methodology and institutional support.
Some educators have made a start of this already. In the past several years our colleagues in professional military education have presented their applied research findings in venues like the Scholarship of Teaching and Learning Commons Conference and the Professional and Organizational Development Network Conference. For example, Lauren Mackenzie of Marine Corps University and Angelle Khachadoorian and Susan Steen of Air University jointly presented on “Metacognitive Strategies for Teaching and Assessing Military Students” at the former conference this year. Relatedly, Brandy Jenner of the U.S. Army War College presented our combined research on “Assessing Sense of Belonging in a Problem-Based Learning Environment” at the 2019 Higher Education Data Sharing Consortium.
Educators are also beginning to conduct and share applied educational research with colleagues across institutions and within their own schoolhouses as part of other institutional projects. For example, Kate Kuehn’s work with the Krulak Center on defining, observing, and teaching innovation at Marine Corps University has contributed to a partnership between that university, the Naval Postgraduate School, and the Naval War College. This partnership has resulted in an annual Innovation Summit. The dearth of professional military education-specific colloquia for applied educational research drove my creation of the Joint Professional Military Education Scholarship of Teaching and Learning Forum earlier this year. The forum is organized to facilitate focused dialogue on research findings, research in progress, and plans for classroom-based pilot experiments in areas like evidence-based instructional strategies, faculty development, and educational technology. The desire and need for this type of publicly shared scholarship is evidenced by the nearly 200 participants who registered to attend the forum, from approximately 30 different federal government agencies, professional military education schoolhouses, and civilian higher education institutions. Before the event was postponed due to COVID-19, we also received over 40 research presentation submissions on topics ranging from “A Quantitative Study of the U.S. Army Command and General Staff Officer’s Course Attendance and Student Resilience” to “Designing and Validating a Military Graduate Student Reading Assessment.” As another example, forum presenter Celestino Perez and I worked together to transform his initial work on strategic performance into a classroom-based study that measured students’ understanding of causal logic and ethical discourse.
Getting Started
One of the most important requirements to make applied educational research a success in professional military education is a shared innovative and investigative ethos within the studied institution. As Karen Peel discusses in her introduction to applied educational research, transparency and rigor in this research go hand-in-hand. Intellectual humility is required, in this regard, as the institution should be willing to take a hard look at itself in real time, reflected in observable and measurable student and faculty behaviors. The data might not be complimentary. Revealed barriers to better learning might not be quickly or easily fixed. Institutions may find, for example, that obstacles to a top-tier military learning experience do not include the usual subjects of funding or technology, but fixed mindsets and tired teaching practices. Institutional leaders who encourage innovation through protecting academic freedom and who are open to learning through experimentation and reflection will benefit the most from this type of research.
To ensure that applied educational research on professional military education is done well, leaders should also put their money where their mouths are. As the Joint Chiefs of Staff explained, “A world-class educational program is not an accident, nor does it come cheap.” Experimental research requires both tangible and intangible resources, from educational technology contracts for pilot experiments to a bench of professionally trained faculty and staff to collect and analyze data. Students’ attainment of learning outcomes should not be put at risk by educational experimentation. It is therefore vital to dedicate the proper resources and planning to designing these studies, to include the expertise of experienced researchers. Learning the craft of applied educational research takes time, and expecting professional military education faculty to undertake these critical discoveries without proper training and support is irresponsible.
Creative approaches will not necessarily require hiring a bevy of Ph.D.s in education, however. The U.S. Army War College has found recent success with a postdoctoral fellowship program funded by the Army War College Foundation. This program is not only an effective outreach and recruiting tool — exposing early career scholars to what might otherwise be the unfamiliar world of professional military education — but it also leads to actionable research done by recent doctoral graduates with fresh research skills. For example, Jenner’s research on the value of diversity in professional military education settings helped inform the decision to, for the first time in the institution’s history, conduct gender-blind student assignment into seminars for the resident education program in the 2020 academic year at the Army War College. As the academic year has closed out, the next step will be to explore how these seminar assignments may have influenced outcomes such as student grades, as one example. Faculty fellows programs are another option. Faculty fellows programs like the one in place at Stanford’s Institute for Research in the Social Sciences can enable faculty to conduct applied educational research through funded research time and educational resources like software licenses. This work is undertaken separately from their teaching responsibilities and recognized as part of their workload. These programs simultaneously capitalize on faculty’s research and disciplinary expertise in order to test new educational methods, tools, and initiatives. Competitively selected faculty fellows are experts in their own fields of study and also passionate about their own reflective teaching practice informed by the scholarship of teaching and learning.
With strong organizational support, this data can feed into cycles of curriculum development and faculty development, which can in turn positively affect student learning. With data on if and how educational strategies are working, instructors can customize learning experiences to individual students. To elaborate, different students could achieve the same learning outcomes as guided by the Officer Professional Military Education Policy, but their paths to learning might look different based on their incoming experiences and levels of proficiency, and even the way they communicate and interact with others. The same customization could apply to faculty. As an example, currently the U.S. Army War College teaches a standardized curriculum for 25 concurrently facilitated seminars in the Resident Education Program. This could be adjusted to a more flexible model that better honors and capitalizes on the curricular expertise and teaching abilities of individual faculty. When supported by the findings of applied educational research, this model appeals to faculty strengths and improves students’ incoming educational deficiencies.
Finally, to lead to real change, resulting analyses and findings of applied educational research should be shared publicly. As this type of research continues to take hold and develop throughout professional military education, so, too, should the accessibility of colloquia like the aforementioned Joint Professional Military Education Scholarship of Teaching and Learning Forum and publications like the Journal of Military Learning. World-class faculty share their discoveries and perspectives in their disciplinary fields of expertise, and they should do the same in the discourse of the scholarship of teaching and learning. By prioritizing, funding, publicly sharing, and applying the findings of applied educational research, professional military education institutions will take tangible steps toward achieving real intellectual overmatch.
Megan J. Hennessey, Ph.D., is the professor of educational methodology at the U.S. Army War College. She has held faculty and senior instructional systems designer positions at Marine Corps University, the FBI Academy, and the National Geospatial-Intelligence College, and earned a direct commission as an officer in the U.S. Navy Reserve. She earned a doctorate in higher education from George Mason University, and her research explores educational methodology and faculty development practices at professional military education institutions. The opinions expressed here do not represent those of the Army War College, the Department of Defense, or any part of the U.S. government.
Image: U.S. Air National Guard (Photo by Master Sgt. Mike R. Smith)