Reframing Rigor for Senior Service Colleges

Army War College

In 2004, Maj. Gen. James Mattis wrote that he found both comfort and professional guidance in his stalwart habit of reading. “It doesn’t give me all the answers,” he wrote, “But it lights what is often a dark road ahead.” Mattis went on to explain the power and military advantage to be had in using reading to inform officers’ agile heuristics, versus the danger in regimented thinking processes that cannot flourish in adaptive environments. He describes going “deeply” into books and how reading can be a tool for coaching. What he does not do is suggest that reading be anything other than its own inherent reward when it comes to an individual officer’s education and development, and he certainly does not prescribe a reading load for military leaders. In my professional role as an educational methodologist at the U.S. Army War College, I have seen that as senior service colleges seek to operate in a culture of assessment fostered by accreditation requirements and accountability, reading load has mechanically become the proxy for a rigor metric that can be easily allocated and quantified.

Faculty and administrators at the U.S. Army War College are constantly asking the question, “Is our curriculum rigorous enough?” The debate over definitions and qualifications of rigor in professional military education is not a new one, as explored by U.S. Naval War College faculty member Nicholas Murray and, in response, Marine Corps Command and Staff College’s James Joyner. Scholars and practitioners have examined this issue as it relates to both enlisted and officer professional military education, including intermediate level education and senior service colleges. Echoing Gen. Raymond Odierno’s rationale for founding Army University as a way to intensify rigor, Chief of Staff of the Army Gen. Mark Milley has repeatedly associated operational readiness with rigor in training and professional military education and advocated for increased rigor in “leader education and development systems.” The 1989 “Skelton Report” is often the framework for these discussions of rigor and its application in military learning models, and the report synonymizes rigor in professional military education as “(1) a challenging curriculum, (2) student accountability for mastering it, and (3) established standards against which student performance is measured.”

The efforts of professional military education institutions to operate within this framework and define and measure rigor in an observable way have had mixed results. However, Milley has repeatedly insisted that reading is a professional responsibility and the pressure on school leaders throughout the Army to increase the weekly page-count of students’ required readings is therefore trickling down from the chief of staff of the Army. Relatedly, the Command and General Staff College now offers speed reading courses as remediation for students who score under a certain threshold in their incoming reading rate, vocabulary, and comprehension diagnostic. The Command and General Staff College uses the Nelson-Denny Reading Test for this diagnostic, a standardized test that was originally targeted toward high school and undergraduate audiences. The Army War College also piloted the test this academic year. While the test has been normed externally for validity and reliability, the age-based normative data correlates to test-takers ages 14 through 24. This, obviously, is not reflective of the age range of either students at the Command and General Staff College or the Army War College. While grade-based normative data is also available, it is irrelevant for students at the senior service college level. Some senior service college students arrive at the beginning of the academic year already having earned multiple master’s degrees and, in some cases, doctorates. As such, there is no homogenous educational grade even within their own peer group. Without reliable normative data for the senior service college population, the value of the test is questionable. Comparing Army War College Nelson-Denny test results to those at the Command and General Staff College is also empirically irresponsible, as we know that age differences contribute to varied perceptual spans during reading (which affect reading rate) and, perhaps predictably, ocular degeneration in timed spatial and nonspatial reasoning tests.

 

 

Even if professional military education institutions could administer a psychometrically valid and standardized test of reading rate (and implement associated remedial training) to keep pace with the proposed increased reading load, using reading load as a metric for rigor is still unwise. Indeed, the journal New Directions for Higher Education recently dedicated a special issue to this topic. The journal’s editor Corbin Campbell and co-authors Deniece Dortch and Brian Burt put it best: If we anchor our definition of rigor on reading load without paying due diligence to increased time for reflection, analysis, and collaboration, we send the message that we prioritize the consumption of information as foundational to the student experience, versus producing, interpreting, and, when appropriate, acting upon information. A reading load model of rigor also disadvantages diverse segments of the student population and perpetuates an achievement gap. Cognitive competence aside, increasing curricular demands on students and assuming they all have equal demands on their time, equal access to resources, and equal responsibilities outside of the classroom is akin to Sheryl Sandberg’s argument that women can overcome external barriers to upward mobility by simply leaning in.

A better and more useful understanding of rigor will focus not on increasing inputs such as reading load or even contact hours, but instead on refining the military learning environment to support inquiry-based learning and to allow, as General Mattis suggested years ago, an opportunity to read and think deeply. The Army War College is taking steps in the right direction with its team-based Integrated Research Projects, and Celestino Perez rightly makes the case for performance-based inquiry in the seminar room. Perez and I are currently implementing a problem-based learning capstone pilot, wherein students work in groups to solve a contemporary problem before briefing their findings to an authentic audience of senior leaders. The pilot gives faculty the opportunity to not only observe small group dynamics in classroom-based research and decision-making contexts, but also to encourage students to translate their knowledge of strategy into real and usable products for policymakers.

Pilots such as this help make the case that for strategic purposes, professional military education should straddle both the skills-based policy voiced in the 2006 Spellings Commission report, Charting the Future of U.S. Higher Education, and the vision of intellectually developing an informed citizenry voiced in the 1947 Truman Commission on Higher Education. Marrying both approaches allows for the professional and personal development of students who can teach others and engage in a global society via the employment of 21st-century skills. Indeed, to enhance rigor in professional military education, faculty should actively incorporate teaching others as a pedagogical measure of performance and learning outcome attainment. This is a relational (versus transactional) learning approach that can be honed through inquiry-based curriculum design that includes guided questioning, problem- and project-based learning, and purposeful and multimodal course materials.

What is the cost? Time. Critical reflection, analysis, discourse, and performance require students’ time both in and out of the classroom. This is time that might not be spent reading increased page counts per lesson or per course, but should nonetheless be factored into curricular requirements and protected from overscheduling by faculty and administrators. This framing of rigor also requires serious preparation on the part of faculty, who must adopt the role of — simultaneously — facilitators and disruptors, and who must, along with the institution, value formative assessment just as much as, if not more than, summative assessment. Some schoolhouses throughout the intelligence community are already leading the way in fostering transformative learning environments that appeal to students’ internal motivation and encourage peer-learning and communities of practice. If rigor is to be measured at the senior service college level, let it be measured by these and similar opportunities, not by static and well-intentioned but misaligned inputs of reading load. Similarly, let rigor be measured by the faculty’s preparation as well as by that of the students, as both are equal catalysts for learning in the seminar room and beyond.

 

 

Megan J. Hennessey, Ph.D., is the Professor of Educational Methodology at the U.S. Army War College. She has held faculty and senior instructional systems designer positions at Marine Corps University, the FBI Academy, and the National Geospatial-Intelligence College, and earned a direct commission as an officer in the U.S. Navy Reserve. She earned a doctorate in higher education from George Mason University and her research explores educational methodology and faculty development practices at professional military education institutions. The opinions expressed here do not represent those of the Army War College, the Department of Defense, or any part of the U.S. government.

 

Image: U.S. Army War College photo by Megan Clugh and Thomas Zimmerman