Rigor in Joint Professional Military Education

23749299883_e185c996ab_k

Policymakers have called for more rigor in the Joint Professional Military Education (JPME) system for over three decades — at least since the Goldwater-Nichols Act, the Skelton Panel of 1989, and more recently the 2010 Congressional report Another Crossroads? Professional Military Education Two Decades After the Goldwater-Nichols Act and the Skelton Panel. All of these efforts sought to “grow” more strategically minded critical thinkers.

Despite this, the record shows that holding JPME schools to rigorous standards has been extremely patchy at best. Why is this the case? The main problem is that Congress and the Department of Defense do not have a clear definition of what they mean by academic rigor. And though Congress has required the services to educate more strategically minded thinkers, and provided guidance on broad topic areas (outlined below) to support this, it has not done the same for academic rigor. Despite this, the services have claimed they are academically rigorous, and point to the joint learning objectives they established in the Officer Professional Military Education Policy (OPMEP). However, an examination of the stated learning objectives of the service schools indicates that the rhetoric is not matched by reality. The measures, as assessed in the levels of learning outlined below, used by the DoD and the services to establish and assess rigor within JPME are grossly inadequate. Unless this is dealt with there will be a never-ending debate over how to improve the level of rigor within the JPME system and, in turn, to help fix the problems of poor strategic decision-making that have plagued us for the last fifty or more years.

First, what is rigor? Rigor is often defined as strict judgment and is often measured, academically, through strict grading. Although this is useful, a better definition can be found in an approach used by Larry Rosenstock, an educational expert. He writes that rigor is not “about more content. … Rigor is about discerning among the avalanche of content that’s coming at you all the time. … It’s not about more complex content. It’s about deepening the quality of analysis” (The Global Achievement Gap, 209–210). This last point is crucial, particularly if the goal is to produce critical thinkers. Rigorous academic work should be analytically challenging and require students to evaluate information to create new understanding of the material. Evaluation standards should be high. Challenging does not simply mean more work or more time, but involves an intellectual challenge to a student’s abilities. Such a challenge might well require a lot of work on the part of the student, but the amount of work should not be the gauge of academic rigor. Building on Rosenstock’s definition, academic rigor can be defined as work that requires students to critically analyze information, evaluate it, and create new ideas. As such, this definition provides a starting point for this discussion.

Second, how do we assess rigor? Academic rigor is usually assessed by comparing a set of learning objectives for a course or a class against the cognitive metrics of Bloom’s Taxonomy.  Bloom’s Taxonomy is the standard taxonomy of words that aid the assessment of the level of cognitive learning that takes place in a course. The words, from highest cognitive learning level to lowest, are: creating, evaluating, analyzing, applying, understanding, and remembering. In other words, creating is the top level of cognitive learning and understanding the bottom level. These levels are used to assess the cognitive academic level of an individual college course or a degree program. For example, if a student is studying for a graduate degree in military operations and strategy, the course’s learning levels should, at a minimum, correspond with analyzing, evaluating, and creating. Thus, someone studying the WWII Allied campaign in Sicily would be expected to understand a broad range of contextual information relating to the campaign. They would then analyze the information and their understanding of it, before evaluating the protagonists’ options and/or creating viable alternatives. The same project at the undergraduate level might simply require a student to know and understand what happened, as well as some analysis. Ideally, as students move through their undergraduate education and into graduate work, the learning level increases correspondingly to become more academically rigorous. Students would be required not only to know, understand and apply, but also to analyze, evaluate, and create new ideas with the knowledge and analytical base they have developed over the course of their studies.

With this in mind, there seems to be a clear relationship between the level of a class (e.g. first-year undergraduate), the learning level, and the increasingly advanced cognitive level expected of students. Thus, at the first-year undergraduate level, most courses emphasize building a knowledge base with a focus on the cognitive learning levels of remembering, understanding, and applying, along with some analysis or synthesis. Typically, this takes the form of a survey class of a topic such as The British Empire where students would gain a broad understanding. As students progress, the classes become more specialized and learning occurs at a higher level. For example, one might see an advanced class on Revolutions in Iran which expects students to use comparative analysis, e.g. comparing Iran’s experiences with other revolutionary ones. A student’s grounding in the history of the British Empire might help to provide context for his or her analysis of Iran, given Britain’s routine interference in Iran’s internal politics. Thus, the higher cognitive learning levels build upon lower ones to become more rigorous as a student advances through the levels of education.

In terms of JPME, the above means that junior officers should be expected to understand strategy and its connection to their tactical mission, whereas mid-level officers should be analyzing the strategic consequences of a tactical, operational, or strategic plan, and they should be able to evaluate information in a way that allows them to identify and create alternate courses of action for their senior leaders. At the highest level, it is obvious that senior leaders must be able to do all of these things if they are to create effective plans to match the strategic goals of the republic. Indeed, the services have emphasized the link between the two by using Bloom’s Taxonomy in their overarching guidelines for JPME, and in their emphasis on developing critical thinking. Thus, the cognitive learning levels within JPME should closely mirror the pattern seen in the above examples from civilian schools. Unfortunately, they do not.

So, what do the services and Congress claim they want from JPME? The Goldwater-Nichols Act required that “[t]he Secretary shall require such schools [JPME] to maintain rigorous standards for the military education of officers with the joint specialty” (Section 663). In addition to Goldwater-Nichols, the Skelton Panel’s report provided broad educational guidelines for the JPME system to “nurture the development of strategic thinkers” and to this end, the report deemed critical history, international relations, political science, and economics. Furthermore, the Skelton Panel called for an emphasis on analysis, critical examination, and creativity (Skelton, 29-30).

The follow-on report, Another Crossroads (2010) reinforced these ideas and repeatedly referenced critical thinking right from the start. It pointed to deficiencies in the teaching of strategic thinking and called for them to “be addressed throughout an officer’s professional military education” (Another Crossroads, xii). With this in mind, Another Crossroads reemphasized that a mission of the Joint Staff was to lay out a “general educational philosophy” to specify “the learning objectives for each level of PME [JPME].” As such, the Joint Staff was to establish “broad educational standards for all PME institutions” (Another Crossroads, 11–12). The Joint Staff does this through the (OPMEP). This provides the overarching education objectives for the JPME schools: “Critical and reflective thinkers who broadly view military affairs across an array of academic disciplines are capable of identifying and evaluating likely changes and associated responses affecting the employment of U.S. military forces. Graduates should possess acuity of mind at the highest level, gained as a result of a continuum of lifetime learning.” These are similar in tone to Congress’s goals for JPME and so far they match. However, in terms of academic rigor, the actual learning objectives set out in the OPMEP do not match the education objectives. One key reason for this gap is that the Joint schools have a flawed approach to the assessment of academic rigor.

First, we must look at the standards themselves. The OPMEP uses a slightly modified version of Bloom’s Taxonomy to measure the levels of cognitive learning and to “define the JPME objectives.” It adds “synthesis” between evaluating and analyzing, and replaces “understand” and “remember” with “comprehend” and “knowledge,” respectively. Other than that, the two are essentially the same. For reasons of space, however, this paper has restricted its examination of rigor to the intermediate level of JPME, although the problems at this level also apply elsewhere.

An assessment of the course standards established in the OPMEP can provide a guide to the level of rigor claimed by JPME courses. For example, if one is studying strategy at the graduate level (which is equivalent to the JPME intermediate level and above), one should expect to have cognitive learning levels that match. Thus, we should expect to see analysis, synthesis, evaluating, and creating dominate the main learning levels, and possibly a few that are lower where students might simply need to know something. If the overarching goal is “acuity of mind at the highest level” one should expect to see higher learning levels such as analysis throughout the learning objectives set out in the OPMEP. Surely, such higher levels should predominate from the intermediate level and up.

The intermediate level of JPME Phase I — where most attendees will be O-4s, Army majors or their sister service equivalent — focuses on “warfighting within the context of operational art” and its mission is to “expand student understanding of Joint Matters.” Simply expanding understanding does not appear to match the overarching goal of “acuity of mind at the highest level” as set out earlier. Furthermore, the link to strategy called for by Congress is not mentioned as part of the mission at the intermediate PME level. Instead, the OPMEP shows six learning areas for intermediate-level PME, along with 31 cognitive learning standards. The learning areas identify broad subjects that are required by the Joint Staff. The learning areas are also broken into multiple learning standards which establish a cognitive learning level that is the objective for each learning area. Of the cognitive learning standards, however, only three are at the level of analysis. None are higher. At this level of PME, which has the goal of educating “critical and reflective thinkers” and fulfilling Congress’s call for PME to “nurture the development of strategic thinkers” one would expect to see most of the 31 learning standards be analysis, synthesis, evaluating, and creating. Yet in fact, 27 of the cognitive learning standards are comprehend, and one is apply. That is, the higher levels of cognition make up less than 10 percent of the learning objectives for this intermediate course. Although the services claim to be providing rigor, their own guidelines suggest the opposite. It is essential that officers comprehend what they are doing but comprehension is surely not the standard that should be applied at the level of a master’s degree when dealing with mid-career professionals.

PME schools will claim their learning standards already match or exceed those set out in the OPMEP. That does not, however, deal with whether the OPMEP’s learning standards, or indeed those claimed by the schools, are appropriate to start with. Schools might also claim that they are externally accredited and that this equates with rigor. But this ignores the obvious problem that accrediting agencies themselves have been criticized in Congress for their lack of standards and at least one civilian school remains accredited despite paying huge fines for education fraud. The schools will also likely argue that the Military Education Coordination Council (MECC) accredits them for JPME. That body, however, is in essence run by the schools themselves and is responsible for the current learning standards criticized here. Furthermore, ignoring the tautology, claiming that something is rigorous because it says it is does not meet typical standards of evidence.

Given the above, it is important for the reader to ask the following questions: What is the justification for the low learning levels being set as the standard for JPME? It has been argued that the schools are there simply as a check-the-box assignment and that to raise the levels of rigor would be problematic. If so, what is the actual purpose of PME? Last, which component of the Office of the Secretary of Defense, as called for in Goldwater-Nichols (section 663), is best equipped to assist and work with the Joint Staff and the MECC in order to elevate standards and add some outside scrutiny?

In conclusion, the guiding standards for academic rigor in JPME are low. Indeed, they are far lower than the standards demanded by Congress, or claimed by the services themselves. They are also low when compared to the equivalent level of civilian academia. This is not easily dismissed, as a recent Army University white paper points out: The civilian education “system produces high-quality critical and creative thinkers at a pace that makes them the envy of the world. Our goal is to blend the best of this proven civilian model with military education to produce the agile and adaptive leaders required by the Army Operating Concept.” In addition, low standards of academic rigor will not produce the types of critical and creative strategic thinkers required in an increasingly complex world. Only a revision upwards of the OPMEP’s standards will ensure the schools are heading in the right direction, and only a thorough review of grading and graduation protocols will ensure that officers are genuinely challenged to meet the standards of intellect needed. Civilian oversight, through the Office of the Secretary of Defense, is clearly required and has repeatedly been called for by Congress. Furthermore, the security of the republic demands the best, and many of our service people demand the best of themselves too. We must enable them to achieve this through increased rigor and higher standards. After all, Ranger School wouldn’t be Ranger School if everyone got a trophy.

 

Dr. Nicholas Murray is an instructor in the Department of Strategy and Policy at the U.S. Naval War College. He is a regular writer on PME and was awarded the Army’s Superior Civilian Service Award in 2014 for his work in service education. His views are his own, and do not represent those of the services or of the Department of Defense.

 

Photo credit: Chief Mass Communication Specialist James E. Foehl, U.S. Navy