Digital Transformation Is a Cultural Problem, Not a Technological One
“I don’t know what we mean when we say we’re ‘pursuing AI.’ Do you?”
“We don’t change to accommodate new technologies, anyway … We just shove them into our current paradigm.”
“I don’t even understand what we’re supposed to be doing right now!”
Twenty officers are seated around a table, mired in the discomfort of an “adaptive leadership” workshop. This framework, developed by Ronald Heifetz and colleagues at the Harvard Kennedy School, is designed to help organizations make progress on complex, collective challenges, known as “adaptive” challenges. Unlike “technical” problems, which can be solved with existing know-how, adaptive challenges demand learning and change — adaptation — from the stakeholders themselves.
Digital transformation presents an adaptive challenge for the Department of Defense. As long as the Department of Defense relies on painless, “technical” fixes — what Steve Blank calls “innovation theater” — America will become increasingly vulnerable to exploitation by foreign adversaries, costing both dollars and lives. To make progress on the challenge of digital transformation — and to maintain technological superiority — the Department of Defense should reexamine and reshape its deeply held values, habits, beliefs, and norms.
The officers in the workshop are an excellent example of a group wrestling with adaptation. As in many groups, they begin by looking outwards. One says, “It’s the ‘frozen middle’ that prevents us from doing anything digital,” while another adds, “Our higher-ups can’t agree on what they want, anyway. … What are we supposed to do?” The instructor nudges them: “It seems the group is shifting responsibility to anywhere but here. What makes it difficult to look inward?”
Next, the officers drift away from the challenge. They share stories of previous successes, appraise the instructor’s credentials, and joke about the workshop itself. Again, the instructor intervenes: “I notice we’re avoiding uncertainty. Can we stay longer in the nebulous space of ‘digital transformation’? Or will we escape the moment it’s not clear how to proceed?”
Begrudgingly, they return to digital transformation, but after a few minutes, they ask the instructor for help: “Are you going to chime in here, or …?” The instructor responds, “You’re depending on an authority — someone in charge — to solve a problem that can only be addressed collectively — by all of you.”
At this point, the room burns with frustration. But the officers can’t be blamed. Their moves to avoid adaptive work — diverting attention away from the issue and shifting responsibility for it elsewhere — are typical for groups confronting a difficult reality.
More specifically, in what Heifetz terms the “classic failure,” groups attempt to resolve adaptive challenges via “technical fixes”: painless attempts that apply existing know-how, rather than working with stakeholders to change how they operate.
Hiring someone, firing someone, increasing the budget, expanding the timeline, creating a committee, restructuring the org, building a new tool, pushing a new policy: These are all technical fixes, which, while not inherently harmful, are easier than — and can distract from — the internal work of reevaluating values, habits, beliefs, and norms.
Even now, the Department of Defense is attempting to address digital transformation through technical means. The Department of Defense has created the Joint AI Center, partnered with the Massachusetts Institute of Technology (MIT), and established the position of Chief Digital and AI Officer. These steps are not without benefit: The Joint AI Center has developed AI ethics principles and a new acquisitions process; MIT has produced valuable research and educational content; and the Chief Digital and AI Officer provides an opportunity to integrate across various technological functions. But these actions are not enough. In fact, they’re not even the most challenging steps.
The real obstacles to digital transformation are deep-seated norms and conflicting perspectives that exist across the entire organization. “How valuable are technologists, really? Should they be treated differently from others?”; “What about computers: Can we trust them to do our jobs as well as we do? If so, what will be the role of humans afterward?”; and perhaps most importantly, “How do we move beyond simply articulating new standards to actually living them?” These are hard questions that affect the Department of Defense’s objectives, strategies, and tasks at every level — but answers will be earned only through discussion and experimentation across the defense ecosystem itself.
Back in the workshop, at least, the officers have made a breakthrough. Toward the end of the session, the instructor says, “I feel a sense of sadness in the room. Does anyone else feel that?” Predictably, everyone shakes their head — admitting sadness feels like admitting failure — but then a major speaks up: “I’ll bite. Yeah, I do feel sad. This just feels overwhelming. If we can’t depend on our commanders to get this done …” He pauses. “I have no idea how we’re going to do it. Especially when we’re told to just keep our heads down all the time. It feels hopeless.”
The major’s comment is the most honest moment the group has seen, and the shift in the room is palpable: An hour prior, the officers were hardly aware of their own duty to generate adaptive work, and if they were, they did not appreciate its weight. Now, they are coming to terms with this responsibility, and they are doing it publicly — vulnerably — where the whole group can learn from individual experience. This shift is the stuff of real change.
The truth is, no one knows how a digitally transformed Department of Defense will operate. But no one will find out without the collective process of trying, failing, and learning. The Department of Defense should therefore become comfortable learning through experience — gathering data through discussion and experimentation — and publicizing that learning across the organization. And while the Department of Defense has good reasons for maintaining a risk-averse culture, avoiding learning creates its own set of risks. The world is changing, and America’s adversaries are improving their capabilities. We cannot afford to wait for our enemies to make clear that they’ve surpassed us.
Officers can take three actions to make progress on digital transformation now.
First, officers should generate and run low-risk experiments: actions that will produce learning for the future, not actions that will produce success based on today’s metrics — who knows whether those metrics will be relevant post-transformation? For example, at the Department of the Air Force– Massachusetts Institute of Technology Artificial Intelligence Accelerator, we have experimented with multiple forms of educating servicemembers, from live lectures and online courses to interactive exercises and project-based workshops. When an experiment produces failure, so be it: Failure is the primary ingredient of learning.
Second, officers should surface as many perspectives on digital transformation as possible. Who balks at digitization? Who supports it? Why? And what’s the wisdom in each perspective? If everyone is part of the problem, everyone should also be part of the solution — even if it means engaging people across boundaries in a way the Department of Defense has never done before.
Finally, officers should prepare those around them for a prolonged period of ambiguity, where operational reality dictates that those in charge will be unable to answer critical questions. This serves two purposes. First, it helps to manage expectations, so those in positions of authority can resist the pressure of providing answers where none exist. Second, it empowers those without authority to run their own experiments — to try something new and to fail — and report back on what they learned.
Ultimately, transforming a system requires transforming the people within it. If the Department of Defense is seriously committed to digital transformation, everyone should be engaged in the uncomfortable and personal process of change. As the work continues, both the organization and the people within it will find themselves better equipped to handle new and challenging realities.
The workshop, meanwhile, closes on a note that applies across the Department of Defense: “This moment demands courage. Try better. Fail better. Learn better. One day, you’ll look back and see that you’ve transformed.”
Brandon Leshchinskiy is an AI innovation fellow at the Department of the Air Force-Massachusetts Institute of Technology Artificial Intelligence Accelerator, where he has taught over 600 servicemembers, including over sixty generals, admirals, and senior executive service members, about AI. He also works with Ronald Heifetz and others at the Harvard Kennedy School, where he has coached over 50 students, ranging from young professionals to senior executives, on complex, collective challenges.
Andrew Bowne is an Air Force judge advocate and the chief legal counsel of the Department of the Air Force-Massachusetts Institute of Technology Artificial Intelligence Accelerator. He is also a Ph.D. candidate at the University of Adelaide examining the nexus of national security and AI, focused on the role of industry. He has published numerous articles and book chapters, including national security, security cooperation, contract law, rule of law, machine learning, and intellectual property.
The views expressed are those of the authors and do not reflect the official guidance or position of the U.S. government, the Department of Defense, or the U.S. Air Force. Further, the appearance of external hyperlinks does not constitute endorsement by the Department of Defense of the linked websites, or the information, products, or services contained therein. The Department of Defense does not exercise any editorial, security, or other control over the information you may find at these locations.
Image: U.S. Army