What Churchill Can Teach Us About the Coming Era of Lasers, Cyborgs, and Killer Drones

Or why the Pentagon needs to think big and look to the past in order to prepare for the chaotic technowars of the future.

BY P.W. SINGER | OCTOBER 22, 2013

There's a famous (though, as with all great quotes, perhaps apocryphal) line attributed to Mark Twain that is often quoted as a guide to world leaders: "History doesn't repeat itself, but it does rhyme."
With that quote in mind, for the last year I've been taking an informal poll of the joint chiefs who lead the U.S. military, asking each of them what period in history they think provides the most apt parallel to today. Interestingly, every single one of them has answered the same: the early 1990s, when the United States sharply pared back its military spending and drew down the personnel size of its armed forces following the collapse of the Soviet Union. These experiences were both painful for the military of that time (side note: most of the joint chiefs were midcareer officers at that time) and in many ways haunted the military a decade later in Iraq and Afghanistan, when the force had to be re-expanded as well as regain many skills and technologies that had atrophied in a procurement holiday.
With similar worries about force cuts and the utterly unstrategic nature of sequestration-fueled budget slashing, the 1990s are certainly an apt parallel, and indeed one widely shared in the defense establishment. But I fear it may fall prey to another one of those mistakes that recur in history. That is, in looking for rhymes, we too often turn to the songs we know best, not the tune that might be a better fit.
History is longer than our own past -- and sometimes the most important lessons must be drawn from beyond our personal experience. Today's tough budget times will lead to military cuts like those that occurred in the 1990s, but the United States is in the midst of a fundamentally different context, both politically and technologically. If there is a natural historical comparison, it is not to the end of the Cold War, but the period surrounding World War I. But it's not Wilsonian America of which I speak -- today, the United States has assumed a role parallel to that of Britain, the last great power, whose time at the top was just coming to an end.
Like Britain back then, today's United States has global responsibilities but also global burdens. For those who would claim the United States today is a mirror of back then, owning the Philippines and having an army one-third the size of Bulgaria is not the same as having roughly 800 bases in 156 countries and half the world's military spending. And like Britain and its colonial wars of that period, while the U.S. military is similarly engaged in a series of conflicts around the planet, the Boer wars and Afghanistans of the world remain at the level of what back then would have been called "small wars": tough, painful, and exhausting, but not existential threats. There is breathing space for strategic assessment.
Part of this assessment must face the fact that, from a geopolitical standpoint, the United States is what scholars call a status quo power. The term cuts against Americans' popular self-image of the United States as a positive change agent in the world, but the reality is that Americans like the shape and structure of the current geopolitical order, as it serves U.S. interests relatively well. After all, the United States sits atop a system that it had a large hand in designing -- first in the wake of World War II and then at the end of the Cold War.
Yet, much like the British during the last century, the essential strategic challenge for the United States is trying to hold on as the world changes rapidly around it. The parallels extend from the challenge of dealing with rising great powers like China (it is not without coincidence that studies of imperial Germany's rise are of deep interest in Chinese policy circles), to the loss of competitive edges in economic power and innovation, to the looming loss of the dollar as the world's reserve currency.
But while it can't often be included in actual planning documents, strategists must not only take into account the underpinnings of power, but also the will to retain it. Much as then, there is a growing isolationism among the public that the foreign-policy elite had better heed. Just as isolationism wasn't an issue solely limited to the rise of the nascent Communist Party in Britain, today's arguments against adventures abroad is not solely a Tea Party phenomenon. Nor is it a flash in the pan. What we saw play out in the Syria vote, where the traditional coalition for the use of force was being eaten away from within, having lost key wings of both parties, will only continue.
A few years back, I carried out a survey of more than 1,100 millennial leaders and found roughly twice the level of resistance to foreign entanglements among the next generation of American leaders (with higher levels among young Democrats) as among the current generation of baby boomers -- not unlike how the youth of Britain began to question the unquestionable empire in the 1920s. The young isolationists of the famous 1933 "King and Country" Oxford pledge had a "deep disgust with the way in which past wars for 'King and Country' have been made, and in which, they suspect, future wars may be made," as the Manchester Guardian put it. It echoes today, but just like their resistance to greater intervention abroad may have allowed poisons back then to fester, so too is today's brewing isolationism no guarantee of an actually safer world.
Yet, for military planners and strategists, what may matter most, above the geopolitical shifts in power -- and attendant attitudinal shifts -- is the remarkable technological shift lying behind all this change as both a causal factor and exacerbating factor.
It has become vogue for leaders to argue that one of the lessons of the last decade of war is that "technology doesn't matter in the human-centric wars we fight," as one four-star general put it to me not too long ago. But that assumes a definition of technology as the exotic and unworkable. To paraphrase the musician Brian Eno's remarks citing inventor Danny Hillis, technology is the name we give to things that we don't yet use every day. When we use it every day, we don't call it technology anymore. Whether it is a stone or a drone, it is technology, a tool that we apply to a task.
These tools though are shifting, and with it come the kinds of questions not faced by strategists trying to stay on top of a geopolitical order since almost a century back. Much as the submarine, tank, and airplane created massive disruptions both on and off the battlefield in the decades surrounding World War I, a new series of science fiction-like technologies have recently become real and, like back then, are in the midst of shaking up the system of war.
For the last year, I helped organize and lead a Pentagon project called "NeXTech." Run by the international consulting firm Noetic and involving partners that ranged from the Army War College to the Naval Academy, its goal was to try to figure out if there are any similar "game-changers" that might loom for war and strategy, akin to the crazy new technologies back then of flying machines and underwater boats. It involved everything from interviews with the scientists and investors who will create and pay for the tools of the future, to war games with soldiers and experts from multiple countries, organizations, and generations. It even included an "ethics war game," not an oxymoron, but rather a unique gathering of philosophers, lawyers, human rights activists, and policymakers to discuss the difficult new ethical and legal dilemmas that only truly revolutionary technologies yield.
The results are startling in their range. Autonomous robotics -- from large drones that can now perform the most difficult human pilot tasks (see: the X-47 that recently made a successful landing on an aircraft carrier) to tiny systems the size of swarming insects -- are not just moving humans farther away from the point of action in war geographically, but they are also moving them farther away chronologically. Key decisions are being made by software developed months or even years beforehand.
Add to that a growing dependence on big data and the so-called "Internet of things." In a world on its way from a few billion devices, each with a human behind them, to one with 75 billion networked together, our computers and the various infrastructures they power will increasingly gather data, communicate about it, and, most importantly, make decisions -- all without human instruction.Imagine the way an escalator turns on just before you step on it, or how your car communicates with your house to let the smart thermostat linked to the smart power grid know to change the temperature when you're 10 minutes away from home. Now imagine an entire battlefield terrain that operates that way, literally reactive, constantly changing, but also woven with vulnerabilities (we've already seen hacking of these).
New weapons are also changing the battlefield. Directed energy systems (aka lasers) are being deployed for use on Navy ships and missile defense, marking the first time weapons have employed something other than kinetic force. Meanwhile, 3-D printing is allowing a bit -- a computer design -- to be turned into an atom, a thing (be it a car part, a gun, or even a drone). The ability to not just prototype more rapidly but also manufacture on site and on demand represents a massive disruption to the defense economy, one that rivals the impact of early assembly lines.
Finally, human performance modification (HPM) technologies are using hardware and chemical technology to change our physical and mental capabilities. We already use what we used to think of as science-fiction technology to replace what has been lost. Whereas Luke Skywalker had a robotic hand, now we have the bionic prosthetics that have allowed victims of improvised explosive devices to go back into combat units even after losing an arm or leg. Consider that 10 percent of the U.S. population now has a pacemaker, drug delivery system, or some other medical device embedded inside their body. Increasingly, we will use technology not merely to replace but to enhance. Humankind has evolved from using fists and stones to guns, bombs, and soon lasers and cyberweapons in our wars against one other. But the frail human body remains fundamentally the same. HPM is about changing that fact, encompassing everything from technological hardware implants to chemical effectors that extend stamina, focus, even learning ability.
Much as H.G. Wells's concepts like the "land ironclad" (which Winston Churchill renamed the "tank") or the "atomic bomb" must have seemed to leaders in Britain a century back -- not merely science fiction, but almost magical -- these technologies are often ridiculed. Indeed, when Arthur Conan Doyle published a short story titled "Danger!" before the start of World War I warning about undersea threats, the British Admiralty went public to mock his concept that the new technology of submarines might be used to carry out a blockade of the island nation.
Much as then, though, these shifts are real, and they are part and parcel of sweeping changes that impact the where, when, how, and even who of war. Still, especially given the overreach of acolytes of network-centric warfare during the last 1990s drawdown (who argued that technology would somehow solve all our problems, "lifting the fog of war"), it must be noted that none of this new technology changes the why of war -- our human flaws and mistakes still drive conflict, whether it is fought with a stone or a drone. Nor does it mean that we can ignore the historic lessons of war. War will never be perfect. Indeed, when military aircraft gained widespread adoption in the 1920s, a new breed of thinkers (or false prophets, depending on what military service you are from) like Gen. Billy Mitchell and Gen. Giulio Douhet claimed that there would be no more need for old ground armies. Yet the need for "boots on the ground" lived on throughout the 20th century -- just as it will live on into the 21st.
Such caveats are not to say that the new technologies like the tank or the airplane weren't fundamental shifts in the last century or that the new technological advances should be ignored in ours. If the United States wants to hold on to its grip on the top, just spending more is no longer doable, nor the right answer. Much as both military and civilian leaders in the British Empire had to rethink certain assumptions about war, our old assumptions need to be re-examined today.
These arguments are uncomfortable, of course. And it is to be expected that necessary change will inevitably be resisted, sometimes for valid reasons, sometimes for reasons that have nothing to do with battlefield performance. For instance, the British not only invented the tank and used it successfully in World War I, but they carried out a series of innovative tests during the interwar years on the famous Salisbury plain that showed just how game-changing tanks could be in the next conflict. Yet the British veered away from fully adapting to the Blitzkrieg concept they arguably birthed, largely because of the consequences that implementing it would have had on the cherished regimental system that was at the center of British military culture. This was not just a British phenomenon; as late as 1939, the head of the U.S. Cavalry, Maj. Gen. John Knowles Herr, declaredthat "not one more horse will I give up for a tank."
One can see eerie parallels today. Discussions in Washington about the F-35 strike fighter -- a plane conceived in the 1990s whose massive budget threatens to strangle a new generation of unmanned systems at birth (including shaping recent decisions to deliberately lower their capabilities, at their very moment of triumph in testing) -- are as much about identity as anything else. Likewise, it is not merely tactics or operations, but fundamental organizational questions, that are at the center of what lies next in the realm of cyberwar. Much as our forebears in the last century had to figure out battle in the new domain of air, operations in the new cyber-domain were first handled by technical units in the traditional services, and today many are arguing that cyberwar should become its own independent service.
Today's technological game-changers also provoke fundamental questions about strategy and its cross with ethics. Indeed, for Britain back then, the use of these new technologies started out in the hinterlands and only later struck close to home. For example, the British saw the new technology of aircraft strikes as a godsend for the empire, offering a cheaper way to police the extremities of rule in an age of austerity and offering asymmetric capabilities and lowered casualties, while lowering troop commitments. Indeed, the Royal Air Force pilots who dropped bombs in Somaliland, Mesopotamia, and the North-West Frontier back then would recognize not just the very same locations, but also the underlying philosophy of modern-day drone strikes. Of course, the use of aircraft in punitive strikes grew more complex for the British a few years later when the technology proliferated.
Admittedly, no historical parallel is exact -- hence Twain's notion of a rhyme. Unlike the last interwar years, today's security environment is peopled by a wider set of actors, which face lower barriers to entry. (There are 87 countries with drones and more than 100 with cybermilitary skills.) And this is just to count states. The so-called Syrian Electronic Army, which has bedeviled the websites of everyone from the New York Times to the U.S. Marine Corps, is not an actual army, but a collective of pro-Assad hackers reportedly led by a 19-year-old. An organization like the Lebanese militant group Hezbollah couldn't build or even operate previously dominant battle platforms like a battleship or an aircraft carrier, but it can -- and already has -- used drones.
Also unlike the last interwar period, the civilian side is racing ahead of the military side, not just by developing technology more rapidly but also by coming up with more innovative uses. Militaries for the last several generations played a central role in fostering new technologies that would spin out to the civilian sector, whether in early jet engines, atomic energy, or computers. Today, their descendants' challenge is instead how to "spin in" civilian technology, be it IT systems or robots. From the seeds of DARPA research, Google and Volkswagen are now working on self-driving cars to match the opening up of the civilian airspace to drones in 2015. If we don't watch out, a few years from now young service members may well join the force and find that, much as already happened with computers and smartphones, they will have better, and cheaper, gear at home.
What this swirl of change means is that the continued focus in D.C. defense circles on traditional approaches to force and resource planning is simply not enough. In this time of strategic and technologic shift, we should not think small, consuming ourselves with which budget line item to tweak to mitigate sequestration or how many hundred staff jobs to cut to shave personnel costs. Instead, if we ignore dealing with the big shifts and asking the resultant questions today, we will be setting up ourselves for failure in the geopolitics and battlefields of tomorrow.
As with all things of the last interwar period, and today, Churchill may have said it best: "Want of foresight, unwillingness to act when action would be simple and effective, lack of clear thinking, confusion of counsel until the emergency comes, until self-preservation strikes its jarring gong -- these are the features which constitute the endless repetition of history."

Comments