03.10.13

Why The U.S. Is Not In A Cyber War

America needs to protect itself from growing cyber threats—but the language of military aggression is misleading, says Brookings’ Ian Wallace.

For several weeks, it has been difficult to open a newspaper or watch a Sunday talk show without hearing about the advent of “cyber war.”  The media has been filled with an avalanche of cyber threat-related stories: the hacking of leading newspapers, evidence of Chinese government involvement in intellectual property theft, and now, further distributed denial of service attacks against U.S. banks. All these events present real and serious national security challenges. But cyber-espionage, cyber-crime and the malicious disruption of critical infrastructure are not the same as war, and the distinction is important.

The idea that America is in the middle of a “cyber war” isn't just lazy and wrong. It's dangerous. The war analogy implies the requirement for military response to cyber intrusions. America genuinely needs effective civilian government cyber defense organizations with strong relationships with the private sector and the active engagement of an informed general public. Creating and even promoting the fear of “cyber war” makes that more difficult.  Here’s why:

First, while the U.S fights its wars using the highly-trained professional within the U.S. Armed Forces, defending against cyber threats does not necessary require military expertise or prowess. True, most private individuals and corporations lack the knowledge and training needed to fight off attacks from elite Chinese, Iranian and Russian cyber “warriors.”  As a result, there is and will continue to be a pressing need for highly qualified information security experts to help defend the larger U.S. cyber landscape. Nonetheless, there are relatively simple ways to make it more difficult for the bad guys without escalating to a “war” standing. In 2011, the Australian Defence Signals Directorate (their equivalent of the U.S. National Security Agency) showed that by taking just four key measures--“whitelisting” (i.e., allowing only authorized software to run on a computer or network), very rapid patching of applications and of operating system vulnerabilities, and restricting the number of people with administrator access to a system--85 percent of targeted intrusions can be prevented. These might appear more like prophylactic public health measures than warfare--and that’s the point. The United States does not need to declare “war” and call up the military to fend off cyber threats.

Second, people expect wars to end and when they drag on, often succumb to war fatigue. People want to believe that victory is achievable. Cyber security, however, is a mission without end. As a result, using the language of war may only serve to frustrate and mislead the public. The fight against cyber attacks will never achieve a definitive, all-encompassing, long-term victory. As more and different devices are connected to the Internet, the threat will continually evolve. While technological countermeasures will surely improve, cyber attacks will remain a very attractive means through which to coerce, defraud, and potentially even harm us as our lives grow ever more dependent on the Internet. The problem with ‘war’ terminology is that it may breed frustration and contempt, and eventually complacency and cynicism.  The growing use of sensational terms like “electronic Pearl Harbor”--which in particular evokes a horrific event that ended the lives of 2402 sailors, airmen, and civilians--becomes as much a part of the problem as the solution. Better analogies (and public policy) are needed to ensure that the public comes to ‘own’ this cyber security challenge as part of their daily lives.

The United States does not need to declare “war” and call up the military to fend off cyber threats.

The third problem with the war analogy is that it legitimizes expedients, especially institutional ones.  This goes to the core of the ongoing cyber legislation debate. An important point of difference between the advocates and opponents of the failed Senate Cybersecurity Act of 2012 was about the role that the National Security Agency (NSA) should play in information exchange with industry.  And while the recently relaunched House Intelligence Committee’s Cyber Intelligence Sharing and Protection Act CISPA is carefully worded to acknowledge the centrality of the Department of Homeland Security to its information-sharing process, concerns still remain. Internet advocacy groups like the Center for Democracy and Technology have argued that its provisions could weaken Homeland Security’s role in favor of more engagement between the private sector and the National Security Agency.  Whether that is true or not--and CISPA advocates deny it--there are still those in Congress who see “giving the problem” to the Department of Defense as part of the answer.

Now is not the time for expedients, however well intentioned. The NSA certainly has a key role to play; when dealing with overseas threats, it would be self-defeating not to utilize the capabilities of the world’s most impressive signals intelligence organization. Privacy concerns need to be balanced against the potential for extreme privacy loss when your data is spread across the web by cyber criminals or exfiltrated by foreign intelligence operatives. It is also unrealistic, both financially and practically, to create a parallel organization within the Department of Homeland Security. That is why President Obama’s recent Executive Order sensibly includes measures to widen the pool of organizations that can benefit from what the NSA knows. However, none of that means additional responsibility for America’s cybersecurity efforts should be put into military hands. What is required is a more effective DHS, not a more customer-focused NSA.

The quicker the country builds up the civilian institutional capacity it needs for long-term cyber security, the better. It would be unfortunate indeed if the specter of “cyber war” gave succor to those who favor further boosting the Pentagon’s and the Intelligence Community’s responsibilities at the expense (in practice, if not in theory) of a non-military security agency such as DHS. This is would be particularly true if the short-term effect was a continued block on the passage of much-needed cyber legislation.

However, this is not just a Congressional problem.  The Obama administration has also internalized the lessons from the last decade: in a time of “war,” it is relatively easy to get funding for the military to take on and complete a mission as opposed to building new civilian capacity to handle the job.  Just as it was with nation building in Iraq, so it is with cyber defense.  The reported plan to establish national mission forces under the military’s U.S. Cyber Command (which is tasked with protecting critical infrastructure) is an understandable bureaucratic response to a perceived need to “defend the nation.”  The problem comes if nothing more happens.  The challenge then becomes ensuring that the necessary cyber defense architecture and robust civilian government support shifts over to the private sector.  That will be difficult enough; the banging of war-drums will make it even harder to accomplish.

Not that the Defense Department and U.S. military should stay out of the cyber security business--quite the opposite. The fourth and final reason why we should be cautious in talking about cyber warfare is the risk that such imprecision leaves us ill-prepared to deal with the cyber elements of war when we do have to confront them. Director of the NSA and Commander of U.S. Cyber Command General Keith Alexander not only has to continue to supply U.S. leaders with top quality strategic intelligence, he must also ensure the United States is prepared to exploit cyber opportunities when the country does go to war. At the same time, General Alexander will need to ensure that U.S. forces’ extraordinary technological capabilities retain their edge in the face of the cyber attacks that will very likely target them whenever they next go into the field.  While General Alexander and his organizations will remain major contributors to any government effort to fend off serious national threats, we should also be mindful of the opportunity cost of making the NSA and the Cyber Command the “super Geek Squad” for the private sector and the nation more generally.  These organizations must stay focused on their primary missions--defending U.S. national security.

Rejecting the application of the war metaphor to cybersecurity should not diminish the current challenges faced by governments, the public, and cyber security professionals. However, when a real cyber war is declared, it will be messy and dangerous, and we need to be prepared, especially on the home front. That planning is best done deliberately, dispassionately and holistically.  Declaring “war” too early will undermine our efforts and likelihood of success.

Ian Wallace is a visiting fellow in cybersecurity at Brookings’ Center on 21st Century Security and Intelligence in Washington, DC.  He was previously a senior official at the British Ministry of Defence where he helped develop UK cyber strategy as well as the UK’s cyber relationship with the United States.