As an industrial nation transitioning to an information society with digital conflict, we tend to see the technology as the weapon. In the process, we ignore the fact that few humans can have a large-scale operational impact.

But we underestimate the importance of applicable intelligence, the intelligence on how to apply things in the right order. Cyber and card games have one thing in common: the order you play your cards matters. In cyber, the tools are mostly publically available, anyone can download them from the Internet and use them, but the weaponization of the tools occur when they are used by someone who understands how to use the tools in the right order.

In 2017, Gen. Paul Nakasone said “our best [coders] are 50 or 100 times better than their peers,” and asked “Is there a sniper or is there a pilot or is there a submarine driver or anyone else in the military 50 times their peer? I would tell you, some coders we have are 50 times their peers.”

The success of cyber operations is highly dependent, not on tools, but upon the super-empowered individual that Nakasone calls “the 50-x coder.”

There have always been those exceptional individuals that have an irreplaceable ability to see the challenge early on, create a technical solution and know how to play it for maximum impact. They are out there – the Einsteins, Oppenheimers, and Fermis of cyber. The arrival of artificial intelligence increases the reliance of these highly capable individuals because someone must set the rules and point out the trajectory for artificial intelligence at the initiation.

But this also raises a series of questions. Even if identified as a weapon, how do you make a human mind “classified?” How do we protect these high-ability individuals that are weapons in the digital world?

These minds are different because they see an opportunity to exploit in a digital fog of war when others don’t see it. They address problems unburdened by traditional thinking, in innovative ways, maximizing the dual-purpose of digital tools, and can generate decisive cyber effects.

It is this applicable intelligence that creates the process, that understands the application of tools, and that turns simple digital software to digitally lethal weapons. In the analog world, it is as if you had individuals with the supernatural ability to create a hypersonic missile from materials readily available at Kroger or Albertson. As a nation, these individuals are strategic national security assets.

For years, what these individuals can deliver was hidden in secret vaults and only discussed in sensitive facilities. Therefore, we classify these individuals’ output to ensure the confidentiality and integrity of our cyber capabilities. Meanwhile, the most critical component, the militarized intellect, we put no value to because it is a human. The marveled technical machinery is the only thing we care about in 2019 and we don’t protect our elite militarized brains enough.

Systemically, we struggle to see humans as the weapon, maybe because we like to see weapons as something tangible, painted black, tan, or green, that can be stored and brought to action when needed.

The Manhattan Project had 125,000 workers on the payroll at its peak, but the intellects that drove the project to success were few. The difference with the Manhattan Project and the future of cyber is that Oppenheimer and his team had to rely on a massive industrial effort to provide them with the input material to create a weapon. In cyber, the tools are free, downloadable and easily accessible. It is the power of the mind that is unique.

For America, technological wonders are a sign of prosperity, ability, self-determination, and advancement, a story that started in the early days of the colonies, followed by the Erie Canal, the manufacturing era, the moon landing and all the way to the autonomous systems, drones, and robots. In a default mindset, there is always a tool, an automated process, a software, or a set of technical steps, that can solve a problem or act. The same mindset sees humans merely as an input to technology, so humans are interchangeable and can be replaced.

Super-empowered individuals are not interchangeable and cannot be replaced, unless we want to be stuck in a digital war. Artificial intelligence and machine learning support the intellectual endeavor to cyber defend America, but humans set the strategy and direction.

It is time to see what weaponized minds are, they are not dudes and dudettes; they are strike capabilities.

Jan Kallberg, Ph.D., LL.M., is a research scientist at the Army Cyber Institute at West Point and an assistant professor in the department of social sciences at the United States Military Academy. The views expressed are those of the author and do not reflect the official policy or position of the Army Cyber Institute at West Point, the United States Military Academy, or the Department of Defense.

Share:
More In Opinion