The Pentagon’s new data, analytics, and AI adoption strategy focuses on data shareability, with the aim of better enabling all-domain command and control. But the biggest challenge, DOD’s chief digital and AI officer said, will be getting tech companies to work together instead of keeping their data to themselves.
“How do we get our industrial partners to work with us in a way where they help us build out this open standard data layer and the data that they provide isn’t locked up in a silo? That’s going to be our biggest challenge,” Craig Martell told reporters Thursday. “If we end up having providers continually locking data in silos and not in this data mesh that allows for free discovery and accessibility of the data? That then that’s going to be a blocker, so that that has to be a real challenge. We have to break through.”
But Martell said he feels confident after talks with some of the Defense Department’s biggest tech contractors, particularly those in the area of enterprise cloud.
“My conversations with all of the big folks, my conversations with Palantir, my conversations with Google, my conversations with Oracle, my conversation with Microsoft, have all been very fruitful. And I think everybody’s on board with a new way of thinking about this,” he said.
The strategy conceives of a new way for the Defense Department to produce and handle its data. “To improve DOD data quality across the enterprise, the department will develop and implement a decentralized network among data providers and users,” as opposed to allow it to be centralized in a single office or allow different service components or combatant commands to hold onto whatever data they feel is theirs.
That follows a path the Defense Department has been on for several years, through reforms to joint requirements and the 2020 publication of a new data strategy. But this new strategy makes clear that allowing other parts of the Defense Department to find data that might belong to others will be key to how the military pursues its connect-everything vision, combined joint all-domain command and control, sometimes called CJADC2.
“We’re integrating sensors and fusing data across every domain while leveraging cutting-edge decision support tools to enable high op-tempo operations,” Deputy Defense Secretary Kathleen Hicks told reporters Thursday. “It’s making us even better than we already are in joint operations and combat integration.”
But the data strategy is also the foundation of how the Defense Department will use AI tools, she said.
The Pentagon can’t just take impressive but unreliable generative AI tools like ChatGPT, a so-called large language model, and apply them to DOD missions—largely because the Defense Department can’t control the data that goes into those products. Generative AI programs work by synthesizing enormous bodies of data corpora—‚basically the entire open web (though some portions are more useful than others).
“Candidly, most commercially available systems enabled by large language models aren’t yet technically mature enough to comply with our ethical AI principles which is required for responsible operational use,” Hicks said. “But we have found over 180 instances where such generative AI tools could add value for us, with oversight, like helping to debug and develop software, faster analysis of battle damage assessments, and verifiably summarizing texts from both open source and classified data sets.”
Beyond flashy generative AI tools, faster and more transparent data will enable quicker operations at much larger scale.
One of the key lessons from the Ukraine conflict, Hicks said, is that having all your data findable and usable is the key to out-maneuvering your opponent.
“I think what we’ve seen play out in Ukraine is instructive for where the department in general is going, which we’ve got to have really good quality data and then you’ve got to take that decision-quality data and move it to the either operator, logistician, decision maker, and that’s what we’re doing here at DOD.”