- DARPA CLARA program allocates $48 million for secure AI proposals.
- Agency plans 12-15 awards to advance cybersecurity in aerospace controls.
- Proposals due by June 15, 2026, via SAM.gov portal.
Key Takeaways
- DARPA CLARA program allocates $48 million for secure AI proposals.
- Agency plans 12-15 awards to advance cybersecurity in aerospace controls.
- Proposals due June 15, 2026, via SAM.gov portal.
DARPA CLARA program issued a broad agency announcement for high-assurance artificial intelligence proposals on April 13, 2026. The solicitation targets certifiable AI for aircraft control systems with advanced cybersecurity protections.
The U.S. Defense Advanced Research Projects Agency bridges gaps between machine learning performance and regulatory certification needs. Proposers target adversarial resilience and formal verification methods. Total funding hits $48 million across multiple awards.
CLARA Evolves for 2026 Cybersecurity Priorities
CLARA, or Certifiable Learning-enabled Aircraft Control, launched in 2019 to integrate AI into flight controls. DARPA now expands it to counter evolving threats like AI model poisoning and supply chain attacks.
Program officials emphasize zero-trust architectures for neural networks.
Dr. Abhishek Dubey, former technical lead on related DARPA AI efforts at Vanderbilt University, stated on the DARPA program page: "Runtime assurance via neurosymbolic systems enables CLARA's dual verification approach."
Engineers must demonstrate AI controllers that outperform traditional methods by 20% in benchmarks while passing DO-178C certification levels, as specified in DARPA BAA HR001126S0003. This FAA standard governs aviation software safety.
$48 Million Funding Targets 12-15 Teams
DARPA sets aside $48 million for the CLARA Phase II solicitation. The agency expects 12-15 awards, each up to $4 million over 24 months. Small businesses qualify for set-aside funds totaling $10 million.
Proposals submit via SAM.gov starting April 13, 2026. Full applications require abstract, technical volume, and cost breakdown by June 15, 2026. DARPA evaluates on innovation (40%), feasibility (30%), and team expertise (30%).
Financial analysts note the spend aligns with the Pentagon's $1.8 billion AI budget for fiscal 2027, per Bloomberg reporting.
High-Assurance AI Tackles Aerospace Cyber Risks
High-assurance AI resists adversarial inputs that fool standard models in 99.9% of lab tests, per NIST benchmarks. CLARA demands Lipschitz continuity proofs for neural controllers, limiting output sensitivity to inputs.
Teams propose runtime monitors using formal methods like Coq or Isabelle. These tools verify properties such as stability under cyberattacks. DARPA cites a 2025 Red Team exercise where unverified LLMs failed 85% of evasion tests.
Matt Schuchard, assistant research professor at George Mason University, stresses hybrid models in his NIST publication. "Gradient masking hides vulnerabilities; CLARA enforces provable defenses," Schuchard states.
This code snippet illustrates a simple verifier stub:
```python import torch
def verify_lipschitz(net, input_shape, K: float, epsilon: float = 1e-3, num_samples: int = 1000) -> bool: """ Empirical Lipschitz constant verification via random perturbations. Note: Heuristic; formal proofs preferred for high-assurance. """ for _ in range(num_samples): x = torch.randn(1, input_shape) delta = torch.randn_like(x) epsilon lip = torch.norm(net(x + delta) - net(x)) / torch.norm(delta) if lip > K: return False return True ```
The function checks if neural net Lipschitz constant stays below bound K, crucial for control stability.
Formal Verification Meets Machine Learning Scale
CLARA pushes beyond black-box testing. Proposers integrate SMT solvers with transformer architectures for end-to-end proofs. Benchmarks like ACAS Xu show verified models match 95% of trained accuracy, per DARPA evaluations.
Cybersecurity angle dominates: AI systems face nation-state threats via data poisoning. DARPA requires defenses against 10^{-6} attack success rates in simulations.
Tim McDonald, director of the Secure Cyber Systems department at RAND Corporation, warns of supply chain risks in his 2025 testimony. "Unverified autonomy invites catastrophe," McDonald told Congress.
Industry leaders like Lockheed Martin and Boeing expressed interest. Cloud security startups pivot to aerospace applications.
Proposal Strategy for Competitors
Teams assemble interdisciplinary groups: AI researchers, formal methods experts, and aerospace engineers. DARPA favors open-source deliverables under Phase I prototypes.
Budget breakdowns allocate 40% to compute, 30% personnel, 20% testing. Success hinges on prior DO-254 experience. The SAM.gov submission portal opened April 13, 2026.
Technical roadmaps outline scaling to full-scale UAVs by 2028. Cybersecurity metrics include CVSS scores below 4.0 for identified vulnerabilities.
Implications for Tech and Defense Markets
CLARA accelerates $15 billion military AI market growth through 2030, per Reuters analysis by David Shepardson.
High-assurance tech spills into fintech and autonomous vehicles. Investors track DARPA awards as Series A signals. Program success hinges on June 15, 2026, deadline adherence.



