in ,

Stalker UAS Now Included in DIU Blue List

Redwire subsidiary Edge Autonomy's Stalker UAS secured an ATO and is now included in the DIU Blue List
UAS Blue List inclusion

Redwire revealed that Edge Autonomy’s Stalker uncrewed aerial system has secured an authority to operate, or ATO, and is now included in the Defense Innovation Unit’s Blue UAS List.

The aerospace and defense technology company said Monday the Stalker UAS’s inclusion in the DIU Blue List means it passed cybersecurity tests, National Defense Authorization Act compliance and operational requirements, making it available to government agencies and operational units.

What Is The Stalker UAS?

Stalker is a UAS developed by Edge Autonomy, a subsidiary of Redwire. It is designed for conducting intelligence, surveillance and reconnaissance missions in different environments. In the past twenty years, the Stalker has been utilized for various global flight missions.

“The Blue List selection is an important recognition that streamlines Redwire’s ability to deliver combat-proven, commercially developed UAS technology at scale to meet the Department of Defense’s evolving mission needs,” said Peter Cannito, chairman and CEO of Redwire.

“The Stalker system is a field-proven aircraft with hundreds of thousands of flight hours across six continents. It is designed with a Modular Open Systems Approach, which has enabled us to work alongside our customers to ensure each system can integrate the third-party sensors and advanced technology required to rapidly evolve operations to meet emerging mission needs around the world,” said Steve Adlich, president of Edge Autonomy.

ExecutiveBiz Logo

Sign Up Now! ExecutiveBiz provides you with Daily Updates and News Briefings about Technology

mm

Written by Miles Jamison

Telos' Xacta platform has gained FedRAMP High certification.
Telos’ Xacta Gains FedRAMP High Certification
Empower AI CTO Jennifer Sample talks about being open to varying views when creating safe AI systems
Empower AI’s Jennifer Sample: Open Discussion Needed When Creating Safe AI Systems