Our slew-to-cue EO/IR solution that supports third-party systems and sensors with autonomous optical acquisition, classification, and tracking of unwanted UAS in your airspace.
What is AirScout Verify?
AirScout Verify is an EO/IR-based, slew-to-cue system that autonomously acquires, classifies, and tracks UAS. With dynamic AI targeting and acquisition algorithms and low-latency predictive tracking, AirScout Verify reliably acquires, classifies, and tracks UAS at tactical ranges better than comparable systems.
Unlike legacy techniques that rely on motion, contrast, and/or manual target detection and acquisition, AirScout Verify utilizes advanced AI feature-based detection and acquisition for accurate acquisition and track initiation. AirScout Verify utilizes multispectral EO/IR imaging systems, is hardware agnostic, and can be rapidly upgraded or downgraded with different optical sensors and optics. AirScout Verify’s easy-to-use API enables efficient integrations with other detection sensors and command-and-control systems.
The AirScout Platform
AirScout is a signal processing and sensor fusion software platform that uses hardened artificial intelligence to fuse and process real-time data from sensor arrays. Designed for processing at the edge, AirScout locally fuses data from deployed sensors, running billions of computations in real-time to identify known patterns or objects. AirScout’s hardware-agnostic software can be rapidly upgraded or downgraded with different sensors or optics and seamlessly integrated with third-party sensors and command-and-control systems through its easy-to-use API.
AirScout Verify delivers many best-in-class benefits.
Automated Response Chain
AirScout’s autonomy enables rapid acquisition, identification, and mitigation of each UAS target, allowing operators to move confidentially and quickly from one UAS target to the next.
Capable Against Swarms
AirScout Verify can acquire, classify, and track an unlimited number of UAS simultaneously in a single camera’s field of view, prioritizing UAS tracks autonomously or with operator instruction. Further, AirScout Verify can be quickly upgraded with additional cameras.
Hardened AI for Complex Scenarios
Novel artificial intelligence and computer vision algorithms designed to perform in complex and cluttered scenarios and at tactical ranges.
Configurable, Scalable, and Modular
AirScout’s open-architecture software can process signals from an unlimited number of sensors and can be quickly upgraded with additional sensors, meeting the needs and requirements of the most demanding organizations.
AirScout’s AI computer vision autonomously reject false-positives and only alert operators to threats that have been classified with a high degree of confidence.
Low Operator Burden
Automating the ID portion of response chains lowers operator burden, accelerates decision cycles, reduces operator errors, and improves asset and personnel protection.
All Environment and Day/Night
AirScout uses multispectral EO/IR imaging systems to detect, acquire, classify, and track UAS in any environment — day or night.
Camera Hardware Agnostic
AirScout’s hardware-agnostic software can rapidly upgrade or downgrade with different sensors and optics, scaling up or down to meet requirements and threats.
AI-based Targeting and Acquisition
Dynamic targeting algorithms and feature-based acquisition ensures reliable target engagement at tactical ranges.
Processing at the Edge
AirScout’s high performance computer vision is optimized for edge devices, delivering the low-latency performance required to reliably detect, acquire, classify, and track small and nimble UAS targets.
AirScout’s low-latency predictive tracking enables precision-guided cuing for directing mitigation responses.
No Additional Manpower
AirScout requires minimal training to operate, no additional manpower to use, and can be run unmanned or with an operator-on-the-loop.
Perpetual updates paired with AirScout’s extensibility and interoperability with third-party sensors and systems ensures that customers evolve as threats evolve.