Title 01
Change the text and add your own content, including any information that is relevant to share. Then customize the font, size and scale to make it your own.
Title 02
Change the text and add your own content, including any information that is relevant to share. Then customize the font, size and scale to make it your own.
Title 03
Change the text and add your own content, including any information that is relevant to share. Then customize the font, size and scale to make it your own.


visionairy®
Agents - Land | Air | Sea
VISIONAIRY® Agents enable fully autonomous decision-making and behavior execution for individual drones, ground vehicles, and maritime platforms. Agents empower assets to perceive, plan, and act independently or as part of a team, unlocking intelligent mission autonomy across all domains.
Why Agents - Land | Air | Sea?
Conventional autonomous systems often rely on centralized control or scripted behaviors, limiting flexibility and responsiveness. VISIONAIRY® Agents deliver decentralized, real-time decision-making directly at the asset level—allowing each unit to assess its surroundings, make mission-critical decisions, and adapt autonomously to dynamic environments.
Agents operate reliably in complex, GPS-denied, or communication-limited scenarios across air, land, and sea, providing scalable, mission-ready autonomy where it matters most.
Decentralized, Intelligent Autonomy
Real-Time Multi-Domain Decision-Making
Mission-Scalable Performance
See how it works


Benefits
Lightweight and Efficient
Runs on low-power, embedded systems for seamless deployment across air, land, and sea assets.
Intelligent, Onboard Decision-Making
Enables each platform to make mission-critical decisions independently, without reliance on ground control.
Seamless Integration
Fully compatible with existing autopilots, control systems, and Spleenlab’s Visionary technologies.
Features
Empower autonomous assets with VISIONAIRY® Agents. Designed to enable intelligent, flexible, and decentralized behavior across complex defense missions.
Real-Time Environmental Awareness - Continuously interprets sensory input to build a situational understanding of surroundings.
Autonomous Decision Layer - Supports mission-specific behaviors like obstacle avoidance, patrol, target engagement, and formation flying.
Multi-Domain Compatibility - Optimized for drones, ground vehicles, and maritime platforms.
Dynamic Mission Adaptation - Adjusts mission execution in real time based on environmental changes or evolving tactical requirements.
Configurable Behavior Modules - Customizable for use cases such as surveillance, area coverage, threat avoidance, or autonomous logistics.
These performance metrics are for demonstrative purposes only, based on configurations with proven results. Actual performance may vary by setup. Our algorithms are optimized for use with any chip, platform, or sensor. Contact us for details.
Update Rate
5-10 Hz
Initialization Time
<10 seconds
Approach Accuracy
±0,2m (depending on the scene)
Target speed (for moving target)
Up to 10km/h
Operating Range
Line-of-sight or sensor-limited
Latency
<100 ms
Supported companion hardware
Nvidia Jetson, ModalAI Voxl2 / Mini, Qualcomm RB5, IMX7, IMX8, Raspberry PI
Supported flight controllers
PX4, APX4, Ardupilot
Basis-SW/OS
Linux, Docker required
Interfaces
ROS2 or Mavlink
Input - Sensors
Any type of camera (sensor agnostic)
Input - Data
• Camera's video frames
• Aerial vehicle’s odometry
• Aerial vehicle’s current flight height
• Intrinsic & extrinsic sensor calibration
Output - Data
• Navigation of the Aerial vehicle
• Position commands for the Autopilot
• Velocity and orientation commands for the Autopilot
Minimum
Recommended
RAM
2 GB
4 GB
Storage
20 GB
50 GB
Camera
640 x 480 px, 10 FPS
1920 x 1080 px, 30 FPS
IMU
100 Hz or GPS
300 Hz or GPS
The information provided reflects recommended hardware specifications based on insights gained from successful customer projects and integrations. These recommendations are not limitations, and actual requirements may vary depending on the specific configuration.
Our algorithms are compatible with any chip, platform, sensor, and individual configuration. Please contact us for further information.
Performance Metrics
Position accuracy
±2.5 cm in typical environments
update rate
Up to 200 Hz
initialization time
<1 second
Maximum Velocity
20 m/s with full accuracy
Operating Range
Unlimited (environment-dependent)
Drift
<0.1% of distance traveled
Related Industries
Resources
Solutions
Customers
Spleenlab GmbH is a highly specialized AI perception software company founded with the vision of redefining safety and artificial intelligence.
©
2025
Spleenlab GmbH | All Rights Reserved