

visionairy®
Precision Landing
Enables drones to autonomously land on designated surfaces with centimeter-level accuracy, actively assessing vertical clearance and avoiding obstacles.
Why Precision Landing?
Spleenlab’s Precision Landing goes beyond traditional systems by combining centimeter-level horizontal accuracy with real-time vertical awareness using vision-based AI. This ensures safe, autonomous landings even in cluttered, dynamic, or uneven environments—where standard descent isn't enough.
Unlike most solutions that focus only on flat-surface accuracy, Spleenlab’s technology actively evaluates vertical clearance and avoids obstacles during descent. It's built for safety-critical missions where precision and adaptability are essential.
Centimeter-level accuracy with vertical awareness
Safe landings in complex environments
Built for safety-critical missions
See how it works

Features
Unlock precise, safe, and intelligent landings with Spleenlab’s Precision Landing. Engineered for reliability in complex environments, it delivers advanced capabilities that go beyond standard descent systems.
Landing Zone Assessment - Continuously evaluates the intended landing area for vertical obstructions or terrain irregularities.
Precision Descent Control - Executes sub-meter landings with live corrections based on camera feedback.
Markerless or Marker-Based Support - Operates with natural features, designated patterns, or visual landing pads.
Fail-Safe Behavior - Abort and retry logic built-in for unsafe zones or interrupted descent.
Camera-Agnostic - Works with any monocular camera—no need for specialized hardware.
AI-Powered Vision System - Uses onboard cameras and neural networks for reliable, GPS-independent guidance.
Fail-Safe Behavior - Defines safe behavior when tracking is lost, such as stopping, returning to base, or switching to manual control.
Performance Metrics
Position accuracy
±2.5 cm in typical environments
update rate
Up to 200 Hz
initialization time
<1 second
Maximum Velocity
20 m/s with full accuracy
Operating Range
Unlimited (environment-dependent)
Drift
<0.1% of distance traveled

These performance metrics are for demonstrative purposes only, based on configurations with proven results. Actual performance may vary by setup. Our algorithms are optimized for use with any chip, platform, or sensor. Contact us for details.
Update Rate
10-50 Hz
Initialization Time
<1 second
Landing Accuracy
±5-10 cm
Minimum object size for vertical clearance
10cm (depends on sensor used)
Operating Range
Field of view (FoV) of the sensors
Latency
<100 ms
Supported companion hardware
Nvidia Jetson, ModalAI Voxl2 / Mini, Qualcomm RB5, IMX7, IMX8, Raspberry PI
Supported flight controllers
PX4, APX4, Ardupilot
Basis-SW/OS
Linux, Docker required
Interfaces
ROS2 or Mavlink
Input - Sensors
• Any type of camera (sensor agnostic)
• Any type of depth sensor (lidar/stereo) for vertical clearance
• RGB camera (for image based vertical clearance)
Input - Data
• Camera's video frames
• Aerial vehicle’s odometry
• Aerial vehicle’s current flight height
• Intrinsic & extrinsic sensor calibration
Output - Data
• Navigation of the Aerial vehicle
• Position commands for the Autopilot
• Velocity and orientation commands for the Autopilot
• Safe/unsafe landing status
Minimum
Recommended
RAM
2 GB
4 GB
Storage
20 GB
50 GB
Camera
640 x 480 px, 10 FPS
1920 x 1080 px, 30 FPS
LiDAR
-
3D LiDAR (no single point/2D)
IMU
100 Hz or GPS
300 Hz or GPS
The information provided reflects recommended hardware specifications based on insights gained from successful customer projects and integrations. These recommendations are not limitations, and actual requirements may vary depending on the specific configuration.
Our algorithms are compatible with any chip, platform, sensor, and individual configuration. Please contact us for further information.
Benefits
Lightweight and Efficient
Runs efficiently on small, low-power hardware
Obstacle Avoidance
Detects and avoids objects and elevation hazards during landing
Seamless Integration
Works seamlessly with any Autopilot and Visionairy systems