top of page
spleenlab-product-slam-lidar-hero.jpg
Product_Blockage_Detection_Camera_01.jpg

visionairy®

SLAM - LiDAR

AI-powered LiDAR technology enables precise, real-time 3D mapping and localization, even in challenging conditions such as dust, fog, and dynamic environments.

Why SLAM - LiDAR?

VISIONAIRY® SLAM – LiDAR fuses AI with LiDAR sensing to deliver real-time, high-precision mapping and localization—even in GPS-denied, dusty, or low-light environments. It outperforms traditional GPS and vision systems by adapting dynamically to complex, changing conditions.

Unlike fixed infrastructure-based solutions, it builds and updates 3D maps on the fly, enabling robust autonomy for robotics, drones, and vehicles in the most demanding scenarios.

Real-time, infrastructure-free navigation

Reliable in low-visibility and GPS-denied areas

AI-boosted precision and resilience

Features

Here are the core features that make VISIONAIRY® SLAM – LiDAR a powerful and reliable solution for precise mapping and autonomous navigation:

Real-Time 3D Mapping - Continuously generates and updates accurate 3D maps of the environment.

Simultaneous Localization -  Precisely tracks the system’s position within the mapped space.

AI-Powered Sensor Fusion - Combines LiDAR data with IMU, GNSS, and other sensors for enhanced robustness.

Environmental Resilience - Performs reliably in dust, fog, darkness, and dynamic surroundings.

Loop Closure & Drift Correction - Reduces cumulative errors for long-term navigation accuracy.

Edge-Optimized Performance - Lightweight and efficient, suitable for real-time deployment on embedded systems.

See how it works

  • These performance metrics are for demonstrative purposes only, based on configurations with proven results. Actual performance may vary by setup. Our algorithms are optimized for use with any chip, platform, or sensor. Contact us for details.

    Position accuracy

    ±2.5 cm in typical environments

    update rate

    Up to 200 Hz

    initialization time

    <1 second

    Maximum Velocity

    20 m/s with full accuracy

    Operating Range

    Unlimited (environment-dependent)

    Drift

    <0.1% of distance traveled

  • Supported companion hardware

    Nvidia Jetson, ModalAI Voxl2 / Mini, Qualcomm RB5, IMX7, IMX8, Raspberry PI

    Supported flight controllers

    PX4, APX4, Ardupilot

    Basis-SW/OS

    Linux, Docker required

    Interfaces

    ROS2 or Mavlink

    Minimum

    Recommended

    RAM

    2 GB

    4 GB

    Storage

    20 GB

    50 GB

    Camera

    640 x 480 px, 10 FPS

    1920 x 1080 px, 30 FPS

    IMU

    100 Hz

    300 Hz

    The information provided reflects recommended hardware specifications based on insights gained from successful customer projects and integrations. These recommendations are not limitations, and actual requirements may vary depending on the specific configuration.


    Our algorithms are compatible with any chip, platform, sensor, and individual configuration. Please contact us for further information.

Related Industries

Soldaten und Hubschrauber
ADAS & AD
Soldaten und Hubschrauber
Industrial
Soldaten und Hubschrauber
Agriculture
Soldaten und Hubschrauber
Cargo & Logistics
Soldaten und Hubschrauber
Mining
Soldaten und Hubschrauber
Defense

Benefits

Accurate 3D Mapping

Creates detailed, real-time maps of any environment.

Autonomous Navigation

Enables safe, self-guided movement without GPS.

Versatile Integration

Works across drones, robots, and vehicles in diverse industries.

get started with SLAM - LiDAR

Our Visual Inertial Odometry empowers robots to navigate complex environments with precision. Learn how it improves pathfinding, spatial awareness, and operational efficiency for your specific use case.

bottom of page