Тёмный

FAST-LIVO2: Fast, Direct LiDAR-Inertial-Visual Odometry 

MARS LAB HKU
Подписаться 2,3 тыс.
Просмотров 1,8 тыс.
50% 1

Preprint: arxiv.org/pdf/...
GitHub: github.com/hku...
We propose an efficient and accurate LiDAR-Inertial-Vision fusion localization and mapping system, FAST-LIVO2, which demonstrates great potential in real-time 3D reconstruction and robotic onboard localization in degraded scenes.
What can FAST-LIVO2 do?
1. Real-time high-precision reconstruction: The system can generate photo-realistic dense colored point clouds in real time. More importantly, it can run in real time on low-power ARM-based platforms (such as rk3588, Jetson Orin NX, RB5, etc.).
2. Stability in extreme environments: It can stably map and return to the origin in extremely degraded and GPS-denied tunnel environments (over 25 minutes of data collection). We have also tested it on the FAST-LIVO2 private dataset with numerous sequences of LiDAR/visual degradation (over 2TB), verifying its efficiency and robustness.
3. Breakthrough in UAV autonomous navigation: FAST-LIVO2 is the world’s first application of LiDAR-Inertial-Vision Odometry (LIVO) systems in UAV autonomous navigation. It enables UAVs to operate stably in environments where both LiDAR and vision are degraded.
4. Enhanced airborne mapping accuracy: It effectively addresses the cumulative drift issues arising from LiDAR degradation or inaccurate point cloud measurements (where the air-to-ground distance is too far and the LiDAR spot effect is significant) in aerial surveying, resulting in pixel-level mapping outcomes.
5. Support for downstream applications in 3D scene representation: It quickly generates dense and accurate large-scale colored point clouds and camera poses for downstream applications (such as mesh generation, texture mapping, depth-supervised 3D Gaussian Splatting, etc.).
6. Real-world 3D scanning: Utilizing its non-contact, high-precision, high-detail, high-efficiency, and large-scale capabilities, it captures 3D data of ancient buildings and landscape features, which can then be imported into UE5 modeling software. This allows game environments (such as the 'Black Myth: Wukong' DLC) to achieve detail comparable to the real world.
Our source code, datasets, handheld and UAV devices, hardware synchronization schemes, and subsequent applications will be open-sourced on GitHub to promote the development of the robotics and computer vision community.

Опубликовано:

 

9 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 7   
@marslabhku1418
@marslabhku1418 2 дня назад
To make sure you don't miss any of the highlights, the chapter timestamps are as follows: 0:03 Experiment 1: Benchmark 0:34 Experiment 2: Evaluation in Environments with LiDAR Degeneration and Visual Challenges 3:44 Experiment 3: Pixel-Level High-Accuracy 3D Reconstruction 5:08 Application 1: Fully Onboard Autonomous UAV Navigation 7:46 Application 2: Airborne Mapping 9:46 Application 3: Gaussian Splatting
@trollenz
@trollenz 2 дня назад
Next level truly... Congrats 👏🏻
@eloyaldao435
@eloyaldao435 2 дня назад
Awesome!! Better and better every time. Congratulations
@SLAM-ib5ln
@SLAM-ib5ln 2 дня назад
High-energy alert starts at 7:46, things are heating up!
@psneves
@psneves 2 дня назад
Have I seen this movie?
@jackhutton9048
@jackhutton9048 2 дня назад
7:50 map what you can, give nothing back
@louisli1004
@louisli1004 День назад
Waiting for code ...
Далее
The TRUTH about Drum Scanners
21:22
Просмотров 5 тыс.
How are holograms possible?
46:24
Просмотров 661 тыс.
Oops 😅
00:10
Просмотров 3,7 млн
Обыкновенное чудо
00:48
Просмотров 267 тыс.
The Genius Behind the Quantum Navigation Breakthrough
20:47
When you Accidentally Compromise every CPU on Earth
15:59
ZEN 5 has a 3D V-Cache Secret
19:32
Просмотров 80 тыс.
The F1 Car That Shouldn’t Have Existed
23:00
Просмотров 136 тыс.
Photogrammetry / NeRF / Gaussian Splatting comparison
23:30
50,000,000x Magnification
23:40
Просмотров 6 млн
Oops 😅
00:10
Просмотров 3,7 млн