Autonomous Navigation using ALOAM and ROS

Project Overview

This project involves implementing an autonomous navigation system using ALOAM (A LiDAR Odometry and Mapping) and ROS (Robot Operating System). The system allows for two modes of control: manual joystick control for navigation and autonomous map-building in real-world environments.

The system integrates sensor data (LiDAR and IMU) for odometry and real-time mapping, enabling the vehicle to autonomously navigate and build maps of its surroundings.

Key Features

  • Manual Control: Using a joystick for real-time vehicle control.
  • Autonomous Navigation: The vehicle autonomously builds a map of the environment as it navigates.
  • Real-World Testing: Successfully tested in various real-world environments, showcasing reliable map-building and navigation.

Technologies Used

  • ALOAM: A LiDAR odometry and mapping algorithm that enables real-time motion tracking and mapping.
  • ROS (Robot Operating System): Provides the middleware for controlling the vehicle and handling sensor data.
  • LiDAR & IMU Sensors: Used for precise measurement of the environment and vehicle position.

Contribution

  • Integrated ALOAM with ROS for seamless vehicle control.
  • Developed algorithms to process LiDAR and IMU data in real-time.
  • Tested and validated the system in outdoor environments, ensuring robust performance.

Outcome

  • Successful demonstration of autonomous navigation and map-building capabilities.
  • Captured vehicle movement in real-time (see GIF below).