Enabling Autonomous Navigation in a Snake Robot through Visual-Inertial Odometry and Closed-Loop Trajectory Tracking Control
Snake robots offer exceptional mobility across extreme terrain inaccessible to conventional rovers, yet their highly articulated bodies present fundamental challenges for autonomous navigation in environments lacking external tracking infrastructure. This thesis develops a complete autonomy pipeline for COBRA, an 11 degree-of-freedom modular snake robot designed for planetary exploration. While the robot’s biologically inspired serpentine gaits achieve impressive mobility, prior work has relied entirely on open-loop teleoperation. This approach integrates onboard visual-inertial SLAM, reduced-order state estimation, and closed-loop trajectory tracking to enable autonomous waypoint navigation. A depth camera paired with edge computing performs real-time localization during dynamic locomotion, validated against motion-capture ground truth to characterize drift behavior and failure modes unique to snake robot platforms. A reduced-order framework estimates Center-of-Mass pose, driving a closed-loop controller that modulates CPG gait parameters through distance-dependent yaw error blending. Physical experiments validate the complete system, demonstrating accurate multi-waypoint tracking and establishing foundations for autonomous snake robot navigation.
💡 Research Summary
This thesis presents the development and experimental validation of a comprehensive autonomy pipeline for the COBRA robot, an 11-degree-of-freedom modular snake robot designed for planetary exploration in extreme terrains. The core challenge addressed is enabling autonomous navigation without external tracking infrastructure, despite the robot’s highly articulated and dynamically oscillating body, which complicates perception and control.
The work integrates three key components into a cohesive system. First, an onboard perception stack was built using an Intel RealSense D455 depth camera and an NVIDIA Jetson Orin NX for edge computing. This runs a visual-inertial SLAM pipeline (RTAB-Map) in real-time on the ROS2 framework to estimate the robot’s head pose during dynamic locomotion gaits like sidewinding.
Second, a reduced-order state estimation framework processes the SLAM output. Instead of tracking all 11 joint states, it calculates a simplified representation of the robot’s overall pose by computing the Center-of-Mass (CoM) and a dynamic bounding box through forward kinematics from the estimated head pose. This abstraction is crucial for making control tractable.
Third, a closed-loop trajectory tracking controller was developed. This high-level controller uses the estimated CoM pose and a target waypoint to calculate a steering error. A novel control law dynamically blends “absolute yaw error” (for final orientation alignment) and “relative yaw error” (for path convergence) based on the distance to the target. This combined error signal modulates the amplitude parameters of a Central Pattern Generator (CPG) that produces the robot’s underlying serpentine gait, effectively steering the robot by creating asymmetric lateral thrust.
The system was rigorously validated through physical experiments. The visual-inertial odometry was benchmarked against a motion-capture ground truth system, characterizing its accuracy and identifying drift behaviors unique to snake robot motion. The closed-loop navigation was tested in multiple scenarios: single-waypoint convergence, sequential multi-waypoint navigation in a star-shaped topology, and bidirectional path following. Results demonstrated accurate positioning (final errors under 5 cm) and orientation alignment at each waypoint. Additional tests showed the system’s robustness to external pushes.
In conclusion, this research successfully demonstrates a functional, integrated autonomy pipeline for a complex snake robot, moving beyond open-loop scripted gaits. It provides a foundational framework for autonomous navigation in GPS-denied, extreme environments by tackling the intertwined problems of state estimation under dynamic motion and whole-body control through simplified, effective abstractions. The work highlights the importance of co-designing perception, estimation, and control specifically for the unique challenges posed by highly articulated mobile robots.
Comments & Academic Discussion
Loading comments...
Leave a Comment