Autonomy Foundations with NVIDIA Jetson Nano
Autonomy Foundations equips learners with a practical understanding of how artificial intelligence shapes robotic perception and autonomy. Using the NVIDIA Jetson Nano and JetBot platform, this course offers a hands-on approach to core, real-world robotic capabilities, including Networking, Collision Avoidance, Path Following, AprilTag Navigation, and SLAM. Autonomy Foundations provides a comprehensive foundation for operating and maintaining intelligent robotic systems. This curriculum includes videos, animations, and step-by-step lessons designed to help learners foster an understanding of Applied AI in Robotics.
Autonomy Foundations is broken up into 6 different units. Within the units, students will engage with their robot and learn core concepts through step-by-step, media-driven instructional content, “Try It” remix activities, mini-challenges, “Check Your Understanding” questions, a culminating end-of-unit challenge to apply what they have learned, and an end-of-unit quiz.
Autonomous Systems are comprised of hardware and software enabling machines to operate independently. In this unit, participants will configure their JetBot, including software, network requirements, assembly, and initial operation. | |
Autonomous systems like the JetBot can be configured to navigate using pre-programmed routines, operator teleoperation, or a blend of both. This unit guides participants through motion control, precise navigation techniques, and teleoperation. | |
Sensors enable robots to perceive their environment and make autonomous decisions. In this unit, participants perform sensor integration, utilizing GPIO (General Purpose Input Output), digital inputs and outputs, an IMU (Inertial Measurement Units), motor encoders, IR (infrared) cameras, and LiDAR. | |
To prepare to navigate unknown environments, autonomous systems are often trained with data from known environments. This unit emphasizes the importance of data collection and labeling for applications like Collision Avoidance and Path Following. Participants will perform supervised learning techniques, utilizing Classification for detecting obstacles and Regression for path prediction. | |
AprilTags are a special type of marker that allows a robot to know its precise position (localization) and orientation (pose estimation) for accurate navigation. In this unit, participants calibrate the camera to improve AprilTag detection accuracy and leverage ROS (Robot Operating System) to perform waypoint navigation with the AprilTag markers. | |
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics to build a map of an environment while simultaneously keeping track of the robot's location within it. In this unit, participants configure ROS to communicate over a network, allowing the JetBot to transmit LiDAR data used to generate a high-fidelity map in an Ubuntu Virtual Machine. |
This material is based upon work supported by the Office of Naval Research under Contract Number N00014-23-C-2015. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the Office of Naval Research.
Required Materials
Robot Hardware and Software
- Jetson Nano Developer Kit:
- 1x 64 GB Micro SD Cards and Micro SD Card Reader
- Waveshare Jetbot ROS AI Kit:
- IR-Cut Infrared Night Vision Camera Module:
- 3x 18650 Rechargeable Batteries
- USB Keyboard
- USB Mouse
- HDMI Monitor and Cable
- Laptop (up-to-date Windows PC or Mac)
- Internet access for Laptop and Jetson Nano
The following is recommended in order to take this course:
- USB Video Capture Card to HDMI:
- Note: alternatives available
Other Materials
- Building Block City Street Plates:
- Note: alternatives available
- Electrical or Painter's tape
- Open areas for the robot to safely move
- Small, colored objects for the robot to manipulate
- Boxes or other objects to serve as barriers and obstacles
- Meter sticks
- Protractors