Robot Operating System
ROS2 (Robot Operating System 2) is middleware for robotics — a standardized communication framework and ecosystem of reusable packages. It provides pub/sub messaging, service calls, hardware abstraction, and a vast library of algorithms (SLAM, navigation, manipulation) so you don’t build everything from scratch.
Why It Matters
ROS2 is the de facto standard for research and increasingly for production robotics. It lets you compose a robot system from modular nodes — swap a LiDAR driver, change a SLAM algorithm, add a planner — without rewriting everything. Understanding ROS2 architecture is essential for working with any modern robot software stack.
Core Concepts
Nodes
A node is a single-purpose process: one node reads the LiDAR, another runs SLAM, another plans paths. Each node is independently launchable and restartable.
import rclpy
from rclpy.node import Node
class MinimalNode(Node):
def __init__(self):
super().__init__('my_node')
self.get_logger().info('Node started')
rclpy.init()
node = MinimalNode()
rclpy.spin(node)Topics (Pub/Sub)
Asynchronous, many-to-many messaging. Publishers send, subscribers receive. Decoupled — publisher doesn’t know who’s listening.
/imu/data → IMU driver publishes Imu messages at 200Hz
/scan → LiDAR driver publishes LaserScan at 10Hz
/cmd_vel → Navigation publishes Twist (velocity commands)
/map → SLAM publishes OccupancyGrid
from std_msgs.msg import String
# Publisher
self.pub = self.create_publisher(String, '/chatter', 10)
msg = String(); msg.data = 'hello'
self.pub.publish(msg)
# Subscriber
self.sub = self.create_subscription(String, '/chatter', self.callback, 10)
def callback(self, msg):
self.get_logger().info(f'Heard: {msg.data}')Services (Request/Response)
Synchronous, one-to-one. Client sends request, server computes, returns response.
/set_mode → change robot state (manual/auto)
/spawn_entity → add object to simulation
Actions (Long-Running Tasks)
Like services but with progress feedback and cancellation:
/navigate_to_pose → send goal, get periodic progress, final result
/follow_path → execute path, report completion percentage
Communication Pattern
┌────────────┐ /scan ┌──────────┐ /map ┌────────────┐
│ LiDAR node │ ------→ │ SLAM │ ------→ │ Navigation │
└────────────┘ │ node │ │ node │
└──────────┘ └─────┬──────┘
┌────────────┐ /imu/data │ /cmd_vel
│ IMU node │ ----→ SLAM node (also subscribes) ↓
└────────────┘ ┌──────────────┐
│ Motor driver │
│ node │
└──────────────┘
DDS (Data Distribution Service)
ROS2 replaced the ROS1 master with DDS — a decentralized middleware standard:
| Feature | ROS1 | ROS2 |
|---|---|---|
| Discovery | Central master (single point of failure) | Decentralized (DDS multicast) |
| Transport | Custom (TCPROS) | DDS (standardized, QoS) |
| Real-time | Not designed for it | DDS supports real-time QoS |
| Platforms | Linux only | Linux, Windows, macOS, RTOS |
Quality of Service (QoS)
| Policy | Options | Use Case |
|---|---|---|
| Reliability | Best-effort / Reliable | Sensor data (best-effort) vs commands (reliable) |
| Durability | Volatile / Transient local | Late subscribers get last message (map, config) |
| History | Keep last N / Keep all | Sensor: keep last 1; Log: keep all |
| Deadline | Max time between messages | Detect sensor failure |
tf2 (Transforms)
A tree of coordinate frame transforms, continuously updated:
world → odom → base_link → imu_link
→ lidar_link
→ camera_link
Any node can look up the transform between any two frames at any time. Essential for combining sensor data from different positions on the robot.
from tf2_ros import TransformListener, Buffer
tf_buffer = Buffer()
tf_listener = TransformListener(tf_buffer, self)
transform = tf_buffer.lookup_transform('odom', 'base_link', rclpy.time.Time())Key Packages
| Package | Purpose |
|---|---|
| nav2 | Full autonomous navigation stack (path planning, obstacle avoidance, recovery) |
| MoveIt2 | Robot arm motion planning (IK, collision avoidance, trajectory execution) |
| slam_toolbox | 2D LiDAR SLAM |
| robot_localization | EKF/UKF sensor fusion for odometry |
| image_pipeline | Camera calibration, stereo, depth |
| Gazebo | Physics simulation with ROS2 integration |
Build System
# Create workspace
mkdir -p ~/ros2_ws/src
cd ~/ros2_ws/src
ros2 pkg create --build-type ament_python my_package
# Build
cd ~/ros2_ws
colcon build
source install/setup.bash
# Run
ros2 run my_package my_node
ros2 launch my_package my_launch.pyRelated
- SLAM — slam_toolbox, cartographer run as ROS2 nodes
- Path Planning — nav2 implements planning algorithms
- Sensor Fusion — robot_localization node fuses IMU + odometry
- Computer Vision for Robotics — image processing via ROS2 image_pipeline
- Kinematics — MoveIt2 handles robot arm kinematics