Skip to main content

AI Assistant

Physical AI & Humanoid Robotics

Hello! I'm your AI assistant for the AI-Native Guide to Physical AI & Humanoid Robotics. How can I help you today?

04:57 AM

Topic 1 — Gazebo Fundamentals & Physics Simulation

This topic grounds the idea of the digital twin in a concrete toolchain: Gazebo (Ignition / Garden / Fortress) as your primary open-source physics simulator. You will learn why simulation is indispensable for humanoid robotics, how Gazebo is structured, how to describe robots and environments in SDF, and how to configure physics parameters so your simulated humanoid behaves like a plausible real robot.


1.1 Why Simulation? The Robotics Dilemma

Humanoid robots are expensive (tens of thousands of dollars) and fragile. A single bad control command can mean:

  • A fall that breaks expensive hardware.
  • A collision that damages lab infrastructure.
  • A fire or safety incident if actuators saturate or overheat.

Simulation solves a fundamental dilemma:

  • You need many experiments to design, debug, and train policies.
  • You cannot afford to run those experiments first on real hardware.

Digital Twins in Physical AI

A digital twin is a virtual replica of:

  • The robot (geometry, mass, joints, actuators, sensors).
  • The environment (terrain, obstacles, contact materials).
  • The dynamics (gravity, friction, damping, contacts).

For Physical AI, a good digital twin allows you to:

  • Run 1,000+ scenarios in simulation before touching hardware.
  • Generate labeled sensor data (RGB-D, LiDAR, IMU) for perception training.
  • Validate motion plans and controllers in a safe sandbox.

However, simulation is not perfect. Common sim-to-real gaps include:

  • Contact forces that do not match reality (grippers slip, feet skate).
  • Sensor noise and latency that are too clean or too simple.
  • Inaccurate mass/inertia, leading to unrealistic accelerations.

Your goal is not to build a perfect universe. It is to build a simulation that is realistic enough to design algorithms that still work when deployed on hardware.


1.2 Gazebo Architecture & Workflow

Gazebo (Gazebo Classic, Ignition / Gazebo Garden/Fortress) is an open-source robotics simulator with:

  • A simulation server (gz server): runs physics and plugins.
  • A client / GUI (gz client or gazebo): visualizes worlds and robots.
  • A plugin system: C++ or Python plugins for sensors, controllers, and custom logic.
  • ROS 2 integration via gazebo_ros: bridges simulation to your ROS 2 graph.

Typical Directory Structure

  • worlds/ — World definitions (.world or .sdf files).
  • models/ — Robot and object models (nested SDF, meshes, materials).
  • plugins/ — Compiled or scripted extensions (sensor, model, system plugins).

Basic Workflow

  1. Define a world: gravity, physics engine, ground plane, lights, basic geometry.
  2. Import or define a robot: URDF from Chapter 2, converted or embedded into SDF.
  3. Add sensors: cameras, LiDAR, IMUs defined in SDF.
  4. Add plugins: motor controllers, ROS 2 bridge, logging.
  5. Run simulation: step physics forward, visualize in GUI, monitor ROS 2 topics.
  6. Record data: ros2 bag record or Gazebo loggers.

This topic focuses on steps 1–4; later topics integrate ROS 2 and perception.


1.3 SDF: Simulation Description Format

While URDF describes robot kinematics and basic visuals, SDF (Simulation Description Format) is designed for full simulation:

  • Worlds, models, and lights.
  • Links and joints (like URDF, but with richer physics).
  • Collision geometry and contact parameters.
  • Sensors (camera, depth, LiDAR, IMU, GPS, etc.).
  • Plugins (model, sensor, world).

Minimal World Example

<?xml version="1.0"?>
<sdf version="1.9">
<world name="humanoid_world">
<!-- Physics -->
<gravity>0 0 -9.81</gravity>
<physics name="default_physics" type="ode">
<max_step_size>0.001</max_step_size>
<real_time_update_rate>1000</real_time_update_rate>
<ode>
<solver>
<type>quick</type>
<iters>50</iters>
<sor>1.3</sor>
</solver>
<constraints>
<cfm>0.0</cfm>
<erp>0.2</erp>
</constraints>
</ode>
</physics>

<!-- Ground plane -->
<include>
<uri>model://ground_plane</uri>
</include>

<!-- Sun light -->
<include>
<uri>model://sun</uri>
</include>
</world>
</sdf>

Key parameters:

  • max_step_size: Simulation time step (seconds). Smaller → more stable but slower.
  • real_time_update_rate: Target simulation frequency.
  • Solver iterations and error-reduction parameters determine contact stability.

A simple link with visual and collision geometry:

<link name="base_link">
<pose>0 0 0.5 0 0 0</pose>

<inertial>
<mass>10.0</mass>
<inertia>
<ixx>0.5</ixx><ixy>0.0</ixy><ixz>0.0</ixz>
<iyy>0.5</iyy><iyz>0.0</iyz>
<izz>0.5</izz>
</inertia>
</inertial>

<collision name="base_collision">
<geometry>
<box>
<size>0.4 0.3 0.1</size>
</box>
</geometry>
<surface>
<friction>
<ode>
<mu>0.8</mu>
<mu2>0.8</mu2>
</ode>
</friction>
</surface>
</collision>

<visual name="base_visual">
<geometry>
<box>
<size>0.4 0.3 0.1</size>
</box>
</geometry>
<material>
<ambient>0.2 0.2 0.8 1</ambient>
<diffuse>0.2 0.2 0.8 1</diffuse>
</material>
</visual>
</link>

Best practices:

  • Always specify mass and inertia for links (never leave defaults).
  • Use simple collision shapes (boxes, cylinders) even if visuals are meshes.
  • Set friction coefficients for contacts that matter (feet, grippers).

Joints in SDF

Joints connect links and define degrees of freedom:

<joint name="shoulder_pitch" type="revolute">
<parent>torso</parent>
<child>upper_arm</child>
<pose>0.15 0.1 0.3 0 0 0</pose>
<axis>
<xyz>0 1 0</xyz>
<limit>
<lower>-1.57</lower>
<upper>1.57</upper>
<effort>80.0</effort>
<velocity>3.0</velocity>
</limit>
<dynamics>
<damping>0.3</damping>
<friction>0.1</friction>
</dynamics>
</axis>
</joint>

Joint limits, damping, and friction heavily influence stability and realism.


1.4 Physics Simulation: Rigid Body Dynamics

Gazebo’s physics engine (ODE/Bullet/DART/PhysX depending on configuration) simulates:

  • Rigid bodies: Links with mass and inertia.
  • Joints: Constraints between links with limits and motors.
  • Collisions: Contacts between collision geometries.
  • Forces and torques: Gravity, actuators, external pushes.

Forces, Torques, and Joint Motors

Conceptually, joint motors compute torques (often written as tau) to drive motion:

  • Position control (PD example): tau = K_p * (q_target - q) + K_d * (qdot_target - qdot)
  • Velocity control: direct velocity targets with damping (e.g., tau = K_d * (v_target - v)).

You rarely implement these equations directly in SDF; instead, you:

  • Configure effort, velocity, and dynamics fields.
  • Use controllers (Gazebo plugins or ROS 2 controllers) to compute commands.

Time Stepping and Stability

Key stability levers:

  • Time step (max_step_size):
    • Too large → interpenetration, missed collisions, “exploding” robots.
    • Typical values: 0.0010.005 seconds.
  • Solver iterations:
    • More iterations → better contact resolution but slower simulation.
    • Start with ~50, increase if you see jitter or penetration.
  • Damping:
    • Under-damped joints oscillate; over-damped joints feel sluggish.
    • Start with conservative damping and reduce as needed.

If your humanoid explodes, check:

  • Inertia tensors (not accidentally tiny or enormous).
  • Joint limits (no impossible constraints).
  • Time step (too large).
  • Solver iterations (too few).

1.5 Sensor Simulation in Gazebo (Overview)

Gazebo can simulate many sensor types:

  • Cameras: RGB, depth, segmentation.
  • LiDAR: 2D/3D laser scanners with configurable resolution and FOV.
  • IMUs: Accelerometers, gyroscopes, magnetometers.
  • GPS/GNSS: For outdoor robots (less critical for indoor humanoids).

Sensors are defined in SDF under a link:

<sensor name="head_camera" type="camera">
<always_on>true</always_on>
<update_rate>30</update_rate>
<pose>0 0 0.1 0 0 0</pose>
<camera>
<horizontal_fov>1.047</horizontal_fov> <!-- 60 degrees -->

<clip>
<near>0.1</near>
<far>10.0</far>
</clip>
</camera>
<plugin name="camera_ros" filename="libgazebo_ros_camera.so">
<ros>
<namespace>/humanoid</namespace>
<remapping>image_raw:=/camera/rgb/image_raw</remapping>
</ros>
</plugin>
</sensor>

LiDAR and IMU sensors follow similar patterns with specialized tags. A later topic dives deep into sensor configuration and noise models; for now, remember:

  • Match FOV, resolution, and range to your real hardware.
  • Add noise that resembles real sensors (not perfectly clean).

1.6 ROS 2 Integration Preview (gazebo_ros)

Gazebo becomes truly powerful when connected to your ROS 2 system:

  • Sensor plugins publish data to ROS 2 topics (e.g., /camera/rgb, /scan, /imu).
  • Model plugins subscribe to motor command topics and apply joint torques.
  • World plugins expose services/actions for resetting or querying the simulation.

Typical integration steps:

  1. Launch Gazebo with gazebo_ros plugins enabled.
  2. Use ROS 2 launch files (from Chapter 2) to start:
    • Gazebo server + world file.
    • Your humanoid model.
    • Sensor and motor controller nodes.
  3. Visualize data in RViz (topics match real robot topics where possible).

You will implement this end-to-end in the Topic 1 lab.


1.7 Hands-On Lab: Build Your First Gazebo Humanoid World

In this lab, you will:

  1. Create a world file (humanoid_world.sdf):
    • Ground plane, gravity, physics engine configuration.
    • Simple environment (floor, one or two obstacles).
  2. Import your humanoid URDF from Chapter 2:
    • Use gazebo_ros tools or ros2_control to spawn the robot.
  3. Add a sensor suite:
    • Head-mounted RGB camera.
    • Depth camera or RGB-D.
    • IMU at the torso.
  4. Set up ROS 2 bridge:
    • Sensors publish to topics like /humanoid/camera/rgb, /humanoid/imu.
    • Motor commands subscribed from /humanoid/joint_commands.
  5. Implement a basic motor controller:
    • A simple Gazebo or ROS 2 controller that:
      • Receives joint position targets.
      • Applies position or velocity commands to joint motors.
  6. Test and visualize:
    • Send joint commands via a ROS 2 node or CLI.
    • Confirm the humanoid moves and does not fall through the floor.
    • Visualize sensor data in RViz.

Success criteria:

  • The robot spawns in Gazebo and stands on the ground plane.
  • Motor commands (e.g., arm wave) cause visible motion without instability.
  • RGB / depth / IMU topics publish data at expected rates.
  • No simulation crashes or obvious “explosions.”

Document:

  • The world and model files you created.
  • The topics you used for commands and sensors.
  • Any physics parameters you had to tune to achieve stability.

This lab establishes your baseline digital twin in Gazebo, which later modules will extend with high-fidelity rendering, synthetic data, and sim-to-real validation.