Welcome to Isaac Lab!#
Isaac Lab is a unified and modular framework for robot learning that aims to simplify common workflows in robotics research (such as reinforcement learning, learning from demonstrations, and motion planning). It is built on NVIDIA Isaac Sim to leverage the latest simulation capabilities for photo-realistic scenes, and fast and efficient simulation.
The core objectives of the framework are:
Modularity: Easily customize and add new environments, robots, and sensors.
Agility: Adapt to the changing needs of the community.
Openness: Remain open-sourced to allow the community to contribute and extend the framework.
Batteries-included: Include a number of environments, sensors, and tasks that are ready to use.
Key features available in Isaac Lab include fast and accurate physics simulation provided by PhysX, tiled rendering APIs for vectorized rendering, domain randomization for improving robustness and adaptability, and support for running in the cloud.
Additionally, Isaac Lab provides a variety of environments, and we are actively working on adding more environments to the list. These include classic control tasks, fixed-arm and dexterous manipulation tasks, legged locomotion tasks, and navigation tasks. A complete list is available in the environments section.
Isaac lab is developed with specific robot assets that are now Batteries-included as part of the platform and are ready to learn! These robots include…
Classic Cartpole, Humanoid, Ant
Fixed-Arm and Hands: UR10, Franka, Allegro, Shadow Hand
Quadrupeds: Anybotics Anymal-B, Anymal-C, Anymal-D, Unitree A1, Unitree Go1, Unitree Go2, Boston Dynamics Spot
Humanoids: Unitree H1, Unitree G1
Quadcopter: Crazyflie
The platform is also designed so that you can add your own robots! Please refer to the How-to Guides section for details.
For more information about the framework, please refer to the technical report [MRT+25]. For clarifications on NVIDIA Isaac ecosystem, please check out the Isaac Lab Ecosystem section.
License#
The Isaac Lab framework is open-sourced under the BSD-3-Clause license, with certain parts under Apache-2.0 license. Please refer to License for more details.
Citation#
If you use Isaac Lab in your research, please cite our technical report:
@article{mittal2025isaaclab,
title={Isaac Lab: A GPU-Accelerated Simulation Framework for Multi-Modal Robot Learning},
author={Mayank Mittal and Pascal Roth and James Tigue and Antoine Richard and Octi Zhang and Peter Du and Antonio Serrano-Muñoz and Xinjie Yao and René Zurbrügg and Nikita Rudin and Lukasz Wawrzyniak and Milad Rakhsha and Alain Denzler and Eric Heiden and Ales Borovicka and Ossama Ahmed and Iretiayo Akinola and Abrar Anwar and Mark T. Carlson and Ji Yuan Feng and Animesh Garg and Renato Gasoto and Lionel Gulich and Yijie Guo and M. Gussert and Alex Hansen and Mihir Kulkarni and Chenran Li and Wei Liu and Viktor Makoviychuk and Grzegorz Malczyk and Hammad Mazhar and Masoud Moghani and Adithyavairavan Murali and Michael Noseworthy and Alexander Poddubny and Nathan Ratliff and Welf Rehberg and Clemens Schwarke and Ritvik Singh and James Latham Smith and Bingjie Tang and Ruchik Thaker and Matthew Trepte and Karl Van Wyk and Fangzhou Yu and Alex Millane and Vikram Ramasamy and Remo Steiner and Sangeeta Subramanian and Clemens Volk and CY Chen and Neel Jawale and Ashwin Varghese Kuruttukulam and Michael A. Lin and Ajay Mandlekar and Karsten Patzwaldt and John Welsh and Huihua Zhao and Fatima Anes and Jean-Francois Lafleche and Nicolas Moënne-Loccoz and Soowan Park and Rob Stepinski and Dirk Van Gelder and Chris Amevor and Jan Carius and Jumyung Chang and Anka He Chen and Pablo de Heras Ciechomski and Gilles Daviet and Mohammad Mohajerani and Julia von Muralt and Viktor Reutskyy and Michael Sauter and Simon Schirm and Eric L. Shi and Pierre Terdiman and Kenny Vilella and Tobias Widmer and Gordon Yeoman and Tiffany Chen and Sergey Grizan and Cathy Li and Lotus Li and Connor Smith and Rafael Wiltz and Kostas Alexis and Yan Chang and David Chu and Linxi "Jim" Fan and Farbod Farshidian and Ankur Handa and Spencer Huang and Marco Hutter and Yashraj Narang and Soha Pouya and Shiwei Sheng and Yuke Zhu and Miles Macklin and Adam Moravanszky and Philipp Reist and Yunrong Guo and David Hoeller and Gavriel State},
journal={arXiv preprint arXiv:2511.04831},
year={2025},
url={https://arxiv.org/abs/2511.04831}
}
Acknowledgement#
Isaac Lab development initiated from the Orbit framework. We gratefully acknowledge the authors of Orbit for their foundational contributions.
Table of Contents#
Isaac Lab
Getting Started
- Quickstart Guide
- Build your Own Project or Task
- Walkthrough
- Tutorials
- Creating an empty scene
- Spawning prims into the scene
- Deep-dive into AppLauncher
- Adding a New Robot to Isaac Lab
- Interacting with a rigid object
- Interacting with an articulation
- Interacting with a deformable object
- Interacting with a surface gripper
- Using the Interactive Scene
- Creating a Manager-Based Base Environment
- Creating a Manager-Based RL Environment
- Creating a Direct Workflow RL Environment
- Registering an Environment
- Training with an RL Agent
- Configuring an RL Agent
- Modifying an existing Direct RL Environment
- Policy Inference in USD Environment
- Adding sensors on a robot
- Using a task-space controller
- Using an operational space controller
- How-to Guides
- Importing a New Asset
- Writing an Asset Configuration
- Making a physics prim fixed in the simulation
- Spawning Multiple Assets
- Saving rendered images and 3D re-projection
- Find How Many/What Cameras You Should Train With
- Configuring Rendering Settings
- Creating Visualization Markers
- Wrapping environments
- Adding your own learning library
- Recording Animations of Simulations
- Recording video clips during training
- Curriculum Utilities
- Mastering Omniverse for Robotics
- Setting up CloudXR Teleoperation
- Setting up Haply Teleoperation
- Simulation Performance and Tuning
- Optimize Stage Creation
- Developer’s Guide
Overview
Features
Experimental Features
Migration Guides
Source API
References