This is my productionized deployment environment for my real world AI robot policies, and a place to create training data for reinforcement learning and imitation learning.
Video Demo:
It's also the first project I've built using create-ros-app, a template I'm developing to make it easier for anyone to create and deploy production ready ROS2 applications.
Check out the large-scale synthetic dataset that's been generated with this project!
Phase 1 focuses on leveraging imitation learning + synthetic data to improve the robustness of imitation-learning datasets with regards to lighting, camera position, and other environmental factors.
Detailed Status
- Develop a decent teleoperation interface for the MyArm M&C robot leader/follower arms
- Add isaac-sim support for visualizing above arms
- Learn how to use Replicator to multiplex trajectories of human demonstrations of robot tasks done in-simulation
- Create dataset collection tools based on lerobot dataset format
- Play around, train, and test lerobot policies. Done: Now available via this fork of lerobot
- Create isaac -> lerobot conversion scripts
- Validate synthetic data improves performance
- In Progress: Collect a synthetic data dataset for cube->basket task
- In Progress: Create STL Assets for cube and basket
- Add randomization for position and scale of ground plane (and other objects too)
- Move main-scene to isaac_src/scenes/cube-cup-task.scene and set it up with 25 preset positions
- record myself moving the robot IRL, and compare to the simulation when rendered @ 30fps
- Validate observation.state vs action latency (and direction of latency) matches real world captured datasets
- Measured (with this project): 14 frames latency from action -> observation
- Measured (with myarm lerobot branch): ~9 frames latency from action -> observation
- Validate frames are synced as expected, for example, when the robot starts moving in the opposite direction
- Frames appear synced, however when compared to real-world footage there is a speed scaling issue...
- Validate "Replay episode" works as expected with episodes collected in Isaac Sim
- Investigate high latency in myarm loop (it is higher than in lerobot branch)
- Better calibrate robots so they match position in sim to real world
- Record position of articulator in sim, not just real robot joints
- Fix bug with myarm firmware where there's a singularity at the 0 point
- Actually collect 50 samples
- Sanity check a few episodes in the Lerobot visualizer before uploading
- Upload and start training
- Create a RobotProtocol that emulates latency and speed of my real robot
- Collect a small real dataset for cube->basket task
- Train a model on synthetic data, fine-tune on real data
- Train a model on real data]
- Compare performance of model trained on synthetic data vs real data
- In Progress: Collect a synthetic data dataset for cube->basket task
- Create & Document easy workflows for:
- Record demonstrations with real ledirview.pyader arm and simulation follower arm
- Multiplex demonstrations using domain randomization, leveraging Replicator learnings above
- Training models with mix of real and simulated data
- Benchmark the sim2real gap with this project, publicize results to open source community
- Add support for Koch arm and other open-source robot arm
Add Reinforcement learning pipelines with ROS2 and Isaac-Sim support
Support long-horizon tasks involving multiple policies, with a focus on VLMs and language-grounded interaction with the robot.
The simplest thing you can do is teleoperation with your robot arm copied in Isaac Sim. To do that, run:
docker/launch teleop_myarm
This will automatically build and run everything needed for the project. Then, open http://localhost/ on your browser to view the project logs.
For in-depth documentation on the repository features, read the About Template documentation.
There are three steps to synthetic data generation. First, open up the synthetic_generation
launch profile:
docker/launch synthetic_generation
This will spin up Isaac Sim. Set up extensions using this guide.
This extension is custom built, and stored in isaac_src/extensions
. Next, plug in your robot. When using a real world robot, you will need to add a parameters.overrides.yaml
file to the
root of launch-profiles/synthetic_generation
to configure your specific robots drivers.
Next, record an episode! Configure the "Trajectory Recorder" window, and click "Start Recording":
Once you've recorded an episode, you can re-render with domain randomization using the "Trajectory Renderer" extension:
This will generate a configurable number of episodes while varying everything possible- including joint interpolation to move the robot faster or slower than the original demonstration.
Finally, convert the rendered episodes to a dataset:
docker/run convert_isaac_to_lerobot \
--episodes_dir /robot/synthetic-output/recordings \
--output_dir /robot/synthetic-output/lerobot-fmt \
--fps 30 \
--task "Pick up a cube and place it in the basket"
- Docker, and optionally Nvidia Container Toolkit for hardware acceleration.
- Poetry, in order to use linting tooling.
This repository was initialized by the create-ros-app template. Contributions are welcome!