Skip to content

Commit

Permalink
Merge pull request #225 from tokk-nv/main
Browse files Browse the repository at this point in the history
Add images to LeRobot tutorial
  • Loading branch information
tokk-nv authored Oct 16, 2024
2 parents 781c6c9 + a106a76 commit 3031284
Show file tree
Hide file tree
Showing 5 changed files with 35 additions and 17 deletions.
Binary file added docs/images/lerobot_csi_camera.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/lerobot_jetson_rig.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/lerobot_jetson_ssd.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/lerobot_visuzalize_dataset_html.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
52 changes: 35 additions & 17 deletions docs/lerobot.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,7 @@

Let's run HuggingFace [`LeRobot`](https://github.com/huggingface/lerobot/) to train Transformer-based [action diffusion](https://diffusion-policy.cs.columbia.edu/) policies and [ACT](https://github.com/tonyzhaozh/act) onboard NVIDIA Jetson. These models learn to predict actions for a particular task from visual inputs and prior trajectories, typically collected during teleoperation or in simulation.

<video controls autoplay muted style="max-width: 640px">
<source src="https://github.com/user-attachments/assets/1ec6e4f0-0f85-4a8a-85c0-f70019f3405b" type="video/mp4">
</video>
![alt text](images/lerobot_jetson_rig.png)

!!! abstract "What you need"

Expand Down Expand Up @@ -44,6 +42,8 @@ This section gives the guide on how you can work through the LeRobot official ex

### a. Check `jetson-container`'s location

![alt text](images/lerobot_jetson_ssd.png){: style="height:240px;" align=right}

Through out the course of all the workflows of `lerobot`, we will be generating a lot of data, especially for capturing dataset.

We will clone the `lerobot` directory on host and mount the directory in the container to keep all the data persistant, but first make sure your `jetson-containers` directory is placed on your SSD, not on your eMMC or microSD card.
Expand Down Expand Up @@ -180,6 +180,8 @@ lrwxrwxrwx 1 root root 7 Sep 24 16:13 /dev/ttyACM_kochleader -> ttyACM1

### e. (Optional) CSI cameras

![alt text](images/lerobot_csi_camera.png){: style="height:240px;" align=right}

If you plan to use CSI cameras (not USB webcams) for data capture, you will use the new `--csi2webcam` options of `jetson-containers`, which exposes V4L2loopback devices that performs like USB webcams (MJPEG) for CSI cameras using Jetson's hardware JPEG encoder.

This feature require some packages to be installed.
Expand Down Expand Up @@ -289,6 +291,25 @@ Follow the [official document's section](https://github.com/huggingface/lerobot/

Another thing worth experimenting is the **wrist cam**. More to come later.

!!! tip

Following commands are registered in Bash history inside the `lerobot` container.

```bash
wandb login
export HF_USER=
python lerobot/scripts/control_robot.py record \
--robot-path lerobot/configs/robot/koch.yaml \
--fps 30 \
--root data \
--repo-id ${HF_USER}/koch_test_$(date +%Y%m%d_%H%M%S) \
--tags tutorial \
--warmup-time-s 5 \
--episode-time-s 30 \
--reset-time-s 30 \
--num-episodes 10
```

!!! tip

If you plan to perfom training on a different machine, `scp` the dataset directory.
Expand All @@ -312,22 +333,17 @@ You should operate on ther container's terminal.
Follow the [official document's section](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#4-train-a-policy-on-your-data).

!!! tip

Following commands are registered in Bash history inside the `lerobot` container.


```bash
wandb login
export HF_USER=
python lerobot/scripts/control_robot.py record \
--robot-path lerobot/configs/robot/koch.yaml \
--fps 30 \
--root data \
--repo-id ${HF_USER}/koch_test_$(date +%Y%m%d_%H%M%S) \
--tags tutorial \
--warmup-time-s 5 \
--episode-time-s 30 \
--reset-time-s 30 \
--num-episodes 10
DATA_DIR=data python lerobot/scripts/train.py \
dataset_repo_id=${HF_USER}/koch_test \
policy=act_koch_real \
env=koch_real \
hydra.run.dir=outputs/train/act_koch_test \
hydra.job.name=act_koch_test \
device=cuda \
wandb.enable=true
```

!!! tip
Expand Down Expand Up @@ -371,6 +387,8 @@ Follow the [official document's section](https://github.com/huggingface/lerobot/
--repo-id ${HF_USER}/eval_koch_test
```

![alt text](images/lerobot_visuzalize_dataset_html.png)

If everything goes well, you should see

<video controls autoplay muted style="max-width: 960px">
Expand Down

0 comments on commit 3031284

Please sign in to comment.