Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/extensions/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ rcs_fr3
rcs_panda
rcs_xarm7
rcs_so101
rcs_ur5e
rcs_realsense
rcs_usb_cam
rcs_tacto
Expand Down
49 changes: 49 additions & 0 deletions docs/extensions/rcs_ur5e.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# UR5e Extension

The `rcs_ur5e` extension adds UR5e hardware support and an environment creator for Robot Control Stack.

## Safety Notices

- This controller does not implement compliant force control.
- Start slowly on the teach pendant and verify the home pose before running motions.
- Treat the extension as a low-level controller, not a safety layer.

## Installation

```shell
pip install -ve extensions/rcs_ur5e
```

## Hardware Usage

For direct hardware access, create a `UR5eConfig`, then build the robot with a kinematics backend.

```python
import rcs
from rcs_ur5e.hw import UR5e, UR5eConfig

cfg = UR5eConfig(ip="192.168.25.201")
ik = rcs.common.Pin(
cfg.kinematic_model_path,
cfg.attachment_site,
urdf=cfg.kinematic_model_path.endswith(".urdf"),
)
robot = UR5e(cfg, ik)
```

For a Gymnasium-style hardware environment, use `RCSUR5eEnvCreator` from `rcs_ur5e.creators`.

## Simulation Usage

The simulation examples show the current setup pattern for UR5e in MuJoCo:

- `examples/ur5e/ur5e_env_joint_control.py`
- `examples/ur5e/ur5e_env_cartesian_control.py`

Unlike the default FR3 simulation config, the UR5e examples set robot joints, actuators, attachment site, and kinematic model explicitly before creating the environment.

## Notes

- Hardware support is implemented in `extensions/rcs_ur5e/src/rcs_ur5e/hw.py`.
- The environment creator lives in `extensions/rcs_ur5e/src/rcs_ur5e/creators.py`.
- Use the example scripts as the source of truth if this page ever drifts.
8 changes: 8 additions & 0 deletions docs/getting_started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,14 @@ We build and test RCS on the latest Debian and on the latest Ubuntu LTS.

2. Create and activate Python virtual environment or conda environment:

RCS requires **Python >= 3.10**. We strongly recommend **Python 3.11** for full compatibility with all extensions (for example `rcs_realsense`).

```{note}
- **Python 3.11** is preferred for development.
- **Python > 3.11**: `rcs_realsense` may have compatibility issues due to `pyrealsense2` limitations.
- **Python > 3.12**: The `ompl` dependency is currently unavailable on PyPI.
```

```shell
conda create -n rcs python=3.11
conda activate rcs
Expand Down
167 changes: 167 additions & 0 deletions docs/user_guide/conventions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,167 @@
# Coordinate and Pose Conventions

This page collects the main 3D conventions used across RCS. It is meant as a quick reference when working with kinematics, environments, simulation objects, and teleoperation.

## Frames

### World frame

In simulation, scenes and objects live in a global **world frame**.

The core robot API exposes explicit conversions between world and robot coordinates:

- `Robot.get_base_pose_in_world_coordinates()`
- `Robot.to_pose_in_world_coordinates(...)`
- `Robot.to_pose_in_robot_coordinates(...)`

### Robot base frame

Kinematics and low-level robot poses are expressed in the robot's **base frame** (also called robot coordinates).

This is the frame assumed by the kinematics backends, for example in `extensions/rcs_robotics_library/src/pybind/RL.h`:

- `inverse(...)`: `pose is assumed to be in the robots coordinate frame`
- `forward(...)`: `pose is assumed to be in the robots coordinate frame`

The Franka hardware code uses the same convention and explicitly refers to the end-effector pose in the **base frame**.

### End-effector frame: `attachment_site`

Each robot config defines an `attachment_site`. This is the end-effector frame used by the kinematics stack.

Common examples in the repository are:

- `attachment_site_0` for FR3 / Panda
- `attachment_site` for UR5e / XArm7
- `gripper` in the SO101 examples

If you are unsure which frame a robot uses, check its config or the relevant example script.

### Tool frame: `tcp_offset`

`tcp_offset` is applied on top of the attachment site to define the actual tool center point (TCP) used by motion commands and IK.

In other words:

- `attachment_site` = default end-effector frame from the model
- `tcp_offset` = additional transform from that frame to the tool you want to control

## Pose representations

RCS uses several pose encodings. The important ones are:

### `Pose`

`rcs.common.Pose` is the main transform type. It supports construction from:

- translation only
- quaternion + translation
- `RPY` + translation
- rotation matrix + translation

### Quaternion order

Within RCS, quaternions are stored in **xyzw** order.

This is visible in two places:

- `python/tests/test_common.py` checks that the identity quaternion is `[0, 0, 0, 1]`
- `python/rcs/envs/sim.py` comments `rotation_q()` as `# xyzw format`

So the convention is:

```text
[qx, qy, qz, qw]
```

### `tquat`

`tquat` means translation plus quaternion and is used by the environment API.

The value layout is:

```text
[x, y, z, qx, qy, qz, qw]
```

This follows directly from `python/rcs/envs/base.py`, where `tquat` is built as:

```python
np.concatenate([
pose.translation(),
pose.rotation_q(),
])
```

### `xyzrpy`

`xyzrpy` is the translation plus roll-pitch-yaw representation used by the environment API.

The value layout is:

```text
[x, y, z, roll, pitch, yaw]
```

The `RPY` type in `python/rcs/_core/common.pyi` exposes the fields in exactly that order:

- `roll`
- `pitch`
- `yaw`

RCS examples and environment limits use `np.deg2rad(...)`, so angles are expected in **radians** unless explicitly documented otherwise.

### Rotation vector / `rotvec`

Some hardware integrations, notably UR, also use a 6D rotation-vector pose:

```text
[x, y, z, rx, ry, rz]
```

You can see this in `extensions/rcs_ur5e/src/rcs_ur5e/hw.py`, where `common.RotVec(...).as_quaternion_vector()` is converted into an RCS `Pose`, and `Pose.rotvec()` is sent back to the robot.

## MuJoCo caveat: free-joint quaternions use `wxyz`

A common source of confusion is that **RCS uses `xyzw`**, but MuJoCo free-joint `qpos` stores the quaternion as **`wxyz`**.

RCS already handles this conversion where needed. For example, `python/rcs/envs/sim.py` explicitly reorders the quaternion when writing directly into MuJoCo joint state:

```python
quat = self.init_object_pose.rotation_q() # xyzw format
...
[self.x, self.y, self.z, quat[3], quat[0], quat[1], quat[2]]
```

So:

- **RCS `Pose` / env API**: `xyzw`
- **MuJoCo free-joint `qpos`**: `wxyz`

If you manipulate MuJoCo state directly, convert between the two explicitly.

## Axis convention

RCS generally works with robot-local base coordinates, and the standard Franka teleoperation setup documents the expected axis orientation as:

```text
x = front
y = left
z = up
```

This comes from `examples/teleop/README.md`, which also notes that the Quest alignment step should match the robot's base frame.

If you are integrating a new robot, make sure its model, `attachment_site`, and teleop calibration all agree on the same base-frame orientation.

## Practical checklist

When something looks wrong, check these first:

1. Are you working in **world frame** or **robot/base frame**?
2. Is your end-effector frame the correct `attachment_site`?
3. Did you apply the right `tcp_offset`?
4. Are your quaternions **xyzw** in RCS?
5. Are you accidentally feeding **MuJoCo `wxyz`** into an RCS API?
6. Are your RPY values in **radians**?
7. Does your teleop or calibration frame match the robot base axes?
67 changes: 67 additions & 0 deletions docs/user_guide/data_collection.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# Data Collection

RCS provides tools for efficiently collecting and managing robot interaction data, primarily for Imitation Learning and Reinforcement Learning.

## StorageWrapper

The `StorageWrapper` is the primary tool for recording environment transitions. It is designed to be crash-safe and efficient.

### Key Features

- **Asynchronous Writing**: Data is written to disk in a background thread to minimize impact on the control loop.
- **Crash-Safe**: Data is flushed in atomic batches, ensuring that most data is preserved even if the process crashes.
- **Parquet Format**: Uses the Apache Parquet format (via `pyarrow`) for efficient storage and fast reading.
- **Automatic Consolidation**: Small batch files are automatically merged into larger optimized files when the environment is closed.
- **Image Compression**: RGB frames are automatically encoded as JPEGs to save space.

### Usage

```python
from rcs.envs.storage_wrapper import StorageWrapper

# Wrap your environment
env = StorageWrapper(
env,
base_dir="data/my_experiment",
instruction="pick up the red cube",
always_record=False # Only record when start_record() is called
)

# Control recording
env.start_record()
# ... perform tasks ...
env.stop_record()

# Close to ensure all data is flushed and consolidated
env.close()
```

## Dataset Structure

The data is organized in a directory structure partitioned by date:

```text
base_dir/
date=2024-05-20/
part-0-a1b2c3d4.parquet
part-1-e5f6g7h8.parquet
date=2024-05-21/
...
```

Each Parquet file contains:
- `obs`: The environment observation (flattened).
- `action`: The action taken.
- `reward`: The reward received.
- `success`: Boolean indicating task success.
- `instruction`: The text instruction for the task.
- `timestamp`: Unix timestamp of the transition.
- `uuid`: A unique identifier for the episode.

## Consolidating Data

If a script exits unexpectedly and consolidation doesn't run, you can manually consolidate the fragmented files using the `StorageWrapper.consolidate` static method or the RCS CLI:

```shell
python -m rcs consolidate data/my_experiment
```
4 changes: 4 additions & 0 deletions docs/user_guide/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,8 @@ The User Guide provides in-depth information about the core concepts and compone
architecture
gym_interface
low_level_api
conventions
teleoperation
data_collection
remote_inference
```
54 changes: 54 additions & 0 deletions docs/user_guide/remote_inference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Remote Inference

RCS includes a lightweight RPC (Remote Procedure Call) layer based on [RPyC](https://rpyc.readthedocs.io/). This allows you to run an RCS environment on one machine (e.g., a machine connected to robot hardware) and control it or run inference from another machine (e.g., a powerful GPU server).

## Overview

The RPC system consists of two main components:
- **`RcsServer`**: Wraps an existing RCS environment and exposes it over the network.
- **`RcsClient`**: A Gymnasium-compatible environment that forwards all calls to a remote `RcsServer`.

## Usage

### Starting the Server

The server machine should be the one physically connected to the robot or running the simulation.

```python
from rcs.envs.configs import EmptyWorldFR3
from rcs.rpc.server import RcsServer

# Create your environment
scene = EmptyWorldFR3()
env = scene.create_env(scene.config())

# Start the RPC server
server = RcsServer(env, port=50051)
server.start()
```

### Connecting with the Client

The client machine runs your control logic or neural network inference.

```python
from rcs.rpc.client import RcsClient

# Connect to the remote server
client = RcsClient(host="robot-machine-ip", port=50051)

# Use it like a local Gymnasium environment
obs, info = client.reset()
action = model.predict(obs)
obs, reward, terminated, truncated, info = client.step(action)
```

## Advantages

- **Hardware Isolation**: Keep your expensive GPU server away from the robot workspace.
- **Resource Management**: Run heavy inference on a dedicated machine while the robot control machine handles low-level loops.
- **Flexibility**: Easily switch between local and remote environments by just changing the environment initialization.

## Examples

See `examples/rpc_server_client/` for a complete working example of a server and client setup.
Loading