home / skills / openclaw / skills / openclaw-robotics

openclaw-robotics skill

/skills/qinrui-dm/openclaw-robotics

This skill lets you control Unitree mobile robots via instant messaging, enabling precise navigation, movement commands, and sensor integration across

npx playbooks add skill openclaw/skills --skill openclaw-robotics

Review the files below or copy the command above to add this skill to your agents.

Files (27)
SKILL.md
1.2 KB
---
name: unitree-robot
description: "Control mobile robots (quadruped, bipedal, wheeled, aerial) via IM platforms. Supports Unitree robots and Insight9 AI stereo camera."
metadata: {
  "openclaw": {
    "emoji": "🤖",
    "requires": {
      "python": ">=3.8",
      "pip": ["numpy"]
    }
  }
}
---

# Unitree Robot Controller Skill

Control various mobile robots through instant messaging platforms.

## Supported Robots

| Code | Model | Type |
|------|-------|------|
| `unitree_go1` | Unitree GO1 | Quadruped |
| `unitree_go2` | Unitree GO2 | Quadruped |
| `unitree_g1` | Unitree G1 | Bipedal/Humanoid |
| `unitree_h1` | Unitree H1 | Bipedal/Humanoid |

## Coming Soon

| Code | Type |
|------|------|
| `wheeled_*` | Wheeled robots |
| `drone_*` | Aerial robots |
| `surface_*` | Surface vehicles |

## Supported Sensors

| Code | Sensor |
|------|--------|
| `insight9` | Looper Robotics AI Stereo Camera (RGB-D) |

## Navigation

- **TinyNav** integration for path planning and obstacle avoidance (coming soon)

## Usage

```python
from unitree_robot_skill import initialize, execute

initialize(robot="unitree_go2", im="wecom")
execute("forward 1m")
execute("turn left 45")
```

Overview

This skill provides remote control of mobile robots (quadruped, bipedal/humanoid, and future wheeled/aerial vehicles) via instant messaging platforms. It integrates with Unitree robot models and the Insight9 AI stereo camera to send movement commands, receive status, and stream sensor data. The implementation is Python-based and designed for quick command-and-control workflows over IM. It is useful for testing, demonstrations, and lightweight teleoperation.

How this skill works

The skill initializes a connection to a specified Unitree robot model and an IM platform, then accepts short textual commands (for example: "forward 1m", "turn left 45") to drive motion primitives. It can pull RGB-D frames from the Insight9 stereo camera and expose basic sensor readings and status through the IM channel. Path planning and obstacle avoidance are handled via TinyNav integration when enabled; otherwise the skill executes direct motion primitives with safety limits.

When to use it

  • Teleoperate a Unitree quadruped or humanoid robot from a remote IM client.
  • Run quick demonstrations or manual tests without a full GUI or SDK.
  • Stream basic stereo camera frames and sensor status through IM for remote inspection.
  • Prototype IM-driven command workflows before deploying more complex autonomy.
  • Archive and replay simple command sequences for repeatable demos.

Best practices

  • Always verify robot firmware compatibility and safe operation modes before issuing motion commands.
  • Start with short, low-speed commands when testing new hardware or environments.
  • Calibrate and validate the Insight9 camera depth data in your environment prior to relying on it for navigation.
  • Use TinyNav or another planner for cluttered environments instead of direct manual commands.
  • Log IM sessions and command responses to support debugging and replay.

Example use cases

  • Send a sequence of move and turn commands from a team chat to demonstrate balance and gait.
  • Request a snapshot from the Insight9 stereo camera via IM for remote visual inspection.
  • Run a short path using TinyNav integration to navigate around simple obstacles (when enabled).
  • Archive command sequences for regression testing of locomotion across firmware versions.
  • Remote-start a supervised demo at an event using a lightweight IM control interface.

FAQ

Which Unitree models are supported?

Current support includes Unitree GO1, GO2 (quadrupeds), G1 and H1 (bipedal/humanoid). Wheeled and aerial models are listed as coming soon.

Can I use this skill for autonomous navigation?

Basic command execution is supported immediately. Autonomous navigation relies on TinyNav integration which is available as an optional component; enable it for path planning and obstacle avoidance.

How do I access camera data?

The skill can stream RGB-D frames from the Insight9 AI stereo camera to the IM channel or provide snapshots on request.