Reachy Mini

Reachy Mini FAQ

Your guide to getting started and beyond

๐Ÿงญ

1. Getting Started

1.1 General / SDK Basics

Product overview image / GIF
How do I connect to Reachy Mini from Python?
SDK PYTHON START

To control Reachy Mini, you mainly use the ReachyMini class from the reachy_mini package:

from reachy_mini import ReachyMini

mini = ReachyMini()

This connects to the Reachy Mini daemon (started via reachy-mini-daemon or reachy_mini.daemon.app.main) and initializes motors and sensors.

Recommended pattern: use a context manager to automatically handle connection and cleanup:

from reachy_mini import ReachyMini

with ReachyMini() as mini:
    # your code here
    ...

Source: API documentation.

Do I need to start the daemon manually?
SDK START

Yes. All examples assume you have already started the Reachy Mini daemon:

  • Either via command line: reachy-mini-daemon
  • Or via Python: reachy_mini.daemon.app.main

The ReachyMini instance will then connect to this daemon.

Source: API documentation.

1.2 Assembly

Assembly guide image / GIF
How long does assembly usually take?
ASSEMBLY HARDWARE

Most testers report 1.5โ€“2 hours, with some up to 4 hours depending on experience.

Source: Feedback โ€“ "What's your background?" = hobbyist / developer / experienced technology professional.

What do testers consider an ideal assembly time?
ASSEMBLY

Most testers consider about 2 hours to be an ideal assembly time.

Source: Feedback โ€“ "What's your background?"

What were the trickiest parts of assembly?
ASSEMBLY HARDWARE

Testers highlighted:

  • Cable routing
  • Correct torque on screws

Source: Feedback โ€“ "What's your background?"

1.3 Dashboard & First Run

The dashboard at http://localhost:8000 doesn't work โ€” what should I check?
DASHBOARD NETWORK SDK

Typical checks:

  1. You are using a proper Python virtual environment (.venv).
  2. You installed/updated the Reachy Mini SDK inside that environment:
    pip install -U reachy-mini
  3. The daemon is running.

This combination fixed most dashboard issues.

Source: Discord โ€“ author: matth_lap (trusted).

๐Ÿงฉ

2. Using & Developing Applications

2.1 Moving the Robot

Robot movement demo GIF
How do I move Reachy Mini's head to a specific pose?
MOVEMENT HEAD SDK

Use goto_target with a pose created by create_head_pose:

from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose

with ReachyMini() as mini:
    mini.goto_target(head=create_head_pose(y=-10, mm=True))

Here:

  • create_head_pose builds a 4x4 transform matrix (position + orientation).
  • mm=True means translation arguments are in millimeters.
  • The head frame is located at the base of the head.

Source: API documentation.

How do I control head orientation (roll, pitch, yaw)?
MOVEMENT HEAD

You can add orientation arguments to create_head_pose, for example:

pose = create_head_pose(z=10, roll=15, degrees=True, mm=True)
mini.goto_target(head=pose, duration=2.0)
  • degrees=True โ‡’ angles are given in degrees.
  • You can combine translation (x, y, z) and orientation (roll, pitch, yaw).

Source: API documentation.

How do I move head, body, and antennas at the same time?
MOVEMENT HEAD ANTENNAS BODY

Use goto_target with multiple named arguments:

import numpy as np
from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose

with ReachyMini() as mini:
    mini.goto_target(
        head=create_head_pose(y=-10, mm=True),
        antennas=np.deg2rad([45, 45]),
        duration=2.0,
        body_yaw=np.deg2rad(30),
    )
  • antennas is a 2-element array in radians [right, left].
  • body_yaw controls body rotation.

Source: API documentation.

What's the difference between goto_target and set_target?
MOVEMENT CONTROL REALTIME

goto_target:

  • Interpolates motion over a duration (default 0.5s).
  • Supports different interpolation methods (linear, minjerk, ease, cartoon).
  • Ideal for smooth, timed motions.

set_target:

  • Sets the target immediately, without interpolation.
  • Suited for high-frequency control (e.g. sinusoidal trajectories, teleoperation).

Example (sinusoidal motion):

y = 10 * np.sin(2 * np.pi * 0.5 * t)
mini.set_target(head=create_head_pose(y=y, mm=True))

Source: API documentation.

How do I choose the interpolation method for movements?
MOVEMENT INTERPOLATION

Use the method argument in goto_target:

mini.goto_target(
    head=create_head_pose(y=10, mm=True),
    antennas=np.deg2rad([-45, -45]),
    duration=2.0,
    method="cartoon",  # "linear", "minjerk", "ease", or "cartoon"
)

For a visual comparison of methods, you can run the example:

  • examples/goto_interpolation_playground.py

Source: API documentation.

2.2 Writing & Sharing Apps

How do I write a Reachy Mini app?
APPS SDK PROGRAMMING

Inherit from ReachyMiniApp and implement run:

import threading
from reachy_mini.apps.app import ReachyMiniApp
from reachy_mini import ReachyMini

class MyApp(ReachyMiniApp):
    def run(self, reachy_mini: ReachyMini, stop_event: threading.Event):
        # your app logic
        ...
  • stop_event tells you when the app should stop.
  • This pattern is used for standalone apps and for HF Spaces.

Source: API documentation.

How can I generate a new app template quickly?
APPS TOOLING

Use the app template generator:

reachy-mini-make-app my_app_name

This creates:

my_app_name/
โ”œโ”€โ”€ pyproject.toml
โ”œโ”€โ”€ README.md
โ”œโ”€โ”€ my_app_name/
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ””โ”€โ”€ main.py

You can run it via:

python my_app_name/main.py

Or install it as a package:

pip install -e my_app_name/

Source: API documentation.

Is there a recommended way to share apps with the community?
APPS COMMUNITY CONTRIBUTING

Yes, via Hugging Face Spaces:

  • Example hub: Reachy Mini Apps
  • Official example space: reachy_mini_app_example

Source: API documentation + Discord โ€“ author: annecharlotte_pollen (trusted).

2.3 Playing Recorded Moves

Dance / emotion demo GIF
How do I play predefined moves?
MOVEMENT DATASET PLAYBACK

Use RecordedMoves and play_move:

from reachy_mini import ReachyMini
from reachy_mini.motion.recorded_move import RecordedMoves

with ReachyMini() as mini:
    recorded_moves = RecordedMoves("pollen-robotics/reachy-mini-dances-library")
    for move_name in recorded_moves.list_moves():
        mini.play_move(recorded_moves.get(move_name), initial_goto_duration=1.0)
  • initial_goto_duration smoothly brings the robot to the starting pose of the move.
  • Datasets are hosted on Hugging Face (e.g. emotions / dances libraries).

Source: API documentation.

How do I record my own moves for later replay?
MOVEMENT RECORDING TELEOP

Call start_recording() and stop_recording(), which record all set_target calls:

with ReachyMini() as mini:
    mini.start_recording()
    # run your teleop / control code...
    recorded_motion = mini.stop_recording()

You can then unpack time, head, antennas, body_yaw from each frame and save them.

Tools to record/upload datasets are available in:

  • reachy_mini_toolbox/tools/moves

Source: API documentation.

๐ŸŽ›

3. Hardware Guide & Motion Limits

Motion limits diagram
What are the safety limits of the head and body?
MOTORS LIMITS SAFETY

Physical & software limits include:

  1. Motors have limited range of motion.
  2. Head can collide with the body.
  3. Body yaw: [-180ยฐ, 180ยฐ].
  4. Head pitch/roll: [-40ยฐ, 40ยฐ].
  5. Head yaw: [-180ยฐ, 180ยฐ].
  6. Difference (body_yaw - head_yaw): [-65ยฐ, 65ยฐ].

If you command a pose outside these limits, Reachy Mini will automatically clamp to the nearest safe pose (no exception is raised).

Source: API documentation.

What happens if I ask for an impossible pose (beyond limits)?
MOTORS LIMITS

Example:

reachy.goto_target(head=create_head_pose(roll=-50, degrees=True))
  • This exceeds the roll limit (ยฑ40ยฐ).
  • The robot will not throw an error but will move to the closest safe pose.

You can inspect the actual pose using:

head_pose = reachy.get_current_head_pose()

Source: API documentation.

Are power supply specs documented in the API docs?
POWER HARDWARE

No, the API documentation does not specify the exact power supply specs (voltage, current). This was also requested in the Discord chat but not answered there.

Source: [NO DIRECT DATA in API docs] + Discord question with no official answer.

๐Ÿง 

4. Sensors & Media

4.1 Camera

How do I grab camera frames from Reachy Mini?
VISION CAMERA MEDIA

Use the media object:

from reachy_mini import ReachyMini

with ReachyMini() as mini:
    frame = mini.media.get_frame()
    # frame is a numpy array as returned by OpenCV

Source: API documentation.

4.2 Microphone

How do I access microphone audio samples?
AUDIO MEDIA
from reachy_mini import ReachyMini

with ReachyMini() as mini:
    sample = mini.media.get_audio_sample()
    # sample is a numpy array as output by sounddevice

Source: API documentation.

4.3 Speaker

How do I send audio to the speaker?
AUDIO MEDIA
from reachy_mini import ReachyMini

with ReachyMini() as mini:
    mini.media.push_audio_sample(chunk)  # chunk = numpy array of audio samples

Source: API documentation.

4.4 GStreamer Backend

How can I use the GStreamer backend instead of the default OpenCV/sounddevice?
AUDIO VIDEO GSTREAMER ADVANCED

Install with the GStreamer extra:

pip install -e ".[gstreamer]"

Then run examples with:

  • --backend gstreamer

You need GStreamer binaries properly installed on your system. You can also define custom pipelines (see camera_gstreamer.py in the repo).

Source: API documentation.

๐Ÿ‘€

5. Interaction Features

Look-at feature demo GIF
How can I make Reachy Mini look at a point in the image?
VISION LOOK_AT_IMAGE

Use the look_at_image method (see example look_at_image.py):

  • Provide a 2D point in image coordinates (0,0 = top-left).
  • You can specify a duration, similar to goto_target.

Source: API documentation.

How can I make Reachy Mini look at a 3D point in the world?
VISION LOOK_AT_WORLD

Use look_at_world, which accepts 3D coordinates in the robot world frame (see world frame illustration in docs).

Source: API documentation.

๐Ÿฆด

6. Motors, Compliancy & Manual Manipulation

How do I enable, disable, or make motors compliant?
MOTORS COMPLIANCY SAFETY

Three main methods:

  1. enable_motors
    • Powers motors ON.
    • Robot holds position, cannot be moved by hand.
  2. disable_motors
    • Powers motors OFF.
    • Robot is limp; you can move it freely by hand.
  3. make_motors_compliant
    • Motors ON but compliant (soft).
    • Good for demonstrations or teaching-by-demonstration.
    • Used in the "gravity compensation" example.

Source: API documentation.

๐Ÿงช

7. Software Setup & Environment

Why do examples assume a virtual environment?
SDK PYTHON ENVIRONMENT

While not strictly required, using a virtual environment:

  • Avoids dependency conflicts.
  • Makes upgrading the SDK (pip install -U reachy-mini) safer.

This pattern is also used by beta testers who had fewer installation issues inside a .venv.

Source: API documentation (implicit) + Discord โ€“ author: matth_lap.

๐ŸŽจ

8. Customization & CAD

Are CAD files mentioned in the API documentation?
CAD 3D_PRINTING

The API documentation doesn't mention CAD files directly. Some testers requested them; no official link was provided in the logs.

Source: [NO DIRECT DATA in API docs] + Feedback / Discord.

๐Ÿค

9. Contributing & Community

How can I share my app with other users?
APPS COMMUNITY CONTRIBUTING

Recommended workflow:

  • Wrap your behavior in a ReachyMiniApp.
  • Publish as a HF Space or Python package.
  • Look at the official example space reachy_mini_app_example.

Source: API documentation + Discord โ€“ author: annecharlotte_pollen.

How can I contribute to datasets of moves (dances/emotions)?
DATASET CONTRIBUTING MOVEMENT

Use the tools in:

  • reachy_mini_toolbox/tools/moves

They help you:

  • Record moves (via start_recording / stop_recording)
  • Upload them to the Hugging Face Hub for reuse by others.

Source: API documentation.