1. Getting Started
1.1 General / SDK Basics
How do I connect to Reachy Mini from Python?
To control Reachy Mini, you mainly use the ReachyMini class from the reachy_mini package:
from reachy_mini import ReachyMini
mini = ReachyMini()
This connects to the Reachy Mini daemon (started via reachy-mini-daemon or reachy_mini.daemon.app.main) and initializes motors and sensors.
Recommended pattern: use a context manager to automatically handle connection and cleanup:
from reachy_mini import ReachyMini
with ReachyMini() as mini:
# your code here
...
Source: API documentation.
Do I need to start the daemon manually?
Yes. All examples assume you have already started the Reachy Mini daemon:
- Either via command line:
reachy-mini-daemon - Or via Python:
reachy_mini.daemon.app.main
The ReachyMini instance will then connect to this daemon.
Source: API documentation.
1.2 Assembly
How long does assembly usually take?
Most testers report 1.5โ2 hours, with some up to 4 hours depending on experience.
Source: Feedback โ "What's your background?" = hobbyist / developer / experienced technology professional.
What do testers consider an ideal assembly time?
Most testers consider about 2 hours to be an ideal assembly time.
Source: Feedback โ "What's your background?"
What were the trickiest parts of assembly?
Testers highlighted:
- Cable routing
- Correct torque on screws
Source: Feedback โ "What's your background?"
1.3 Dashboard & First Run
The dashboard at http://localhost:8000 doesn't work โ what should I check?
Typical checks:
- You are using a proper Python virtual environment (
.venv). - You installed/updated the Reachy Mini SDK inside that environment:
pip install -U reachy-mini - The daemon is running.
This combination fixed most dashboard issues.
Source: Discord โ author: matth_lap (trusted).
2. Using & Developing Applications
2.1 Moving the Robot
How do I move Reachy Mini's head to a specific pose?
Use goto_target with a pose created by create_head_pose:
from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
with ReachyMini() as mini:
mini.goto_target(head=create_head_pose(y=-10, mm=True))
Here:
create_head_posebuilds a 4x4 transform matrix (position + orientation).mm=Truemeans translation arguments are in millimeters.- The
headframe is located at the base of the head.
Source: API documentation.
How do I control head orientation (roll, pitch, yaw)?
You can add orientation arguments to create_head_pose, for example:
pose = create_head_pose(z=10, roll=15, degrees=True, mm=True)
mini.goto_target(head=pose, duration=2.0)
degrees=Trueโ angles are given in degrees.- You can combine translation (x, y, z) and orientation (roll, pitch, yaw).
Source: API documentation.
How do I move head, body, and antennas at the same time?
Use goto_target with multiple named arguments:
import numpy as np
from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
with ReachyMini() as mini:
mini.goto_target(
head=create_head_pose(y=-10, mm=True),
antennas=np.deg2rad([45, 45]),
duration=2.0,
body_yaw=np.deg2rad(30),
)
antennasis a 2-element array in radians [right, left].body_yawcontrols body rotation.
Source: API documentation.
What's the difference between goto_target and set_target?
goto_target:
- Interpolates motion over a duration (default 0.5s).
- Supports different interpolation methods (
linear,minjerk,ease,cartoon). - Ideal for smooth, timed motions.
set_target:
- Sets the target immediately, without interpolation.
- Suited for high-frequency control (e.g. sinusoidal trajectories, teleoperation).
Example (sinusoidal motion):
y = 10 * np.sin(2 * np.pi * 0.5 * t)
mini.set_target(head=create_head_pose(y=y, mm=True))
Source: API documentation.
How do I choose the interpolation method for movements?
Use the method argument in goto_target:
mini.goto_target(
head=create_head_pose(y=10, mm=True),
antennas=np.deg2rad([-45, -45]),
duration=2.0,
method="cartoon", # "linear", "minjerk", "ease", or "cartoon"
)
For a visual comparison of methods, you can run the example:
examples/goto_interpolation_playground.py
Source: API documentation.
2.2 Writing & Sharing Apps
How do I write a Reachy Mini app?
Inherit from ReachyMiniApp and implement run:
import threading
from reachy_mini.apps.app import ReachyMiniApp
from reachy_mini import ReachyMini
class MyApp(ReachyMiniApp):
def run(self, reachy_mini: ReachyMini, stop_event: threading.Event):
# your app logic
...
stop_eventtells you when the app should stop.- This pattern is used for standalone apps and for HF Spaces.
Source: API documentation.
How can I generate a new app template quickly?
Use the app template generator:
reachy-mini-make-app my_app_name
This creates:
my_app_name/
โโโ pyproject.toml
โโโ README.md
โโโ my_app_name/
โ โโโ __init__.py
โ โโโ main.py
You can run it via:
python my_app_name/main.py
Or install it as a package:
pip install -e my_app_name/
Source: API documentation.
Is there a recommended way to share apps with the community?
Yes, via Hugging Face Spaces:
- Example hub: Reachy Mini Apps
- Official example space: reachy_mini_app_example
Source: API documentation + Discord โ author: annecharlotte_pollen (trusted).
2.3 Playing Recorded Moves
How do I play predefined moves?
Use RecordedMoves and play_move:
from reachy_mini import ReachyMini
from reachy_mini.motion.recorded_move import RecordedMoves
with ReachyMini() as mini:
recorded_moves = RecordedMoves("pollen-robotics/reachy-mini-dances-library")
for move_name in recorded_moves.list_moves():
mini.play_move(recorded_moves.get(move_name), initial_goto_duration=1.0)
initial_goto_durationsmoothly brings the robot to the starting pose of the move.- Datasets are hosted on Hugging Face (e.g. emotions / dances libraries).
Source: API documentation.
How do I record my own moves for later replay?
Call start_recording() and stop_recording(), which record all set_target calls:
with ReachyMini() as mini:
mini.start_recording()
# run your teleop / control code...
recorded_motion = mini.stop_recording()
You can then unpack time, head, antennas, body_yaw from each frame and save them.
Tools to record/upload datasets are available in:
reachy_mini_toolbox/tools/moves
Source: API documentation.
3. Hardware Guide & Motion Limits
What are the safety limits of the head and body?
Physical & software limits include:
- Motors have limited range of motion.
- Head can collide with the body.
- Body yaw:
[-180ยฐ, 180ยฐ]. - Head pitch/roll:
[-40ยฐ, 40ยฐ]. - Head yaw:
[-180ยฐ, 180ยฐ]. - Difference (body_yaw - head_yaw):
[-65ยฐ, 65ยฐ].
If you command a pose outside these limits, Reachy Mini will automatically clamp to the nearest safe pose (no exception is raised).
Source: API documentation.
What happens if I ask for an impossible pose (beyond limits)?
Example:
reachy.goto_target(head=create_head_pose(roll=-50, degrees=True))
- This exceeds the roll limit (ยฑ40ยฐ).
- The robot will not throw an error but will move to the closest safe pose.
You can inspect the actual pose using:
head_pose = reachy.get_current_head_pose()
Source: API documentation.
Are power supply specs documented in the API docs?
No, the API documentation does not specify the exact power supply specs (voltage, current). This was also requested in the Discord chat but not answered there.
Source: [NO DIRECT DATA in API docs] + Discord question with no official answer.
4. Sensors & Media
4.1 Camera
How do I grab camera frames from Reachy Mini?
Use the media object:
from reachy_mini import ReachyMini
with ReachyMini() as mini:
frame = mini.media.get_frame()
# frame is a numpy array as returned by OpenCV
Source: API documentation.
4.2 Microphone
How do I access microphone audio samples?
from reachy_mini import ReachyMini
with ReachyMini() as mini:
sample = mini.media.get_audio_sample()
# sample is a numpy array as output by sounddevice
Source: API documentation.
4.3 Speaker
How do I send audio to the speaker?
from reachy_mini import ReachyMini
with ReachyMini() as mini:
mini.media.push_audio_sample(chunk) # chunk = numpy array of audio samples
Source: API documentation.
4.4 GStreamer Backend
How can I use the GStreamer backend instead of the default OpenCV/sounddevice?
Install with the GStreamer extra:
pip install -e ".[gstreamer]"
Then run examples with:
--backend gstreamer
You need GStreamer binaries properly installed on your system. You can also define custom pipelines (see camera_gstreamer.py in the repo).
Source: API documentation.
5. Interaction Features
How can I make Reachy Mini look at a point in the image?
Use the look_at_image method (see example look_at_image.py):
- Provide a 2D point in image coordinates (0,0 = top-left).
- You can specify a duration, similar to
goto_target.
Source: API documentation.
How can I make Reachy Mini look at a 3D point in the world?
Use look_at_world, which accepts 3D coordinates in the robot world frame (see world frame illustration in docs).
Source: API documentation.
6. Motors, Compliancy & Manual Manipulation
How do I enable, disable, or make motors compliant?
Three main methods:
enable_motors- Powers motors ON.
- Robot holds position, cannot be moved by hand.
disable_motors- Powers motors OFF.
- Robot is limp; you can move it freely by hand.
make_motors_compliant- Motors ON but compliant (soft).
- Good for demonstrations or teaching-by-demonstration.
- Used in the "gravity compensation" example.
Source: API documentation.
7. Software Setup & Environment
Why do examples assume a virtual environment?
While not strictly required, using a virtual environment:
- Avoids dependency conflicts.
- Makes upgrading the SDK (
pip install -U reachy-mini) safer.
This pattern is also used by beta testers who had fewer installation issues inside a .venv.
Source: API documentation (implicit) + Discord โ author: matth_lap.
8. Customization & CAD
Are CAD files mentioned in the API documentation?
The API documentation doesn't mention CAD files directly. Some testers requested them; no official link was provided in the logs.
Source: [NO DIRECT DATA in API docs] + Feedback / Discord.
9. Contributing & Community
How can I share my app with other users?
Recommended workflow:
- Wrap your behavior in a
ReachyMiniApp. - Publish as a HF Space or Python package.
- Look at the official example space reachy_mini_app_example.
Source: API documentation + Discord โ author: annecharlotte_pollen.
How can I contribute to datasets of moves (dances/emotions)?
Use the tools in:
reachy_mini_toolbox/tools/moves
They help you:
- Record moves (via
start_recording/stop_recording) - Upload them to the Hugging Face Hub for reuse by others.
Source: API documentation.