AgileX Piper
Compact tabletop manipulator with CAN bus control, Python SDK, ROS2 integration, and Meta Quest 3 VR teleoperation. Ready for imitation learning and data collection.
Your Setup Journey
Follow these steps to go from unboxing to first teleoperated episode with your AgileX Piper.
CAN Bus & Host Setup
Connect USB-to-CAN adapter, bring up the can0 interface at 1 Mbps
Install piper_sdk
Install from PyPI or source, verify import, run first connect-and-enable script
First Motion
Enable all joints, send a joint position command, read feedback in a loop
ROS2 / MoveIt Integration
Launch piper_ros, publish to ROS topics, run MoveIt planning in RViz
Quest 3 VR Teleoperation
Set up Unity UDP bridge, stream hand pose to PiperController, record episodes
Data Collection
Record teleoperated demonstrations, export RLDS/LeRobot format datasets
Hardware at a Glance
VR Teleoperation
Control the Piper in real time using a Meta Quest 3 headset. Hand pose data streams over UDP from Unity to a Python server that drives the arm via piper_sdk.
Compatible AI Models
The AgileX Piper is well-suited for imitation learning. These policy frameworks work with data collected via piper_sdk or piper_ros.
ACT
Action Chunking Transformer — best for pick-and-place tasks. Works well with Piper's 6-DOF joint space data.
View model →Diffusion Policy
Best for contact-rich manipulation. Generates smooth trajectories over Piper's compact workspace envelope.
View model →OpenVLA
Language-conditioned tasks. Combines vision-language understanding with robot action prediction via piper_ros topics.
View model →Technical Guides & Docs
Detailed guides covering every layer of the AgileX Piper stack — from CAN bus setup to VR teleoperation.
Community
Have a question about CAN bus setup, SDK integration, or VR teleop?
Piper Community → GitHub SDK → Ask a Question →