Bimanual Manipulation Datasets

Curated open-source and custom bimanual manipulation datasets for imitation learning, VLA fine-tuning, and dual-arm policy research. Two-arm coordination is one of the hardest open problems in robot learning — these datasets provide the demonstrations to tackle it.

Key Open-Source Bimanual Datasets

DatasetEpisodesRobotFormat
ALOHA~50 per task, ~10 tasks2x ViperX-300HDF5, LeRobot
Mobile ALOHA50+ per task2x ViperX-300 + wheeled baseHDF5, LeRobot
ALOHA 2VariesGoogle ALOHA 2 cellsRLDS
DROID (bimanual split)Subset of 76KFranka + assortedRLDS, HDF5
RH20T110K+ episodesDual Franka PandaHDF5

What Makes Bimanual Data Special

Bimanual datasets record two synchronized action streams — typically 14 DoF of joint positions plus two gripper states. Coordination timing between the left and right arms is critical: a 50 ms desynchronization can turn a successful handoff into a dropped object. This means bimanual datasets require stricter time synchronization, higher collection frequency (usually 50 Hz), and operators trained specifically in dual-arm coordination.

Common bimanual tasks include: object handoffs, box packing, lid opening and pouring, threading and tying, two-handed assembly, and cooperative lifting of large or deformable objects.

Custom Bimanual Data Collection

SVRC operates Mobile ALOHA and OpenArm bimanual collection stations in our Mountain View lab. We collect custom bimanual datasets with leader-follower teleoperation, deliver in your target format, and provide full QA with per-episode quality scores.

Need a bimanual platform? The Mobile ALOHA is available for purchase or lease.

Need bimanual manipulation data?

Commission custom bimanual teleoperation data or explore our open-source dataset catalog.