← Robotics Academy

Gloves & wearables

Gloves (and hand trackers) translate human motion into robot commands. The hard parts are calibration, latency, and mapping human kinematics to different robot hands — covered in demonstration data and Communication & architecture.

Hand tracking and wearable teleoperation

Motion sketch: stylized fingers (CSS) — same idea as dexterous hands, but here the story is human → robot mapping.

Learning outcomes

  • Explain calibration, latency, and retargeting for human → robot mapping.
  • Compare gloves-only vs headset-mediated teleop for your setup.
  • Collect episodes with timestamps suitable for downstream learning.
Learn

Hand tracking pipeline, mapping DOF, safety envelopes.

Practice

Measure latency in open air; retarget to sim with matched DOF.

Challenge

One hardware session with speed caps + estop drill; post metrics on the Forum.

Facilitation: Run sim-only first; pair with demonstration data for episode hygiene.

Self-check

What goes wrong if latency is ignored?
Oscillation, poor contact control, and unsafe corrections — always show numbers early.
Why shared timestamps?
Vision, proprioception, and commands must align for imitation and debugging.

STEM alignment: human–machine interfaces, measurement, iterative design under safety constraints.

Practice drills

  1. Track hands in open air with visible latency metrics.
  2. Retarget to a simulated robot hand with matched DOF.
  3. Move to hardware with conservative speed limits and e-stop drills.
  4. Record episodes with shared timestamps for learning pipelines.
VR headsets vs gloves-only
Headsets add scene context but another latency path. Gloves-only can be lower friction for bench arms if you already have global cameras for world frame.
Pairing with SO-101 / OpenArm
Start with SO-101 joint limits before scaling to OpenArm — identical teleop discipline, different torque and safety envelopes.

← Unitree H1 · Next: Humanoids & mobile →