YCB-Handovers Dataset
- YCB-Handovers Dataset is a comprehensive, weight-sensitive motion dataset capturing 2,771 handover trials with varied objects and mass conditions.
- The dataset features time-aligned kinematics from high-frequency motion capture, providing detailed velocity and acceleration metrics for giver and taker roles.
- Its structured data and rigorous segmentation support adaptive robotic handover planning and enable benchmarking of weight-sensitive motion generation algorithms.
The YCB-Handovers Dataset is a human-human handover motion dataset oriented toward weight-sensitive research in human-robot handover planning. It leverages the standard Yale-CMU-Berkeley (YCB) object set and captures 2,771 handover trials across 27 objects varying systematically in mass, geometric class, and handling requirements. The dataset provides time-aligned kinematics for both giver and taker, segmented for handover events, and is structured for direct integration into models of adaptive, human-inspired robotic handover motion. It is hosted at https://github.com/paragkhanna1/YCB-Handovers and described in (Khanna et al., 25 Feb 2025, Khanna et al., 23 Dec 2025).
1. Composition and Object Properties
The YCB-Handovers Dataset consists of 2,771 trials recorded from 12 healthy adult participants (forming six fixed giver–taker pairs), each performing handovers of 27 distinct objects sourced from the YCB Object and Model Set and selected weighted variants. Object choices reflect a spectrum of everyday manipulation challenges, spanning cylinders, spheres, boxes, tool-like shapes, and containers. Object weights range from 0.008 kg (Measuring Cup) to 2.060 kg (Pitcher+Weights or Pitcher+Water). “Careful” and “non-careful” handovers are distinguished for select objects (e.g., Pitcher+Water).
Objects are categorized for analysis into three bins: low ( < 0.10 kg), moderate (0.10–1.00 kg), and high (> 1.00 kg) mass. Approximate trial counts per category are: low = 380, moderate = 1,750, high = 640 (Khanna et al., 25 Feb 2025).
| Object Name | YCB (ID) | Mass (kg) |
|---|---|---|
| Marker Small | 1 | 0.008 |
| Hand Drill | 9 | 0.874 |
| Cleanser Bottle | 5 | 1.131 |
| Pitcher+Weights/Water | 24/25 | 2.060 |
| Measuring Cup | 26 | 0.008 |
| Skillet | 23 | 0.925 |
The full set includes tool-like, container, and block-shaped objects. Each is represented in ~85–130 handover trials, with care taken to represent both light, moderate, and heavy-weight handling.
2. Motion Capture and Data Acquisition
Motion data were recorded using a 12-camera OptiTrack system at 120 Hz. Reflective markers tracked 13 segments per participant (hips, chest, shoulders, upper/lower arms, hands, head), with rigid clusters for object pose estimation (Khanna et al., 23 Dec 2025). Calibration ensured all data streams are referenced to a unified world coordinate frame (z-axis up). The dataset contains no direct force/torque or tactile sensor channels—kinematic data constitutes the primary modality.
Raw data are distributed in ROS bag (.bag) format, while post-processed CSV exports provide segment-wise kinematic data per trial. CSV fields per row include the time, header/child frame names, translation (tx, ty, tz in meters), and rotation quaternion (qx, qy, qz, qw). Each trial subdirectory contains giver and taker segment files and a metadata JSON describing participant pairing, object identity, weight, basket, and trial ID (Khanna et al., 25 Feb 2025).
3. Data Annotation and Feature Definitions
Handover segments are algorithmically extracted using hand proximity and velocity thresholds. For YCB handovers, onset is defined as the moment the skeleton-tracked hand and object are separated by less than 2 cm and relative velocity drops below 0.05 m/s; the segment ends 6.66 s later (800 frames at 120 Hz) (Khanna et al., 25 Feb 2025). Manual validation using secondary RGB streams (not publicly released) confirms handover quality.
Feature extraction proceeds from raw positions and orientations for each segment. Derived metrics include:
- Instantaneous linear velocity:
- Linear acceleration:
- Angular velocity:
- Average and peak velocity/acceleration over the segmented handover phase: , ; analogously for .
Hand proximity for readiness cue detection uses , with segmentation threshold m (Khanna et al., 23 Dec 2025).
4. Object Weight Effects on Human Motion
Analysis of kinematic features demonstrates systematic, weight-dependent modulation of handover trajectories. After 4th-order Butterworth filtering (5 Hz), “careful” trials are excluded. Pearson correlation coefficients between object mass and key motion metrics are:
- Average velocity:
- Max velocity:
- Average acceleration:
- Max acceleration:
Regression yields m/, with ; for average velocity, m/s. Group-wise statistical tests show significant differences ( < 0.004) in mean acceleration and velocity across low vs. high mass bins; mean velocity drops by m/s and mean acceleration by m/ from low to high weight objects (Khanna et al., 23 Dec 2025).
No sharp transition (“kink”) is noted near 0.5 kg; trends are smooth. This suggests any adaptive controller will require continuous, rather than discrete, mass input.
5. Data Structure, Preprocessing, and Usage Paradigms
The dataset directory is structured for programmatic traversal; each trial directory includes paired CSV files for giver/taker kinematics and a metadata.json descriptor. Column headers match segment ID and quaternion labels; sample code is supplied for velocity/acceleration computation and batch-feature extraction.
Recommended applications include:
- Benchmarking handover-motion generation across weight bins
- Training weight-adaptive trajectory planners for robot givers
- Learning grip-release and readiness strategies from human timing
- Estimating object mass from kinematic signals
- Developing vision-only or haptics-only handover phase-detection algorithms
A representative feature-extraction pipeline involves low-pass filtering position signals, numerical differentiation to obtain and , threshold-based event segmentation, and statistical aggregation for each trial. See provided pseudocode in (Khanna et al., 23 Dec 2025). Data-driven approaches (e.g., neural regressors, GPs) are encouraged; model-based strategies such as weight-conditioned DMP interpolation are directly feasible.
6. Limitations and Prospects for Extension
Limitations include lack of force/torque data or grip force traces (except in auxiliary baton data), potential marker occlusion artifacts (trials with high drop rates are excluded), moderate participant diversity (12 adults, fixed pairings), and exclusive lab-environment context (single table, neutral lighting). No joint-angle or finger contact data is available; only gross hand/arm kinematics are captured.
Future directions include adding wrist/hand-mounted force or tactile sensors, expanding the participant pool (age, handedness), deployment in unstructured or natural environments, and augmenting with additional modalities such as eye-tracking or EMG for readiness cues.
The YCB-Handovers Dataset provides an extensive, weight-calibrated basis for quantifying and modeling human handover strategies, supporting rigorous development of weight-sensitive robotic handover algorithms. For citation and full data access, see (Khanna et al., 25 Feb 2025, Khanna et al., 23 Dec 2025).