YOR: Your Own Mobile Manipulator for Generalizable Robotics

This presentation introduces YOR, a groundbreaking open-source mobile manipulation platform that democratizes advanced robotics research. For under $10,000, YOR delivers bimanual dexterity, omnidirectional mobility, and vertical reach through a modular design combining swerve-drive base, commercial lift actuator, and compliant dual arms. We'll explore how YOR's whole-body teleoperation enables rapid data collection, examine policy learning results achieving 90% success on complex pick-carry-place tasks, and reveal how visual-inertial SLAM supports dynamic obstacle avoidance with sub-second replanning—all while maintaining the accessibility and extensibility needed to accelerate embodied AI research worldwide.
Script
What if the biggest barrier to advancing home robotics wasn't the algorithms, but simply the price tag and closed designs of the platforms themselves? YOR tackles this head-on by delivering a fully open-source mobile manipulator for under 10 thousand dollars.
Building on that accessibility problem, let's examine why existing platforms fall short.
The researchers identified a critical gap: existing mobile manipulators either cost tens of thousands of dollars, sacrifice the dexterity needed for household tasks, or lock researchers into proprietary ecosystems. This creates a fundamental bottleneck for scaling up data-driven policy learning in real-world environments.
So how does YOR break through these limitations?
YOR's architecture cleverly balances competing demands. The four-module swerve drive provides holonomic motion that traditional wheeled bases can't match in cluttered spaces, while the repurposed standing-desk lift gives floor-to-overhead reach without custom actuators. Most importantly, every subsystem is intentionally decoupled, so researchers can swap sensors or upgrade compute without redesigning the entire platform.
This comparison reveals YOR's unique position in the design space. While platforms like Mobile ALOHA sacrifice vertical workspace and RB-Y1 pushes costs to 50 thousand dollars, YOR achieves both bimanual dexterity and full-height manipulation at one-fifth the price. The omnidirectional base further distinguishes it from non-holonomic alternatives that struggle in tight indoor spaces.
Now let's see what this design enables in practice.
The teleoperation interface directly maps controller movements to end-effector poses, making expert demonstration collection fast and ergonomic. When they trained a vision-based policy on just 100 filtered examples, the system achieved 10 out of 10 success on grasping and height adjustment, with only one navigation failure across trials attributable to odometry drift from camera occlusion.
This sequence captures the full complexity of the task. YOR must coordinate both arms to securely grasp the box, raise the lift to clear obstacles, navigate around barriers using its omnidirectional base, and precisely position over the recycling bin for release. The policy handles this entire chain autonomously, demonstrating true whole-body coordination learned from teleoperated demonstrations.
The navigation stack demonstrates remarkable robustness for such an affordable platform. Using the onboard depth camera and inertial measurement unit, YOR constructs world-referenced maps that enable dynamic path updates within one second of detecting new obstacles like walking humans. Perhaps most impressively, the tight integration between base and arm control allows kinematic compensation that keeps the end-effector locked in world coordinates even as the base translates and rotates.
Watch how quickly the system responds to unexpected obstacles. The top row shows the internal voxel map where red indicates the detected human and green traces the replanned path. Within one second of detection, YOR has computed and begun executing an alternative route that maintains clearance while still reaching the goal. This responsiveness is critical for deployment in dynamic human environments where static maps quickly become obsolete.
YOR proves that advanced mobile manipulation no longer requires choosing between capability and accessibility. By open-sourcing a platform that delivers bimanual dexterity, omnidirectional mobility, and policy learning performance at one-tenth the cost of commercial alternatives, the authors have removed a fundamental barrier to embodied AI research. To explore the full hardware specifications, software stack, and experimental results, visit EmergentMind.com.