Papers
Topics
Authors
Recent
Search
2000 character limit reached

Habitat-PyRobot Bridge (HaPy)

Updated 8 February 2026
  • Habitat-PyRobot Bridge (HaPy) is a minimal, configurable Python library that unites simulation and real-world robot control via a unified API.
  • It abstracts disparate sensor pipelines, action semantics, and environment resets, thereby simplifying reproducible embodied AI research.
  • HaPy facilitates rigorous sim-to-real benchmarking, demonstrated by quantifiable improvements in Sim-vs-Real Correlation Coefficient metrics through parameter tuning.

The Habitat-PyRobot Bridge (HaPy) is a minimal, highly configurable Python library that enables seamless execution of embodied AI agents in both simulation (Habitat-Sim) and on physical robots (notably LoCoBot) via a unified API. HaPy’s design philosophy is to treat “reality” as just another backend, so researchers can deploy algorithms trained and evaluated in simulation directly onto robotic hardware with only a one-line configuration change—eliminating the engineering burden of maintaining parallel “sim” and “robot” codebases with divergent interfaces, action spaces, and sensor pipelines. By unifying task logic, metrics, sensor preprocessing, resets, and other environment details, HaPy facilitates rigorous sim-to-real research and quantitative predictivity analysis, including via the Sim-vs-Real Correlation Coefficient (SRCC) metric (Kadian et al., 2019).

1. Motivation and Rationale

Conventional embodied AI workflows require separate stacks for simulation (e.g., Habitat-Sim in Python/C++) and robotics (e.g., ROS plus PyRobot or custom drivers). This duplication leads to divergent observation processing, inconsistent action semantics (“forward 0.25 m” in sim vs. velocity commands in ROS), and non-uniform definitions of tasks, resets, or episodic metrics. Such fragmentation introduces barriers to reproducibility, complicates cross-platform benchmarking, and impairs the empirical validity of sim vs. real-world generalization studies.

HaPy’s objective is to provide a single Env interface against which agents are written and evaluated, regardless of underlying backend. Agents written for sim can run on real robots—such as the LoCoBot platform (mobile base plus Intel D435 depth/RGB sensor)—through a single configuration parameter (SIMULATOR.TYPE), with all sensor, actuator, and metric semantics remaining strictly identical.

2. System Architecture and Components

HaPy sits between policy code (agent) and the hardware/simulator, abstracting hardware/backend differences with a unified “HabitatEnv” interface. The core architecture is organized as follows:

Layer Component Description
Agent Code Policy logic and training loop
Unified Env Interface HabitatEnv Wraps backend-specific simulators, exposes gym-style reset/step
Sim Backend HabitatSimulator Calls habitat-sim (renders RGB, depth, simulates agent kinematics, injects actuation noise)
Real Backend PyRobotSimulator Interfaces with ROS, manages camera/LIDAR/SLAM and actuates real robot via PyRobot API

Major Components

  • HabitatEnv: Implements a gym-like interface and dispatches to HabitatSimulator or PyRobotSimulator based on cfg.SIMULATOR.TYPE.
  • HabitatSimulator: Instantiates habitat-sim with SceneID, SensorSpec (modifiable FOV, resolution, cropping), AgentSpec (radius, height, collision/sliding), and an ActuationNoiseModel (2D Gaussian for forward/turn).
  • PyRobotSimulator: Launches or connects to ROS nodes (camera driver, LIDAR/HectorSLAM); subscribes to image/pose topics; publishes velocity commands; provides sensor observations identical in shape/dtype to simulator.
  • Middleware (habitat_pyrobot_bridge): Registers “HabitatSim-v0” and “PyRobot-Locobot-v0” gym environments; performs all data-format shuffling (e.g., ROS Image to numpy, habitat-sim GPU buffer to numpy).

All differences—sensor field of view, normalization, cropping, action mapping, agent kinematics and collision semantics—are specified in a single config file, ensuring identical perception and actuation semantics in both sim and real scenarios.

3. API, Configuration, and Usage

HaPy provides a minimal Python API. Switching from simulation to real deployment is achieved by modifying only SIMULATOR.TYPE in the configuration (YAML or code). All task logic runs unchanged. For example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
from habitat.config import Config
from habitat.core.env import Env

cfg = Config.from_yaml("configs/pointnav.yaml")
cfg.SIMULATOR.TYPE = "PyRobot-Locobot-v0"  # Or "HabitatSim-v0"
env = Env(config=cfg)
agent = MyPolicyNetwork(cfg)
for episode in range(cfg.EVAL.EPISODES):
    obs = env.reset()
    done = False
    while not done:
        action = agent.act(obs)
        obs, reward, done, info = env.step(action)
    print("Success:", info["success"], "SPL:", info["spl"])
env.close()

Key Classes and Configuration Parameters

  • HabitatEnv: Dispatches according to SIMULATOR.TYPE.
  • PyRobotSimulator: Handles step/reset using ROS and PyRobot.
  • SensorSpec: Controls FOV, resolution, modality (RGB, depth).
  • ActuationNoiseModel: Specifies Gaussian actuation noise (σ_x, σ_y).
  • YAML Configuration: Unified. Key elements:
    • SIMULATOR.TYPE: “HabitatSim-v0” or “PyRobot-Locobot-v0”
    • SENSORS: RGB, depth (resolution, FOV, clipping, normalization)
    • COLLISION.SLIDE_ON_COLLISION: true/false
    • ACTUATION_NOISE.MULTIPLIER: float
    • ENVIRONMENT.SCENE: mesh file location

4. Supported Platforms, Sensors, and Actuators

HaPy supports the following hardware and sensor/actuator modalities:

  • Platforms: LoCoBot (default); extensible to any PyRobot-supported robot (e.g., Sawyer arm + mobile base).
  • Sensors:
    • RGB camera (FOV, aspect ratio, resolution matched via SensorSpec)
    • Depth camera (clipped at 10 m to mimic Intel D435)
    • Simulated GPS+Compass (in sim), LIDAR+HectorSLAM (real, ~7 cm pose error)
    • Optional: Use of FastDepth as an in-simulation depth predictor
  • Actuators:
    • Discrete: FORWARD 0.25 m, TURN_LEFT 30°, TURN_RIGHT 30°, STOP
    • Simulator: Applies actuation noise (2D Gaussian per step)
    • Real robot: Velocity commands issued via PyRobot API

5. Installation, Dependencies, and Setup

HaPy is designed for streamlined installation and configuration. Essential dependencies include Python 3.6+, habitat-sim, habitat-api, pyrobot, ROS Kinetic or Melodic, and auxiliary packages (OpenCV, PyTorch, NumPy, YAML).

Setup Procedure (Linux/Ubuntu 18.04)

  1. Install ROS, create a catkin workspace.
  2. Install required ROS packages:
    1
    
    sudo apt install ros-melodic-hector-slam ros-melodic-cv-bridge
  3. Install Python dependencies:
    1
    
    pip install habitat-sim habitat-api pyrobot numpy opencv-python pyyaml
  4. Clone and install HaPy:
    1
    2
    3
    
    git clone https://github.com/facebookresearch/habitat-pyrobot-bridge.git
    cd habitat-pyrobot-bridge
    pip install -e .
  5. Prepare config file; set SIMULATOR.TYPE as required.
  6. Launch ROS nodes for LoCoBot and HectorSLAM as needed.

Sample launch scripts—run_sim.sh (simulator) and run_real.sh (real robot)—allow experiment automation with the same Python codebase.

6. Evaluation Methodology and Predictivity Analysis

HaPy facilitates systematic sim-to-real benchmarking by running identical agent code, environment configurations, and metric logging in both simulation and reality. In the “Sim2Real Predictivity” study, nine deep reinforcement learning (DRL) agents were trained in Habitat on the Gibson environment, then evaluated on both HabitatSim-v0 (sim, 810 episodes) and PyRobot-Locobot-v0 (real robot, 810 episodes, ~40 hours of robotic operation). The only change required for deployment was the SIMULATOR.TYPE configuration key.

HaPy automates real-world resets using LIDAR+SLAM for pose relocalization and supports fully scripted data collection across both platforms, enabling geographically distributed research teams to perform reproducible sim2real experiments.

7. Metrics, Results, and Configurable Predictivity

A central contribution enabled by HaPy is the use of the Sim-vs-Real Correlation Coefficient (SRCC) to quantify the predictive validity of simulation with respect to real-robot outcomes. Let (Si,Ri)(S_i, R_i) be the performance of agent ii in simulation and reality, for i=1...ni=1...n:

S=1ni=1nSi,R=1ni=1nRi\overline{S} = \tfrac1n\sum_{i=1}^n S_i, \quad \overline{R} = \tfrac1n\sum_{i=1}^n R_i

SRCC=i=1n(SiS)(RiR)i=1n(SiS)2i=1n(RiR)2\mathrm{SRCC} = \frac{\sum_{i=1}^n (S_i - \overline{S})(R_i - \overline{R})}{\sqrt{\sum_{i=1}^n (S_i - \overline{S})^2} \sqrt{\sum_{i=1}^n (R_i - \overline{R})^2}}

SRCC values range from +1 (perfect sim-to-real linear predictivity) to 0 (no predictive power) to –1 (inverse relationship).

In the default simulator configuration (collision sliding enabled, no actuation noise), SRCC was found to be low: SRCCsuccess=0.18\mathrm{SRCC_{success}} = 0.18 and SRCCSPL=0.603\mathrm{SRCC_{SPL}} = 0.603, indicating poor sim-to-real correspondence. Tuning simulator parameters—disabling collision sliding and introducing actuation noise—substantially improved correlation: SRCCsuccess=0.844\mathrm{SRCC_{success}} = 0.844 and SRCCSPL=0.875\mathrm{SRCC_{SPL}} = 0.875.

HaPy’s unified metric tracking and identical episode definitions enable robust, reproducible computation of SRCC. Treating SRCC as an “objective” for simulator parameter tuning allows optimizing simulation realism without repeated real-robot trials.

8. Summary and Significance

Habitat-PyRobot Bridge establishes a minimal, config-driven paradigm for embodied agent research, enabling a paradigm in which a physical robot is treated as “just another backend simulator.” Its unified environment guarantees that observations, actions, and evaluation metrics are consistent across platforms, dramatically lowering the barrier for reproducible sim2real experiments. In the referenced study, HaPy enabled the collection of 1,620 runs (simulation and reality) in a fully automated, end-to-end pipeline, underpinning rigorous research into sim-to-real transfer and predictive evaluation methodologies (Kadian et al., 2019).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Habitat-PyRobot Bridge (HaPy).