Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
51 tokens/sec
2000 character limit reached

Astribot Suite: Advanced Robotics System

Updated 27 July 2025
  • Astribot Suite is a comprehensive, platform-agnostic robotics system designed for advanced autonomous operations across planetary and terrestrial environments.
  • It integrates AI perception with methods like deep learning, Kalman tracking, and semantic segmentation for real-time object detection and navigation safety.
  • Its multi-agent planning and human–robot interaction protocols enable seamless coordination between rovers and astronauts in dynamic, hazardous settings.

The Astribot Suite is a comprehensive, platform-agnostic robotics software system designed for advanced planetary and terrestrial robotics operations where complex rover–rover and astronaut–rover interactions are required. Developed within the CISRU project framework, the suite emphasizes high autonomy, enabled by state-of-the-art AI perception modules, multi-agent autonomous planning systems, and robust protocols for human–robot interaction—all abstracted from specific robotic platforms to facilitate deployment across diverse scenarios and hardware(Romero-Azpitarte et al., 2023).

1. AI-Based Perception

A central innovation within the Astribot Suite is its AI-powered perception module, which integrates deep learning and probabilistic tracking to endow robotic agents with rich situational awareness and competence in human–robot interaction tasks. The module comprises multiple neural network architectures:

  • Object Detection: Employs a lightweight MobileNet-SSD architecture utilizing Single Shot Detection (SSD) for real-time inference on continuous video streams. The network is trained explicitly to identify astronauts, rovers, rocks, and solar panels.
  • Tracking: Integrates a Kalman filter–based algorithm atop detection outputs to persistently track object states and identities over sequential frames. Tracking involves standard prediction and update equations:

Prediction: xk=Axk1+Buk1+wk1\text{Prediction: } x_k = A \cdot x_{k-1} + B \cdot u_{k-1} + w_{k-1}

Update: x^k=x^kk1+Kk(zkHx^kk1)\text{Update: } \hat{x}_k = \hat{x}_{k|k-1} + K_k(z_k - H \cdot \hat{x}_{k|k-1})

where AA, BB, HH are the state transition, control input, and measurement matrices, with ww and zz representing process and measurement noise respectively.

  • Semantic Segmentation: Deploys DeepLabV3+ for pixelwise classification of RGB images from stereo cameras. Segmentation assigns each pixel to classes such as soil, multiple types of rock, and unknown objects. Outputs are presented as "palletised" images, overlaying class labels onto depth data for navigation safety.

These perception capabilities directly underpin real-time interaction protocols; for example, the system can detect proximity between a rover and astronaut, recognize falls or hazardous situations, and trigger rescue or avoidance maneuvers as required for operational safety and mission continuity.

2. Multi-Agent Autonomous Planning Systems

Inherited from ERGO/ADE and the PERASPERA program, the Astribot Suite features advanced planning mechanisms for distributed autonomy:

  • Multi-Agent Synchronization (MAS): A MAS reactor enables agents—including multiple rovers and astronaut-facing interfaces—to synchronize and exchange high-level mission goals and world model observations. This fosters fault tolerance and adaptability across semi-structured or unstructured environments.
  • E4-Level Autonomy: The system operates under E4 autonomy, per ECSS standards, allowing high-level goals to be submitted. These are decomposed into medium-level commands addressing locomotion, manipulation, and other functional primitives by onboard planners.
  • Onboard Replanning: Autonomous planners can adjust execution plans in response to deviations (e.g., unexpected obstacles, agent failures) without requiring human intervention, supporting resilience where teleoperation is suboptimal or unavailable.

3. Interaction Protocols and Human–Robot Interface

Mechanisms for robust astronaut–robot interaction are a cornerstone of the Astribot Suite, incorporating:

  • Shared Agency Ontology: Both human and robotic agents utilize an abstract command interface based on an integrated Human–Robot Interaction (HRI) ontology, ensuring mutual interpretability of commands and situational context.
  • Wearable and Mixed Reality Interfaces: Astronauts are equipped with wearable consoles integrated into spacesuit forearms and Microsoft Hololens 2–based Mixed Reality (MR) devices. These allow multimodal input (voice, gaze, gesture) for direct command manipulation and mission data overlay.
  • Inter-Agent Communication: ROS2 interface messages and elements from the ERGO architecture are used for reliable, low-latency inter-agent communication. This supports both telecommand initiation from ground control and fully autonomous notification flows (e.g., alerting mission control and nearby rovers to emergencies detected by AI perception).

4. Modular Architecture and Task Management

The architecture of the Astribot Suite is modular, with five principal components operating concurrently to manage the full cycle of planetary or hazardous-terrestrial tasks:

  • Multi-agent autonomy
  • AI-based perception
  • Guidance-navigation-control (GNC)
  • Manipulation
  • Cooperative behavior

This modularity enables dynamic plan adaptation; for example, detection of an unforeseen hazard by the perception module initiates immediate replanning by the multi-agent system. Integrated inference allows for task switching and parallel operations such as cooperative sample retrieval, tool exchange, or multi-rover exploration, all with minimal human oversight.

5. Technical Framework and Algorithms

The technical underpinnings of the Astribot Suite are grounded in contemporary robotics middleware, perception, and control techniques:

Aspect Technology/Specification Role in Suite
Middleware ROS2 Communication, module integration
Navigation/Planning NAV2, Fast Marching Square (NAV2Plugin) Path planning, trajectory control
Perception MobileNet-SSD, DeepLabV3+, Kalman Filter Detection, segmentation, tracking
Navigation Sensors Stereo cameras, IMUs, wheel odometry Visual SLAM, environment reconstruction
Semantic Output "Palletised" segmentation images Navigation safety, hazard identification

Visual SLAM is achieved by fusing stereo camera streams with inertial and odometric data, filtered through AI-based perception to generate robust point clouds. The Fast Marching Square algorithm is implemented within NAV2 as a plugin to support efficient navigation in unknown or evolving terrains.

6. Use Cases in Planetary and Terrestrial Contexts

The suite addresses a spectrum of use cases, with particular emphasis on the following:

  • Planetary Exploration: Teams of autonomous rovers (e.g., LAMARR as reconnaissance, MAE for tool delivery and sampling) collaborate with astronauts for tasks such as regolith sampling, geological mapping, or infrastructure inspection (e.g., solar panel monitoring and maintenance). Emergency scenarios (astronaut fall, proximity to danger) engage integrated rescue and alert protocols.
  • Terrestrial and Industrial Applications: Analogous multi-agent autonomy, perception, and human–robot interaction capabilities are targeted at hazardous terrestrial applications—such as nuclear facility inspection, mining, or refinery plant navigation—where remote and autonomous operations reduce human risk and operational cost.

7. Significance and Broader Implications

By uniting neural perception, high-level planning autonomy, and rich interaction protocols within a hardware-agnostic and extensively modular software framework, the Astribot Suite represents a convergence of developments in planetary robotics, autonomous systems, and intelligent HRI architectures. This integration enables operational flexibility, resilience to unstructured environments, and rapid adaptation to mission demands, with demonstrated applicability both to future planetary exploration missions and to demanding terrestrial domains(Romero-Azpitarte et al., 2023).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)