Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
166 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Smart Guiding Glasses for Visually Impaired People in Indoor Environment (1709.09359v1)

Published 27 Sep 2017 in cs.HC

Abstract: To overcome the travelling difficulty for the visually impaired group, this paper presents a novel ETA (Electronic Travel Aids)-smart guiding device in the shape of a pair of eyeglasses for giving these people guidance efficiently and safely. Different from existing works, a novel multi sensor fusion based obstacle avoiding algorithm is proposed, which utilizes both the depth sensor and ultrasonic sensor to solve the problems of detecting small obstacles, and transparent obstacles, e.g. the French door. For totally blind people, three kinds of auditory cues were developed to inform the direction where they can go ahead. Whereas for weak sighted people, visual enhancement which leverages the AR (Augment Reality) technique and integrates the traversable direction is adopted. The prototype consisting of a pair of display glasses and several low cost sensors is developed, and its efficiency and accuracy were tested by a number of users. The experimental results show that the smart guiding glasses can effectively improve the user's travelling experience in complicated indoor environment. Thus it serves as a consumer device for helping the visually impaired people to travel safely.

Citations (167)

Summary

  • The paper introduces smart guiding glasses using multi-sensor fusion (depth and ultrasonic) to improve indoor navigation for visually impaired people.
  • Key results show the system accurately detects small and transparent obstacles, outperforming single-sensor methods, with real-time processing (approx. 30ms per frame).
  • The developed prototype demonstrates a practical, low-cost aid for visually impaired individuals, with potential for future enhancements in varied conditions and outdoor use.

Smart Guiding Glasses for Visually Impaired People in Indoor Environment

The paper introduces an innovative Electronic Travel Aid (ETA), designed in the form of smart guiding glasses, to aid visually impaired individuals in navigating complex indoor environments. The device leverages a multi-sensor fusion-based obstacle-avoidance algorithm, amalgamating depth and ultrasonic sensors, to overcome the limitations of existing navigation aids—particularly in detecting small and transparent objects, such as glass doors. This advancement is vital for offering a reliable navigation tool for both totally blind and weak-sighted users, providing directional guidance through auditory and visual cues, respectively.

The smart glasses employ a depth camera coupled with an ultrasonic rangefinder to acquire environmental information, which is processed via an embedded CPU. The integration of Augmented Reality (AR) techniques for visual enhancement allows weak-sighted users to perceive surroundings and determine traversable paths effectively. The robustness and applicability of the system were tested through a prototype equipped with low-cost sensors, demonstrating improvements in user navigation experiences within complex indoor environments.

Key Results

The paper systematically evaluates the performance of the innovative smart glasses through both objective and subjective testing. Objective tests measured the adaptability, correctness, and computational efficiency of the obstacle-avoidance algorithm. Notably, experiments demonstrate that the proposed system can detect obstacles with heights as small as 5 cm when camera height and obstacle distance are optimized.

The paper further explored the effectiveness of the multi-sensor approach by testing against various transparent objects. The results substantiate the hypothesis that combining depth cameras with ultrasonic sensors increases obstacle detection accuracy significantly compared to solely using depth imaging technology.

The computational analysis reveals that the system achieves real-time processing with a maximum frame-processing time of around 30.2 ms. This is crucial for ensuring user safety by enabling timely detection and avoidance of obstacles.

Implications and Future Directions

The successful implementation and testing of these smart guiding glasses hold substantial practical implications for aiding visually impaired individuals by offering an accessible and efficient navigation solution in complex environments. The device's affordability and efficiency make it a strong candidate for widespread consumer use. Future directions may focus on enhancing the system's adaptability in varied lighting conditions and broadening the scope of the device for outdoor environments. Furthermore, the integration of machine learning algorithms to optimize pathfinding and user feedback could be a potential area of development, ensuring enhanced accuracy and user adaptability.

The smart guiding glasses present an intriguing convergence of sensor fusion, AR technology, and human-computer interaction, contributing significantly to the enhancement of mobility aids for the visually impaired community. Continued advancements and interdisciplinary collaboration could further propel the accessibility and functionality of such innovative devices in real-world applications.