• Home
  • A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired

A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired

A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired

A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired

Building an electronic travel aid (ETA) is a difficult interdisciplinary challenge. In this work we demonstrate how to build a simple model of 3D scenes for the purpose of presenting the environment nonvisually to the visually impaired.

  • Methods

    The proposed solution relies on off-the-shelf stereo vision camera, Stereolabs ZED. Among all the depth estimation techniques, stereo vision was selected due to its simplicity, effectiveness in natural illumination of the observed scene (a passive method that does not require any additional source of radiation), as well as small size and weight compared to competitive solutions. The concept of a simplified 3D model of the scene put an emphasis on determining the ground plane (GP), that is a flat area devoid of any objects in the bottom part of the imaged scene. Such an approach is logical if taken into account the meaning of planes in the 3D scene. Detection of the GP significantly simplifies identifying the remaining objects of a scene (obstacles). Hence, obstacles can be identified after discarding the parts of the imaged scene corresponding to the ground plane. We propose a particle filtering algorithm for modelling the movement of the ground plane in the camera coordinates on the basis of changes in values of its normal vector’s components or point’s coordinates.

  • Results

    Our particle filter-based algorithm allows to estimate the positions of planes in sequences of depth images of three-dimensional scenes captured by a freely moving depth camera. Tests (a set of 38 stereovision test image sequences) showed that the RMS errors for pitch and roll orientation angles of ground plane in the camera coordinate system do not exceed 3˚ and 2˚ respectively. A large overlap between the ground-truth ground plane regions and the ground plane detected by the PF-based algorithm were achieved for indoor and outdoor test image sequences (the mean value of the Jaccard similarity measure JC=0.94).

  • Conclusions

    The presented work focuses on image processing in the service of quality of life and safety of the blind and visually impaired pedestrians. The overarching aim of the work was to develop a method that allows to build a simplified threedimensional (3D) model of the scene. Demo movies of the proposed method can be accessed at :
    (Initial) obstacle detection: walk in the corridor
    (Initial) obstacle detection: crossing the platform

  • Contact

    • Dr. Mateusz Owczarek
    • Dr. Piotr Skulimowski , e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.
    • Prof. Pawel Strumillo , e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

  • Publication


A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired
A method for three-dimensional (3D) scene modelling from sequences of depth images in a personal micro-navigation aid for the visually impaired

Institute Structure – Divisions

Medical Electronics Division

Head of the Division
prof. Piotr Szczypiński


Communications Division

Head of the Division
prof. Sławomir Hausman


Electronic Circuits and Thermography Division

Head of the Division
prof. Bogusław Więcek

Address

Institute of Electronics
Lodz University of Technology
Al. Politechniki 10, B-9 building
93-590 Lodz, POLAND


Correspondence address

116 Żeromskiego Str.
PL 90-924 Lodz
POLAND

VAT identification number: PL 727-002-18-95