Person independent recognition of head gestures from parametrised and non-parametrised IMU signals

Person independent recognition of head gestures from parametrised and non-parametrised IMU signals

Introduction

Human–System Interaction (HSI) is currently actively pursued as a separate research field dedicated to the development of new technologies facilitating human communication with systems. Depending on the application, such interaction systems are also referred to as Human–Computer Interfaces (HCI) or more generally human–machine interfaces. The purpose of the research is to develop a hands-free head-gesture controlled interface that can support persons with disabilities to communicate with other people and devices, e.g. the paralyzed to signal messages or the visually impaired to handle electronic travel aids.

  • Methods

    The hardware of the interface consists of a small stereovision rig with a built in inertial measurement unit (IMU). The device is positioned on a user’s forehead. Two approaches to recognizing head movements were considered. In the first approach, from the signals recorded from a three-axis accelerometer and a three-axis gyroscope, statistical parameters were calculated such as: average, minimum and maximum amplitude, standard deviation, kurtosis, correlation coefficient and signal energy. For the second, non-parametric approach, the focus was put onto direct analysis of signal samples recorded from the IMU. In both approaches the accuracies of sixteen different data classifiers for distinguishing the head movements: pitch, roll, yaw and immobility were evaluated. The recordings of head gestures were collected from 65 individuals. The data set recorded from 65 participants can be accessed at: http://eletel.p.lodz.pl/abterka


  • Results

    The best results for the testing data were obtained for the non-parametric approach, i.e. direct classification of unprocessed samples of IMU signals for a Support Vector Machine (SVM) classifier (95% correct recognitions). Marginally worse results, in this approach, were obtained for the random forests classifier (93%). Achieved high recognition rates of the head gestures suggest that a person with physical or sensory disability can efficiently communicate with other people or manage applications using simple head gesture sequences.

  • Conclusions

    Key conclusions from this study are the following:

    • • head gestures (roll, yaw, and pitch) can be efficiently recognised on the basis of signal recordings from an IMU that is positioned on the user’s forehead; for the user independent tests the achieved correct recognition rates exceeded 95%,
    • • data classifiers trained on IMU signal samples outperformed the classifiers that were trained on a set of statistical parameters derived from the IMU signals (confirmed by the paired Wilcoxon signed-rank test),
    • • the recognition accuracy could be further improved by accumulating classifier recognitions with every arriving IMU signal sample (i.e. with every 100ms time interval) and application of an ensemble of classifiers.

    We hypothesise that the presented proof of concept study can be the basis for building an alternative communication channel for people with serious physical disabilities, e.g. individuals suffering from serious spinal injuries or stroke.

  • Contact

    • Anna Borowska-Terka , e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

  • Publication