Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

In this paper, we present a framework for integrating two different types of sensors for hand-gesture recognition using deep-learning. The two sensors utilize completely different approaches for detecting the signal, namely; an ultra-wideband (UWB) impulse radar sensor and a thermal sensor. For robust gesture classification two parallel paths are utilized, each employs a combination of a convolutional neural network (CNN) and a long short-term memory (LSTM) network on both the radar signal and the thermal signal. The classification results from the two paths are then fused to improve the overall detection probability. The two sensors compliment the capability of each other; while the UWB radar is accurate for radial movement and less accurate for lateral movement, the thermal sensor is vice-versa. Thus, we find that combining both sensors produces near perfect classification accuracy of 99 % for 14 different hand-gestures.

Original publication

DOI

10.1109/SENSORS47125.2020.9278683

Type

Conference paper

Publication Date

25/10/2020

Volume

2020-October