Abstract
The ability to measure heart rate (HR) from face videos is useful in applications such as neonatal monitoring, telemedicine and affective computing. In the realistic environments, subjects often have spontaneous head movements and facial expressions which severely degrade the performances of the current methods. We propose a novel patch-based fusion framework for estimating accurate HR from face videos in the presence of subjects' motions. The wavelet time–frequency analysis is applied on the raw blood volume pulse (BVP) signals for selecting less contaminated patches. Furthermore, a weighted fusion formula is constructed to obtain the final precise BVP signal, which is based on frequency and gradient information. Our method is validated on both our self-collected dataset and public dataset MAHNOB-HCI. Compared with the state of the art, experimental results show that the proposed method has an obvious superiority in the accuracy and robustness.
http://bit.ly/2RqnVwl
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου