Fully automatic 3D facial expression recognition using local depth features
MetadataShow full item record
Facial expressions form a significant part of our nonverbal communications and understanding them is essential for effective human computer interaction. Due to the diversity of facial geometry and expressions, automatic expression recognition is a challenging task. This paper deals with the problem of person-independent facial expression recognition from a single 3D scan. We consider only the 3D shape because facial expressions are mostly encoded in facial geometry deformations rather than textures. Unlike the majority of existing works, our method is fully automatic including the detection of landmarks. We detect the four eye corners and nose tip in real time on the depth image and its gradients using Haar-like features and AdaBoost classifier. From these five points, another 25 heuristic points are defined to extract local depth features for representing facial expressions. The depth features are projected to a lower dimensional linear subspace where feature selection is performed by maximizing their relevance and minimizing their redundancy. The selected features are then used to train a multi-class SVM for the final classification. Experiments on the benchmark BU-3DFE database show that the proposed method outperforms existing automatic techniques, and is comparable even to the approaches using manual landmarks.
Showing items related by title, author, creator and subject.
Xue, M.; Mian, A.; Liu, Wan-Quan; Li, Ling (2015)This paper addresses the problem of person-independent 4D facial expression recognition. Unlike the majority of existing works, we propose to extract spatio-temporal features in 4D data (3D expression sequences changing ...
Xue, M.; Duan, X.; Zhou, J.; Wang, C.; Wang, Y.; Li, Z.; Liu, Wan-Quan (2016)© Springer International Publishing AG 2016.This paper investigates the other-race-effects in automatic 3D facial expression recognition, giving the computational analysis of the recognition performance obtained from two ...
Classifying pretended and evoked facial expressions of positive and negative affective states using infrared measurement of skin temperatureKhan, Masood Mehmood; Ward, R. D.; Ingleby, M. (2009)Earlier researchers were able to extract the transient facial thermal features from thermal infrared images (TIRIs) to make binary distinctions between the expressions of affective states. However, effective human-computer ...