Adaptive Classification of Occluded Facial Expressions of Affective States
MetadataShow full item record
Internationally, the recent pandemic caused severe social changes forcing people to adopt new practices in their daily lives. One of these changes requires people to wear masks in public spaces to mitigate the spread of viral diseases. Affective state assessment (ASA) systems that rely on facial expression analysis become impaired and less effective due to the presence of visual occlusions caused by wearing masks. Therefore, ASA systems need to be future-proofed and equipped with adaptive technologies to be able to analyze and assess occluded facial expressions, particularly in the presence of masks. This paper presents an adaptive approach for classifying occluded facial expressions when human faces are partially covered with masks. We deployed an unsupervised, cosine similarity-based clustering approach exploiting the continuous nature of the extended Cohn-Kanade (CK+) dataset. The cosine similaritybased clustering resulted in twenty-one micro-expression clusters that describe minor variations of human facial expressions. Linear discriminant analysis was used to project all clusters onto lower-dimensional discriminant feature spaces, allowing for binary occlusion classification and the dynamic assessment of affective states. During the validation stage, we observed 100% accuracy when classifying faces with features extracted from the lower part of the occluded faces (occlusion detection). We observed 76.11% facial expression classification accuracy when features were gathered from the uncovered full-faces and 73.63% classification accuracy when classifying upper-facial expressions - applied when the lower part of the face is occluded. The presented system promises an improvement to visual inspection systems through an adaptive occlusion detection and facial expression classification framework.
Showing items related by title, author, creator and subject.
Classifying pretended and evoked facial expressions of positive and negative affective states using infrared measurement of skin temperatureKhan, Masood Mehmood; Ward, R. D.; Ingleby, M. (2009)Earlier researchers were able to extract the transient facial thermal features from thermal infrared images (TIRIs) to make binary distinctions between the expressions of affective states. However, effective human-computer ...
Vice, Jordan; Khan, Masood ; Tan, Tele; Yanushkevich, Svetlana (2022)Independent, discrete models like Paul Ekman’s six basic emotions model are widely used in affective state assessment (ASA) and facial expression classification. However, the continuous and dynamic nature of human expressions ...
Hargreaves, T.; Khan, Masood Mehmood; Bensen, D.; Tan, Tele (2016)A closed-loop Petri Net (PN) model was developed to exhibit, maintain and, withdraw facial expressions of six basic affective states, in a human-like manner, on a robotic face. The PN model was aimed to enable execution ...