Adaptive Classification of Occluded Facial Expressions of Affective States
Access Status
Date
2022Type
Metadata
Show full item recordCitation
Source Title
Source Conference
ISBN
Faculty
School
Collection
Abstract
Internationally, the recent pandemic caused severe social changes forcing people to adopt new practices in their daily lives. One of these changes requires people to wear masks in public spaces to mitigate the spread of viral diseases. Affective state assessment (ASA) systems that rely on facial expression analysis become impaired and less effective due to the presence of visual occlusions caused by wearing masks. Therefore, ASA systems need to be future-proofed and equipped with adaptive technologies to be able to analyze and assess occluded facial expressions, particularly in the presence of masks. This paper presents an adaptive approach for classifying occluded facial expressions when human faces are partially covered with masks. We deployed an unsupervised, cosine similarity-based clustering approach exploiting the continuous nature of the extended Cohn-Kanade (CK+) dataset. The cosine similaritybased clustering resulted in twenty-one micro-expression clusters that describe minor variations of human facial expressions. Linear discriminant analysis was used to project all clusters onto lower-dimensional discriminant feature spaces, allowing for binary occlusion classification and the dynamic assessment of affective states. During the validation stage, we observed 100% accuracy when classifying faces with features extracted from the lower part of the occluded faces (occlusion detection). We observed 76.11% facial expression classification accuracy when features were gathered from the uncovered full-faces and 73.63% classification accuracy when classifying upper-facial expressions - applied when the lower part of the face is occluded. The presented system promises an improvement to visual inspection systems through an adaptive occlusion detection and facial expression classification framework.
Related items
Showing items related by title, author, creator and subject.
-
Vice, Jordan; Khan, Masood ; Tan, Tele ; Murray, Iain ; Yanushkevich, Svetlana (2023)Models of seven discrete expressions developed using macro-level facial muscle variations would suffice identifying macro-level expressions of affective states. These models won’t discretise continuous and dynamic ...
-
Khan, Masood Mehmood; Ward, R. D.; Ingleby, M. (2009)Earlier researchers were able to extract the transient facial thermal features from thermal infrared images (TIRIs) to make binary distinctions between the expressions of affective states. However, effective human-computer ...
-
Vice, Jordan; Khan, Masood ; Tan, Tele; Yanushkevich, Svetlana (2022)Independent, discrete models like Paul Ekman’s six basic emotions model are widely used in affective state assessment (ASA) and facial expression classification. However, the continuous and dynamic nature of human expressions ...