Show simple item record

dc.contributor.authorVice, Jordan
dc.contributor.authorKhan, Masood
dc.contributor.authorMurray, Iain
dc.contributor.authorYanushkevich, Svetlana
dc.contributor.editorPapadopoulos, George
dc.contributor.editorAngelov, Plamen
dc.date.accessioned2022-06-08T07:25:33Z
dc.date.available2022-06-08T07:25:33Z
dc.date.issued2022
dc.identifier.citationVice, J. and Khan, M. and Murray, I. and Yanushkevich, S. 2022. Adaptive Classification of Occluded Facial Expressions of Affective States. In: 2022 IEEE International Conference on Evolving and Adaptive Intelligent Systems, 25th May 2022, Larnaca, Cyprus.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/88712
dc.identifier.doi10.1109/EAIS51927.2022.9787693
dc.description.abstract

Internationally, the recent pandemic caused severe social changes forcing people to adopt new practices in their daily lives. One of these changes requires people to wear masks in public spaces to mitigate the spread of viral diseases. Affective state assessment (ASA) systems that rely on facial expression analysis become impaired and less effective due to the presence of visual occlusions caused by wearing masks. Therefore, ASA systems need to be future-proofed and equipped with adaptive technologies to be able to analyze and assess occluded facial expressions, particularly in the presence of masks. This paper presents an adaptive approach for classifying occluded facial expressions when human faces are partially covered with masks. We deployed an unsupervised, cosine similarity-based clustering approach exploiting the continuous nature of the extended Cohn-Kanade (CK+) dataset. The cosine similaritybased clustering resulted in twenty-one micro-expression clusters that describe minor variations of human facial expressions. Linear discriminant analysis was used to project all clusters onto lower-dimensional discriminant feature spaces, allowing for binary occlusion classification and the dynamic assessment of affective states. During the validation stage, we observed 100% accuracy when classifying faces with features extracted from the lower part of the occluded faces (occlusion detection). We observed 76.11% facial expression classification accuracy when features were gathered from the uncovered full-faces and 73.63% classification accuracy when classifying upper-facial expressions - applied when the lower part of the face is occluded. The presented system promises an improvement to visual inspection systems through an adaptive occlusion detection and facial expression classification framework.

dc.languageEnglish
dc.publisherIEEE Press
dc.subject0915 - Interdisciplinary Engineering
dc.subject4611 - Machine learning
dc.subject4602 - Artificial intelligence
dc.subject4603 - Computer vision and multimedia computation
dc.titleAdaptive Classification of Occluded Facial Expressions of Affective States
dc.typeConference Paper
dcterms.source.volume1
dcterms.source.number1
dcterms.source.titleProceedings of the 2022 IEEE International Conference on Evolving and Adaptive Intelligent Systems
dcterms.source.isbn978-1-6654-3706-6
dcterms.source.conference2022 IEEE International Conference on Evolving and Adaptive Intelligent Systems
dcterms.source.conference-start-date25 May 2022
dcterms.source.conferencelocationLarnaca, Cyprus
dcterms.source.placeNew Jersey
dc.date.updated2022-06-08T07:25:32Z
curtin.departmentSchool of Civil and Mechanical Engineering
curtin.accessStatusFulltext not available
curtin.facultyFaculty of Science and Engineering
curtin.contributor.orcidKhan, Masood [0000-0002-2769-2380]
dcterms.source.conference-end-date27 May 2022
curtin.contributor.scopusauthoridKhan, Masood [7410317782]


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record