A Hierarchical Separation and Classification Network for Dynamic Micro-Expression Classification
dc.contributor.author | Vice, Jordan | |
dc.contributor.author | Khan, Masood | |
dc.contributor.author | Tan, Tele | |
dc.contributor.author | Murray, Iain | |
dc.contributor.author | Yanushkevich, Svetlana | |
dc.date.accessioned | 2023-12-12T10:31:05Z | |
dc.date.available | 2023-12-12T10:31:05Z | |
dc.date.issued | 2023 | |
dc.identifier.citation | Vice, J. and Khan, M. and Tan, T. and Murray, I. and Yanushkevich, S. 2023. A Hierarchical Separation and Classification Network for Dynamic Micro-Expression Classification. IEEE Transactions on Computational Social Systems. | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/93940 | |
dc.identifier.doi | 10.1109/TCSS.2023.3334823 | |
dc.description.abstract |
Models of seven discrete expressions developed using macro-level facial muscle variations would suffice identifying macro-level expressions of affective states. These models won’t discretise continuous and dynamic micro-level variations in facial expressions. We present a Hierarchical Separation and Classification Network (HSCN) for discovering dynamic, continuous and macro- and micro-level variations in facial expressions of affective states. In the HSCN, we first invoke an unsupervised cosine similarity-based separation method on continuous facial expression data to extract twenty-one dynamic facial expression classes from the seven common discrete affective states. The between-clusters separation is then optimised for discovering the macro-level changes resulting from facial muscle activations. A following step in the HSCN separates the upper and lower facial regions for realizing changes pertaining to upper and lower facial muscle activations. Data from the two separated facial regions are then clustered in a linear discriminant space using similarities in muscular activation patterns. Next, the actual dynamic expression data are mapped onto discriminant features for developing a rule-based expert system that facilitates classifying twenty-one upper and twenty-one lower micro-expressions. Invoking the random forest algorithm would classify twenty-one macro-level facial expressions with 76.11\% accuracy. A support vector machine (SVM), used separately on upper and lower facial regions in tandem, could classify them with respective accuracies of 73.63\% and 87.68\%. This work demonstrates a novel and effective method of dynamic assessment of affective states. The HSCN further demonstrates that facial muscle variations gathered from either upper-, lower- or full-face would suffice classifying affective states. We also provide new insight into discovery of micro-level facial muscle variations and their utilization in dynamic assessment of facial expressions of affective states. | |
dc.relation.sponsoredby | Curtin University | |
dc.relation.sponsoredby | University of Calgary (partial) | |
dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | |
dc.subject | Cosine similarity-based separation, Hierarchical classification, Micro-expression detection, Affective state assessment, Facial expression classification, Rule-based systems | |
dc.title | A Hierarchical Separation and Classification Network for Dynamic Micro-Expression Classification | |
dc.type | Journal Article | |
dcterms.source.title | IEEE Transactions on Computational Social Systems | |
dc.date.updated | 2023-12-12T10:31:03Z | |
curtin.department | School of Civil and Mechanical Engineering | |
curtin.accessStatus | Open access | |
curtin.faculty | Faculty of Science and Engineering | |
curtin.contributor.orcid | Khan, Masood [0000-0002-2769-2380] | |
curtin.contributor.orcid | Murray, Iain [0000-0003-1840-9624] | |
curtin.contributor.orcid | Tan, Tele [0000-0003-3195-3480] | |
curtin.contributor.scopusauthorid | Khan, Masood [7410317782] | |
curtin.repositoryagreement | V3 |