Curtin University Homepage
  • Library
  • Help
    • Admin

    espace - Curtin’s institutional repository

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item

    Deep, dense and accurate 3D face correspondence for generating population specific deformable models

    Access Status
    Fulltext not available
    Authors
    Gilani, S.
    Mian, A.
    Eastwood, Peter
    Date
    2017
    Type
    Journal Article
    
    Metadata
    Show full item record
    Citation
    Gilani, S. and Mian, A. and Eastwood, P. 2017. Deep, dense and accurate 3D face correspondence for generating population specific deformable models. Pattern Recognition .. 69: pp. 238-250.
    Source Title
    Pattern Recognition .
    DOI
    10.1016/j.patcog.2017.04.013
    ISSN
    0031-3203
    School
    School of Physiotherapy and Exercise Science
    URI
    http://hdl.handle.net/20.500.11937/63382
    Collection
    • Curtin Research Publications
    Abstract

    © 2017 Elsevier Ltd We present a multilinear algorithm to automatically establish dense point-to-point correspondence over an arbitrarily large number of population specific 3D faces across identities, facial expressions and poses. The algorithm is initialized with a subset of anthropometric landmarks detected by our proposed Deep Landmark Identification Network which is trained on synthetic images. The landmarks are used to segment the 3D face into Voronoi regions by evolving geodesic level set curves. Exploiting the intrinsic features of these regions, we extract discriminative keypoints on the facial manifold to elastically match the regions across faces for establishing dense correspondence. Finally, we generate a Region based 3D Deformable Model which is fitted to unseen faces to transfer the correspondences. We evaluate our algorithm on the tasks of facial landmark detection and recognition using two benchmark datasets. Comparison with thirteen state-of-the-art techniques shows the efficacy of our algorithm.

    Related items

    Showing items related by title, author, creator and subject.

    • Facial feature discovery for ethnicity recognition
      Wang, C.; Zhang, Q.; Liu, Wan-Quan; Liu, Y.; Miao, L. (2018)
      The salient facial feature discovery is one of the important research tasks in ethnical group face recognition. In this paper, we first construct an ethnical group face dataset including Chinese Uyghur, Tibetan, and Korean. ...
    • Heuristic algorithms for routing problems.
      Chong, Yen N. (2001)
      General routing problems deal with transporting some commodities and/or travelling along the axes of a given network in some optimal manner. In the modern world such problems arise in several contexts such as distribution ...
    • Robust and flexible landmarks detection for uncontrolled frontal faces in the wild
      Liang, A.; Wang, C.; Liu, Wan-Quan; Li, L. (2016)
      In this paper, we propose a robust facial landmarking scheme for frontal faces which can be applied on both controlled and uncontrolled environ-ment. This scheme is based on improvement/extension of the tree-structured ...
    Advanced search

    Browse

    Communities & CollectionsIssue DateAuthorTitleSubjectDocument TypeThis CollectionIssue DateAuthorTitleSubjectDocument Type

    My Account

    Admin

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Follow Curtin

    • 
    • 
    • 
    • 
    • 

    CRICOS Provider Code: 00301JABN: 99 143 842 569TEQSA: PRV12158

    Copyright | Disclaimer | Privacy statement | Accessibility

    Curtin would like to pay respect to the Aboriginal and Torres Strait Islander members of our community by acknowledging the traditional owners of the land on which the Perth campus is located, the Whadjuk people of the Nyungar Nation; and on our Kalgoorlie campus, the Wongutha people of the North-Eastern Goldfields.