Curtin University Homepage
  • Library
  • Help
    • Admin

    espace - Curtin’s institutional repository

    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item
    • espace Home
    • espace
    • Curtin Research Publications
    • View Item

    Human interaction prediction using deep temporal features

    Access Status
    Fulltext not available
    Authors
    Ke, Q.
    Bennamoun, M.
    An, Senjian
    Boussaid, F.
    Sohel, F.
    Date
    2016
    Type
    Conference Paper
    
    Metadata
    Show full item record
    Citation
    Ke, Q. and Bennamoun, M. and An, S. and Boussaid, F. and Sohel, F. 2016. Human interaction prediction using deep temporal features, pp. 403-414.
    Source Title
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    DOI
    10.1007/978-3-319-48881-3_28
    ISBN
    9783319488806
    School
    School of Electrical Engineering, Computing and Mathematical Science (EECMS)
    URI
    http://hdl.handle.net/20.500.11937/69567
    Collection
    • Curtin Research Publications
    Abstract

    © Springer International Publishing Switzerland 2016. Interaction prediction has a wide range of applications such as robot controlling and prevention of dangerous events. In this paper, we introduce a new method to capture deep temporal information in videos for human interaction prediction. We propose to use flow coding images to represent the low-level motion information in videos and extract deep temporal features using a deep convolutional neural network architecture. We tested our method on the UT-Interaction dataset and the challenging TV human interaction dataset, and demonstrated the advantages of the proposed deep temporal features based on flow coding images. The proposed method, though using only the temporal information, outperforms the state of the art methods for human interaction prediction.

    Related items

    Showing items related by title, author, creator and subject.

    • Leveraging Structural Context Models and Ranking Score Fusion for Human Interaction Prediction
      Ke, Q.; Bennamoun, M.; An, Senjian; Sohel, F.; Boussaid, F. (2018)
      Predicting an interaction before it is fully executed is very important in applications, such as human-robot interaction and video surveillance. In a two-human interaction scenario, there are often contextual dependency ...
    • Improving strategy for discovering interacting genetic variants in association studies
      Uppu, S.; Krishna, Aneesh (2016)
      Revealing the underlying complex architecture of human diseases has received considerable attention since the exploration of genotype-phenotype relationships in genetic epidemiology. Identification of these relationships ...
    • A new representation of skeleton sequences for 3D action recognition
      Ke, Q.; Bennamoun, M.; An, Senjian; Sohel, F.; Boussaid, F. (2017)
      © 2017 IEEE. This paper presents a new method for 3D action recognition with skeleton sequences (i.e., 3D trajectories of human skeleton joints). The proposed method first transforms each skeleton sequence into three clips ...
    Advanced search

    Browse

    Communities & CollectionsIssue DateAuthorTitleSubjectDocument TypeThis CollectionIssue DateAuthorTitleSubjectDocument Type

    My Account

    Admin

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Follow Curtin

    • 
    • 
    • 
    • 
    • 

    CRICOS Provider Code: 00301JABN: 99 143 842 569TEQSA: PRV12158

    Copyright | Disclaimer | Privacy statement | Accessibility

    Curtin would like to pay respect to the Aboriginal and Torres Strait Islander members of our community by acknowledging the traditional owners of the land on which the Perth campus is located, the Whadjuk people of the Nyungar Nation; and on our Kalgoorlie campus, the Wongutha people of the North-Eastern Goldfields.