Computer vision for human-machine interaction
dc.contributor.author | Ke, Q. | |
dc.contributor.author | Liu, J. | |
dc.contributor.author | Bennamoun, M. | |
dc.contributor.author | An, Senjian | |
dc.contributor.author | Sohel, F. | |
dc.contributor.author | Boussaid, F. | |
dc.date.accessioned | 2019-02-19T04:17:21Z | |
dc.date.available | 2019-02-19T04:17:21Z | |
dc.date.created | 2019-02-19T03:58:21Z | |
dc.date.issued | 2018 | |
dc.identifier.citation | Ke, Q. and Liu, J. and Bennamoun, M. and An, S. and Sohel, F. and Boussaid, F. 2018. Computer vision for human-machine interaction. In Computer Vision For Assistive Healthcare, 127-145. | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/74557 | |
dc.identifier.doi | 10.1016/B978-0-12-813445-0.00005-8 | |
dc.description.abstract |
© 2018 Elsevier Ltd. All rights reserved. Human-machine interaction (HMI) refers to the communication and interaction between a human and a machine via a user interface. Nowadays, natural user interfaces such as gestures have gained increasing attention as they allow humans to control machines through natural and intuitive behaviors. In gesture-based HMI, a sensor such as Microsoft Kinect is used to capture the human postures and motions, which are processed to control a machine. The key task of gesture-based HMI is to recognize the. | |
dc.title | Computer vision for human-machine interaction | |
dc.type | Book Chapter | |
dcterms.source.startPage | 127 | |
dcterms.source.endPage | 145 | |
dcterms.source.title | Computer Vision For Assistive Healthcare | |
dcterms.source.isbn | 9780128134450 | |
curtin.department | School of Electrical Engineering, Computing and Mathematical Science (EECMS) | |
curtin.accessStatus | Fulltext not available |
Files in this item
Files | Size | Format | View |
---|---|---|---|
There are no files associated with this item. |