Touch event recognition for human interaction
dc.contributor.author | Chen, Q. | |
dc.contributor.author | Li, H. | |
dc.contributor.author | Abu-Zhaya, R. | |
dc.contributor.author | Seidl, A. | |
dc.contributor.author | Zhu, Maggie | |
dc.contributor.author | Delp, E. | |
dc.date.accessioned | 2018-08-08T04:43:22Z | |
dc.date.available | 2018-08-08T04:43:22Z | |
dc.date.created | 2018-08-08T03:50:50Z | |
dc.date.issued | 2016 | |
dc.identifier.citation | Chen, Q. and Li, H. and Abu-Zhaya, R. and Seidl, A. and Zhu, M. and Delp, E. 2016. Touch event recognition for human interaction. | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/70052 | |
dc.identifier.doi | 10.2352/ISSN.2470-1173.2016.11.IMAWM-465 | |
dc.description.abstract |
© 2016 Society for Imaging Science and Technology. This paper investigates the interaction between two people, namely, a caregiver and an infant. A particular type of action in human interaction known as "touch" is described. We propose a method to detect "touch event" that uses color and motion features to track the hand positions of the caregiver. Our approach addresses the problem of hand occlusions during tracking. We propose an event recognition method to determine the time when the caregiver touches the infant and label it as a "touch event" by analyzing the merging contours of the caregiver's hands and the infant's contour. The proposed method shows promising results compared to human annotated data. | |
dc.title | Touch event recognition for human interaction | |
dc.type | Conference Paper | |
dcterms.source.issn | 2470-1173 | |
dcterms.source.title | IS and T International Symposium on Electronic Imaging Science and Technology | |
dcterms.source.series | IS and T International Symposium on Electronic Imaging Science and Technology | |
curtin.department | School of Public Health | |
curtin.accessStatus | Fulltext not available |
Files in this item
Files | Size | Format | View |
---|---|---|---|
There are no files associated with this item. |