Show simple item record

dc.contributor.authorCocks, Naomi
dc.contributor.authorSautin, L.
dc.contributor.authorKita, S.
dc.contributor.authorMorgan, G.
dc.contributor.authorZlotowitz, S.
dc.date.accessioned2017-01-30T11:17:57Z
dc.date.available2017-01-30T11:17:57Z
dc.date.created2013-09-03T20:00:20Z
dc.date.issued2009
dc.identifier.citationCocks, Naomi and Sautin, Laetitia and Kita, Sotaro and Morgan, Gary and Zlotowitz, Sally. 2009. Gesture and speech integration: An exploratory study of a man with aphasia. International Journal of Language & Communication Disorders. 44 (5): pp. 795-804.
dc.identifier.urihttp://hdl.handle.net/20.500.11937/10304
dc.identifier.doi10.1080/13682820802256965
dc.description.abstract

Background: In order to comprehend fully a speaker's intention in everyday communication, information is integrated from multiple sources, including gesture and speech. There are no published studies that have explored the impact of aphasia on iconic co-speech gesture and speech integration. Aims: To explore the impact of aphasia on co-speech gesture and speech integration in one participant with aphasia and 20 age-matched control participants. Methods & Procedures: The participant with aphasia and 20 control participants watched video vignettes of people producing 21 verb phrases in three different conditions, verbal only (V), gesture only (G), and verbal gesture combined (VG). Participants were required to select a corresponding picture from one of four alternatives: integration target, a verbal-only match, a gesture-only match, and an unrelated foil. The probability of choosing the integration target in the VG that goes beyond what is expected from the probabilities of choosing the integration target in V and G was referred to as multi-modal gain (MMG).Outcomes & Results: The participant with aphasia obtained a significantly lower multi-modal gain score than the control participants (p<0.05). Error analysis indicated that in speech and gesture integration tasks, the participant with aphasia relied on gesture in order to decode the message, whereas the control participants relied on speech in order to decode the message. Further analysis of the speech-only and gesture-only tasks indicated that the participant with aphasia had intact gesture comprehension but impaired spoken word comprehension. Conclusions & Implications: The results confirm findings by Records (1994) that reported that impaired verbal comprehension leads to a greater reliance on gesture to decode messages. Moreover, multi-modal integration of information from speech and iconic gesture can be impaired in aphasia. The findings highlight the need for further exploration of the impact of aphasia on gesture and speech integration.

dc.publisherJohn Wiley & Sons Ltd.
dc.titleGesture and speech integration: An exploratory study of a man with aphasia
dc.typeJournal Article
dcterms.source.volume44
dcterms.source.number5
dcterms.source.startPage795
dcterms.source.endPage804
dcterms.source.issn13682822
dcterms.source.titleInternational Journal of Language & Communication Disorders
curtin.departmentof Technlogy
curtin.accessStatusFulltext not available


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record