Show simple item record

dc.contributor.authorSiddiqui, Aniqa Azeem
dc.contributor.supervisorMasood Khanen_US
dc.contributor.supervisorTele Tanen_US
dc.date.accessioned2023-02-28T03:25:48Z
dc.date.available2023-02-28T03:25:48Z
dc.date.issued2021en_US
dc.identifier.urihttp://hdl.handle.net/20.500.11937/90683
dc.description.abstract

This thesis reports a novel approach to translate tongue voluntary motion data into user-specific parametric tongue position models which are then used to train a classifier. The classifier is then tested to localise tongue movements into cursor movement on screen. It can be used by rehabilitation centres and researchers to develop tongue assistive technology. The system, designed and developed to let disabled people use tongue for interacting with computers, is called InfraRed (IR) Activated Tongue-Computer Interaction System (IRTCIS)

en_US
dc.publisherCurtin Universityen_US
dc.titleDesign and development of Infrared activated tongue computer interaction systemen_US
dc.typeThesisen_US
dcterms.educationLevelMPhilen_US
curtin.departmentSchool of Civil and Mechanical Engineeringen_US
curtin.accessStatusOpen accessen_US
curtin.facultyScience and Engineeringen_US
curtin.contributor.orcidSiddiqui, Aniqa Azeem [0000-0002-6993-275X]en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record