Visually guided autonomous robot navigation : an insect based approach.
dc.contributor.author | Weber, Keven | |
dc.date.accessioned | 2017-01-30T10:09:34Z | |
dc.date.available | 2017-01-30T10:09:34Z | |
dc.date.created | 2008-05-14T04:36:27Z | |
dc.date.issued | 1998 | |
dc.identifier.uri | http://hdl.handle.net/20.500.11937/1609 | |
dc.description.abstract |
Giving robots the ability to move around autonomously in various real-world environments has long been a major challenge for Artificial Intelligence. New approaches to the design and control of autonomous robots have shown the value of drawing inspiration from the natural world. Animals navigate, perceive and interact with various uncontrolled environments with seemingly little effort. Flying insects, in particular, are quite adept at manoeuvring in complex, unpredictable and possibly hostile environments.Inspired by the miniature machine view of insects, this thesis contributes to the autonomous control of mobile robots through the application of insect-based visual cues and behaviours. The parsimonious, yet robust, solutions offered by insects are directly applicable to the computationally restrictive world of autonomous mobile robots. To this end, two main navigational domains are focussed on: corridor guidance and visual homing.Within a corridor environment, safe navigation is achieved through the application of simple and intuitive behaviours observed in insect, visual navigation. By observing and responding to observed apparent motions in a reactive, yet intelligent way, the robot is able to exhibit useful corridor guidance behaviours at modest expense. Through a combination of both simulation and real-world robot experiments, the feasibility of equipping a mobile robot with the ability to safely navigate in various environments, is demonstrated.It is further shown that the reactive nature of the robot can be augmented to incorporate a map building method that allows previously encountered corridors to be recognised, through the observation of landmarks en route. This allows for a more globally-directed navigational goal.Many animals, including insects such as bees and ants, successfully engage in visual homing. This is achieved through the association of visual landmarks with a specific location. In this way, the insect is able to 'home in' on a previously visited site by simply moving in such a way as to maximise the match between the currently observed environment and the memorised 'snapshot' of the panorama as seen from the goal. A mobile robot can exploit the very same strategy to simply and reliably return to a previously visited location.This thesis describes a system that allows a mobile robot to home successfully. Specifically, a simple, yet robust, homing scheme that relies only upon the observation of the bearings of visible landmarks, is proposed. It is also shown that this strategy can easily be extended to incorporate other visual cues which may improve overall performance.The homing algorithm described, allows a mobile robot to home incrementally by moving in such a way as to gradually reduce the discrepancy between the current view and the view obtained from the home position. Both simulation and mobile robot experiments are again used to demonstrate the feasibility of the approach. | |
dc.language | en | |
dc.publisher | Curtin University | |
dc.subject | visual navigation | |
dc.subject | insect homing | |
dc.subject | autonomous robot navigation | |
dc.title | Visually guided autonomous robot navigation : an insect based approach. | |
dc.type | Thesis | |
dcterms.educationLevel | PhD | |
curtin.thesisType | Traditional thesis | |
curtin.department | School of Computing | |
curtin.identifier.adtid | adt-WCU20020708.133936 | |
curtin.accessStatus | Open access |