Meeting Abstract
A number of experiments have revealed how different visual features are used to guide familiar foraging routes in wood ants. Using these data, we have developed algorithms to extract visual features that ants use for guidance from panoramic scenes. In these experiments, we examine how the visual cues that ants use are extracted, prioritized and stored during navigation. A model was created to simulate navigation in a procedurally generated environment where the visual cues could be precisely characterized. In these environments, our algorithms extracted and stored the visual cues that were available during a single Levy walk foraging event. Following a random foraging event, the success on subsequent foraging bouts using the stored information was examined. In addition to the procedurally generated scenes, panoramic images from wooded areas, similar to the foraging terrain of our ants, were used to test our visual perception algorithms in natural scenes. Again, a single Levy walk was used to simulate the foraging route. The points of a Levy walk was generated in Matlab and these points determined the sites to be imaged. Following a single walk images were analyzed to identify and store the visual features. Our algorithms extracted several stable features that could be used to provide reliable landmarks to facilitate route learning. Following analysis and storage of the visual cues extracted from a natural scene, the success in finding the goal location on subsequent bouts was examined. These two studies have provided insight into how ants extract visual information from complex and natural scenes. This provided us with a platform to test predictions about the reliability and stability of specific visual features within complex, and cluttered panoramic scenes and has shown which cues may be stored to facilitate success on repeated foraging bouts.