A Perception-aware Architecture for Autonomous Robots


Service robots are required to operate in indoor environments to help humans in their daily lives. To achieve the tasks that they might be assigned, the robots must be able to autonomously model and interact with the elements in it. Even in homes, which are usually more predictable than outdoor scenarios, robot perception is an extremely challenging task. Clutter, distance and partial views complicate modelling the environment, making it essential for robots to approach the objects to perceive in order to gain favourable points of view. This article proposes a novel grammar-based distributed architecture, designed with reusability and scalability in mind, which enables robots not only to find and execute the perception-aware plans they need to achieve their goals, but also to verify that the world representation they build is valid according to a set of grammatical rules for the world model. Additionally, it describes a real-world example of use, providing qualitative results, in which a robot successfully models the room in which it is located and finds a coffee mug.

Publication DOI: https://doi.org/10.5772/61742
Divisions: College of Engineering & Physical Sciences
Additional Information: © 2015 Author(s). Licensee InTech. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Publication ISSN: 1729-8814
Last Modified: 16 Apr 2024 07:31
Date Deposited: 08 Jun 2021 13:46
Full Text Link:
Related URLs: https://journal ... l/10.5772/61742 (Publisher URL)
PURE Output Type: Article
Published Date: 2015-12-01
Accepted Date: 2015-10-08
Authors: Manso, Luis J. (ORCID Profile 0000-0003-2616-1120)
Bustos, Pablo
Bachiller, Pilar
Núñez, Pedro



Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Additional statistics for this record