- English only
The objective of this project is to develop the tools and algorithms necessary for a team of mobile robots equipped with vision sensors to explore and map a rough outdoor environment in a scenario similar to planetary exploration.
To explore an area, the robots must be able to build a map of the environment in this area and to locate themselves in the map. This environment modeling and localization process is called Simultaneous Localization and Mapping (slam). It is considered to be a prerequisite in order to have really autonomous robots and it has therefore been a central research topic in the mobile robotics community over the past two decades and it still receives a lot of attention from researchers.
The usual mobile robots used for slam experiments used mostly sonars or 2D laser-range-ﬁnders as sensors. These sensors have the advantage to directly output features easily usable by slam algorithms. Until recently, the use of cameras has not been at the center of mobile robots slam because of the computation power needed for image analysis algorithms that produce vision features and also the lack of cameras that could efficiently be embedded in mobile robots. However, vision is an appropriate choice of slam sensor: cameras accurately capture the world’s geometry, they allow the slam algorithms to work in 3D environments and they appealing to humans because it is the sense we primarily use for navigation. Cameras also are passive sensors and this allows a team of robots to work in the same area without interfering each other. Nowadays cameras are becoming cheap, ubiquitous, and compact. Furthermore, embeddedsystems now have the computation power to processthe high data rates coming from a camera and toturn them into slam features in real-time.
For an exploration task, it is obvious that a team of robots is going to be very efficient: the robots can bespread on the area to share the work. Furthermore, the robot redundancy gives a better robustness to the system: if a robot fails, the other can still continuetheir work and catch up with the work of the ﬂawed robot.
© 2015 EPFL, LSRO, Station 9, 1015 Lausanne, tel. +41 21 693 3825
Last modification 09/08/10