Walking robots have the advantage over moving ones of being able to move better in rough terrain and especially in human habitats (buildings, stairs, etc.). Different approaches of walking for quadrupedal and bipedal robots, the possible applications of machine learning and ZMP-based control to generate efficient and stable walking patterns are explored.
Perception is central to autonomous robots in that any action planning is based on the knowledge extracted from sensing. Information about the robot's own state (posture, position in space, etc.) is obtained mainly from accelerometers, gyroscopes and joint angle sensors. Information about the robot's environment is mainly extracted from camera systems. This must be done in real time, i.e. 20-30 times per second, to allow fluid tracking of fast movements.
Mobile robots are becoming increasingly powerful due to the development of hardware. This concerns both the perception of the environment through sensor technology and the ability to interact with it through actuators. As a result, the complexity of the tasks that autonomous robots can perform is growing at a high rate. At the IRF, methods are being investigated to define these tasks appropriately for the robot, both to allow the developer to describe complex tasks intuitively and to allow the robot to adapt its behavior autonomously.