Machine learning for robot perception
Many modern robots are equipped with powerful sensors, such as cameras and lidars, which produce large amounts of high-dimensional data. Robots frequently rely on state-of-the-art deep learning techniques to make sense of such data and extract useful information that guides autonomous intelligent behaviour. In this context, the institute tackles important research challenges such as:
- enabling robots to autonomously learn and adapt their perception models to changing conditions without explicit human supervision, leveraging self-supervised learning techniques;
- deploying perception models on new exciting platforms, such as nano-sized unmanned aerial vehicles, with severe constraints in payload, battery, and power budget;
- pioneering applications to novel, practical and challenging tasks, both for mobile (wheeled, legged or flying) and industrial robots, with special attention to cases in which limited data for training machine learning models is available.
Past work in this area led to exciting demonstrations such as fully onboard vision-based nano drone control in human proximity, robots that automatically learn to perceive long-range obstacles without any external supervision, a quadrotor that follows a forest trail, and traversability estimation models for ground robots on challenging terrain. The same research results also found successful application in industrial R&D projects with companies, tackling challenging tasks such as visual quality inspection systems that automatically adapt to different types of products.
Interaction interfaces for humans and robots sharing spaces
Humans that share an environment (such as a corridor, an office, or a factory floor) find it natural to interact and coordinate with each other. This research area aims at developing interaction methods that allow humans and robot swarms to interact with each other just as efficiently.
In past research, this led to algorithms that allow swarms of robots to efficiently navigate among crowds of humans without collisions, and to an award-winning human-robot interaction pipeline that allows to localize, identify, select, and guide a robot in the direct line of sight of an operator wearing a smartwatch, by simply pointing at the robot to take control of it, and then pointing at its target in the environment. The system can be applied both to ground and flying robots and has clear applications for example in disaster robotics scenarios, where human rescuers are expected to interact with rescue robots. More recently, the same idea is being applied to interaction with automation systems and conveyor belts in logistic plants.
Robotics for education
Using robots in education is a very important interdisciplinary field of research, at the intersection between educational sciences and robotics. The lab is part of ‘Introducing People to Research in Robotics through an Extended Peer Community in Southern Switzerland’: a project awarded the Optimus Agora Prize by the Swiss National Science Foundation.
-
Department of Innovative Technologies
Dalle Molle Institute for Artificial Intelligence USI-SUPSI
Polo universitario Lugano - Campus Est, Via la Santa 1
CH-6962 Lugano-Viganello
T +41 (0)58 666 66 66
info@idsia.ch
-
Faculty of Informatics
Università della Svizzera italiana
Polo universitario Lugano, Campus Est, via La Santa 1
6900 Lugano-Viganello
T +41 (0)58 666 46 90
decanato.inf@usi.ch