Aerial Robots with the Sense of Touch

Publié le
Equipe
Date de début de thèse (si connue)
Octobre 2023
Lieu
IRISA, Rennes
Unité de recherche
IRISA - UMR 6074
Description du sujet de la thèse

Aerial robots (commonly called “drones”) are nowadays extensively used to see the environment for surveillance and monitoring in applications like agriculture, mapping, etc. But if aerial robots were also able to effectively manipulate the environment (by physically interacting with it), the application domains could be further extended toward new areas like contact-based inspection, assembly and construction, and so on. But contact is today synonymous for crash, and thus avoided. The unstable nature of aerial robots, as well as their non-linear dynamics and limited vision-based perception, makes the manipulation problem in real environments extremely difficult and delicate.

To show the feasibility of Aerial Physical Interaction (APhI), the research community has focused in the last years on the design and control of aerial manipulators, that is, drones equipped with an onboard manipulator arm. This opened the door to new applications, e.g., contact-based inspection. However, current methodologies are still limited to very simple interaction tasks, involving limited contact behaviors with static and rigid surfaces (e.g., touching a flat wall with a stick attached to the robot). Furthermore, although the preliminary results are very encouraging, they have also been obtained in extremely controlled lab conditions. Current experiments mostly rely on external Motion Capture Systems (MOCAPs) that guarantee precise state estimation and environmental models. However, in real scenarios robots can only rely on on-board sensors, which today do not provide at all the required accuracy

Thanks to the investigation of novel haptic-sensing for aerial robots, combined with the well-known vision-sensing, this Thesis has the ambition to drastically enhance aerial physical interaction capabilities for complex manipulation in real environments. We aim at bringing aerial robots much closer and faster to real applications than what the current investigation based on vision-only systems could do.

For a grounded robot, which is very precise and stable, manipulating articulated objects might be an easy task, but for an aerial robot it is extremely complex. These challenging tasks require in-flight physical interaction whit millimeter precision. However, because of limited payload, aerial robots can employ only “cheap” vision-based sensors which, subject to vibrations of propellers, can provide only centimeter accuracy. Unfortunately, only vision-based methods are not enough to perform complex manipulation tasks.

On the other hand, in nature, it has been shown that the sense of touch is a key element to perform manipulation. Even humans, with a very accurate vision system and dexterous hands, would have problems performing manipulation tasks without the sense of touch.

Nevertheless, so far no one dared to extend haptic and touch capabilities to aerial robots, and even less to exploit touch sensors as direct feedback to perform manipulation. We will address these challenges in the following work-packages:

WP1 - Touch sensor: On one side we have to answer the question if and which “touch sensors” are feasible for flying robots. The vibrations and the air gusts produced by the propellers might drastically deteriorate the measurements making them useless (this, for instance, has been the case with many vision-based sensors that required specific modifications to be used on aerial robots, e.g., event-based cameras). We will review and evaluate different touch sensors mounted on the end-effector of an aerial manipulator. We will analyze the quality of contact and interaction force measurements, designing appropriate filters and mechanical supports to make sensors readings free of propellers’ vibrations.

WP2 - haptic-based control: On the other side, we have to understand how to use those measurements at best. Currently, they are used only to improve models of the environments, based on which standard control methods are applied. This introduces an overhead in terms of both computational time and resources. We instead foresee the use of the sense of touch to directly drive actions. Skin-like sensors produce a matrix where every cell corresponds to the pressure value. One can notice the similarity with cameras. Therefore, skin-like sensors provide “images of the contact”. Inspired by this correlation, we will investigate even further the similarities and differences, to then try to transfer the rich knowledge on vision-based perception to the purely explored area of haptic-based perception. We will take inspiration from visual-servoing and reinforcement-learning control methods to define a new haptic-servoing paradigm, where the robot actions are defined to reproduced a desired haptic feeling. The expertise of the Rainbow group on both visual-servoing and haptics will be helpful. We want to study if it is possible to transfer to robots the capacity of humans in using the sense of touch for manipulation.

WP3 - experimental evaluation: The proposed methodologies will be validated on real aerial manipulators. Their effectiveness will be demonstrated through complex interaction tasks in real-world scenarios, like opening a door with only onboard sensors and minimal knowledge of the environment. We will further assess their practical value by investigating their performance when sensors’ accuracy degrades, as well as their limitations.

Bibliographie
  1. A. Ollero, M. Tognon, A. Suarez, D. J. Lee, and A. Franchi. ``Past, present, and future of aerial robotic manipulators.’’ IEEE Trans. on

    Robotics, 2021.

  2. M. Tognon, Tello-Chavez, H. A., Gasparin, E., Sablé, Q., Bicego, D., Mallet, A., Lany, M., Santi, G., Revaz, B., Cortés, J., and Franchi, A., “A

    Truly-Redundant Aerial Manipulator System With Application to Push-and-Slide Inspection in Industrial Plants”, IEEE Robotics

    and Automation Letters, vol. 4, no. 2, pp. 1846-1851, 2019.

  3. M. Hamandi, Usai, F., Sablé, Q., Staub, N., Tognon, M., and Franchi, A., “Design of Multirotor Aerial Vehicles: a Taxonomy Based on

    Input Allocation”, International Journal of Robotics Research, vol. 40, no. 8-9, pp. 1015-1044, 2021.

  4. G. Nava, Sablé, Q., Tognon, M., Pucci, D., and Franchi, A., “Direct Force Feedback Control and Online Multi-Task Optimization for

    Aerial Manipulators”, IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 331-338, 2020.

  5. K. Bodie, Tognon, M., and Siegwart, R., “Dynamic End Effector Tracking with an Omnidirectional Parallel Aerial Manipulator”, IEEE

    Robotics and Automation Letters, vol. 6, no. 4, pp. 8165-8172, 2021.

  6. M. Tognon, Yüksel, B., Buondonno, G., and Franchi, A., “Dynamic decentralized control for protocentric aerial manipulators”, in

    2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, Singapore, 2017

  7. P. Robuffo Giordano, Q. Delamare, A. Franchi. Trajectory Generation for Minimum Closed-Loop State Sensitivity. In IEEE Int. Conf.

    on Robotics and Automation, ICRA'18, Pages 286-293, Brisbane, Australia, May 2018

  8. P. Brault, Q. Delamare, P. Robuffo Giordano. Robust Trajectory Planning with Parametric Uncertainties. In IEEE Int. Conf. on

    Robotics and Automation, ICRA'21, Pages 11095-11101, Xi'an, China, May 2021

Liste des encadrants et encadrantes de thèse

Nom, Prénom
Robuffo Giordano, Paolo
Type d'encadrement
Directeur.trice de thèse
Unité de recherche
UMR 6074
Equipe

Nom, Prénom
Tognon, Marco
Type d'encadrement
Co-encadrant.e
Unité de recherche
Inria
Equipe
Contact·s
Nom
Robuffo Giordano, Paolo
Email
prg@irisa.fr
Mots-clés
drones, vision, haptics, physical interaction with the environment