Augmenting the interaction with everyday objects with wearable haptics and Augmented Reality

Publié le
Equipe
Date de début de thèse (si connue)
Octobre 2021
Description du sujet de la thèse

Context

One of the presence-breaking factors when interacting with an artificial world is the lack of haptic feedback. There exist many ways of simulating haptic sensations in virtual and augmented environments, e.g., using actuated devices known as force feedback or tactile interfaces or using passive props also known as tangible objects.

tangible object Tangible objects are known to be very effective at providing human users with haptic information about distributed shape and weight sensations (see Fig. 1 for an example of a user interacting with a tangible cube). Although rather effective, developing multiple, ad-hoc, haptic-enabled tangible objects for each scenario requires a significant amount of work. Moreover, actuated tangible objects can be expensive and complex to build.

Fig. 1. User interaction in VR using
a tangible object cube [Harley et al. (2017)].

 
Wearable haptics indicates haptic interfaces that can be easily and comfortable worn by the user. They provide haptic sensations localized to the skin, including pressure, vibrotactile, and skin strech stimuli. Although rather popular, as they only provide tactile stimulation (no kinesthetic/force feedback), they are not able to simulate stiff contacts or provide information about distributed shape. wearable haptics


Fig. 2. Wearable haptic interface for the palm [Trinitatova and Tsetserukou (2019)]

Subject

Imagine sitting at your desk. There are few pens, a notebook, a large screen, a keyboard, and other everyday objects. Through a AR headset, the surrounding gets visually augmented by a plethora of virtual objects grounded on the real, tangible environment, e.g., the notebook gets augmented with buttons and control knobs, the pens become drum sticks, and the keyboard turn into a large drum cymbal. Through wearable haptics, the surrounding gets haptically augmented so as to match the expected physical characteristics anticipated by the above visual augmentation, e.g., you can now feel the reliefs of the button and knobs on the notebook, perceive the weight of the drum sticks, and experience the vibrations when hitting the cymbal.

The objective of this Ph.D. is to study how we can improve the interaction with augmented reality environments by taking the best of two simple haptic solutions: tangible objects and wearable haptics. Tangible objects will simulate the global and distributed shape/percept of the virtual/augmented object, while wearable haptics will dynamically change its mechanical properties.

The project will proceed by developing four main key aspects:

  • study the perceptual interaction and conflicts between vision and haptics when interacting with passive tangible objects in AR/MR;

  • design interaction techniques combining the  capabilities of wearable haptics (vibrotactile rings, pressure fingertip displays) and tangible objects for AR/MR, addressing the known limitations of the proposed system, e.g.,  under-actuation, limited range of forces, and visual  occlusions.

  • develop use cases augmenting the shape, stiffness, and texture of everyday objects through wearable haptics and AR, focusing on representative job (e.g., desk with a computer) and household (e.g., a dinner table with plates and cutlery) environments.

  • evaluate the performance and user experience of the proposed augmentation approach on human subjects.

We are looking for excellent, highly-motivated students interested in Virtual/ Augmented/Mixed Reality and haptics,  with a computer science background and previous experience in computer  programming (C#,  C++). Experience in using VR/AR tools (e.g.,  Unity 3D, ARToolkit, Oculus Rift, Hololens) and single-board/embedded computers (e.g., Arduino, Raspeberry) is considered a plus.

The student should be able to speak and write in English.

The team

The PhD work will be carried out in the RAINBOW team at IRISA, whose research focuses on human-machine interaction and sensor-based robotics. Its activities revolves around the idea of (shared) cooperation between machines and humans: on the one hand, empower robots with a large degree of autonomy  for allowing them to effectively operate in non-trivial environments (e.g., outside completely defined factory settings). On the other hand,  include human users in the loop for having them in control of some aspects of the overall robot behavior. The three research axes of the team are: Optimal and Uncertainty-Aware Sensing, Advanced Sensor-based Control, and Haptics for Robotics Applications.

 

 

Bibliographie

Trinitatova, Daria, and Dzmitry Tsetserukou. "DeltaTouch: a 3D Haptic  Display for Delivering Multimodal Tactile Stimuli at the Palm." 2019 IEEE World Haptics Conference (WHC). IEEE, 2019

Harley, Daniel, et al. "Tangible VR: Diegetic tangible objects for virtual reality narratives." Proceedings of the 2017 Conference on Designing Interactive Systems. 2017.

Salazar, Steeven Villa, et al. "Altering the stiffness, friction, and shape perception of tangible objects in virtual reality using wearable haptics." IEEE transactions on haptics 13.1 (2020): 167-174.

De Tinguy, Xavier, et al. "Enhancing the stiffness perception of tangible objects in mixed reality using wearable haptics." 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2018.

Liste des encadrants et encadrantes de thèse

Nom, Prénom
Maud MARCHAL
Type d'encadrement
Directeur.trice de thèse
Unité de recherche
UMR6074

Nom, Prénom
Claudio PACCHIEROTTI
Type d'encadrement
Co-encadrant.e
Unité de recherche
UMR6074
Contact·s