With the recent development of character simulation and virtual reality, avatars (also called virtual characters, or virtual representations of users) are now more and more often used in a large number of applications, such as: to train users to procedures or motor tasks, to interact with other users in order to access digital data, or also to enhance entertainment and art. In most cases, these avatars are used to increase engagement by embodying users into their avatars [KGS12]. Embodiment, i.e, the capacity of considering an avatar to represent our virtual self in the virtual world, is also highly important in these applications to increase immersion and presence (the feeling of being there). However, current avatars often fail to convey appropriate levels of embodiment due to the lack of appropriate control mechanisms (e.g., because their body motion does not match the user’s motions) or due to the fact that the avatar is not able to convey physical information about the environment (e.g. the efforts involved in lifting virtual objects).
This PhD aims to explore how to create avatars which enhance user acceptance and improve sensory appreciations of the environment through (1) a natural user control and (2) its physicalization. Such avatars should enhance user immersion and embodiment by providing a seamlessly control of the avatar and the ability to perceive the physical properties of the virtual environment.
The first objective of this PhD is to design new control methods to animate an avatar in real-time by using the user’s body motion. The main research question is how to capture and interpret the complexity of human motion. However, the challenges are multiple: 1) the range of possible actions is infinite, 2) the hardware choice influences the quality of the input (e.g., high quality motion capture systems like Vicon are extremely expensive, vs. low-cost systems such as Kinect) and 3) user and avatar morphology is usually different (need of retargeting as, e.g., the reaching capabilities of users and avatars might be incompatible). In contrast with systems based on expensive full-body tracking approaches, this PhD aims to design novel controls (e.g. mapping methods and transfer functions) of the avatar based on a reduced set of tracked joints. The goal is to facilitate their use in interactive VR applications, similarly to what has been done in regards to simplifying the complex simultaneous capture of finger and body motions for games [HRM12]. These control mechanisms will be evaluated and validated experimentally based on their efficiency in terms of control, presence and embodiment [DMH15].
The second objective of this PhD is to enhance the capacity of avatars to convey relevant information about the properties of the virtual environment through “full-body pseudo-haptic feedback” (introduced in [JAO14]). Pseudo-haptic feedback [Lec09] is an approach which aims to simulate a wide range of haptic sensations in the absence of haptic devices. For example, by combining visual feedback in a synchronized way with the user’s action, visuo-haptic illusions can be created (e.g., the sense that something is heavy). The goal is to take advantage of this approach and propose novel kinds of pseudo-haptic avatar-based interactions by decoupling the user’s actions from the avatar’s reactions in order to convey a ”distorted perception” or pseudo-haptic feedback to the user (e.g., kinematic or dynamic properties of the interaction). Several tasks could be explored, such as the manipulation of virtual objects (lifting, pushing, throwing, stretching, etc); or the navigation or locomotion of the avatar in virtual environments (walking, running, jumping, etc). These novel pseudo-haptic techniques will also be evaluated and validated experimentally based on their efficiency in terms of perception, control, presence and embodiment. In addition, the final goal will be to evaluate the capacity of different levels of control mechanisms (from objective 1) to elicit different types of pseudo-haptic feedback.
Results and Impact
This PhD aims at improving interactions with virtual environments through the use of avatars, by taking into account the entire interaction loop from the accurate control of the avatar to the improvement of the feedback provided to the user through “physicalized” avatars. The results obtained in this PhD could be directly transferred to interactive applications (from high-end VR applications to consumer market games), including the local ImmerStar virtual reality platform which provides immersive services to users from various research teams. Thus we expect a potential wide impact of the obtained results.