Vous êtes ici

Perception-based Virtual Human Motion Personalisation

Equipe et encadrants
Département / Equipe: 
Site Web Equipe: 
http://team.inria.fr/mimetic
Directeur de thèse
Franck Multon
Co-directeur(s), co-encadrant(s)
Ludovic Hoyet
Contact(s)
NomAdresse e-mailTéléphone
Ludovic Hoyet
ludovic.hoyet@inria.fr
0299842521
Sujet de thèse
Descriptif

Context

The goal of this project is to propose new methods to personalise the motions of virtual characters. Such characters are now a requisite to create always more lifelike virtual worlds, and are now widely used in industries ranging from entertainment to training and education. While many factors influence their realism (e.g., appearance [MLD*08,MLH*09], behaviours, motions [HOKP16], clothes, etc.), the goal of this PhD is to focus on creating natural personalised motions. As people do not perform actions in precisely the same manner, not even in the same manner every time, variations in natural motions are important to create always more believable virtual humans. 
 

Although the visual realism of virtual human motions has drastically improved over the last decades, especially due to improvements in motion capture based approaches, current animation techniques still create a certain uniformity of motion across characters. For single individuals (e.g., a main character), displaying the same generic motions for all users can limit their engagement, as motions are not personalised for any user. Similarly the absence of variation in large groups of individual also affects realism when they all move in the same manner. Tremendous amounts of manual artistic work can indeed create such variations, which undeniably improves overall realism (e.g., crowds in computer generated movies like Warcraft, Star Wars, The Hobbit), however it is still impossible to automatically create such levels of personalisation for interactive applications.

Objectives

The goal of this PhD is therefore to explore the creation of new models for personalising human motions for different users and characters. Unlike existing approaches which rely on learning statistical models to create variations [HPP05,LBK09], the goal is to explore the creation of variations from a perception point of view. Based on some of our recent results exploring the factors that make biological human motions to be recognizable and appealing [HRZ*13], the challenge is to express variations as differences from what users consider to be an average motion.
 

Therefore, the first objective of this PhD is to define perception-based average motions. Amongst others, we found in our previous study that creating average motions (from 15 actors) always resulted in motions considered to be amongst the least distinctive and most attractive. This result was consistent across walking and jogging motions, as well as male and female ones, and corroborated findings in Psychology on average faces. However, when comparing motions in order to create variations, traditional approaches such motions are computed as numerical averages. Therefore, the first challenge of this PhD is to explore what define natural average motions from a perceptual point of view. Then, the second objective is to explore methods to produce natural motion personalisation as perceptual differences from these average motions. The challenge consists in understanding and determining the type of differences from real human data that lead to perceptually personalised motions from those which do not (i.e., not visually different), and to relate the former to physiological parameters of the users (e.g., age, weight, height)

Results and Impact

The aim of this PhD is to propose a new model to interactively personalise the motions of virtual characters, based on physiological parameters and drawing from perceptual insights to produce natural motions. Such results would be highly beneficial to improve the naturalness of applications using virtual characters, especially in the current context democratising consumer virtual reality applications, with potential applications in numerous other fields using virtual humans, such as rehabilitation, virtual therapies, ergonomics, etc. For instance, patient-personalised motions in health related applications might improve their acceptation of their virtual representation, such as in the case of body representation therapies. This work would also contribute to the local ImmerStar virtual reality platform by providing novel natural method to produce more natural personalised human motions.

Bibliographie

[HOKP16]    Hoyet, A.-H. Olivier, R. Kulpa, J. Pettré. 2016.  Perceptual Effect of Shoulder Motions on Crowd Animations. In ACM Transaction on Graphics (SIGGRAPH 2016), 35(4).

[HRZ*13]    L. Hoyet, K. Ryall, K. Zibrek, H. Park, J. Lee, J. Hodgins and C. O'Sullivan. 2013. Evaluating the Distinctiveness and Attractiveness of Human Motions on Realistic Virtual Bodies. In ACM Transactions on Graphics (SIGGRAPH Asia 2013), 32(6).

[HPP05]    E. Hsu, K. Pulli and J. Popović. Style translation for human motion. ACM Trans. Graph. 24(3), 2005.

[LBK09]    M. Lau, Z. Bar-Joseph and J. Kuffner. Modeling spatial and temporal variation in motion data. ACM Trans. Graph. (SIGGRAPH Asia 2009), 28(5).

[MLD*08]    R. McDonnell, M. Larkin, S. Dobbyn, S. Collins and C. O'Sullivan. Clone Attack! Perception of Crowd Variety. In ACM Transactions on Graphics (SIGGRAPH 2008), 27(3), 2008.

[MLH*09]    R. McDonnell, M. Larkin, B. Hernandez, I. Rudomin and C. O'Sullivan. Eye-catching Crowds: Saliency based Selective Variation. In ACM Transactions on Graphics (SIGGRAPH 2009), 28(3), 2009.

Début des travaux: 
Dès que possible
Mots clés: 
Virtual Characters, Human Motion, Perception, User Experimentation
Lieu: 
IRISA - Campus universitaire de Beaulieu, Rennes