Head of team
Olivier LE MEUR (Enseignent-chercheur, Université de Rennes 1)

PERCEPT : Comportement visuel de différentes populations - Computational Visual Perception and Applications

The research of PERCEPT team deals with applied-visual perception. This is a cross-disciplinary project, spanning computer science, cognitive science and visions science. We consider that visual phenomena are induced not only by the stimulus itself but also by the observer himself as well as by the intent of the creative director. Our research objectives aim to study these three components independently, but, more importantly, jointly. This very last point is the most important scientific challenge for the PERCEPT team. It would require to understand the visual behavior of observers in many viewing conditions, encompassing a variety of content (e.g. natural images, webpages, comics, paintings...), a variety of style and aesthetic.

The visual behavior mainly refers to the overt attention, that, per definition, involves eye movements. The observation and the understanding of eye movements is then fundamental for the team objectives. Indeed, eye movements are an exogenous manifestation of where we look at and reflect, to some extent, the visual processing that are involved in the perception of our visual environment. It is common to say that eye movements can be compared to a window on mind and soul. The way we look within a visual scene, the way we jump from one location to another may indeed indicate the task we are trying to perform, our gender, our age, and whether we suffer from a visual disease or not.

We believe that the monitoring, the understanding and the modelling of factors influencing our visual behavior are fundamental for opening up new avenues for the development of cutting-edge computed-based applications and contents tailored for different groups of people.


Creation date
Reporting institution
Université de Rennes 1
Campus de Beaulieu, Rennes (35)
Activity reports
Attachment Size
percept2019.pdf 796.64 KB
percept2018.pdf 784.89 KB