You are here

A Perceptual-driven Approach to Film Editing

Team and supervisors
Department / Team: 
Team Web Site: 
http://people.irisa.fr/Olivier.Le_Meur/
PhD Director
Olivier Le Meur
Co-director(s), co-supervisor(s)
Marc Christie
Contact(s)
NameEmail addressPhone Number
Olivier Le Meur
olemeur@irisa.fr
0299847425
PhD subject
Abstract

Recent years have witnessed the rise of perceptual studies in the analysis of moving images [1].  Typically theories such as attentional synchrony [2] have played a key role in demonstrating how the spectators' attention is driven in a shot (a continuous camera sequence) and more importantly across shots, thereby providing foundations to explain long-used empirical editing techniques adopted by filmmakers (i.e. how and when cuts should be performed between viewpoints). Attentional synchrony has indeed shown how spectators’ gaze has a limited spatial dispersion on the screen, and how this dispersion is affected by a number of factors encompassing film editing, shot composition, and camera motions [3]. Interestingly, the recent development of computational attention techniques [4,5] which aim to predict where and in which order the gaze explores the spatial extent of images, could be a key ingredient for automatically monitoring the quality of an edited film sequence, and furthermore for optimizing the edit of a film sequence with respect to visual continuity and aesthetic intentions of a film director.

Our objective in this PhD is twofold.

First, we aim at determining a computational perceptual model capable of evaluating the continuity of an edit in a sequence of shots through measuring attentional synchrony. To this end, we first propose a study using gaze tracking techniques on a large collection of edits. Edits will be constructed by generating synthetic movies using rule-based automated editing techniques developed by the Mimetic team [6]. The interest lies in the capacity of such techniques to generate a large number of variations in edits with a fine control on editing parameters, in starking contrast with existing approaches mostly based on real footage and manual edits. Such a study will serve to calibrate our computational model. The challenge consists in identifying the factors and editing rules allowing to control observers’ dispersion or focalization.  

Our second objective is to use the aforementioned computational approach allowing to automatically generate an edit from a collection of shots. Not only will we explore means to control the gaze through shots by re-framming and cutting, but also explore how these may influence the users experiences and his emotions. For instance, the experience of fear is amplified when the camera focused on people who are scared [7]. By relying on machine learning technique (e.g. deep learning), the PhD will explore means of correlating filmic techniques, gaze dispersion and impacts on spectators.

Bibliography

[1]  Society for Cognitive Studies of the  Moving Images,  http://scsmi-online.org/

[2] Smith, T. J. An attentional theory of continuity editing. Doctoral thesis, University of Edinburgh. (PhD), 2006.

[3]  Smith, T. J. and Mital, P.K. (2013) Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. Journal of Vision 13 (8), ISSN 1534-7362.

[4] Le Meur, O., & Liu, Z. (2015). Saccadic model of eye movements for free-viewing condition. Vision research, 116, 152-164.

[5] Le Meur, O., & Coutrot, A. (2016). Introducing context-dependent and spatially-variant viewing biases in saccadic models. Vision research, 121, 72-84.

[6] Galvane, Q., Ronfard, R., Lino, C., & Christie, M. (2015, January). Continuity editing for 3D animation. In AAAI Conference on Artificial Intelligence.

[7] Vorderer, P., Wulff, H. J., & Friedrichsen, M. (2013). Suspense: Conceptualizations, theoretical analyses, and empirical explorations. Routledge.

Keywords: 
Eye-tracking analysis, visual perception, visual dispersion behaviors, gaze-base motion behaviors, film editing
Place: 
IRISA - Campus universitaire de Beaulieu, Rennes