Cornell University Program of Computer Graphics
Merging live video with synthetic imagery.Jeremy Adam Selan.
Master's thesis, Cornell University, 2003.
Merging live action and synthetic imagery in a realistic manner is becoming increasingly important, and has become ubiquitous to filmmaking and television. However, current techniques to merge live and synthetic imagery typically generate low-quality images in real-time, or high-quality results though off-line processes. Being able to realistically merge live and synthetic imagery in real-time would generate many new applications, particularly in the virtual set, augmented reality, military, and entertainment industries.
This thesis presents a framework for the realistic, real-time merging of live action with synthetic imagery by analyzing the three major components of the process. First, we maintain the necessity of utilizing synthetic imagery that mimics the visual complexity of the real world. Second, we demonstrate that attention must be spent on acquiring live action that is visually compatible with the synthetic imagery, with the emphasis on matching specific illumination characteristics. Finally, we propose the use of a compositing mechanism that accounts for the missing visual interaction between the live and synthetic components, including occlusion, shadowing, and reflection.
We present the real-time implementation of two virtual set systems that adhere to these principles. An image-based renderer generates realistic imagery in limited interaction environments, while a software-based ray-engine simulates physically based, dynamic environments. The live and synthetic environmental illumination is matched by manipulating the histogram characteristics of the live video. Finally, we implement a compositing system that accounts for the visual interactions between the live and synthetic imagery, synthesizing inter-reflections and shadows using a silhouette reprojection technique.