Light Measurement and Image Validation
Cornell University Program of Computer Graphics
For the past decade or more we have been developing a system to test, validate, and improve the fidelity and efficiency of computer graphics algorithms. Our research framework is structured into three separate sub-sections dealing with the local light reflection model, the global light transport simulation, and the image display. At each stage, simulations are compared with measured experiments.
For the first stage, our ultimate goal is to derive an accurate, physically based local light reflection model for arbitrary reflectance functions. A measurement laboratory has been assembled to goniometrically measure and compare the local reflection model with a large number of samples. We use a gonioreflectometer to obtain the reflectance of surfaces with complex, strongly directional behavior. For surfaces that are specular or nearly ideal diffuse, we can use either a specular reflectometer or an integrating sphere reflectometer. These provide fast, accurate spectral measurements of the reflectance. The two reflectometers can also be used for transmission measurements, and the monochromator and detectors can be arranged for light source measurements. For measurements of scene geometry, we have relied on direct mensuration, or for small objects, a Cyberware scanner.
To create our "gold standard" images of physical scenes, we use a calibrated CCD camera to capture a twelve-bit image of the scene at any wavelength band. Note that we have not yet created an output RGB image, but are only comparing measured and simulated radiant energy on an image plane with full dynamic range and infinite resolution.
(a)Calibrated CCD camera image via 550 nm filter
(b)The same camera image showing radiance contours
With accurate data in terms of geometry, emission, and reflectance functions, it is then necessary to accurately simulate the physical propagation of light energy throughout the environment. Our validation objective during this phase of realistic image synthesis is then to directly compare a simulation with a calibrated physical measurement, where previously comparisons were based on subjective appearance alone. The direct captured vs. simulated image comparisons we have performed to date serve to quantify our progress in light reflection models, BRDF representations, and simulation algorithms.
The repercussions of this research are far reaching: the ability to predict exact lighting and color appearance will not only improve visualization accuracy, but also enhance the development of new technology for color printing, digital photography, and image displays.