This presentation will describe a new approach to augmented reality, an area within virtual reality research that merges the real world environment with a synthetic virtual world. These synthetic elements may add information to the world we see, such as in equipment maintenance tasks or image guided surgery, or they can create the illusion of the real world being seamlessly composited with virtual elements as used in simulations and entertainment. Previous methods for augmenting reality required careful calibration of elements in the real world, knowledge of the calibration parameters of the camera, and the position of the person interacting with the system. This new approach requires minimal apriori calibration information. It only requires the ability to track at least four feature points through time. I will describe the implementation of such a system and show its current state of operation on videotape. The work is also described at my Augmented Reality Page.
Colloquia Series page.