Seminar: Stereo Matching Solutions for Real-Time 3D Reconstruction of Outdoor Environments
Thesis Proposal Presentation
Supervisor: Dr. Oscar Meruvia-Pastor
Stereo Matching Solutions for Real-Time 3D Reconstruction of Outdoor Environments
Department of Computer Science
Tuesday, May 21, 2013, 12:30 pm, Room EN 2022
Augmented Reality systems combine standard video input with computer-generated objects or environments. Stereoscopic video is a unique type of video source, which is not typically used in Augmented Reality systems. In addition to color, stereoscopic systems allow viewers to perceive depth, thus providing important information to support navigation and manipulation tasks. Using parallax, stereo vision systems determine depth from two or more images which are taken at the same time from slightly different viewpoints. The most important and time consuming task for stereo vision systems is the identification of corresponding pixels within a pair of stereo images for further processing to extract depth. After depth information has been extracted from a scene, it can be transmitted over the network to remote users for tele-collaboration. It can also be used for 3D model reconstruction or be combined with 3D computer-generated models to generate an Augmented Reality environment to be displayed on regular screens or on 3D see-through glasses or displays. In this research we are ultimately interested in the combination of stereoscopic systems in an Augmented Reality system for outdoor applications. For this purpose, we will implement two top ranked available solutions on Middlebury  for stereo matching and extraction of the depth data and evaluate them based on the fundamental requirements (speed and accuracy) of a mobile augmented reality system in an outdoor environment. We will also test some solutions, such as realtime noise filtering techniques and the addition of one stereoscopic camera, to improve the accuracy of the results for the particular purpose of 3D reconstruction of outdoor environments.