抄録
We present a of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the onmi-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-hased camera calibration method to omni-directional cameras. We also use a B-Tree data stnicture for the light field. to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a. wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an onlni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.