2017 年 22 巻 2 号 p. 177-187
Localizing the user by using a feature database of a scene is a basic and necessary step for presentation of localized augmented reality (AR) content. Due to the time and effort in preparing such a database, only a single appearance of the scene is commonly stored. The appearance depends on various factors, e.g., position of the sun and cloudiness. Observing the scene under different lighting conditions results in a decrease in the success rate and the accuracy of localization.
To address these problems, we propose to generate a feature database from the simulated appearance of the scene model under different lighting conditions. We also propose to extend the feature descriptors used in the localization with a parametric representation of their changes under varying lighting conditions. We compare our method with the standard representation and matching based on L2-norm in a simulation and real-world experiments. Our results show that our simulated environment is a satisfactory representation of the scene's appearance and improves feature matching from a single database. The proposed feature descriptor achieves a higher localization ratio with fewer feature points and a lower processing cost.