Title
Reconstructing illumination environment by omnidirectional camera calibration
Abstract
This paper presents a novel approach to reconstruct illumination environment by omnidirectional camera calibration. The camera positions are estimated by our method considering the inlier distribution. The light positions are computed with the intersection points of the rays starting from the camera positions toward the corresponding points between two images. In addition, our method can integrate various synthetic objects in the real photographs by using the distributed ray tracing and HDR (High Dynamic Range) radiance map. Simulation results showed that we can generate photo-realistic image synthesis in the reconstructed illumination environment.
Year
DOI
Venue
2006
10.1007/11941439_57
Australian Conference on Artificial Intelligence
Keywords
Field
DocType
novel approach,high dynamic range,inlier distribution,camera position,reconstructed illumination environment,omnidirectional camera calibration,light position,illumination environment,intersection point,corresponding point,distributed ray tracing
Omnidirectional camera,Iterative reconstruction,Computer vision,Computer graphics (images),Ray tracing (graphics),Computer science,Camera auto-calibration,Camera resectioning,Distributed ray tracing,Global illumination,Artificial intelligence,High dynamic range
Conference
Volume
ISSN
ISBN
4304
0302-9743
3-540-49787-0
Citations 
PageRank 
References 
0
0.34
10
Authors
2
Name
Order
Citations
PageRank
Yongho Hwang1125.74
Hyun-ki Hong26414.17