Title
Physically-Based Editing of Indoor Scene Lighting from a Single Image.
Abstract
We present a method to edit complex indoor lighting from a single image with its predicted depth and light source segmentation masks. This is an extremely challenging problem that requires modeling complex light transport, and disentangling HDR lighting from material and geometry with only a partial LDR observation of the scene. We tackle this problem using two novel components: 1) a holistic scene reconstruction method that estimates reflectance and parametric 3D lighting, and 2) a neural rendering framework that re-renders the scene from our predictions. We use physically-based light representations that allow for intuitive editing, and infer both visible and invisible light sources. Our neural rendering framework combines physically-based direct illumination and shadow rendering with deep networks to approximate global illumination. It can capture challenging lighting effects, such as soft shadows, directional lighting, specular materials, and interreflections. Previous single image inverse rendering methods usually entangle lighting and geometry and only support applications like object insertion. Instead, by combining parametric 3D lighting estimation with neural scene rendering, we demonstrate the first automatic method for full scene relighting from a single image, including light source insertion, removal, and replacement.
Year
DOI
Venue
2022
10.1007/978-3-031-20068-7_32
European Conference on Computer Vision
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
9
Name
Order
Citations
PageRank
zhengqin li1525.82
Jia Shi200.68
Sai Bi3635.28
Rui Zhu4304.20
Kalyan Sunkavalli550031.75
Miloš Hašan639621.61
Zexiang Xu710110.17
Ravi Ramamoorthi84481237.21
Manmohan Chandraker945125.58