Title
Texture Memory-Augmented Deep Patch-Based Image Inpainting
Abstract
Patch-based methods and deep networks have been employed to tackle image inpainting problem, with their own strengths and weaknesses. Patch-based methods are capable of restoring a missing region with high-quality texture through searching nearest neighbor patches from the unmasked regions. However, these methods bring problematic contents when recovering large missing regions. Deep networks, on the other hand, show promising results in completing large regions. Nonetheless, the results often lack faithful and sharp details that resemble the surrounding area. By bringing together the best of both paradigms, we propose a new deep inpainting framework where texture generation is guided by a texture memory of patch samples extracted from unmasked regions. The framework has a novel design that allows texture memory retrieval to be trained end-to-end with the deep inpainting network. In addition, we introduce a patch distribution loss to encourage high-quality patch synthesis. The proposed method shows superior performance both qualitatively and quantitatively on three challenging image benchmarks, i.e., Places, CelebA-HQ, and Paris Street-View datasets (Code will be made publicly available in https://github.com/open-mmlab/mmediting).
Year
DOI
Venue
2021
10.1109/TIP.2021.3122930
IEEE TRANSACTIONS ON IMAGE PROCESSING
Keywords
DocType
Volume
Image reconstruction, Image restoration, Training, Optimization, Interpolation, Generative adversarial networks, Semantics, Image completion, generative adversarial network, texture synthesis
Journal
30
Issue
ISSN
Citations 
1
1057-7149
0
PageRank 
References 
Authors
0.34
19
6
Name
Order
Citations
PageRank
Rui Xu1202.66
Guo Minghao241.44
Jiaqi Wang3774.20
Xiaoxiao Li42539.86
Bolei Zhou5152966.96
Chen Change Loy64484178.56