Title
Self-Attention Implicit Function Networks For 3d Dental Data Completion
Abstract
While complete dental models are crucial for digital dentistry, current technologies mostly focus on the 3D dental crown but overlook the dental gum that is important for applications in orthodontics and prosthodontics. To reconstruct the complete dental models with visually realistic geometry from the given crown data, we propose to combine the implicit function representation with the self-attention mechanism. Recent studies have shown that the implicit function is an effective 3D representation for shape completion. However, existing methods fail in dealing with dental models with complex shapes and details, because the convolution and linear operations adopted in their networks are inefficient for modeling long-range dependencies or hard to maintain detailed geometry of the shapes. Therefore, we propose to introduce self-attention to the implicit function network for the first time and use it to effectively capture non-local features at different levels. Extensive ablation studies were conducted to validate the efficiency of our method. Quantitative and qualitative comparisons demonstrate that the feature extracted by our network is more expressive and thus leads to better dental model completion and reconstruction results. (C) 2021 Elsevier B.V. All rights reserved.
Year
DOI
Venue
2021
10.1016/j.cagd.2021.102026
COMPUTER AIDED GEOMETRIC DESIGN
Keywords
DocType
Volume
Dental model completion, Self-attention, Implicit function
Journal
90
ISSN
Citations 
PageRank 
0167-8396
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Yuhan Ping100.34
Guodong Wei231.74
Lei Yang3122.87
Zhiming Cui446.48
Wenping Wang52491176.19