Title
Digital Makeup from Internet Images.
Abstract
Abstract We present a novel approach of creating face makeup upon a face image with another images as the style examples. Our approach is analogous to physical makeup, as we modify the color and skin details while preserving the face structure. More precisely, we extract image foregrounds from both subject and multiple example images. Then by using image matting algorithms, the system extracts the semantic information such as faces, lips, teeth, eyes and eyebrows, from the extracted foregrounds of both subject and multiple example images. And, then the makeup style is transferred between the corresponding parts with the same semantic information. Next, we get the face makeup transferred result by seamlessly compositing different parts together using alpha blending. In the final step, we present an efficient method of makeup consistency to optimize the color of a collection of images. The main advantage of our method over existing techniques is that it does not need face matching, as one could use more than one example images. Because one example image does not fulfill the complete requirements of a user. Our algorithm is not restricted to head shot images as we can also change the makeup style in the wild. Moreover, our algorithm does not require to choose the same pose and image size between subject and example images. The experiment results demonstrate the effectiveness of the proposed method in faithfully transferring makeup.
Year
Venue
Field
2016
Optik
Computer vision,High color,Color histogram,Pattern recognition,Computer science,Color depth,Color balance,RGB color model,Artificial intelligence,False color,Color quantization,Color image
DocType
Volume
Citations 
Journal
abs/1610.04861
0
PageRank 
References 
Authors
0.34
32
3
Name
Order
Citations
PageRank
Asad I. Khan116131.63
Yudong Guo201.01
Ligang Liu351.76