Video
BibTex
@inproceedings{Ghafourzadeh:2020:10.20380/GI2020.03,
author = {Ghafourzadeh, Donya and Rahgoshay, Cyrus and Fallahdoust, Sahel and Beauchamp, Andre and Aubame, Adeline and Popa, Tiberiu and Paquette, Eric},
title = {Part-Based 3D Face Morphable Model with Anthropometric Local Control},
booktitle = {Proceedings of Graphics Interface 2020},
series = {GI 2020},
year = {2020},
isbn = {978-0-9947868-5-2},
location = {University of Toronto},
pages = {7 -- 16},
numpages = {10},
doi = {10.20380/GI2020.03},
publisher = {Canadian Human-Computer Communications Society / Société canadienne du dialogue humain-machine},
}
Abstract
We propose an approach to construct realistic 3D facial morphable models (3DMM) that allows an intuitive facial attribute editing workflow. Current face modeling methods using 3DMM suffer from a lack of local control. We thus create a 3DMM by combining local part-based 3DMM for the eyes, nose, mouth, ears, and facial mask regions. Our local PCA-based approach uses a novel method to select the best eigenvectors from the local 3DMM to ensure that the combined 3DMM is expressive, while allowing accurate reconstruction. The editing controls we provide to the user are intuitive as they are extracted from anthropometric measurements found in the literature. Out of a large set of possible anthropometric measurements, we filter those that have meaningful generative power given the face data set. We bind the measurements to the part-based 3DMM through mapping matrices derived from our data set of facial scans. Our part-based 3DMM is compact, yet accurate, and compared to other 3DMM methods, it provides a new trade-off between local and global control. We tested our approach on a data set of 135 scans used to derive the 3DMM, plus 19 scans that served for validation. The results show that our part-based 3DMM approach has excellent generative properties and allows the user intuitive local control.