A Compact Embedding for Facial Expression Similarity
Abstract: Most of the existing work on automatic facial expression analysis focuses on discrete emotion recognition, or facial action unit detection. However, facial expressions do not always fall neatly into pre-defined semantic categories. Also, the similarity between expressions measured in the action unit space need not correspond to how humans perceive expression similarity. Different from previous work, our goal is to describe facial expressions in a continuous fashion using a compact embedding space that mimics human visual preferences. To achieve this goal, we collect a large-scale faces-in-the-wild dataset with human annotations in the form: Expressions A and B are visually more similar when compared to expression C, and use this dataset to train a neural network that produces a compact (16-dimensional) expression embedding. We experimentally demonstrate that the learned embedding can be successfully used for various applications such as expression retrieval, photo album summarization, and emotion recognition. We also show that the embedding learned using the proposed dataset performs better than several other embeddings learned using existing emotion or action unit datasets.
Contributions
- We introduce the FEC dataset, which is the first large-scale face dataset with expression comparison annotations. This dataset is now available to public.
- We experimentally demonstrate that a 16-dimensional expression embedding learned by training a deep neural network with the FEC dataset can be successfully used for several expression-based applications.
- We show that the embedding learned using the FEC dataset performs better than several other embeddings learned using existing emotion or action unit datasets.
Publications:
Raviteja Vemulapalli, Aseem Agarwala, "A Compact Embedding for Facial Expression Similarity", CVPR, 2019.
[PDF]
[PDF]
Patents:
Raviteja Vemulapalli, Aseem Agarwala, Compact Language-Free Facial Expression Embedding and Novel Triplet Training Scheme, US Patent App. 15/639,086.