Skip to content

Latest commit

 

History

History
61 lines (36 loc) · 2.01 KB

face.md

File metadata and controls

61 lines (36 loc) · 2.01 KB

Face

In face modality, it involves detecting affect from facial expressions using both static and dyanmic sources, i.e image and videos.

Image

CK+

CK+ comprises of posed facial expressions obtained from 123 adults in laboratory conditions. A total of 593 image sequences with duration varying from 10 to 60 frames were obtained with a subset of 327 image sequences was labelled with 7 discrete affect states.

[Get] [Paper]

Label information:

Sadness, Surprise, Happy, Fear, Angry, Contempt and Disgust.

Paper Year Metric Code Link

AffectNet

AffectNet comprises of in-the-wild face images obtained from the web. A total of ~440,000 images were labelled with 7 discrete affect states and 2 continous affect ratings.

Label information:

Discrete - Sad, Surprise, Happy, Fear, Angry, Contempt, Disgust.
Continous - Valence and Arousal.

[Get] [Paper]

Paper Year Metric Code Link

ISLab

Paper Year Metric Code Link

FER-2013

FER-2013 was obtained using a keyword-based Google image search and comprises of 35887 images labelled with 7 discrete affect states.

Label information:

Angry, Disgust, Fear, Happy, Sad, Surprise, Neutral

[Get] [Paper]

Paper Year Metric Code Link

Video

Aff-Wild

Paper Year Metric Code Link