We list some widely used facial expression databases, and
summarize the specifications of these databases as below.
o
Source: this database is provided by the Second Emotion Recognition In The Wild Challenge and Workshop.
o
Purpose: this database is primarily used to evaluate the emotion recognition methods in real-world conditions.
o
Properties:
Properties |
Descriptions |
# of subjects |
N/A |
# of images/videos |
Training (578 videos), validataion (383 videos), and test sets (N/A) |
Static/Videos |
Video and audio |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
N/A |
Face pose |
various face poses |
Facial expression |
7 basic facial expressions: Anger, Disgust, Fear, Happiness, Neutral, Sadness and Surprise. |
Description of facial expression |
video clips from movies. |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Facial expression label for each video |
o
Source: this database is provided by Jeff Cohn from Carnegie
Mellon University.
o
Purpose: this database is widely used as the standard database
to evaluate the facial action unit recognition systems. It may also be used for
facial expression recognition and face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
100 university students (released version) 65% female, 15% African-American, and 3% percent Asian
or Latino. |
# of images/videos |
486 |
Static/Videos |
Videos |
Single/Multiple faces |
Single |
Gray/Color |
Eight-bit gray |
Resolution |
640* 490 |
Face pose |
Frontal-view only |
Facial expression |
23 facial displays including single AUs or combinations
of AUs |
Description of facial expression |
Neutral to apex; Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
12 frame/sec |
Ground truth |
AU label for final frame in each image sequence Identifications of subjects |
o
Source: this database is provided by M. Pantic and M. F.
Valstar.
o
Purpose: this database is primarily used to evaluate the facial action
unit recognition systems. It may also be used for face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
43 |
# of images/videos |
1280 videos and over 250 images |
Static/Videos |
Videos and static images |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
720* 576 |
Face pose |
Frontal-view or dual-view (frontal and profile)
captured by two cameras simultaneously |
Facial expression |
79 facial displays including single AUs or combinations
of AUs |
Description of facial expression |
Neutral-apex-neutral; Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
24 frame/sec |
Ground truth |
AU label for the image frame with apex facial
expression in each image sequence; Some image sequences have been FACS coded for each
frame; Lot of metadata of subjects |
o
Source: this database was planned and assembled by Miyuki
Kamachi, Michael Lyons, and Jiro Gyoba.
o
Purpose: this database is primarily used to evaluate the facial
expression recognition systems. It may also be used for face recognition.
o
Properties:
Properties |
Descriptions |
# of subjects |
10 |
# of images/videos |
213 |
Static/Videos |
Static |
Single/Multiple faces |
Single |
Gray/Color |
Eight-bit gray |
Resolution |
256* 256 |
Face pose |
Frontal-view |
Facial expression |
7 facial expressions: neutral, sadness, surprise, happiness, fear, anger, and
disgust |
Description of facial expression |
Posed facial expressions |
Illumination |
N/A |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Facial expression label Identifications of subjects |
o
Reference: Please refer to the paper “Michael J. Lyons, Shigeru
Akamatsu, Miyuki Kamachi & Jiro Gyoba, Coding Facial Expressions with
Gabor Wavelets, Proc. of FGR98, pp. 200-205, 1998”.
o
Source: this database is created by Queen's University of
Belfast.
o
Purpose: This database is primarily used for emotion
recognition. It may also be used to evaluate the algorithms for facial
expression and facial action unit recognition under spontaneous conditions.
o
Properties:
Properties |
Descriptions |
# of subjects |
125 (31 males and 94 females) |
# of images/videos |
>250 |
Static/Videos |
Videos (audio-visual) |
Single/Multiple faces |
Single |
Gray/Color |
Color |
Resolution |
N/A |
Face pose |
Various |
Facial expression |
Wide range of facial expressions |
Description of facial expression |
Neutral-apex-neutral; Spontaneous facial
expressions |
Illumination |
Indoor |
Accessories |
N/A |
3D data |
N/A |
Frame rate |
N/A |
Ground truth |
Identifications of subjects Emotional descriptors of each sequence |
o
Reference: Please refer to the paper “E. Douglas-Cowie and R.
Cowie and M. Schroeder, The description of naturally occurring emotional
speech, 15th ICPhS, Barcelona”.