List of facial expression databases: Difference between revisions

Content deleted Content added
Added way to download isafe db
Tags: nowiki added Visual edit Mobile edit Mobile web edit
Citation bot (talk | contribs)
Removed URL that duplicated identifier. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 576/990
 
(31 intermediate revisions by 14 users not shown)
Line 1:
{{short description|none}}
A '''facial expression database''' is a collection of images or video clips with [[facial expression]]s of a range of [[emotions]].
Well-annotated ([[emotion]]-tagged) media content of facial behavior is essential for training, testing, and validation of [[algorithm]]s for the development of [[Emotion recognition|expression recognition systems]]. The emotion annotation can be done in [[discrete emotion theory|discrete emotion]] labels or on a continuous scale. Most of the databases are usually based on the [[basic emotions]] theory (by [[Paul Ekman]]) which assumes the existence of six discrete basic emotions (anger, fear, disgust, surprise, joy, sadness). However, some databases include the emotion tagging in continuous arousal-valence scale.
Line 4 ⟶ 5:
In posed expression databases, the participants are asked to display different basic emotional expressions, while in spontaneous expression database, the expressions are natural. Spontaneous expressions differ from posed ones remarkably in terms of intensity, configuration, and duration. Apart from this, synthesis of some AUs are barely achievable without undergoing the associated emotional state. Therefore, in most cases, the posed expressions are exaggerated, while the spontaneous ones are subtle and differ in appearance.
 
Many publicly available databases are categorized here.<ref>{{Cite web|url=http://emotion-research.net/wiki/Databases|title=collection of emotional databases|archive-url=https://web.archive.org/web/20180325205102/http://emotion-research.net/wiki/Databases|archive-date=2018-03-25|url-status=dead}}</ref><ref>{{Cite web|url=https://www.ecse.rpi.edu/~cvrl/database/other_facial_expression.htm|title=facial expression databases|last=|first=|date=|website=|access-date=}}</ref> Here are some details of the [[facial expression]] databases.
{| class="wikitable"
!Database
Line 14 ⟶ 15:
!Ground truth
!Type
|-
|FERG-3D-DB (Facial Expression Research Group 3D Database) for stylized characters <ref>Aneja, Deepali, et al. "Learning to generate 3D stylized character expressions from humans." 2018 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2018.</ref>
Line 27:
|Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) <ref>Livingstone & Russo (2018). The Ryerson Audio-Visual Database of
Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. {{doi|10.1371/journal.pone.0196391
|doi-access=free}}</ref>
|Speech: Calm, happy, sad, angry, fearful, surprise, disgust, and neutral.
Song: Calm, happy, sad, angry, fearful, and neutral.
Line 48:
|Posed; spontaneous smiles
|-
|Japanese Female Facial Expressions (JAFFE)<ref>{{Cite book | doi=10.5281/zenodo.3451524| year=1998| last1=Lyons| first1=Michael| title=The Japanese Female Facial Expression (JAFFE) Database| last2=Kamachi| first2=Miyuki| last3=Gyoba| first3=Jiro}}</ref>
|neutral, sadness, surprise, happiness, fear, anger, and disgust
|10
Line 89:
|
|-
|iSAFE (Indian Semi-Acted Facial Expression Database (iSAFE)<ref>{{Cite journalbook|lastlast1=Singh|firstfirst1=Shivendra|last2=Benedict|first2=Shajulin|title=Advances in Signal Processing and Intelligent Recognition Systems |chapter=Indian Semi-Acted Facial Expression (ISAFE) Dataset for Human Emotions Recognition |date=2020|editor-last=Thampi|editor-first=Sabu M.|editor2-last=Hegde|editor2-first=Rajesh M.|editor3-last=Krishnan|editor3-first=Sri|editor4-last=Mukhopadhyay|editor4-first=Jayanta|editor5-last=Chaudhary|editor5-first=Vipin|editor6-last=Marques|editor6-first=Oge|editor7-last=Piramuthu|editor7-first=Selwyn|editor8-last=Corchado|editor8-first=Juan M.|title=Indian Semichapter-Acted Facial Expression (iSAFE) Dataset for Human Emotions Recognition|url=https://link.springer.com/chapter/10.1007/978-981-15-4828-4_13|journal=Advances in Signal Processing and Intelligent Recognition Systems|series=Communications in Computer and Information Science|volume=1209 |language=en|___location=Singapore|publisher=Springer|pages=150–162|doi=10.1007/978-981-15-4828-4_13|isbn=978-981-15-4828-4}}</ref>[https://github.com/shivendra2015iiit/Indian-Semi-Acted-Facial-Expression-Database-iSAFE- <nowiki>[download]</nowiki>]
|Happy, Sad, Fear, Surprise, Angry, Neutral, Disgust
|44
Line 117:
|Posed
|-
|Indian Spontaneous Expression Database (ISED)<ref>S L Happy, P. Patnaik, A. Routray, and R. Guha,&nbsp;"The Indian Spontaneous Expression Database for&nbsp;Emotion Recognition," in IEEE Transactions on Affective Computing,&nbsp;2016, {{doi|10.1109/TAFFC.2015.2498174}}.</ref>
 
“The Indian Spontaneous Expression Database for&nbsp;
 
Emotion Recognition,” in IEEE Transactions on Affective Computing,&nbsp;
 
2016, {{doi|10.1109/TAFFC.2015.2498174}}.
</ref>
|sadness, surprise,&nbsp;happiness, and disgust
|50
Line 133 ⟶ 126:
|Spontaneous
|-
|Radboud Faces Database (RaFD)<ref>Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H.J., Hawk, S.T., & van Knippenberg, A. (2010). Presentation and validation of the Radboud Faces Database. Cognition & Emotion, 24(8), 1377—1388. {{doi|10.1080/02699930903485076}}</ref>
|neutral, sadness,&nbsp;contempt, surprise,&nbsp;happiness, fear, anger, and disgust
|67
Line 160 ⟶ 153:
|Frontal pose
|-
|AffectNet<ref>{{Cite journal|lastlast1=Mollahosseini|firstfirst1=A.|last2=Hasani|first2=B.|last3=Mahoor|first3=M. H.|date=2017|title=AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild|journal=IEEE Transactions on Affective Computing|volume=PP|issue=99|pages=18–31|doi=10.1109/TAFFC.2017.2740923|issn=1949-3045|arxiv=1708.03985|s2cid=37515850}}</ref>
|neutral, happy, sad, surprise, fear, disgust, anger, contempt
|
Line 188 ⟶ 181:
|Posed
|-
|Aff-Wild<ref>{{Cite journalbook|lastlast1=Zafeiriou|firstfirst1=S.|last2=Kollias|first2=D.|last3=Nicolaou|first3=M.A.|last4=Papaioannou|first4=A.|last5=Zhao|first5=G.|last6=Kotsia|first6=I.|datetitle=2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) |titlechapter=Aff-Wild: Valence and Arousal in'In-the-wildWild' Challenge |date=2017|chapter-url=httphttps://openaccesseprints.thecvfmdx.comac.uk/content_cvpr_2017_workshops22045/w331/papers/Zafeiriou_Aff-Wild_Valence_and_CVPR_2017_paperaff_wild_kotsia.pdf|journalpages=Computer Vision and Pattern Recognition Workshops (1980–1987|doi=10.1109/CVPRW), .2017.248|isbn=978-1-5386-0733-6|s2cid=3107614|url=http://urn.fi/urn:nbn:fi-fe201902276466 }}</ref> <ref>{{Cite journal|lastlast1=Kollias|firstfirst1=D.|last2=Tzirakis|first2=P.|last3=Nicolaou|first3=M.A.|last4=Papaioannou|first4=A.|last5=Zhao|first5=G.|last6=Schuller|first6=B.|last7=Kotsia|first7=I.|last8=Zafeiriou|first8=S.|date=2019|title=Deep Affect Prediction in-the-wild: Aff-Wild Database and Challenge, Deep Architectures, and Beyond|url=https://rdcu.be/bmGm2|journal=International Journal of Computer Vision (IJCV), 2019|volume=127|issue=6–7|pages=907–929|doi=10.1007/s11263-019-01158-4|s2cid=13679040|doi-access=free|arxiv=1804.10938}}</ref>
|valence and arousal
|200
Line 197 ⟶ 190:
|In-the-Wild setting
|-
|Aff-Wild2<ref>{{Cite journal|lastlast1=Kollias|firstfirst1=D.|last2=Zafeiriou|first2=S.|date=2019|title=Expression, affect, action unit recognition: Aff-wild2, multi-task learning and arcface|url=https://bmvc2019.org/wp-content/uploads/papers/0399-paper.pdf|journal=British Machine Vision Conference (BMVC), 2019|arxiv=1910.04855}}</ref> <ref>{{Cite journalbook|lastlast1=Kollias|firstfirst1=D.|last2=Schulc|first2=A.|last3=Hajiyev|first3=E.|last4=Zafeiriou|first4=S.|datetitle=2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020) |titlechapter=Analysing affectiveAffective behaviorBehavior in the firstFirst abawABAW 2020 competitionCompetition |date=2020|chapter-url=https://www.computer.org/csdl/proceedings-article/fg/2020/307900a794/1kecIYu9wL6|journalpages=IEEE International Conference on Automatic Face and Gesture Recognition (FG), 637–643|doi=10.1109/FG47880.2020.00126|arxiv=2001.11409|isbn=978-1-7281-3079-8|s2cid=210966051}}</ref>
|neutral, happiness, sadness, surprise, fear, disgust, anger + valence-arousal + action units 1,2,4,6,12,15,20,25
|458
Line 206 ⟶ 199:
|In-the-Wild setting
|-
|Real-world Affective Faces Database (RAF-DB)<ref>{{Cite web|last=Li.|first=S.|title=RAF-DB|url=http://www.whdeng.cn/RAF/model1.html|website=Real-world Affective Faces Database}}</ref><ref>{{Cite book|last1=Li|first1=S.|last2=Deng|first2=W.|last3=Du|first3=J.|title=2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) |chapter=Reliable Crowdsourcing and Deep Locality-Preserving Learning for Expression Recognition in the Wild |date=2017|pages=2584–2593|doi=10.1109/CVPR.2017.277|isbn=978-1-5386-0457-1|s2cid=11413183}}</ref>
 
|6 classes of '''basic emotions''' (Surprised, Fear, Disgust, Happy, Sad, Angry) plus Neutral and 12 classes of '''compound emotions''' (Fearfully Surprised, Fearfully Disgusted, Sadly Angry, Sadly Fearful, Angrily Disgusted, Angrily Surprised, Sadly Disgusted, Disgustedly Surprised, Happily Surprised, Sadly Surprised, Fearfully Angry, Happily Disgusted)
 
|29672 annotated examples
|Color
|Various for original dataset and 100x100 for aligned dataset
|Emotion labels
|Posed and Spontaneous
|-
 
|}
Line 214:
{{reflist}}
 
{{Nonverbal communication}}
 
[[Category:Database-related lists|Facial expression]]
[[Category:Facial expressions]]