PurposeUncertainty is an under-respected issue when it comes to automatic assessment of human emotion by machines. The purpose of this paper is to highlight the existent approaches towards such measurement of uncertainty, and identify further research need.Design/methodology/approachThe discussion is based on a literature review.FindingsTechnical solutions towards measurement of uncertainty in automatic emotion recognition (AER) exist but need to be extended to respect a range of so far underrepresented sources of uncertainty. These then need to be integrated into systems available to general users.Research limitations/implicationsNot all sources of uncertainty in automatic emotion recognition (AER) including emotion representation and annotation can be touched upon in this communication.Practical implicationsAER systems shall be enhanced by more meaningful and complete information provision on the uncertainty underlying their estimates. Limitations of their applicability should be communicated to users.Social implicationsUsers of automatic emotion recognition technology will become aware of their limitations, potentially leading to a fairer usage in crucial application context.Originality/valueThere is no previous discussion including the technical view point on extended uncertainty measurement in automatic emotion recognition.
Journal of Information, Communication and Ethics in Society – Emerald Publishing
Published: Aug 12, 2019