The Smile Detector
-
-
slice.mit.edu
Filed Under
Recommended

A Media Lab research team has developed a way to discern the authenticity of a smile. The team, led by Affective Computing Research Group Director Rosalind Picard SM ’86, PhD ’91 and doctoral candidates Ehsan Hoque and Daniel McDuff, has developed an algorithm that analyzes the subtle differences between a “fake” smile generated by frustration and a “real” smile born of enjoyment.
From MIT News:[youtube http://www.youtube.com/watch?v=MYmgCQjgXQU&w=500&h=281]The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.
The team used real smile footage, recorded via webcam at the Media Lab, that juxtaposed frustrated grimaces with happy grins and recorded volunteers’ expressions in a range of situations, including:
While a fully-formed “real” or “fake” smile does not show much differentiation, the progression of each smile was quite different. An authentic smile builds gradually; a smiles born of frustration quickly appears and rapidly fades.
- Feigning expressions of happiness or frustration
- Watching a video of a cute baby
- Filling out a long, arduous form; the form's contents were deleted after hitting the “submit” button
The research was published in IEEE Transactions on Affective Computing and, according to the team, the computer program has shown a better success rate than human observers in figuring out the emotional state of their subjects.