An MIT Alumni Association Publication
Have you ever found yourself in an aggravating conversation, smiling endlessly, hoping that the conversation might end? If so, be careful. Someone may determine that your smile is insincere.

A Media Lab research team has developed a way to discern the authenticity of a smile. The team, led by Affective Computing Research Group Director Rosalind Picard SM ’86, PhD ’91 and doctoral candidates Ehsan Hoque and Daniel McDuff, has developed an algorithm that analyzes the subtle differences between a “fake” smile generated by frustration and a “real” smile born of enjoyment.

From MIT News:

The research could pave the way for computers that better assess the emotional states of their users and respond accordingly. It could also help train those who have difficulty interpreting expressions, such as people with autism, to more accurately gauge the expressions they see.

[youtube http://www.youtube.com/watch?v=MYmgCQjgXQU&w=500&h=281]

The team used real smile footage, recorded via webcam at the Media Lab, that juxtaposed frustrated grimaces with happy grins and recorded volunteers’ expressions in a range of situations, including:

  1. Feigning expressions of happiness or frustration
  2. Watching a video of a cute baby
  3. Filling out a long, arduous form; the form's contents were deleted after hitting the “submit” button
While a fully-formed “real” or “fake” smile does not show much differentiation, the progression of each smile was quite different. An authentic smile builds gradually; a smiles born of frustration quickly appears and rapidly fades.

The research was published in IEEE Transactions on Affective Computing and, according to the team, the computer program has shown a better success rate than human observers in figuring out the emotional state of their subjects.