learning.understanding.cognition.intelligence.data science

Emotion-Recognizer Demonstration

Interactive Demonstration of the Emotion-Recognizer helps provide an engaged experience and and introduction to applications of machine learning. The Emotion-Recognizer involves using NEXT to map out the space of concepts, simple machine vision tools to identify facial features, and a learned mapping from feature to emotion space.

 

Human judgements through crowdsourcing were used to generate this map (below), which has clusters for happy, angry, bored and excited emotions. This demo allows you to take a picture of your face, then project it onto a map of facial emotions.

TRY This with your face!

In summary:

  • Generating the embedding of faces is time consuming (be patient!)
  • Embedding relied on human judgements is done by asking questions like “is face X more similar to face A or face B?”
  • There are two axes to the circle of faces: an intensity axis (calm/rage) and a positivity axis (happy/sad).
  • The faces are roughly distributed on a ring; all neutral faces (aka not happy or sad) are calm

If you are in the Madison area Please Join Us at the Wisconsin Science Festival Oct 11-14, 2018 more details here