According to Hod Lipson, professor of innovation, “One day I was minding my own business when EVA suddenly gave me a big friendly smile”, he remembered. “I knew it was purely mechanical, but I found myself smiling back at him by reflex”.
Once the team was satisfied with the “mechanics” of EVA, they began to address the programming of artificial intelligence (AI) that would guide the robot’s facial movements, using Deep learning AI to “read” and then reflect the expressions it finds across nearby human faces. But the capacity of EVA is much higher, since is able to imitate a wide range of different human facial expressions, what learn by trial and error, by watching videos of herself.
Thus, after several improvements, EVA acquired the ability to read the gestures of the human face from a camera, and respond by reflecting the facial expression of that human.